Click on the cells to turn them on or off. Adjust the bias of each cell to see how it affects the output of the perceptron.
Adjust the bias of corresponding cell to make the perceptron learn.
0
This Perceptron Simulator is a visual and interactive representation of a simplified Single Layer Perceptron. A Single Layer Perceptron is one of the oldest and most fundamental neural network models, originally proposed by Frank Rosenblatt in 1958. It's primarily used for binary classification tasks, effectively computing logical gates like AND, OR, and NOR.
updatePotentiometer()
function calculates the weighted sum of the inputs. When a cell is
"on," its bias (weight) is added to the sum. When it is off, the bias is subtracted. This is a form of weighted summation.
While this simulation uses a 4x4 grid, the MNIST dataset involves 28x28 pixel images. A single-layer perceptron could be used to classify MNIST digits, but it would require a much larger input layer (784 inputs) and more sophisticated training. However, single layer perceptrons are limited to linearly seperable problems, and therefore have very limited use cases with complex datasets like MNIST.
In summary, this simulator provides a valuable visual and interactive way to understand the fundamental building blocks of a single-layer perceptron. It effectively demonstrates the concepts of inputs, weights, weighted sums, and simplified activation.
This project was developed by Priyangsu Banerjee.
Inspiration: Welch Labs
Source Code: View on GitHub
Connect with me: LinkedIn | Website
Last updated: 06.03.2025