Perceptron Simulator

Click on the cells to turn them on or off. Adjust the bias of each cell to see how it affects the output of the perceptron.




Input matrix

Create shapes / alphabets / numbers by turning on the cells.





Bias Adjustment

Adjust the bias of corresponding cell to make the perceptron learn.





0

Single Layer Perceptron Explanation

This Perceptron Simulator is a visual and interactive representation of a simplified Single Layer Perceptron. A Single Layer Perceptron is one of the oldest and most fundamental neural network models, originally proposed by Frank Rosenblatt in 1958. It's primarily used for binary classification tasks, effectively computing logical gates like AND, OR, and NOR.


Core Functionality of a Single Layer Perceptron (as Simulated)


Key Concepts Demonstrated by This Simulation


Relation to MNIST Dataset

While this simulation uses a 4x4 grid, the MNIST dataset involves 28x28 pixel images. A single-layer perceptron could be used to classify MNIST digits, but it would require a much larger input layer (784 inputs) and more sophisticated training. However, single layer perceptrons are limited to linearly seperable problems, and therefore have very limited use cases with complex datasets like MNIST.

In summary, this simulator provides a valuable visual and interactive way to understand the fundamental building blocks of a single-layer perceptron. It effectively demonstrates the concepts of inputs, weights, weighted sums, and simplified activation.



This project was developed by Priyangsu Banerjee.
Inspiration: Welch Labs
Source Code: View on GitHub
Connect with me: LinkedIn | Website
Last updated: 06.03.2025

Page visits: Page visit counter