Coders Packet

Various Logic gates using Perceptron (ANN) in Python

By Tanveer Chawla

This project depicts how we can create logic gates like AND, OR through perception algorithms in an artificial neural network model.

Artificial neural networks (ANNs), usually simply called neural networks (NNs), or connectionist systems are computing systems vaguely inspired by the biological neural networks that constitute animal brains. The data structures and functionality of neural nets are designed to simulate associative memory. Neural nets learn by processing examples, each of which contains a known "input" and "result," forming probability-weighted associations between the two, which are stored within the data structure of the net itself.

The perceptron is an algorithm for supervised learning of binary classifiers. A binary classifier is a function that can decide whether or not an input, represented by a vector of numbers, belongs to some specific class. It is a type of linear classifier, i.e. a classification algorithm that makes its predictions based on a linear predictor function combining a set of weights with the feature vector.
The perceptron model is a more general computational model than McCulloch-Pitts neuron. It takes an input, aggregates it (weighted sum), and returns 1 only if the aggregated sum is more than some threshold else returns 0. Rewriting the threshold as shown above and making it a constant input with variable weight. A single perceptron can only be used to implement linearly separable functions. It takes both real and boolean inputs and associates a set of weights to them, along with a bias (the threshold thing I mentioned above). We learn the weights, we get the function. Let's use a perceptron to learn an OR function.

The Perceptron is inspired by the information processing of a single neural cell called a neuron. A neuron accepts input signals via its dendrites, which pass the electrical signal down to the cell body. In a similar way, the Perceptron receives input signals from examples of training data that we weight and combined in a linear equation called the activation.

1.activation = sum(weight_i * x_i) + bias

The activation is then transformed into an output value or prediction using a transfer function, such as the step transfer function.

2.prediction = 1.0 if activation >= 0.0 else 0.0

In this way, the Perceptron is a classification algorithm for problems with two classes (0 and 1) where a linear equation (like or hyperplane) can be used to separate the two classes. It is closely related to linear regression and logistic regression that make predictions in a similar way (e.g. a weighted sum of inputs). The weights of the Perceptron algorithm must be estimated from your training data using stochastic gradient descent.

The following project contains AND, OR, NAND, NOR gates through various functions like perception, activation function etc. The activation function which is used is sigmoid which can be changed according to the requirements.

Download Complete Code

Comments

No comments yet