Implementation of Neural Network from scratch, used Sigmoid, tanh and ReLu activation functions.
Coded a neural network (NN) having two hidden layers, besides the input and output layers. Implemented Sigmoid, tanh and ReLu activation functions. Implemented backpropagation algorithm for training the neural network.
The above neural network was then used to make predictions for three datasets below:
- Car Evaluation Dataset: https://archive.ics.uci.edu/ml/datasets/Car+Evaluation
- Iris Dataset: https://archive.ics.uci.edu/ml/datasets/Iris
- Adult Census Income Dataset: https://archive.ics.uci.edu/ml/datasets/Census+Income