Recently I stumbled upon an excellent demo of a 2 layer neural network written by Florian Muellerklein: https://github.com/FlorianMuellerklein/Machine-Learning.
It is written in Python using numpy and focuses on digit recognition based on sklearn dataset.
I decided to play around with it and add visualization for learning process of a sine function.
I used matplotlib for creating an animation.
The neural network implementation is typical. It uses standard gradient descent procedure with some optimizations like momentum and regularization (also random initialization).
If you are interested in understanding how exactly gradient descent works, I highly recommend an article from Matt Mazur: http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example.
The network architecture I used was following:
- 1 input neuron (x parameter of sine function)
- 60 hidden neurons
- 1 output value (the function result)
Here is a link to source code: https://github.com/rafalrusin/Machine-Learning/tree/master2
It converges pretty well. Here's an animation showing convergence of sine function during consecutive learning iterations (10 learing iterations per frame):
Machine Learning is a very fascinating domain that has been emerging rapidly over recent years, mainly in visual object recognition. I hope that it keeps this pace in future (or even exceeds it!).