Skip to content

njblur/mini_neural

Repository files navigation

A tiny neural network implementation

the codes here are mainly for the purpose to fullfil the concept and mathatical theory of neural networks. they are all writen from scratch with only numpy as math lib. the normal neural network has been implemented with following activation and loss functions: linear,sigmoid,relu,softmax. weights decay(L2 regulazation) is also included. an mnist example is made based on this experimental neural nets. A simple LSTM is implemented also but with no sequence,no batch yet.

update 2017-09-07

just now, I finished the work to implement a Convolution Neural Network and an lenet-like mnist example.
It is a full functional convolotion network with padding, strides,batch relu,dropout,moment optimizer til now, the mini neural is not just 'mini',with RNN implemented days agao. it is going to be a deep neural network like caffe or tensorflow. Cheers!!

contact me at [email protected] if you have any interesting

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages