Git Product home page Git Product logo

neural-net-test10's Introduction

This is an edited version of the go-deep library except it has been converted to 32-bit for better performance and some extra activation functions have been added (Elu, Mish and Swish)

**

Update: 3 new activation functions have been added, 1 custom function for experimenting with activation functions and 2 new activation functions, RootX and DoubleRoot... these 2 functions seem to be able to be used to train unusually deep neural networks.

I have also added an experimental Bonds system that is designed to strengthen or weaken the bonds between the neuraon in each layer, not sure if it works well but it is easy to turn off. **

neural-net

Feed forward/backpropagation neural network implementation. Currently supports:

  • Activation functions: sigmoid, hyperbolic, ReLU, Elu, Mish, Swish
  • Solvers: SGD, SGD with momentum/nesterov, Adam
  • Classification modes: regression, multi-class, multi-label, binary
  • Supports batch training in parallel
  • Bias nodes

Networks are modeled as a set of neurons connected through synapses. No GPU computations - don't use this for any large scale applications.

Todo:

  • Dropout
  • Batch normalization

Install

go get -u github.com/nathanleary/neural-net-test10

Usage

Import the go-deep package

import (
	"fmt"
	deep "github.com/nathanleary/neural-net-test10"
	"github.com/nathanleary/neural-net-test10/training"
)

Define some data...

var data = training.Examples{
	{[]float32{2.7810836, 2.550537003}, []float32{0}},
	{[]float32{1.465489372, 2.362125076}, []float32{0}},
	{[]float32{3.396561688, 4.400293529}, []float32{0}},
	{[]float32{1.38807019, 1.850220317}, []float32{0}},
	{[]float32{7.627531214, 2.759262235}, []float32{1}},
	{[]float32{5.332441248, 2.088626775}, []float32{1}},
	{[]float32{6.922596716, 1.77106367}, []float32{1}},
	{[]float32{8.675418651, -0.242068655}, []float32{1}},
}

Create a network with two hidden layers of size 2 and 2 respectively:

n := deep.NewNeural(&deep.Config{
	/* Input dimensionality */
	Inputs: 2,
	/* Two hidden layers consisting of two neurons each, and a single output */
	Layout: []int{2, 2, 1},
	/* Activation functions: Sigmoid, Tanh, ReLU, Linear */
	Activation: deep.ActivationSigmoid,
	/* Determines output layer activation & loss function: 
	ModeRegression: linear outputs with MSE loss
	ModeMultiClass: softmax output with Cross Entropy loss
	ModeMultiLabel: sigmoid output with Cross Entropy loss
	ModeBinary: sigmoid output with binary CE loss */
	Mode: deep.ModeBinary,
	/* Weight initializers: {deep.NewNormal(μ, σ), deep.NewUniform(μ, σ)} */
	Weight: deep.NewNormal(1.0, 0.0),
	/* Apply bias */
	Bias: true,
	/*to activativate the Bonds function when training the neural net, use a number other than 0*/
	Bonds: 0.00001,
})

Train:

// params: learning rate, momentum, alpha decay, nesterov
optimizer := training.NewSGD(0.05, 0.1, 1e-6, true)
// params: optimizer, verbosity (print stats at every 50th iteration)
trainer := training.NewTrainer(optimizer, 50)

training, heldout := data.Split(0.5)
trainer.Train(n, training, heldout, 1000) // training, validation, iterations

resulting in:

Epochs        Elapsed       Error         
---           ---           ---           
5             12.938µs      0.36438       
10            125.691µs     0.02261       
15            177.194µs     0.00404       
...     
1000          10.703839ms   0.00000       

Finally, make some predictions:

fmt.Println(data[0].Input, "=>", n.Predict(data[0].Input))
fmt.Println(data[5].Input, "=>", n.Predict(data[5].Input))

Alternatively, batch training can be performed in parallell:

optimizer := NewAdam(0.001, 0.9, 0.999, 1e-8)
// params: optimizer, verbosity (print info at every n:th iteration), batch-size, number of workers
trainer := training.NewBatchTrainer(optimizer, 1, 200, 4)

training, heldout := data.Split(0.75)
trainer.Train(n, training, heldout, 1000) // training, validation, iterations

Examples

See training/trainer_test.go for a variety of toy examples of regression, multi-class classification, binary classification, etc.

See examples/ for more realistic examples:

Dataset Topology Epochs Accuracy
wines [5 5] 10000 ~98%
mnist [50] 25 ~97%

neural-net-test10's People

Contributors

nathanleary avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.