Git Product home page Git Product logo

simpledeepnettoolbox's Introduction

SimpleDeepNetToolbox : Simple Deep Net Toolbox in MATLAB


Authors: Hiroyuki Kasai

Last page update: November, 14, 2018

Latest library version: 1.0.3 (see Release notes for more info.)


Introduction

The SimpleDeepNetToolbox is a pure-MATLAB and simple toolbox for deep learning. This toolbox was originally ported from python library. However, major modification have been made for MATLAB implementation and its efficient implementation.

There are much better other toolboxes available for deep learning, e.g. Theano, torch or tensorflow. I would definitely recommend you to use one of such tools for your problems at hand. The main purpose of this toolbox is to allows you, especially "MATLAB-lover" researchers, to undedrstand deep learning techniques using "non-black-box" simple implementations.


  • Feedforward Backpropagation Neural Networks
  • Convolutional Neural Networks
  • Affine layer
  • Conolution layer
  • Pooling layer
  • Dropout layer
  • Batch normalization layer (Under construction)
  • ReLu (Rectified Linear Unit) layer
  • Sigmoid layer
  • Softmax layer
  • Vanila SGD
  • AdaGrad
  • Momentum SGD

Folders and files

./                      - Top directory.
./README.md             - This readme file.
./run_me_first.m        - The scipt that you need to run first.
./demo.m                - Demonstration script to check and understand this package easily. 
./download.m            - Script to download datasets.
|networks/              - Contains various network classes.
|layers/               	- Contains various layer classes.
|optimizer/             - Contains optimization solvers.
|test_samples/          - Contains test samples.
|datasets/          	- Contains dasetsets (to be downloaded).

First to do: configure path

Run run_me_first for path configurations.

%% First run the setup script
run_me_first;

Second to do: download datasets

Run download for downloading datasets.

%% Run the downloading script
download;
  • If your computer is behind a proxy server, please configure your Matlab setting. See this.

Simplest usage example: 5 steps!

Just execute demo_two_layer_neuralnet for the simplest demonstration of this package. This is a forward backward neural network.

%% load dateaset
[x_train, t_train, train_num, x_test, t_test, test_num, class_num, dimension, ~, ~] = ...
    load_dataset('mnist', './datasets/',  inf, inf, false);

%% set network
network = two_layer_net(x_train, t_train, x_test, t_test, 784, 50, 10, []);

%% set trainer
trainer = nn_trainer(network);


%% train
info = trainer.train(); 

%% plot
display_graph('epoch', 'cost', {'Tow layer net'}, {}, {info});    

train_info = info;
test_info = info;
train_info.accuracy = info.train_acc;
test_info.accuracy = info.test_acc;
display_graph('epoch', 'accuracy', {'Train', 'Test'}, {}, {train_info, test_info});   


Let's take a closer look at the code above bit by bit. The procedure has only 5 steps!

Step 1: Load dataset

First, we load a dataset including train set and test set using a data loader function load_dataset(). The output include train set and test set, and related other data.

[x_train, t_train, train_num, x_test, t_test, test_num, class_num, dimension, ~, ~] = ...
    load_dataset('mnist', './datasets/',  inf, inf, false);

Step 2: Set network

The next step defines the network architecture. This example uses a two layer neural network with the input size 784, the hidden layer size 50, and the output layer size 10. Datasets are also delivered to this class.

%% set network
network = two_layer_net(x_train, t_train, x_test, t_test, 784, 50, 10, []);

Step 3: Set trainer

You also set the network to be used. Some options for training could be configured using the second argument, which is not used in this example, though.

%% set trainer
trainer = nn_trainer(network);

Step 4: Perform trainer

Now, you start to train the network.

%% train
info = trainer.train(); 

It returns the statistics information that include the histories of epoch numbers, cost values, train and test accuracies, and so on.

Step 5: Show result

Finally, display_graph() provides output results of decreasing behavior of the cost values in terms of the number of epoch. The accuracy results for the train and the test are also shown.

% plot
display_graph('epoch', 'cost', {'Two layer net'}, {}, {info});    

train_info = info;
test_info = info;
train_info.accuracy = info.train_acc;
test_info.accuracy = info.test_acc;
display_graph('epoch', 'accuracy', {'Train', 'Test'}, {}, {train_info, test_info}); 

That's it!


More plots

TBA.


License

  • The SimpleDeepNetToolbox is free and open source.
  • The code provided in SimpleDeepNetToolbox should only be used for academic/research purposes.
  • This toolbox was originally ported from python library.

Notes

  • As always, parameters such as the learning rate should be configured properly in each dataset and network.

Problems or questions

If you have any problems or questions, please contact the author: Hiroyuki Kasai (email: kasai at is dot uec dot ac dot jp)


Release notes

  • Version 1.0.3 (Nov. 14, 2018)
    • Some class structures are re-configured.
  • Version 1.0.2 (Nov. 09, 2018)
    • Some class structures are re-configured.
  • Version 1.0.1 (Nov. 07, 2018)
    • Some class structures are re-configured.
  • Version 1.0.0 (Oct. 08, 2018)
    • Initial version.

simpledeepnettoolbox's People

Contributors

hiroyuki-kasai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

simpledeepnettoolbox's Issues

Program error

The software platform is matlab r2016.
Running files:demo_two_layer_neuralnet。
The following error prompt appears:
{
Incorrect use +
The matrix dimensions must be consistent.
Error affine/forward (line 53)
             Out = obj.x * obj.W + obj.b;

Error two_layer_net/predict (line 152)
                 x = obj.layer_manager.layers{idx}.forward(x);

Error two_layer_net/loss_partial (line 141)
             y = obj.predict(x_curr_batch);

Error two_layer_net/loss (line 125)
             f = loss_partial(obj, 1:obj.samples);

Error nn_trainer/train (line 162)
             Loss = obj.network.loss();

Error demo_two_layer_neuralnet (line 27)
Info = trainer.train();
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.