Git Product home page Git Product logo

deep-learning's Introduction

deep-learning

Fast neural networks implementation in C++

Compile with build.sh

Uncomment required files for the build

./build.sh

Prerequisites

Boost C++

sudo apt-get install libboost-all-dev

g++

sudo apt-get install g++

Sample: Building a neural network in 5 lines of code.

NeuralNetwork nn = NeuralNetwork();
nn.add_layer(new ReluLayer(28 * 28, 128, LEARNING_RATE, LAMBDA));
nn.add_layer(new ReluLayer(128, 32, LEARNING_RATE, LAMBDA));
nn.add_layer(new SoftmaxLayer(32, 10, LEARNING_RATE, LAMBDA));
nn.add_loss_function(new SoftmaxCrossEntropyLoss());

Checkout notebook branch

Test the implementation with an interactive jupyter notebook


Documentation

Building blocks of the model

  • class Matrix

    file: deep-learning/matrix/matrix.h

    Supports all basic matrix operations and the ability to do fast matrix dot products.
    All Inputs must be fed into the network using this class.
    All internal calculations work on top of these Matrix objects.

    Constructor: Matrix(int n, int m, char rand_init)

    Initialize a n x m matrix with suitable random values.
    rand_init = 'u' for uniform distribution between 0 and 1
    rand_init = 'n' for normal distribution with mean 0 and variance 1
    default rand_init = 0 for zero initialization.

  • class Layer

    file: deep-learning/layer/layer.h

    The Layer class does what its name says. It represents the layers of the neural
    network. This is an abstract class and implements the key forward propagation,
    backward propagation and optimization steps. Other layers must extend this class
    with its own activation function by overriding the Matrix activation(Matrix Z); and
    Matrix backward_activation(Matrix Z); methods.

    Constructor: Layer(int l_, int l, float learning_rate, float lambda, float beta1, float beta2)

    Initialize the layer with the given parameters.
    l_: number of units in previous layer
    l: number of units for the layer
    learning_rate: step size for optimization algorithm
    lambda: regularization hyperparameter
    beta1: first order term for 'adam' optimizer
    beta2: second order term for 'adam' optimizer

    The following activations are already implemented:

    1. class SigmoidLayer: public Layer
    2. class ReluLayer: public Layer
    3. class SoftmaxLayer: public Layer

    Mathematical details of the forward and backward propagation steps are described later.

  • class LossFunction

    file: deep-learning/loss/loss.h

    This is an abstract base class for all loss functions that can be implemented. Other loss
    functions must implement the float cost(Matrix A, Matrix Y); and
    Matrix derivative(Matrix A, Matrix Y); methods.

    The following loss functions are already implemented:

    1. class BinaryCrossEntropyLoss: public LossFunction
    2. class SoftmaxCrossEntropyLoss: public LossFunction

    NOTE: SoftmaxCrossEntropyLoss derivative calculates the derivative of cross entropy loss multiplied with softmax activation layer.

    Mathematical details of the forward and backward propagation steps are described later.

Putting it all together

  • class NeuralNetwork

    file: deep-learning/neuralnetwork/neuralnetwork.h

    Methods:

    1. void add_layer(Layer * L);
      Adds a fully connected layer to the network.

    2. void add_loss_function(LossFunction * J);
      Adds target loss function to the network.

    3. float train_batch(Matrix X, Matrix Y, int num_iterations);
      Trains weights for num_interations for given input and labels.
      returns cost

    4. Matrix predict(Matrix X);

    5. void save(string filename);
      Saves the network and its state to file.

    6. void load(string filename);
      Loads a pre-trained network from file.

Sample: Logistic Regression

NeuralNetwork nn = NeuralNetwork();
nn.add_layer(new SigmoidLayer(NUM_INPUTS, 1, LEARNING_RATE, LAMBDA));
nn.add_loss_function(new BinaryCrossEntropyLoss());
nn.train_batch(X_train, Y_train, 1000);
Matrix Y_pred = nn.predict(X_test);

deep-learning's People

Contributors

noob77777 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.