Git Product home page Git Product logo

dfa_optica_paper's Introduction

Silicon Photonic Architecture for Training Deep Neural Networks

Simulation code for our research article Silicon Photonic Architecture for Training Deep Neural Networks published in Optica.

In our paper, we propose on-chip training of neural networks enabled by a CMOS-compatible silicon photonic architecture to harness the potential for massively parallel, efficient, and fast data operations. Our scheme employs the direct feedback alignment training algorithm, which trains neural networks using error feedback rather than error backpropagation, and can operate at speeds of trillions of multiply-accumulate (MAC) operations per second while consuming less than one picojoule per MAC operation. The photonic architecture exploits parallelized matrix-vector multiplications using arrays of microring resonators for processing multi-channel analog signals along single waveguide buses to calculate the gradient vector for each neural network layer in situ. Our novel approach for efficient, ultra-fast neural network training showcases photonics as a promising platform for executing AI applications.

Simulation

This Python program simulates training of feedforward neural networks on the MNIST dataset using analog photonic hardware. In our architecture, each MAC operation is performed by a microring resonator (pictured above). Our simulation injects accurately scaled Gaussian noise, which represents the error in our experimental inner product measurements, to the output of each MAC operation in the matrix-vector multiplication for calculating the gradient. Further details concerning the simulation of our photonic architecture are given in the Supplementary.

Usage

Neural network training is executed using the main.py script which accepts command line arguments to specify the simulation parameters:

python main.py --hidden-layers 800 800 --error-std 0.618 --error-mean 0.03

The list of all command line arguments for main.py can be displayed as follows:

python main.py --help

License

This work is distributed under the MIT License. See LICENSE for more information.

Acknowledgements

This code was written by Matthew Filipovich as part of a graduate research project at Queen's University (Kingston, Canada) and supervised by Bhavin Shastri.

dfa_optica_paper's People

Contributors

matthewfilipovich avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.