My implementation of different machine learning algorithms and architectures for self-education purposes
- MNIST Linear Model
- MNIST Convoluational Neural Network
- Neural Style Transfer (VGG16, TensorFlow) (work in progress)
- Neural Style Transfer (VGG19, PyTorch)
- Simple Gradient Descent
- Simple Autoencoder
- MNIST GAN (Generative Adversarial Network) (work in progress)
- MNIST Convolutional GAN (work in progress)
- LSTM (Long Short Term Memory) Cell (work in progress)
- GRU (Gated Recurrent Unit) Cell (work in progress)
- 3-layer RNN (LSTM) Network (text generation) (work in progress)
- Dynamic RNN Loop (TensorFlow)
- Encoder-Decoder Seq2Seq Model (work in progress)
- Encoder-Decoder Seq2Seq Attention Model
- Seq2Seq Convolutional Model (PyTorch)
- Word Embedding with Word2Vec
- Automatic Symbolic Differentiation
- Neural Network Framework (work in progress)
- Neural Network Framework V2 (TensorFlow-like API, with automatic differentiation and computation graphs) (work in progress)
- Neural Network Framework V2 MNIST (work in progress)
- Cheatsheet