sezercakir / mlp-autoencoder Goto Github PK
View Code? Open in Web Editor NEWThis project forked from sumeyyeozturkk/mlp-autoencoder
Implement multilayer perceptrons (MLP) that contains a hidden layer and tanh activation function(only hidden layer) with 64 input, 2 hidden, and 64 output units. Train the network using stochastic gradient descent with mini-batches and use mean-square-error (MSE) as the loss function. Train your MLP for multiple epochs with optdigits.tra and visual
License: GNU General Public License v3.0