Git Product home page Git Product logo

Comments (4)

martin-gorner avatar martin-gorner commented on August 30, 2024 1

I didn't find how [] to have a minimum learning rate

Hmm, I hear the ancients knew of a secret way. The knowledge is lost now, only an arcane symbol reached us through time, its meaning shrouded in mystery: +

As in:
learning_rate_with_min = min + tf.train.exponential_decay(...)

;-)

from tensorflow-mnist-tutorial.

kakawait avatar kakawait commented on August 30, 2024 1

I was totally focused on API signature and searching on doc that I forget the basic arithmetic... Good catch :)

from tensorflow-mnist-tutorial.

kakawait avatar kakawait commented on August 30, 2024

Other possible solution will be to delegate decay rate computation to TensorFlow

global_step = tf.Variable(0, trainable=False)
learning_rate = tf.train.exponential_decay(0.003, global_step, 100, 0.97, staircase=True)
train_op = tf.train.AdamOptimizer(learning_rate).minimize(loss, global_step=global_step)

Some result on training phase (3.0 convolutional)

Epoch: 0001 - cost = 0.317358517 - learning rate = 0.002576
Epoch: 0002 - cost = 0.068440056 - learning rate = 0.002212
Epoch: 0003 - cost = 0.044457262 - learning rate = 0.001843
Epoch: 0004 - cost = 0.030760335 - learning rate = 0.001582
Epoch: 0005 - cost = 0.021672835 - learning rate = 0.001318
Epoch: 0006 - cost = 0.014874582 - learning rate = 0.001132
Epoch: 0007 - cost = 0.012011148 - learning rate = 0.000943
Epoch: 0008 - cost = 0.006284746 - learning rate = 0.000810
Epoch: 0009 - cost = 0.004536705 - learning rate = 0.000674
Epoch: 0010 - cost = 0.003143254 - learning rate = 0.000579
Epoch: 0011 - cost = 0.001789587 - learning rate = 0.000482
Epoch: 0012 - cost = 0.001161188 - learning rate = 0.000414
Epoch: 0013 - cost = 0.000715020 - learning rate = 0.000345
Epoch: 0014 - cost = 0.000526037 - learning rate = 0.000296
Epoch: 0015 - cost = 0.000266390 - learning rate = 0.000247
Epoch: 0016 - cost = 0.000154369 - learning rate = 0.000212
Epoch: 0017 - cost = 0.000115868 - learning rate = 0.000177

Only thing you can't reproduce (I didn't find how, I mean) is to have a minimum learning rate

from tensorflow-mnist-tutorial.

martin-gorner avatar martin-gorner commented on August 30, 2024

Python 2 compatibility tested. Everything OK

from tensorflow-mnist-tutorial.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.