Git Product home page Git Product logo

Comments (7)

lukaszkaiser avatar lukaszkaiser commented on May 3, 2024

There are very many hparams that you can set for Transformer in T2T, and I believe there is a version that'll make it match Trax... but it's gotten really hard to find. Well - that's why we made Trax!

I feel like finding the exact hparams to match Trax may be quite a lot of work, so maybe instead we correct inference in Trax? What are the bugs?

from trax.

lukaszkaiser avatar lukaszkaiser commented on May 3, 2024

Hmm, it may also be that some dense layers in Trax have biases that we don't use in T2T?

from trax.

DevKretov avatar DevKretov commented on May 3, 2024

There are very many hparams that you can set for Transformer in T2T, and I believe there is a version that'll make it match Trax... but it's gotten really hard to find. Well - that's why we made Trax!

I feel like finding the exact hparams to match Trax may be quite a lot of work, so maybe instead we correct inference in Trax? What are the bugs?

The thing that blows my mind is that while training the same network with the same data and on a relatively easy task, my T2T transformer failes to converge compared to Trax's transformer, which learns very well and converges after several hundreds steps, which is the behaviour I expect, since my task is to mostly copy data and make some small changes to the missing diacritics.

One more thing worth mentioning is that the learning rate in both frameworks has different flows even when I fix all the T2T learning rate scheduler's parameters and set then equal to those Trax has. Take a look. This is Trax:

Screenshot 2020-04-23 at 09 37 11

And this is T2T:

Screenshot 2020-04-23 at 09 39 31

Also T2T Transformer has a huge variance in loss while being trained. Considering numerous issues in T2T regarding the same problem, it was my motivation to get it straight.

Thank you for help.

from trax.

DevKretov avatar DevKretov commented on May 3, 2024

I feel like finding the exact hparams to match Trax may be quite a lot of work, so maybe instead we correct inference in Trax? What are the bugs?

I would really appreciate if we could fix these bugs in Trax. One thing that I've accomplished is inference, but in a weird way - I load my model with mode='train' and run inference by providing transformer with the whole input and the empty array as the currently decoded output. Then, I feed it back and get the next prediction. Basically, this process replicates the greedy beam search strategy with beam size 1, but still it seems to me, that it's very cumbersome and awkward way to do it.

One more thing I really want to get out from the model are the attention weights. I'll take a look at how to hack those basing on the strategy from T2T (fortunately, getting attention weights didn't have many complications).

from trax.

lukaszkaiser avatar lukaszkaiser commented on May 3, 2024

Ok: let's focus on repairing the bugs in Trax! What is the current inference bug? Could you open a separate Issue for that? I'll then close this one, so we can focus on the bugs one-by-one.

from trax.

DevKretov avatar DevKretov commented on May 3, 2024

Well, I'll open some new issues tomorrow morning. Thanks.

from trax.

lukaszkaiser avatar lukaszkaiser commented on May 3, 2024

Thanks!

from trax.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.