Comments (7)
There are very many hparams that you can set for Transformer in T2T, and I believe there is a version that'll make it match Trax... but it's gotten really hard to find. Well - that's why we made Trax!
I feel like finding the exact hparams to match Trax may be quite a lot of work, so maybe instead we correct inference in Trax? What are the bugs?
from trax.
Hmm, it may also be that some dense layers in Trax have biases that we don't use in T2T?
from trax.
There are very many hparams that you can set for Transformer in T2T, and I believe there is a version that'll make it match Trax... but it's gotten really hard to find. Well - that's why we made Trax!
I feel like finding the exact hparams to match Trax may be quite a lot of work, so maybe instead we correct inference in Trax? What are the bugs?
The thing that blows my mind is that while training the same network with the same data and on a relatively easy task, my T2T transformer failes to converge compared to Trax's transformer, which learns very well and converges after several hundreds steps, which is the behaviour I expect, since my task is to mostly copy data and make some small changes to the missing diacritics.
One more thing worth mentioning is that the learning rate in both frameworks has different flows even when I fix all the T2T learning rate scheduler's parameters and set then equal to those Trax has. Take a look. This is Trax:
And this is T2T:
Also T2T Transformer has a huge variance in loss while being trained. Considering numerous issues in T2T regarding the same problem, it was my motivation to get it straight.
Thank you for help.
from trax.
I feel like finding the exact hparams to match Trax may be quite a lot of work, so maybe instead we correct inference in Trax? What are the bugs?
I would really appreciate if we could fix these bugs in Trax. One thing that I've accomplished is inference, but in a weird way - I load my model with mode='train' and run inference by providing transformer with the whole input and the empty array as the currently decoded output. Then, I feed it back and get the next prediction. Basically, this process replicates the greedy beam search strategy with beam size 1, but still it seems to me, that it's very cumbersome and awkward way to do it.
One more thing I really want to get out from the model are the attention weights. I'll take a look at how to hack those basing on the strategy from T2T (fortunately, getting attention weights didn't have many complications).
from trax.
Ok: let's focus on repairing the bugs in Trax! What is the current inference bug? Could you open a separate Issue for that? I'll then close this one, so we can focus on the bugs one-by-one.
from trax.
Well, I'll open some new issues tomorrow morning. Thanks.
from trax.
Thanks!
from trax.
Related Issues (20)
- The colab button on Knowledge_Tracing_Transformer.ipynb is not open
- TypeError: float() argument must be a string or a number, not 'jaxlib.tpu_client_extension.PyTpuBuffer'
- Machine Translation Refromer model.pkl for trax 1.4.1?
- ImportError: cannot import name 'MergeHeads' from 'trax.layers.attention'
- how to use trax to translate other languages
- Limit the dataset from TFDS
- TypeError: unsupported operand type(s) for ==: 'Array' and 'tuple' HOT 2
- SelfAttention - problem with tensorflow 2.11.0
- AttributeError: module 'jax.ops' has no attribute 'index_add' HOT 1
- Unable to import trax HOT 1
- Cannot import Trax HOT 6
- Could not normally run trax using GPU in local computer
- Issue when running training_loop.run(2000) - message StopIteration in next_batch(self)
- Is possible Linformer algorithm ?
- Can I do simple tokenization?
- Can't run `bert_vocab_from_dataset` without `TypeError: Tensor is unhashable` when import `trax` with `tensorflow`
- Are any easy ways to use something like `train_test_split` from `sklearn`?
- AttributeError: 'function' object has no attribute 'n_steps_per_checkpoint' for NLP Machine translation model HOT 1
- Error loading loop from a checkpoint HOT 1
- Inconsistency in function's doc-string HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from trax.