Comments (2)
I got the same bug
from nemo.
When I comment 339-342 lines in
.local/lib/python3.10/site-packages/nemo/collections/asr/parts/utils/asr_confidence_utils.py
the script works.
# OmegaConf.structured ensures that post_init check is always executed
#confidence_cfg = OmegaConf.structured(
# ConfidenceConfig() if confidence_cfg is None else ConfidenceConfig(**confidence_cfg)
#)
#self.confidence_method_cfg = confidence_cfg.method_cfg
from nemo.
Related Issues (20)
- Unusually high initial loss during continual pre-training of the Gemma2-2B model.
- Can't run basic inference HOT 2
- Continual training error: FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpbqgpune1/model_weights/model.decoder.layers.self_attention.linear_proj._extra_state/shard_0_16.pt'
- 00_NeMo_Primer.ipynb in Google Collab fail HOT 2
- Convert Mamba2 Hybrid .nemo model to .safetensors / .bin
- RuntimeError: stack expects each tensor to be equal size (when using lhotse shar data sets) HOT 1
- megatron.core.dist_checkpointing.core.CheckpointingException: Object shard /ckpt/model_weights/model.decoder.layers.self_attention.core_attention._extra_state/shard_0_80.pt not found
- Problem running LoRA PEFT on Llama 3 8B Instruct using NeMo docker container
- Cosine Similarity to Probability HOT 1
- Add a checkpoint averaging script for the new .distcp checkpoint format
- ImportError: cannot import name '_TORCH_GREATER_EQUAL_2_0' from 'lightning_fabric.utilities.imports' (/usr/local/lib/python3.10/dist-packages/lightning_fabric/utilities/imports.py)
- NeMo Container 24.07: NLPSaveRestoreConnector.save_to() is calling modelopt.torch.opt.plugins.save_sharded_modelopt_state with an unsupported parameter HOT 1
- Megatron -> .nemo checkpoint conversion script `megatron_lm_ckpt_to_nemo.py` fails
- `always_save_nemo` not working properly HOT 1
- Canary-1b on long audio file.
- Sortformer Integration Release Inquiry HOT 1
- RuntimeError: Internal: could not parse ModelProto from /tmp/tmpnf2ficir/tokenizer.decoder.32000.BPE.model
- Continued PreTraining Notebook of llama3.1,mistral-nemo
- Neva - Llama-3.1 support
- Mamba inference server init_process_group error for 1 gpu
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from nemo.