Comments (5)
full parameters and training?
Do you mean fine-tuning all the parameters of an LLM? There is a PR for that you could check out if so: #645
from mlx-examples.
Not fine-tuning. I mean pre-train the model.
For example, I need to use the model in certain professional fields.
Like llama model, I want to add some tokens and retrain it.
from mlx-examples.
Like llama model, I want to add some tokens and retrain it.
Based on your comment it sounds like you want to fine tune the model. But are you saying you want an example which trains something like a Llama model from scratch? That is very computationally expensive.
from mlx-examples.
Yes. Because mac studio is much cheaper than nvida
On a Mac with 192G memory, I want to try to see if I can train a model like llama from scratch.
from mlx-examples.
Training something like Llama from scratch on a single Mac Studio is a tall order. Llama 3 was trained on 15 trillion tokens on a cluster with 24K H100. The difference in compute from that to a single Mac studio is probably well over 100000x.
If you can train a much smaller model or fine tune that would be much more feasible. Here is an example in MLX of training a Transformer LM from scratch: https://github.com/ml-explore/mlx-examples/tree/main/transformer_lm
from mlx-examples.
Related Issues (20)
- CLIP Tokenizer unable to take text of unequal length (or token length) HOT 7
- Package 'mlx_whisper.assets' is absent from the `packages` configuration HOT 1
- [Feature Request] Add support for logprobs to the mlx_lm server HOT 3
- Phi-3 128K Context Variants' `su` RoPE Scaling HOT 12
- [REGRESSION] Some MoE models display 0% GPU utilization with mlx-ops 0.14.0 HOT 3
- I would like to inquire about a solution to the following problem. HOT 1
- link to Phi-2 example in readme broken HOT 1
- SPMStreamingDetokenizer sometimes outputs incorrect multi-byte characters HOT 1
- Why change the module decomposition of whisper HOT 3
- A simple enhancement, in dataset creation time HOT 1
- [Question]about creating the 'adapters.npz' file HOT 3
- [QUESTION] Is there a way to provide a Huggingface access token for downloading models that are private? HOT 1
- [Model Request] Add support for IBM's Granite model HOT 2
- [Feature] Export Lora Adapters as GGML HOT 3
- Error when running inference on newly converted OpenELM MLX model, ValueError(f"Received parameters not in model: {extras}.") HOT 1
- LLMEvaluator : libc++abi: terminating due to uncaught exception of type std::invalid_argument: [matmul] Last dimension of first input with shape (1,916,2048) must match second to last dimension of second input with shape (256,32000)
- Unable to allocate memory
- Proposal: Add mypy to .pre-commit-config.yml HOT 2
- Fusing adapters with llama3 cause bad performances HOT 7
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from mlx-examples.