Comments (6)
@KillianLucas is like that?
can you speak english?
yes, I can.
Can you write C++ code?
yes, I can.
Can you write a bubble algorithm?
yes, I can.
from open-interpreter.
[?] Parameter count (smaller is faster, larger is more capable): 7B
7B
16B
34B
[?] Quality (lower is faster, higher is more capable): Low | Size: 3.01 GB, RAM usage: 5.51 GB
Low | Size: 3.01 GB, RAM usage: 5.51 GB
Medium | Size: 4.24 GB, RAM usage: 6.74 GB
High | Size: 7.16 GB, RAM usage: 9.66 GB
[?] Use GPU? (Large models might crash on GPU, but will run more quickly) (...: y
from open-interpreter.
I meet same question
from open-interpreter.
Hey @hotwa and @mathpopo! I've been seeing this across several systems. We're going to change interface packages for CodeLlama in the next week or so, and hopefully that should solve this problem. I'll keep you updated. In the meantime, this works for some users:
pip install --force-reinstall --upgrade llama-cpp-python
Let me know if the GPU gets used after rebuilding llama-cpp-python
like that. Thanks!
from open-interpreter.
@KillianLucas sorry , cannot work
(base) chenxin@chenxin-Nitro-AN515-52:$ conda activate open-interpreter$ pip install --force-reinstall --upgrade llama-cpp-python
(open-interpreter) chenxin@chenxin-Nitro-AN515-52:
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting llama-cpp-python
Downloading llama_cpp_python-0.1.83.tar.gz (1.8 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.8/1.8 MB 935.3 kB/s eta 0:00:00
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting typing-extensions>=4.5.0 (from llama-cpp-python)
Obtaining dependency information for typing-extensions>=4.5.0 from https://files.pythonhosted.org/packages/ec/6b/63cc3df74987c36fe26157ee12e09e8f9db4de771e0f3404263117e75b95/typing_extensions-4.7.1-py3-none-any.whl.metadata
Downloading typing_extensions-4.7.1-py3-none-any.whl.metadata (3.1 kB)
Collecting numpy>=1.20.0 (from llama-cpp-python)
Obtaining dependency information for numpy>=1.20.0 from https://files.pythonhosted.org/packages/32/6a/65dbc57a89078af9ff8bfcd4c0761a50172d90192eaeb1b6f56e5fbf1c3d/numpy-1.25.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata
Downloading numpy-1.25.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.6 kB)
Collecting diskcache>=5.6.1 (from llama-cpp-python)
Obtaining dependency information for diskcache>=5.6.1 from https://files.pythonhosted.org/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl.metadata
Downloading diskcache-5.6.3-py3-none-any.whl.metadata (20 kB)
Downloading diskcache-5.6.3-py3-none-any.whl (45 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 45.5/45.5 kB 1.6 MB/s eta 0:00:00
Downloading numpy-1.25.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.2 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 18.2/18.2 MB 1.2 MB/s eta 0:00:00
Downloading typing_extensions-4.7.1-py3-none-any.whl (33 kB)
Building wheels for collected packages: llama-cpp-python
Building wheel for llama-cpp-python (pyproject.toml) ... done
Created wheel for llama-cpp-python: filename=llama_cpp_python-0.1.83-cp311-cp311-linux_x86_64.whl size=413883 sha256=54697e6f41cc6048446d5f80e174a933487d406e5fb75302d4d55dc65bfe5426
Stored in directory: /tmp/pip-ephem-wheel-cache-l9dvavp4/wheels/25/d2/5b/9f3c919284f260835ba686d041a70e06ae7c35adc62493188e
Successfully built llama-cpp-python
Installing collected packages: typing-extensions, numpy, diskcache, llama-cpp-python
Attempting uninstall: typing-extensions
Found existing installation: typing_extensions 4.7.1
Uninstalling typing_extensions-4.7.1:
Successfully uninstalled typing_extensions-4.7.1
Attempting uninstall: numpy
Found existing installation: numpy 1.25.2
Uninstalling numpy-1.25.2:
Successfully uninstalled numpy-1.25.2
Attempting uninstall: diskcache
Found existing installation: diskcache 5.6.3
Uninstalling diskcache-5.6.3:
Successfully uninstalled diskcache-5.6.3
Attempting uninstall: llama-cpp-python
Found existing installation: llama-cpp-python 0.1.83
Uninstalling llama-cpp-python-0.1.83:
Successfully uninstalled llama-cpp-python-0.1.83
Successfully installed diskcache-5.6.3 llama-cpp-python-0.1.83 numpy-1.25.2 typing-extensions-4.7.1
from open-interpreter.
This is now a duplicate of #168. If you still need help, please leave a comment on that issue.
from open-interpreter.
Related Issues (20)
- How to Extract Complete, Non-redundant, and Correct Code from Messages Testing on Benchmarks like HumanEval? HOT 5
- %info ubuntu 22 no output HOT 2
- When use ollama with model llama3:70b, the code cannot run HOT 11
- Can Not Start a New Chat in Terminal
- 'computer' module not found
- The interpreter cannot install required dependencies by itself.
- ollama llama3 How to remove the first line " ` " when generating code in Windows 11 terminal HOT 3
- Cannot run scripts that arent python HOT 5
- After the task is completed, the task will be executed repeatedly and will not stop automatically; intermittent and continuous repeated output HOT 3
- Adding Groq Support HOT 1
- litellm.exceptions.ServiceUnavailableError: AnthropicException - anthropic does not support parameters: {'functions'
- You can see what's on the screen and go to My Downloads
- role inversion with llama 3 "You are Open Interpreter"
- Hosted multimodal models from Open Router currently don't work on Open Interpreter HOT 2
- Adding the 'Computer' destroyed open intererpreter which was the best product i used HOT 9
- Installation fails without specifying full Python version in one-liner `oi-mac-installer.sh`
- Termux: tip and report, each time during upgrade
- In VSCode terminal, generated code blocks & errors progressively repeat in a flashing way
- "open terminal failed: not a terminal"
- Password-input prompt from OS was removed from the terminal when OI try to run `sudo` commands. HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from open-interpreter.