Comments (17)
@Arnold1 Yes I run most models on the M2 using LLama.cpp and Alpaca.cpp. I believe they run on the GPU when using LLama.cpp. Vicuña runs very fast and well. Alpaca 7B runs faster. Alpaca 30B runs slowly, but surely. My MacBook Air is 24GB, most models fit just fine. In fact I've had more success on Mac than on Windows to run models locally - since most repos focus on Linux and Mac somehow.
from minigpt-4.
@DexterLagan how you run it - what comands? I have a M1 with 64 GB ram...
from minigpt-4.
Thanks for your interest! Our code currently is only tested in Linux and I'm also not sure whether Mac M2 has a powerful enough GPU to run our model. But for the error you show, basically this is trying to install a cuda inside the new environment. So if you already has cuda installed in your machine, you can comment this out
from minigpt-4.
Unfortunately, I tried creating a requirments.txt file for windows to avoid conda and I got the same error.
from minigpt-4.
I'll find out how to install cudatoolkit on conda. Vicuna runs perfect on my M2 24GB, so I'm hoping MiniGPT will also run. I can't see the visual encoder taking that much more RAM, considering Vicuna runs on just 8GB of RAM or less. Cheers
from minigpt-4.
Was able to install nearly all packages into the conda env by chmodding 666 conda's environments.txt, and removing cudatoolkit from the environment.yaml. There should be a way to run this model on CPU only.
from minigpt-4.
@DexterLagan are you able to run training and inference of Vicuna and MiniGPT on m2 already? is it able to use the gpu on m2?
from minigpt-4.
Follow the instructions on the LLama.cpp repo for Mac. The goal was to run on Mac in the first place. Cheers !
from minigpt-4.
[I want to run it in MacbookPro M1,but have a ValueError: Tokenizer class LLaMATokenizer does not exist or is not currently imported. After executing this command : python -m fastchat.model.apply_delta --base llama-7b-hf/ --target vicuna-7b/ --delta vicuna-7b-delta-v1.1/. Help me please!] my email:[email protected]
from minigpt-4.
@Coca-Cola1999 Make sure you search issues on the LLama.cpp repo, and post one if nothing matches. I haven't encountered this error. Cheers!
from minigpt-4.
@DexterLagan thanks so much for sharing your experience! Could you also share how you connect Vicuna (running through Llama.cpp) with this MiniGPT4? I am running Llama.cpp Vicuna on my MacBook now, but still trying to figure out how to connect it to MiniGPT4. Some directions would be appreciated!
from minigpt-4.
can we use docker to run it on Silicon ?
from minigpt-4.
see pull reqeust #174
I change max_new_tokens=3000, max_length=20000 in demo.py
then On the basis of this picture, create a detailed, wonderful, fascinating novel story
Generate a story as below :
In a land far, far away, there was a young woman named Li who lived in a small village on the outskirts of the city. She was known for her beauty and grace, and many men in the village had asked for her hand in marriage. However, Li had always declined, saying that she was not ready to settle down.
One day, while Li was out walking near the river, she saw a man standing on the bridge. He was dressed in a white robe with gold embroidery and held a fan in his hand. As she approached, he turned to her and bowed deeply.
“Hello, miss,” he said in a smooth, melodious voice. “My name is Tao, and I am a traveler. I have been traveling for many years and have seen many beautiful sights, but none as lovely as you.”
Li blushed at the compliment and lowered her eyes. “Thank you, sir,” she said softly. “But I am just a simple village girl. There is nothing special about me.”
Tao chuckled. “Oh, but there is, miss. You have a kind heart and a gentle spirit, and those are qualities that are rare and precious. I have never met anyone like you before.”
Li looked up at him, surprised. “You have never met anyone like me?” she asked.
Tao smiled. “No, miss. You are unique and special, and I would like to get to know you better.”
Li hesitated for a moment, wondering if she should trust this stranger. But something about him seemed kind and sincere, and she found herself wanting to believe him.
“Very well,” she said finally. “I will allow you to get to know me better. But you must promise to be respectful and kind.”
Tao bowed again. “I promise, miss. I will do my best to make you happy.”
And with that, the two of them began a journey of discovery and love that would take them to the farthest corners of the earth and beyond.
use gpt4 transalted into chinese is :
用gpt4润色后如下
在一个遥远的土地上,有一个名叫丽的年轻女子,她居住在城市边缘的一个小村庄。她因美貌和优雅而闻名于世,村里许多男子都向她求婚。然而,丽总是婉拒他们,称自己还未准备好安定下来。
一天,丽在河边散步时,看到一名男子站在桥上。他身着白色镶金的长袍,手持一把折扇。当丽走近时,他转身向她鞠躬行礼。
“小姐好,”他用柔和悦耳的声音说道,“我叫涛,是一名行者。我游历多年,目睹过世间诸多美景,但都不及你的美丽动人。”
丽听到这番赞美,脸颊泛红,羞涩地低下了头。“谢谢您,先生,”她轻声说,“但我只是一个普通的乡村姑娘,没有什么特别之处。”
涛笑了笑。“哦,但是有的,小姐。你拥有一颗善良的心和温柔的气质,这样的品质实属罕见且珍贵。我从未遇见过像你这样的人。”
丽抬起头,惊讶地看着他。“您从未遇到过像我这样的人?”她问道。
涛微笑着说:“是的,小姐。你独一无二、与众不同,我希望能更多地了解你。”
丽犹豫了一会儿,不知道是否该信任这个陌生人。但他身上似乎有种善意和真诚,她情不自禁地想要相信他。
“好吧,”她最终说道,“我愿意让你更了解我。但你必须承诺尊重我,对我好。”
涛再次鞠躬。“我承诺,小姐。我会竭尽所能让你幸福。”
就这样,他们开始了一段探索与爱情的旅程,这段旅程将带他们到地球的最远角落,乃至更遥远的地方。
image is created by stable-diffusion
from minigpt-4.
@wacdev Thank you for fixes. Tested on M1 16Gb, with max_new_tokens=150, max_length=1000
and all other yours changes. It can be executed. Performs very slow, because of excessive swap usage. Description of one image took 25 minutes in my case.
Trying to figure out, if mps
device can be used to utilize GPU.
from minigpt-4.
@CoruNethron I have just tried to run in M1 64, not sure what is missing, after "Loading LLAMA", it stated the error
CUDA SETUP: Required library version not found: libsbitsandbytes_cpu.so. Maybe you need to compile it from source?
...
Is there anything missed during the installation? I am using conda created an environment dedicated for testing minigpt.
from minigpt-4.
@simongcc please check this fork https://github.com/wacfork/MiniGPT-4/tree/main by @wacdev , there was few changes, like device type (if CUDA device is not available). As far as I remember, I just took all the changes from that fork and also sightly changed max_new_tokens
and max_length
cause of low RAM in my system.
from minigpt-4.
@CoruNethron Thank you so much for shedding light on this! I will check it out and try.🙏🏽😃
update:
It can run now. But when it run, it encounter a small error like this
/miniforge3/envs/minigpt4-arm/lib/python3.9/site-packages/torchvision/io/image.py:13: UserWarning: Failed to load image Python extension: dlopen(/miniforge3/envs/minigpt4-arm/lib/python3.9/site-packages/torchvision/image.so, 0x0006): Symbol not found: (__ZN2at4_ops19empty_memory_format4callEN3c108ArrayRefIxEENS2_8optionalINS2_10ScalarTypeEEENS5_INS2_6LayoutEEENS5_INS2_6DeviceEEENS5_IbEENS5_INS2_12MemoryFormatEEE)
so that the resulted minigpt seems cannot get the image after uploading.
Did you encounter the same error?
from minigpt-4.
Related Issues (20)
- Confusion around normalising the bounding boxes between (0, 100) for fine-tuning.
- I couldn't find the checkpoint for MiniGPT-4 (llama2) in stage1
- 功能需求:提供图片输入API接口,向API接口传输图片的base64编码,以及需要GPT执行的工作,返回结果 HOT 1
- _pickle.UnpicklingError: invalid load key, '<'. HOT 1
- How to Finetune MiniGPT-v2 on 40G A100 GPUs? HOT 7
- throws the StopIteration error HOT 1
- Why lora is not used in minigpt4 version-1?
- the checkpoints of MiniGPT-v2 (after stage-3) and MiniGPT-v2 (online developing demo) have any differences? HOT 1
- Could not create share link. Please check your internet connection or our status page: https://status.gradio.app. HOT 1
- Has anyone ever tried to finetune miniGPT-4 on VQA tasks such as OKVQA?
- How to preprocess IconQA for evaluation? HOT 2
- urllib.error.HTTPError: HTTP Error 403: Forbidden
- Is is okay to use CrossEntropyLoss on referring or grounding?
- Hyperparameters for the VQA zero-shot inference
- Question: replacing the Llama 2 7b chat component with an equivalent Code Llama 7b
- cc_sbu_align datasets
- A load persistent id instruction was encountered, but no persistent_load function was specified. HOT 1
- minigpt4-demo
- demo.py 出错 HOT 3
- pip install accelerate仍然不可以 HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from minigpt-4.