Git Product home page Git Product logo

test-agent's People

Contributors

hailianzhl avatar jglee2046 avatar mcc202307 avatar pandadevs avatar ss41979310 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

test-agent's Issues

环境安装时遇到的问题

环境:mac m1 pro
python version: 3.10.7

环境安装时,执行pip install -r requirements.txt,执行很久,大概6,7个小时,llama_index,langchain一直在自动试各种版本
image

image 最终报错 image

无法发送信息

部署完成,发送信息提示NETWORK ERROR DUE TO HIGH TRAFFIC. PLEASE REGENERATE OR REFRESH THIS PAGE.

启动Web服务无反应

命令行依次运行指令:
python -m chat.server.controller
企业微信截图_17127549038352

然后另一个命令行运行:
python -m chat.server.model_worker --model-path models/TestGPT-7B --device cpu
2

最后另一个命令行运行:
python -m chat.server.gradio_testgpt
企业微信截图_1712755203335

无任何反应!!!
这是为什么啊,求助大神

执行pip install -r requirements.txt时报错

Preparing metadata (setup.py) ... error
error: subprocess-exited-with-error

× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [17 lines of output]
No CUDA runtime is found, using CUDA_HOME='C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.3'
test.c
LINK : fatal error LNK1181: 无法打开输入文件“aio.lib”
Traceback (most recent call last):
File "", line 2, in
File "", line 34, in
File "C:\Users\33\AppData\Local\Temp\pip-install-7rd2r2v0\deepspeed_0aefcc0fcaa04450b49543c945a2152f\setup.py", line 182, in
abort(f"Unable to pre-compile {op_name}")
File "C:\Users\33\AppData\Local\Temp\pip-install-7rd2r2v0\deepspeed_0aefcc0fcaa04450b49543c945a2152f\setup.py", line 52, in abort
assert False, msg
AssertionError: Unable to pre-compile async_io
[WARNING] Torch did not find cuda available, if cross-compiling or running with cpu only you can ignore this message. Adding compute capability for Pascal, Volta, and Turing (compute capabilities 6.0, 6.1, 6.2)
DS_BUILD_OPS=1
[WARNING] async_io requires the dev libaio .so object and headers but these were not found.
[WARNING] If libaio is already installed (perhaps from source), try setting the CFLAGS and LDFLAGS environment variables to where it can be found.
[WARNING] One can disable async_io with DS_BUILD_AIO=0
[ERROR] Unable to pre-compile async_io
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

启动模型worker报错

命令:
python3 -m chat.server.model_worker --model-path ../TestGPT-7B --device cpu
报错:
/deepspeed/accelerator/real_accelerator.py", line 155, in get_accelerator
ds_accelerator = MPS_Accelerator()
TypeError: Can't instantiate abstract class MPS_Accelerator with abstract method supported_dtypes

macbook M1启动模型worker报错

使用命令:
python3 -m chat.server.model_worker --model-path models/testgpt --device mps

执行log记录如下:
2024-01-22 17:08:04 | INFO | model_worker | args: Namespace(host='localhost', port=21002, worker_address='http://localhost:21002', controller_address='http://localhost:21001', model_path='models/testgpt', revision='main', device='mps', gpus=None, num_gpus=1, max_gpu_memory=None, load_8bit=False, cpu_offloading=False, gptq_ckpt=None, gptq_wbits=16, gptq_groupsize=-1, gptq_act_order=False, awq_ckpt=None, awq_wbits=16, awq_groupsize=-1, model_names=None, conv_template=None, embed_in_truncate=False, limit_worker_concurrency=5, stream_interval=2, no_register=False)
2024-01-22 17:08:04 | INFO | stdout | testgpt!!!!!!
2024-01-22 17:08:04 | INFO | model_worker | Loading the model ['testgpt'] on worker c61383a5 ...
2024-01-22 17:08:04 | ERROR | stderr | Traceback (most recent call last):
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 286, in hf_raise_for_status
2024-01-22 17:08:04 | ERROR | stderr | response.raise_for_status()
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status
2024-01-22 17:08:04 | ERROR | stderr | raise HTTPError(http_error_msg, response=self)
2024-01-22 17:08:04 | ERROR | stderr | requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/models/testgpt/resolve/main/tokenizer_config.json
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | The above exception was the direct cause of the following exception:
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | Traceback (most recent call last):
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/transformers/utils/hub.py", line 429, in cached_file
2024-01-22 17:08:04 | ERROR | stderr | resolved_file = hf_hub_download(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
2024-01-22 17:08:04 | ERROR | stderr | return fn(*args, **kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1368, in hf_hub_download
2024-01-22 17:08:04 | ERROR | stderr | raise head_call_error
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1238, in hf_hub_download
2024-01-22 17:08:04 | ERROR | stderr | metadata = get_hf_file_metadata(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
2024-01-22 17:08:04 | ERROR | stderr | return fn(*args, **kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1631, in get_hf_file_metadata
2024-01-22 17:08:04 | ERROR | stderr | r = _request_wrapper(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 385, in _request_wrapper
2024-01-22 17:08:04 | ERROR | stderr | response = _request_wrapper(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 409, in _request_wrapper
2024-01-22 17:08:04 | ERROR | stderr | hf_raise_for_status(response)
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 323, in hf_raise_for_status
2024-01-22 17:08:04 | ERROR | stderr | raise RepositoryNotFoundError(message, response) from e
2024-01-22 17:08:04 | ERROR | stderr | huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-65ae3074-2287e248178f4d250c755ad1;7b85dc4b-9482-4cb3-95c0-2ce23b61028f)
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | Repository Not Found for url: https://huggingface.co/models/testgpt/resolve/main/tokenizer_config.json.
2024-01-22 17:08:04 | ERROR | stderr | Please make sure you specified the correct repo_id and repo_type.
2024-01-22 17:08:04 | ERROR | stderr | If you are trying to access a private or gated repo, make sure you are authenticated.
2024-01-22 17:08:04 | ERROR | stderr | Invalid username or password.
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | The above exception was the direct cause of the following exception:
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | Traceback (most recent call last):
2024-01-22 17:08:04 | ERROR | stderr | File "", line 198, in _run_module_as_main
2024-01-22 17:08:04 | ERROR | stderr | File "", line 88, in _run_code
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/server/model_worker.py", line 521, in
2024-01-22 17:08:04 | ERROR | stderr | args, worker = create_model_worker()
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/server/model_worker.py", line 498, in create_model_worker
2024-01-22 17:08:04 | ERROR | stderr | worker = ModelWorker(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/server/model_worker.py", line 213, in init
2024-01-22 17:08:04 | ERROR | stderr | self.model, self.tokenizer = load_model(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/model/model_adapter.py", line 278, in load_model
2024-01-22 17:08:04 | ERROR | stderr | model, tokenizer = adapter.load_model(model_path, kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/model/model_adapter.py", line 1589, in load_model
2024-01-22 17:08:04 | ERROR | stderr | model, tokenizer = super().load_model(model_path, from_pretrained_kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/model/model_adapter.py", line 62, in load_model
2024-01-22 17:08:04 | ERROR | stderr | tokenizer = AutoTokenizer.from_pretrained(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 686, in from_pretrained
2024-01-22 17:08:04 | ERROR | stderr | tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 519, in get_tokenizer_config
2024-01-22 17:08:04 | ERROR | stderr | resolved_config_file = cached_file(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/transformers/utils/hub.py", line 450, in cached_file
2024-01-22 17:08:04 | ERROR | stderr | raise EnvironmentError(
2024-01-22 17:08:04 | ERROR | stderr | OSError: models/testgpt is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
2024-01-22 17:08:04 | ERROR | stderr | If this is a private repository, make sure to pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token=<your_token>

启动Worker失败

命令:python -m chat.server.model_worker --model-path ../TestGPT-7B --device cpu --num-gpus 2

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.