codefuse-ai / test-agent Goto Github PK
View Code? Open in Web Editor NEWAgent that empowers software testing with LLMs; industrial-first in China
License: Other
Agent that empowers software testing with LLMs; industrial-first in China
License: Other
ImportError:
2024-04-08 09:54:28 | ERROR | stderr | CodeLlamaTokenizer requires the protobuf library but it was not found in your environment. Checkout the instructions on the
2024-04-08 09:54:28 | ERROR | stderr | installation page of its repo: https://github.com/protocolbuffers/protobuf/tree/master/python#installation and follow the ones
第三步启动前端后,访问 http://0.0.0.0:7860
显示(failed)net::ERR_EMPTY_RESPONSE
我看后端也收到请求了,但是不知道为什么访问不了
rt
也太随意了,整个环境都freeze了,里面300多个依赖包,请重新整理一份吧
需要什么硬件
部署完成,发送信息提示NETWORK ERROR DUE TO HIGH TRAFFIC. PLEASE REGENERATE OR REFRESH THIS PAGE.
执行python -m chat.server.controller,报错Error while finding module specification for 'chat.server.controller' (ModuleNotFoundError: No module named 'chat')
Preparing metadata (setup.py) ... error
error: subprocess-exited-with-error
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [17 lines of output]
No CUDA runtime is found, using CUDA_HOME='C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.3'
test.c
LINK : fatal error LNK1181: 无法打开输入文件“aio.lib”
Traceback (most recent call last):
File "", line 2, in
File "", line 34, in
File "C:\Users\33\AppData\Local\Temp\pip-install-7rd2r2v0\deepspeed_0aefcc0fcaa04450b49543c945a2152f\setup.py", line 182, in
abort(f"Unable to pre-compile {op_name}")
File "C:\Users\33\AppData\Local\Temp\pip-install-7rd2r2v0\deepspeed_0aefcc0fcaa04450b49543c945a2152f\setup.py", line 52, in abort
assert False, msg
AssertionError: Unable to pre-compile async_io
[WARNING] Torch did not find cuda available, if cross-compiling or running with cpu only you can ignore this message. Adding compute capability for Pascal, Volta, and Turing (compute capabilities 6.0, 6.1, 6.2)
DS_BUILD_OPS=1
[WARNING] async_io requires the dev libaio .so object and headers but these were not found.
[WARNING] If libaio is already installed (perhaps from source), try setting the CFLAGS and LDFLAGS environment variables to where it can be found.
[WARNING] One can disable async_io with DS_BUILD_AIO=0
[ERROR] Unable to pre-compile async_io
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
命令:
python3 -m chat.server.model_worker --model-path ../TestGPT-7B --device cpu
报错:
/deepspeed/accelerator/real_accelerator.py", line 155, in get_accelerator
ds_accelerator = MPS_Accelerator()
TypeError: Can't instantiate abstract class MPS_Accelerator with abstract method supported_dtypes
使用命令:
python3 -m chat.server.model_worker --model-path models/testgpt --device mps
执行log记录如下:
2024-01-22 17:08:04 | INFO | model_worker | args: Namespace(host='localhost', port=21002, worker_address='http://localhost:21002', controller_address='http://localhost:21001', model_path='models/testgpt', revision='main', device='mps', gpus=None, num_gpus=1, max_gpu_memory=None, load_8bit=False, cpu_offloading=False, gptq_ckpt=None, gptq_wbits=16, gptq_groupsize=-1, gptq_act_order=False, awq_ckpt=None, awq_wbits=16, awq_groupsize=-1, model_names=None, conv_template=None, embed_in_truncate=False, limit_worker_concurrency=5, stream_interval=2, no_register=False)
2024-01-22 17:08:04 | INFO | stdout | testgpt!!!!!!
2024-01-22 17:08:04 | INFO | model_worker | Loading the model ['testgpt'] on worker c61383a5 ...
2024-01-22 17:08:04 | ERROR | stderr | Traceback (most recent call last):
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 286, in hf_raise_for_status
2024-01-22 17:08:04 | ERROR | stderr | response.raise_for_status()
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status
2024-01-22 17:08:04 | ERROR | stderr | raise HTTPError(http_error_msg, response=self)
2024-01-22 17:08:04 | ERROR | stderr | requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/models/testgpt/resolve/main/tokenizer_config.json
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | The above exception was the direct cause of the following exception:
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | Traceback (most recent call last):
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/transformers/utils/hub.py", line 429, in cached_file
2024-01-22 17:08:04 | ERROR | stderr | resolved_file = hf_hub_download(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
2024-01-22 17:08:04 | ERROR | stderr | return fn(*args, **kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1368, in hf_hub_download
2024-01-22 17:08:04 | ERROR | stderr | raise head_call_error
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1238, in hf_hub_download
2024-01-22 17:08:04 | ERROR | stderr | metadata = get_hf_file_metadata(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
2024-01-22 17:08:04 | ERROR | stderr | return fn(*args, **kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1631, in get_hf_file_metadata
2024-01-22 17:08:04 | ERROR | stderr | r = _request_wrapper(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 385, in _request_wrapper
2024-01-22 17:08:04 | ERROR | stderr | response = _request_wrapper(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 409, in _request_wrapper
2024-01-22 17:08:04 | ERROR | stderr | hf_raise_for_status(response)
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 323, in hf_raise_for_status
2024-01-22 17:08:04 | ERROR | stderr | raise RepositoryNotFoundError(message, response) from e
2024-01-22 17:08:04 | ERROR | stderr | huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-65ae3074-2287e248178f4d250c755ad1;7b85dc4b-9482-4cb3-95c0-2ce23b61028f)
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | Repository Not Found for url: https://huggingface.co/models/testgpt/resolve/main/tokenizer_config.json.
2024-01-22 17:08:04 | ERROR | stderr | Please make sure you specified the correct repo_id
and repo_type
.
2024-01-22 17:08:04 | ERROR | stderr | If you are trying to access a private or gated repo, make sure you are authenticated.
2024-01-22 17:08:04 | ERROR | stderr | Invalid username or password.
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | The above exception was the direct cause of the following exception:
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | Traceback (most recent call last):
2024-01-22 17:08:04 | ERROR | stderr | File "", line 198, in _run_module_as_main
2024-01-22 17:08:04 | ERROR | stderr | File "", line 88, in _run_code
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/server/model_worker.py", line 521, in
2024-01-22 17:08:04 | ERROR | stderr | args, worker = create_model_worker()
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/server/model_worker.py", line 498, in create_model_worker
2024-01-22 17:08:04 | ERROR | stderr | worker = ModelWorker(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/server/model_worker.py", line 213, in init
2024-01-22 17:08:04 | ERROR | stderr | self.model, self.tokenizer = load_model(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/model/model_adapter.py", line 278, in load_model
2024-01-22 17:08:04 | ERROR | stderr | model, tokenizer = adapter.load_model(model_path, kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/model/model_adapter.py", line 1589, in load_model
2024-01-22 17:08:04 | ERROR | stderr | model, tokenizer = super().load_model(model_path, from_pretrained_kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/model/model_adapter.py", line 62, in load_model
2024-01-22 17:08:04 | ERROR | stderr | tokenizer = AutoTokenizer.from_pretrained(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 686, in from_pretrained
2024-01-22 17:08:04 | ERROR | stderr | tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 519, in get_tokenizer_config
2024-01-22 17:08:04 | ERROR | stderr | resolved_config_file = cached_file(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/transformers/utils/hub.py", line 450, in cached_file
2024-01-22 17:08:04 | ERROR | stderr | raise EnvironmentError(
2024-01-22 17:08:04 | ERROR | stderr | OSError: models/testgpt is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
2024-01-22 17:08:04 | ERROR | stderr | If this is a private repository, make sure to pass a token having permission to this repo either by logging in with huggingface-cli login
or by passing token=<your_token>
命令:python -m chat.server.model_worker --model-path ../TestGPT-7B --device cpu --num-gpus 2
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.