Git Product home page Git Product logo

nczkevin / chatglm-web Goto Github PK

View Code? Open in Web Editor NEW
424.0 10.0 74.0 6.5 MB

用 FastAPI 和 Vue3 搭建的 ChatGLM 网页 (前端样式仿照chatgpt-web, 支持chatglm流式输出、前端调整参数、上下文选择、保存图片、知识库问答等功能)

Home Page: http://chatglm.nczkevin.com

License: MIT License

JavaScript 0.37% Shell 0.41% Dockerfile 0.47% TypeScript 28.23% HTML 1.08% Python 5.53% Vue 46.08% Less 17.80% CSS 0.04%

chatglm-web's People

Contributors

nczkevin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

chatglm-web's Issues

python main.py 发生了如下错误。请帮忙看下

Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
Traceback (most recent call last):
File "/root/chat/lib/python3.11/site-packages/urllib3/connectionpool.py", line 790, in urlopen
response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/urllib3/connectionpool.py", line 536, in _make_request
response = conn.getresponse()
^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/urllib3/connection.py", line 454, in getresponse
httplib_response = super().getresponse()
^^^^^^^^^^^^^^^^^^^^^
File "/usr/python3.11/lib/python3.11/http/client.py", line 1374, in getresponse
response.begin()
File "/usr/python3.11/lib/python3.11/http/client.py", line 318, in begin
version, status, reason = self._read_status()
^^^^^^^^^^^^^^^^^^^
File "/usr/python3.11/lib/python3.11/http/client.py", line 287, in _read_status
raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/root/chat/lib/python3.11/site-packages/requests/adapters.py", line 486, in send
resp = conn.urlopen(
^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/urllib3/connectionpool.py", line 844, in urlopen
retries = retries.increment(
^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/urllib3/util/retry.py", line 470, in increment
raise reraise(type(error), error, _stacktrace)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/urllib3/util/util.py", line 38, in reraise
raise value.with_traceback(tb)
File "/root/chat/lib/python3.11/site-packages/urllib3/connectionpool.py", line 790, in urlopen
response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/urllib3/connectionpool.py", line 536, in _make_request
response = conn.getresponse()
^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/urllib3/connection.py", line 454, in getresponse
httplib_response = super().getresponse()
^^^^^^^^^^^^^^^^^^^^^
File "/usr/python3.11/lib/python3.11/http/client.py", line 1374, in getresponse
response.begin()
File "/usr/python3.11/lib/python3.11/http/client.py", line 318, in begin
version, status, reason = self._read_status()
^^^^^^^^^^^^^^^^^^^
File "/usr/python3.11/lib/python3.11/http/client.py", line 287, in _read_status
raise RemoteDisconnected("Remote end closed connection without"
urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/root/chatglm-web/service/main.py", line 178, in
tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 663, in from_pretrained
tokenizer_class = get_class_from_dynamic_module(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 388, in get_class_from_dynamic_module
final_module = get_cached_module_file(
^^^^^^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 286, in get_cached_module_file
commit_hash = model_info(pretrained_model_name_or_path, revision=revision, token=use_auth_token).sha
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/huggingface_hub/hf_api.py", line 1675, in model_info
r = get_session().get(path, headers=headers, timeout=timeout, params=params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/requests/sessions.py", line 602, in get
return self.request("GET", url, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/requests/sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/requests/sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/chat/lib/python3.11/site-packages/requests/adapters.py", line 501, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

WARNING - asyncio - socket.send() raised exception.

当我客户端停止接收数据后,服务端抛出大量报错
07/18/2023 16:14:17 - WARNING - asyncio - socket.send() raised exception.
07/18/2023 16:14:17 - WARNING - asyncio - socket.send() raised exception.
07/18/2023 16:14:17 - WARNING - asyncio - socket.send() raised exception.
07/18/2023 16:14:18 - WARNING - asyncio - socket.send() raised exception.
07/18/2023 16:14:18 - WARNING - asyncio - socket.send() raised exception.
07/18/2023 16:14:18 - WARNING - asyncio - socket.send() raised exception.
07/18/2023 16:14:18 - WARNING - asyncio - socket.send() raised exception.
07/18/2023 16:14:18 - WARNING - asyncio - socket.send() raised exception.
07/18/2023 16:14:18 - WARNING - asyncio - socket.send() raised exception.
07/18/2023 16:14:18 - WARNING - asyncio - socket.send() raised exception.
07/18/2023 16:14:19 - WARNING - asyncio - socket.send() raised exception.
07/18/2023 16:14:19 - WARNING - asyncio - socket.send() raised exception.

404 Not Found

使用https://github.com/THUDM/ChatGLM-6B的后端api和此项目的前端。
后端报错显示如下
INFO: Started server process [7348]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:3002 (Press CTRL+C to quit)
INFO: 127.0.0.1:50608 - "POST /chat-process HTTP/1.1" 404 Not Found
INFO: 127.0.0.1:50609 - "POST /chat-process HTTP/1.1" 404 Not Found
前端报错如下:
2023/4/16 15:03:40 :你好

2023/4/16 15:03:40:Request failed with status code 404

如何解决

本地知识库

请问这个如何设置本地知识库,应该如何才能根据自己定义的知识库回答问题呢

frontend root_path support?

  • 已阅读README和上游项目README
  • 无类似issue

麻烦问一下前端支持挂载在子路径吗?
比如 主页从 http://chatglm.nczkevin.com/ 变更为 http://nczkevin.com/chatglm/
直接代理前端的话 主要问题在于 main.js 和 src/client.js 会有路径问题

connect ECONNREFUSED

前端
➜ Local: http://localhost:3000/
➜ Network: http://192.168.1.9:3000/
➜ Network: http://172.28.80.1:3000/
➜ press h to show help
11:08:46 [vite] http proxy error at /chat-process:
Error: connect ECONNREFUSED ::1:3002
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1494:16)
后端
oading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 8/8 [00:15<00:00, 1.89s/it]
INFO: Started server process [17736]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:3002 (Press CTRL+C to quit)

chatglm-web功能建议和想法

欢迎大家提供建议,目前后续想法包括

  1. 追上chatgpt-web原仓库的一些功能,包括权限控制,Prompt Store
  2. 支持和chatgpt双模型效果对比
  3. 数据导入、导出

求助,这是什么问题

cuda已经正常安装了

Traceback (most recent call last):
File "D:\chatglm-web-main\service\main.py", line 186, in
model = AutoModel.from_pretrained(model_path, trust_remote_code=True).half().quantize(quantize).cuda()
File "C:\Users\Administrator/.cache\huggingface\modules\transformers_modules\THUDM\chatglm-6b\658202d88ac4bb782b99e99ac3adff58b4d0b813\modeling_chatglm.py", line 1434, in quantize
self.transformer = quantize(self.transformer, bits, empty_init=empty_init, **kwargs)
File "C:\Users\Administrator/.cache\huggingface\modules\transformers_modules\THUDM\chatglm-6b\658202d88ac4bb782b99e99ac3adff58b4d0b813\quantization.py", line 159, in quantize
weight_tensor=layer.attention.query_key_value.weight.to(torch.cuda.current_device()),
File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\cuda_init_.py", line 674, in current_device
lazy_init()
File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\cuda_init
.py", line 239, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

克隆下来后跑不起来service

python main.py --device='cpu' --host='127.0.0.1' --port='7680'
Traceback (most recent call last):
File "/Users/weicheng/Documents/Dev/llm/chatglm-web/service/main.py", line 16, in
import knowledge
File "/Users/weicheng/Documents/Dev/llm/chatglm-web/service/knowledge.py", line 4, in
ix = storage.open_index()
File "/opt/homebrew/lib/python3.10/site-packages/whoosh/filedb/filestore.py", line 176, in open_index
return indexclass(self, schema=schema, indexname=indexname)
File "/opt/homebrew/lib/python3.10/site-packages/whoosh/index.py", line 421, in init
TOC.read(self.storage, self.indexname, schema=self._schema)
File "/opt/homebrew/lib/python3.10/site-packages/whoosh/index.py", line 618, in read
raise EmptyIndexError("Index %r does not exist in %r"
whoosh.index.EmptyIndexError: Index 'MAIN' does not exist in FileStorage('knowdata')

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.