daodao97 / chatdoc Goto Github PK
View Code? Open in Web Editor NEWChat with your doc by openai
Chat with your doc by openai
启动后端时,提示导入包错误了,完整信息:
Requirement already satisfied: mypy-extensions>=0.3.0 in /usr/local/lib/python3.10/dist-packages (from typing-inspect>=0.4.0->dataclasses-json<0.6.0,>=0.5.7->langchain==0.0.123->-r requirements.txt (line 5)) (1.0.0)
Traceback (most recent call last):
File "/app/chatdoc/server/main.py", line 3, in
from doc_util import Doc
File "/app/chatdoc/server/doc_util.py", line 9, in
from epub2txt import epub2txt
File "/usr/local/lib/python3.10/dist-packages/epub2txt/init.py", line 2, in
from .epub2txt import epub2txt
File "/usr/local/lib/python3.10/dist-packages/epub2txt/epub2txt.py", line 10, in
from collections import Iterable # < py38
ImportError: cannot import name 'Iterable' from 'collections' (/usr/lib/python3.10/collections/init.py)
Invalid URL (POST /engines/text-embedding-ada-002/embeddings), openai 0.28.0包并不支持这个路径了,而是/embeddings,这个路由在chatdoc/server里不知道要哪里修改?
1.Is your Python version 3.8? You still need to add python-multipart
to the requirements.txt file
.
2.Get an error with can not find data directory
.
3.When I try to upload a file, nothing happens.
看了chatpdf到chatdoc以及类似的项目,感觉这个似乎不难(当然对于我很难- -!我是个代码渣渣,非从业人员自己看代码看着玩
但是好像没有这样的项目,大家都是基于"让chatgpt帮我读某个文档"的思路在开发。
那个app.copilothub.co似乎是例外,上传文档、生成基于文档的某个角色来和用户对话。不过他们不开源,我编程基础太差,只能做这样的猜测
不知道你在github上有看到过类似的项目嘛?
通过ollama提供的本地api与大语言模型交互
请问daodao老师,我这样运行了后,在http://localhost:5173/界面连接不上。我的操作过程怎么不对呢?
下面是构建时的报错:
chatdoc-backend | Requirement already satisfied: pip in /usr/local/lib/python3.9/site-packages (22.0.4)
chatdoc-backend | Collecting pip
chatdoc-backend | Downloading pip-23.0.1-py3-none-any.whl (2.1 MB)
chatdoc-backend | ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.1/2.1 MB 3.6 MB/s eta 0:00:00
chatdoc-backend | Installing collected packages: pip
chatdoc-backend | Attempting uninstall: pip
chatdoc-backend | Found existing installation: pip 22.0.4
chatdoc-backend | Uninstalling pip-22.0.4:
chatdoc-backend | Successfully uninstalled pip-22.0.4
chatdoc-backend | Successfully installed pip-23.0.1
chatdoc-backend | WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
chatdoc-backend | /bin/sh: 3: ./start: not found
chatdoc-backend exited with code 127
chatdoc-frontend |
chatdoc-frontend | added 1 package in 9s
chatdoc-frontend |
chatdoc-frontend | 1 package is looking for funding
chatdoc-frontend | run `npm fund` for details
chatdoc-frontend | npm notice
chatdoc-frontend | npm notice New minor version of npm available! 9.5.1 -> 9.6.3
chatdoc-frontend | npm notice Changelog: <https://github.com/npm/cli/releases/tag/v9.6.3>
chatdoc-frontend | npm notice Run `npm install -g [email protected]` to update!
chatdoc-frontend | npm notice
chatdoc-frontend | /bin/sh: 3: ./start: not found
chatdoc-frontend exited with code 127
这个是启动时的报错,感觉好像是这个启动命令没有拷贝到镜像中?
1 package is looking for funding
run `npm fund` for details
npm notice
npm notice New minor version of npm available! 9.5.1 -> 9.6.3
npm notice Changelog: <https://github.com/npm/cli/releases/tag/v9.6.3>
npm notice Run `npm install -g [email protected]` to update!
npm notice
/bin/sh: 3: ./start: not found
changed 1 package in 342ms
1 package is looking for funding
run `npm fund` for details
/bin/sh: 3: ./start: not found
changed 1 package in 350ms
1 package is looking for funding
run `npm fund` for details
/bin/sh: 3: ./start: not found
changed 1 package in 322ms
1 package is looking for funding
run `npm fund` for details
/bin/sh: 3: ./start: not found
3:31:21 AM [vite] http proxy error at /upload:
chatdoc-frontend | Error: connect ECONNREFUSED 127.0.0.1:8000
chatdoc-frontend | at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1571:16)
0,>=0.5.7->langchain==0.0.123->-r requirements.txt (line 5)) (23.0)
Requirement already satisfied: mypy-extensions>=0.3.0 in /usr/local/lib/python3.9/site-packages (from typing-inspect>=0.4.0->dataclasses-json<0.6.0,>=0.5.7->langchain==0.0.123->-r requirements.txt (line 5)) (1.0.0)
Installing collected packages: fastapi
Successfully installed fastapi-0.95.0
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Traceback (most recent call last):
File "main.py", line 1, in
from fastapi import FastAPI, File, UploadFile, BackgroundTasks
ModuleNotFoundError: No module named 'fastapi'
how to get OPENAI_API_KEY?
请问可以不用docker,直接运行前后端可以吗,我直接运行前后端上传文件后一直在索引构建中是什么原因呢
epub文件、pdf文件无法正确加载显示,网页和txt可以正常显示,控制台报错:
1、Failed to load resource: the server responded with a status of 404 (Not Found)
viewer.html:1
2、Access to fetch at 'http://192.168.31.240:8000/static/927c0d75c7e9c278e0e2f862ccfa05b7/Proof%20of%20employment.pdf' from origin 'http://192.168.31.240:5173' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
:8000/static/927c0d75c7e9c278e0e2f862ccfa05b7/Proof%20of%20employment.pdf:1
3、Failed to load resource: net::ERR_FAILED
app.js:1109
4、载入 PDF 时发生错误。
PDF.js v3.4.120 (build: af6414988)
Message: Failed to fetch
_otherError @ app.js:1109
👉🏻 console.error(${message}\n\n${moreInfoText.join("\n")}
);
5、Uncaught (in promise) UnknownErrorExceptiondetails: "TypeError: Failed to fetch"message: "Failed to fetch"name: "UnknownErrorException"[[Prototype]]: Error
[[Prototype]]
:
Error
constructor
:
class UnknownErrorException
[[Prototype]]
:
Error at BaseExceptionClosure (http://192.168.31.240:5173/pdfjs/build/pdf.js:482:29) at Object.defineProperty.value (http://192.168.31.240:5173/pdfjs/build/pdf.js:485:2) at w_pdfjs_require (http://192.168.31.240:5173/pdfjs/build/pdf.js:15512:41) at http://192.168.31.240:5173/pdfjs/build/pdf.js:15774:13 at http://192.168.31.240:5173/pdfjs/build/pdf.js:15786:3 at http://192.168.31.240:5173/pdfjs/build/pdf.js:15789:12 at webpackUniversalModuleDefinition (http://192.168.31.240:5173/pdfjs/build/pdf.js:31:50) at http://192.168.31.240:5173/pdfjs/build/pdf.js:32:3
stack
:
"Error\n at BaseExceptionClosure (http://192.168.31.240:5173/pdfjs/build/pdf.js:482:29)\n at Object.defineProperty.value (http://192.168.31.240:5173/pdfjs/build/pdf.js:485:2)\n at w_pdfjs_require (http://192.168.31.240:5173/pdfjs/build/pdf.js:15512:41)\n at http://192.168.31.240:5173/pdfjs/build/pdf.js:15774:13\n at http://192.168.31.240:5173/pdfjs/build/pdf.js:15786:3\n at http://192.168.31.240:5173/pdfjs/build/pdf.js:15789:12\n at webpackUniversalModuleDefinition (http://192.168.31.240:5173/pdfjs/build/pdf.js:31:50)\n at http://192.168.31.240:5173/pdfjs/build/pdf.js:32:3"
[[Prototype]]
:
Object
chatdoc-frontend | > [email protected] dev /data
chatdoc-frontend | > vite --host
chatdoc-frontend |
chatdoc-frontend |
chatdoc-frontend | VITE v4.2.1 ready in 906 ms
chatdoc-frontend |
chatdoc-frontend | ➜ Local: http://localhost:5173/
chatdoc-frontend | ➜ Network: http://172.29.0.3:5173/
chatdoc-frontend | node:internal/errors:490
chatdoc-frontend | ErrorCaptureStackTrace(err);
chatdoc-frontend | ^
chatdoc-frontend |
chatdoc-frontend | Error: ENOSPC: System limit for number of file watchers reached, watch '/data/.pnpm-store/v3/files/91/4a1eaca75f67e09109a8ba403cf5d07fe3107d55cfeca64592d63c5ced82e81403d699297cfc45272d4026cf094ebee301db105d868a1c952fdf0aea81d4f0'
chatdoc-frontend | at FSWatcher. (node:internal/fs/watchers:247:19)
chatdoc-frontend | at Object.watch (node:fs:2350:36)
chatdoc-frontend | at createFsWatchInstance (file:///data/node_modules/.pnpm/[email protected][email protected]/node_modules/vite/dist/node/chunks/dep-79892de8.js:50328:17)
chatdoc-frontend | at setFsWatchListener (file:///data/node_modules/.pnpm/[email protected][email protected]/node_modules/vite/dist/node/chunks/dep-79892de8.js:50375:15)
chatdoc-frontend | at NodeFsHandler._watchWithNodeFs (file:///data/node_modules/.pnpm/[email protected][email protected]/node_modules/vite/dist/node/chunks/dep-79892de8.js:50530:14)
chatdoc-frontend | at NodeFsHandler._handleFile (file:///data/node_modules/.pnpm/[email protected][email protected]/node_modules/vite/dist/node/chunks/dep-79892de8.js:50594:23)
chatdoc-frontend | at NodeFsHandler._addToNodeFs (file:///data/node_modules/.pnpm/[email protected][email protected]/node_modules/vite/dist/node/chunks/dep-79892de8.js:50836:21)
chatdoc-frontend | Emitted 'error' event on FSWatcher instance at:
chatdoc-frontend | at FSWatcher._handleError (file:///data/node_modules/.pnpm/[email protected][email protected]/node_modules/vite/dist/node/chunks/dep-79892de8.js:52028:10)
chatdoc-frontend | at NodeFsHandler._addToNodeFs (file:///data/node_modules/.pnpm/[email protected][email protected]/node_modules/vite/dist/node/chunks/dep-79892de8.js:50844:18) {
chatdoc-frontend | errno: -28,
chatdoc-frontend | syscall: 'watch',
chatdoc-frontend | code: 'ENOSPC',
chatdoc-frontend | path: '/data/.pnpm-store/v3/files/91/4a1eaca75f67e09109a8ba403cf5d07fe3107d55cfeca64592d63c5ced82e81403d699297cfc45272d4026cf094ebee301db105d868a1c952fdf0aea81d4f0',
chatdoc-frontend | filename: '/data/.pnpm-store/v3/files/91/4a1eaca75f67e09109a8ba403cf5d07fe3107d55cfeca64592d63c5ced82e81403d699297cfc45272d4026cf094ebee301db105d868a1c952fdf0aea81d4f0'
chatdoc-frontend | }
chatdoc-frontend |
找不到原因。
I am getting error message when i use both localhost and 127.0.0.1. Any help would be appreciated
[vite] http proxy error at /my_docs:
Error: connect ECONNREFUSED ::1:8000
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1494:16)
chatdoc-backend | WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
chatdoc-backend | INFO: Will watch for changes in these directories: ['/data']
chatdoc-backend | INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
chatdoc-backend | INFO: Started reloader process [1] using StatReload
chatdoc-backend | INFO: Started server process [98]
chatdoc-backend | INFO: Waiting for application startup.
chatdoc-backend | INFO: Application startup complete.
chatdoc-frontend | 6:33:04 AM [vite] http proxy error at /my_docs:
chatdoc-frontend | Error: connect ECONNREFUSED 127.0.0.1:8000
chatdoc-frontend | at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1532:16)
chatdoc-frontend | 6:33:13 AM [vite] http proxy error at /upload:
chatdoc-frontend | Error: connect ECONNREFUSED 127.0.0.1:8000
chatdoc-frontend | at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1532:16)
目前是这样的,前端网页能显示 但是实现不了任何功能,报错信息如上 请问是哪里需要修改吗?谢谢
您好!我想要修改chatdoc后端的一些功能,然后在debug后端的时候,会报错,报错内容如下:
Traceback (most recent call last):
File "/Users/nancygetup/PycharmProjects/chatdoc-final/server/main.py", line 6, in
from doc_util import Doc
File "/Users/nancygetup/PycharmProjects/chatdoc-final/server/doc_util.py", line 17, in
llm_predictor = LLMPredictor(llm=ChatOpenAI(
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for ChatOpenAI
root
Did not find openai_api_key, please add an environment variable OPENAI_API_KEY
which contains it, or pass openai_api_key
as a named parameter. (type=value_error)
我已经在yml文件中输入自己的api key 了,不知道这里是有什么问题,可以请教一下吗谢谢了
> Hello guys.
I know that we can use OpenAI as a RESTful API.
But This is not a streaming.
I need to implement typing action.
How can I do that?
我没有使用过Next.js,可以问下ChatGPT,应该能得到满意的答复
Originally posted by @daodao97 in #36 (comment)
你好daodao,chatdoc挺厉害的,想和你认识认识,我在一家大模型公司。VX 13261569307
Prompt里的原因? 请问怎么改
请问OPEN_AI_KEY怎么获取
Hello guys.
How can I use OpenAI GPT-4 version? Is that not free version?
chatdoc-backend | Traceback (most recent call last):
chatdoc-backend | File "/data/main.py", line 3, in
chatdoc-backend | from doc_util import Doc
chatdoc-backend | File "/data/doc_util.py", line 12, in
chatdoc-backend | import docx2txt
chatdoc-backend | ModuleNotFoundError: No module named 'docx2txt'
chatdoc-backend exited with code 1
pdf最多只能上传2m限制在哪里改呀
我 运行 docker-compose up 后 会报这个错:
chatdoc-backend | Traceback (most recent call last):
chatdoc-backend | File "/data/main.py", line 104, in
chatdoc-backend | uvicorn.run('main:app', host="0.0.0.0", reload=True)
chatdoc-backend | File "/usr/local/lib/python3.9/site-packages/uvicorn/main.py", line 563, in run
chatdoc-backend | ChangeReload(config, target=server.run, sockets=[sock]).run()
chatdoc-backend | File "/usr/local/lib/python3.9/site-packages/uvicorn/supervisors/basereload.py", line 45, in run
chatdoc-backend | for changes in self:
chatdoc-backend | File "/usr/local/lib/python3.9/site-packages/uvicorn/supervisors/basereload.py", line 64, in next
chatdoc-backend | return self.should_restart()
chatdoc-backend | File "/usr/local/lib/python3.9/site-packages/uvicorn/supervisors/statreload.py", line 32, in should_restart
chatdoc-backend | for file in self.iter_py_files():
chatdoc-backend | File "/usr/local/lib/python3.9/site-packages/uvicorn/supervisors/statreload.py", line 52, in iter_py_files
chatdoc-backend | for path in list(reload_dir.rglob("*.py")):
chatdoc-backend | File "/usr/local/lib/python3.9/pathlib.py", line 1190, in rglob
chatdoc-backend | for p in selector.select_from(self):
chatdoc-backend | File "/usr/local/lib/python3.9/pathlib.py", line 611, in _select_from
chatdoc-backend | for p in successor_select(starting_point, is_dir, exists, scandir):
chatdoc-backend | File "/usr/local/lib/python3.9/pathlib.py", line 558, in _select_from
chatdoc-backend | with scandir(parent_path) as scandir_it:
chatdoc-backend | FileNotFoundError: [Errno 2] No such file or directory: '/sys/devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/VMBUS:00/383305f1-d159-11ed-85a4-8038fb684b70'
但是在docker desktop 中看两个容器都没挂 不过http://localhost:5173/打不开。我是四处搜索也没找到原因,只好请教您了。
有很多其他chatpaper类应用,都实现了自动头脑风暴的功能。我问了一下chatgpt,头脑风暴的prompt该怎么做:https://poe.com/s/bxwSkBQdat9OZCJSBsv0
可以参考一下:
Human : I want you act like a prompt engineer who is working on building a chatbot for research papers. Your goal is to write prompt for gpt and let it generated brainstorm questions based on current user conversations and current paper
Chatgpt : Sure, I can help you with that! Here's a prompt that you can use to generate brainstorm questions based on the user's conversation and the current research paper:
Prompt: Given the current conversation and the research paper being discussed, generate brainstorm questions that can help deepen the user's understanding of the paper. Your questions should be clear, concise, and relevant to the paper's main topic.
Example conversation:
User: I'm interested in learning more about the use of neural networks in natural language processing.
Bot: Sure, I can help with that! I have a research paper here that explores the use of neural networks for sentiment analysis in social media data. Would you like to discuss that?
Example paper:
Title: A Deep Learning Approach to Sentiment Analysis in Social Media Data
Authors: John Smith, Jane Doe, and Sarah Lee
Abstract: In this paper, we present a deep learning approach to sentiment analysis in social media data. Specifically, we use a convolutional neural network (CNN) to classify tweets as positive, negative, or neutral based on their content. Our model achieves state-of-the-art performance on several benchmark datasets, demonstrating the effectiveness of the CNN architecture for this task.
Generated questions:
How does the use of a convolutional neural network improve sentiment analysis in social media data?
What benchmark datasets were used to evaluate the performance of the model?
Can this approach be applied to other types of social media data besides Twitter?
How does this approach compare to other sentiment analysis techniques, such as rule-based or machine learning-based methods?
What are the implications of this research for businesses or organizations that use social media for customer engagement?
What are the limitations of using a CNN for sentiment analysis, and how can they be addressed?
Are there any potential ethical concerns related to using this approach to analyze social media data?
Hi @daodao97, Your work is awesome. I've tried using another tool called ChatDoc, which is quite similar to your work. This tool has a select, click, and ask feature that allows users to select a segment in a PDF or DOC file and display it on the right side. I would like to implement this feature in my own tool, but I'm not sure how to do it. Could you please provide me with some advice or guidelines to help me? Thank you!
chatdoc-frontend | 3:43:15 PM [vite] http proxy error at /my_docs:
chatdoc-frontend | Error: connect ECONNREFUSED 127.0.0.1:8000
chatdoc-frontend | at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1606:16)
chatdoc-backend | INFO: 172.18.0.1:42538 - "GET / HTTP/1.1" 404 Not Found
chatdoc-backend | INFO: 172.18.0.1:42550 - "GET /upload HTTP/1.1" 405 Method Not Allowed
chatdoc-backend | INFO: 172.18.0.1:43416 - "GET /my_docs HTTP/1.1" 200 OK
chatdoc-backend | INFO: 172.18.0.1:57268 - "GET /my_docs HTTP/1.1" 200 OK
chatdoc-frontend | 3:49:52 PM [vite] http proxy error at /my_docs:
chatdoc-frontend | Error: connect ECONNREFUSED 127.0.0.1:8000
chatdoc-frontend | at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1606:16)
后台确实是启动了的。
Hi, I'm trying to run the application in a local environment, but when I run the command
docker-compose up --build
docker start to build the container but after a while it fails and returns this error
Traceback (most recent call last): File "/data/main.py", line 3, in <module> from doc_util import Doc File "/data/doc_util.py", line 19, in <module> CJKPDFReader = download_loader("CJKPDFReader") File "/usr/local/lib/python3.9/site-packages/llama_index/readers/download.py", line 125, in download_loader library = json.loads(library_raw_content) File "/usr/local/lib/python3.9/json/__init__.py", line 346, in loads return _default_decoder.decode(s) File "/usr/local/lib/python3.9/json/decoder.py", line 340, in decode raise JSONDecodeError("Extra data", s, end) json.decoder.JSONDecodeError: Extra data: line 1 column 4 (char 3)
Can someone help with this issue? Am I doing something wrong?
Hello guys.
I know that we can use OpenAI as a RESTful API.
But This is not a streaming.
I need to implement typing action.
How can I do that?
你好,
能明细一下这个项目的limitation吗,想知道有哪些对话的限制条件
使用docker compose up启动时 报错:
chatdoc-backend | ModuleNotFoundError: No module named 'docx2txt'
ERROR: Could not find a version that satisfies the requirement beautifulsoup4==4.12.0 (from versions: 4.0.1, 4.0.2, 4.0.3, 4.0.4, 4.0.5, 4.1.0, 4.1.1, 4.1.2, 4.1.3, 4.2.0, 4.2.1, 4.3.0, 4.3.1, 4.3.2, 4.4.0, 4.4.1, 4.5.0, 4.5.1, 4.5.2, 4.5.3, 4.6.0, 4.6.1, 4.6.2, 4.6.3, 4.7.0, 4.7.1, 4.8.0, 4.8.1, 4.8.2, 4.9.0, 4.9.1, 4.9.2, 4.9.3, 4.10.0, 4.11.0, 4.11.1, 4.11.2)
好几个包都这样,请问是什么情况?
@daodao97 你好!我跑通前后端后上传文件后一直显示索引构建中。查看server的终端,发现它了以下错误
PermissionError: [Errno 13] Permission denied: '/tmp/data-gym-cache/6d1cbeee0f20b3d9449abfede4726ed8212e3aee.4ce1aec6-8596-4927-9911-41a1d1accd83.tmp'
你们现在重新装 会报错,以前装的不会 ,应该是 python环境变化了
chatdoc-backend | Traceback (most recent call last):
chatdoc-backend | File "/data/main.py", line 3, in
chatdoc-backend | from doc_util import Doc
chatdoc-backend | File "/data/doc_util.py", line 19, in
chatdoc-backend | CJKPDFReader = download_loader("CJKPDFReader")
chatdoc-backend | File "/usr/local/lib/python3.9/site-packages/llama_index/readers/download.py", line 125, in download_loader
chatdoc-backend | library = json.loads(library_raw_content)
chatdoc-backend | File "/usr/local/lib/python3.9/json/init.py", line 346, in loads
chatdoc-backend | return _default_decoder.decode(s)
chatdoc-backend | File "/usr/local/lib/python3.9/json/decoder.py", line 340, in decode
chatdoc-backend | raise JSONDecodeError("Extra data", s, end)
chatdoc-backend | json.decoder.JSONDecodeError: Extra data: line 1 column 4 (char 3)
您好,我在上传文件后提示了如下错误,请问是为什么
ValueError: The constructor now takes in a list of Node objects. Since you are passing in a list of Document objects, please use from_documents
instead.
具体错误为
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 429, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in call
return await self.app(scope, receive, send)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\fastapi\applications.py", line 276, in call
await super().call(scope, receive, send)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\applications.py", line 122, in call
await self.middleware_stack(scope, receive, send)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\middleware\errors.py", line 184, in call
raise exc
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in call
await self.app(scope, receive, _send)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\middleware\cors.py", line 92, in call
await self.simple_response(scope, receive, send, request_headers=headers)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\middleware\cors.py", line 147, in simple_response
await self.app(scope, receive, send)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\middleware\exceptions.py", line 79, in call
raise exc
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\middleware\exceptions.py", line 68, in call
await self.app(scope, receive, sender)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 21, in call
raise e
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 18, in call
await self.app(scope, receive, send)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\routing.py", line 718, in call
await route.handle(scope, receive, send)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\routing.py", line 276, in handle
await self.app(scope, receive, send)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\routing.py", line 69, in app
await response(scope, receive, send)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\responses.py", line 174, in call
await self.background()
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\background.py", line 43, in call
await task()
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\background.py", line 28, in call
await run_in_threadpool(self.func, *self.args, **self.kwargs)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\starlette\concurrency.py", line 41, in run_in_threadpool
return await anyio.to_thread.run_sync(func, *args)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\anyio\to_thread.py", line 33, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\anyio_backends_asyncio.py", line 2106, in run_sync_in_worker_thread
return await future
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\anyio_backends_asyncio.py", line 833, in run
result = context.run(func, *args)
File "C:\Users\15030\Desktop\chatdoc-master\server\main.py", line 98, in file_task
doc.build_index(res["doc_type"])
File "C:\Users\15030\Desktop\chatdoc-master\server\doc_util.py", line 101, in build_index
index = GPTSimpleVectorIndex(documents)
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\llama_index\indices\vector_store\base.py", line 44, in init
super().init(
File "C:\Users\15030\Desktop\chatdoc-master\venv\lib\site-packages\llama_index\indices\base.py", line 51, in init
raise ValueError(
ValueError: The constructor now takes in a list of Node objects. Since you are passing in a list of Document objects, please use from_documents
instead.
INFO: 192.168.110.1:57003 - "GET /eea8fcc666120d03c38118dd3be3dceb/%E8%AE%BA%E6%96%87%E4%BA%8C%E7%89%88.docx HTTP/1.1" 200 OK
Traceback (most recent call last):
File "/usr/local/python3.9.7/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py", line 429, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/usr/local/python3.9.7/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in call
return await self.app(scope, receive, send)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/fastapi/applications.py", line 276, in call
await super().call(scope, receive, send)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/applications.py", line 122, in call
await self.middleware_stack(scope, receive, send)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/middleware/errors.py", line 184, in call
raise exc
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/middleware/errors.py", line 162, in call
await self.app(scope, receive, _send)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/middleware/cors.py", line 92, in call
await self.simple_response(scope, receive, send, request_headers=headers)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/middleware/cors.py", line 147, in simple_response
await self.app(scope, receive, send)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 79, in call
raise exc
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 68, in call
await self.app(scope, receive, sender)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in call
raise e
File "/usr/local/python3.9.7/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in call
await self.app(scope, receive, send)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/routing.py", line 718, in call
await route.handle(scope, receive, send)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/routing.py", line 276, in handle
await self.app(scope, receive, send)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/routing.py", line 69, in app
await response(scope, receive, send)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/responses.py", line 174, in call
await self.background()
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/background.py", line 43, in call
await task()
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/background.py", line 28, in call
await run_in_threadpool(self.func, *self.args, **self.kwargs)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/starlette/concurrency.py", line 41, in run_in_threadpool
return await anyio.to_thread.run_sync(func, *args)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/anyio/to_thread.py", line 33, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "/usr/local/python3.9.7/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
return await future
File "/usr/local/python3.9.7/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 807, in run
result = context.run(func, *args)
File "/home/chat/chatdoc-master/server/main.py", line 98, in file_task
doc.build_index(res["doc_type"])
File "/home/chat/chatdoc-master/server/doc_util.py", line 100, in build_index
index = GPTSimpleVectorIndex(documents)
File "/usr/local/python3.9.7/lib/python3.9/site-packages/llama_index/indices/vector_store/base.py", line 44, in init
super().init(
File "/usr/local/python3.9.7/lib/python3.9/site-packages/llama_index/indices/base.py", line 51, in init
raise ValueError(
ValueError: The constructor now takes in a list of Node objects. Since you are passing in a list of Document objects, please use from_documents
instead.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.