Git Product home page Git Product logo

Comments (34)

XianYue0125 avatar XianYue0125 commented on August 17, 2024

另一个问题,使用流式返回的时候,会在不确定的时间报错,报错内容是
httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

image
我也修改了nginx的文件,但依然不行

也一直没排查出来问题,我对通信这块不太懂,求大佬指点,感谢

from openai-forward.

KenyonY avatar KenyonY commented on August 17, 2024

openai-forward运行时的报错日志吗?

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

log文件夹下,openai里面的四个文件夹的log都是空的,.env里面都设置的true但还是没有日志

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

image
我猜测原因可能是我使用了nohup命令,log都打印到了nohup.out里面,这是截图,其他部分都是正常的

from openai-forward.

KenyonY avatar KenyonY commented on August 17, 2024
  1. 如果不用nginx转发8000端口,是否可以正常使用?
  2. 只是当gpt4 + 非流式 才会超时吗? 也就是其它时候都是正常使用的
  3. 建议使用requests直接对 v1/chat/completions接口进行测试
  4. https://api.openai-forward.com/v1/chat/completions 这个服务是否也有类似问题?

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

不用nginx转发8000端口也无法正常使用
使用gpt4+流式的时候,经常会超时,信息量大概是,发送tokens3000,返回tokens1500左右
不是langchain的问题,直接使用openai官方的接口走转发也会出现同样的问题

gpt4+流式的时候,报错信息通常是peer closed connection without sending complete message body
gpt4+非流式的时候,返回504 Gateway Time-out,nginx/1.18.0 (Ubuntu)

另外,使用其他的php代码跑转发服务,跑同样的gpt代码没有出现问题

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

也尝试过ip地址加端口访问,http的xxx.xxx.xxx.xxx:8000/v1这种,问题依然存在

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

#openai的api
from openai import OpenAI
client = OpenAI(api_key='', base_url='')
completion = client.chat.completions.create(
model="gpt-4-1106-preview",
#model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": prompt}
],
stream=True
)

#langchain的api
from langchain_openai import ChatOpenAI
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler

chat_model = ChatOpenAI(
model='gpt-4-1106-preview',
#model="gpt-3.5-turbo",
openai_api_key='',
openai_api_base='',
streaming=True,
callbacks=[StreamingStdOutCallbackHandler()],
#temperature=0.2
)

from openai-forward.

KenyonY avatar KenyonY commented on August 17, 2024

你能否测试一下将base_url设置为

https://api.openai-forward.com
或者
https://render.openai-forward.com/

是否也有类似问题?

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

我尝试过直接本地挂梯子然后不设置url,直接走官方的url,没有问题

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

我目前没有官方的key了,测试不了了

我现在的通信流程是先走云服务器,也就是部署好的openai-forward端口,然后本地转发到另一个端口,key也会映射成azure的key,然后另一个端口会把openai 的请求转换成azure的openai请求然后直接发送到azure。

收到的错误都是在openai-forward这边,还没有数据转发到另一个端口就报错了,另外转发成azure的github项目叫azure-openai-proxy,是封装好的容器

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

image
有时候返回到一半就不行了

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

image
发现一个问题,旧的env文件的LOG_OPENAI和新的OPENAI_LOG不一样,导致设置日志为true不生效,这个改了,让打印出来了日志,偶尔会有上面这个报错

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024
      │    └ <class 'functools.partial'>
      └ <function StreamingResponse.__call__.<locals>.wrap at 0x7fdcf0347b50>

File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 257, in wrap
await func()
└ functools.partial(<bound method StreamingResponse.listen_for_disconnect of <starlette.middleware.base._StreamingResponse obje...
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 234, in listen_for_disconnect
message = await receive()
└ <bound method _CachedRequest.wrapped_receive of <starlette.middleware.base._CachedRequest object at 0x7fdcf0342e00>>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 52, in wrapped_receive
msg = await self.receive()
│ └ <property object at 0x7fdd039bd120>
└ <starlette.middleware.base._CachedRequest object at 0x7fdcf0342e00>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 534, in receive
await self.message_event.wait()
│ │ └ <function Event.wait at 0x7fdd05097910>
│ └ <asyncio.locks.Event object at 0x7fdcf0341c30 [unset]>
└ <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>
File "/usr/lib/python3.10/asyncio/locks.py", line 214, in wait
await fut

asyncio.exceptions.CancelledError: Cancelled by cancel scope 7fdcf03a2b00

During handling of the above exception, another exception occurred:

  • Exception Group Traceback (most recent call last):
    |
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 192, in call
    | await response(scope, wrapped_receive, send)
    | │ │ │ └ <function ServerErrorMiddleware.call.._send at 0x7fdcf1105f30>
    | │ │ └ <bound method _CachedRequest.wrapped_receive of <starlette.middleware.base._CachedRequest object at 0x7fdcf0342e00>>
    | │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | └ <starlette.middleware.base._StreamingResponse object at 0x7fdcf03a1750>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 254, in call
    | async with anyio.create_task_group() as task_group:
    | │ │ └ <anyio._backends._asyncio.TaskGroup object at 0x7fdcf03a3190>
    | │ └ <function create_task_group at 0x7fdd03d71990>
    | └ <module 'anyio' from '/var/www/openai-forward/myenv/lib/python3.10/site-packages/anyio/init.py'>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 678, in aexit
    | raise BaseExceptionGroup(
    | └ <class 'exceptiongroup.BaseExceptionGroup'>
    |
    | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
    +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 261, in call
    | await wrap(partial(self.listen_for_disconnect, receive))
    | │ │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ │ └ <function StreamingResponse.listen_for_disconnect at 0x7fdd03bc7f40>
    | │ │ └ <starlette.responses.StreamingResponse object at 0x7fdcf03a2560>
    | │ └ <class 'functools.partial'>
    | └ <function StreamingResponse.call..wrap at 0x7fdcf0347640>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 257, in wrap
    | await func()
    | └ functools.partial(<bound method StreamingResponse.listen_for_disconnect of <starlette.responses.StreamingResponse object at 0...
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 234, in listen_for_disconnect
    | message = await receive()
    | └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 128, in receive_or_disconnect
    | message = await wrap(wrapped_receive)
    | │ └ <bound method _CachedRequest.wrapped_receive of <starlette.middleware.base._CachedRequest object at 0x7fdcf0342e00>>
    | └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect..wrap at 0x7fdcf0345c60>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 123, in wrap
    | result = await func()
    | └ <bound method _CachedRequest.wrapped_receive of <starlette.middleware.base._CachedRequest object at 0x7fdcf0342e00>>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 52, in wrapped_receive
    | msg = await self.receive()
    | │ └ <property object at 0x7fdd039bd120>
    | └ <starlette.middleware.base._CachedRequest object at 0x7fdcf0342e00>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 534, in receive
    | await self.message_event.wait()
    | │ │ └ <function Event.wait at 0x7fdd05097910>
    | │ └ <asyncio.locks.Event object at 0x7fdcf0341c30 [unset]>
    | └ <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>
    | File "/usr/lib/python3.10/asyncio/locks.py", line 214, in wait
    | await fut
    | └
    |
    | asyncio.exceptions.CancelledError: Cancelled by cancel scope 7fdcf03a1450
    |
    |
    | During handling of the above exception, another exception occurred:
    |
    |
    | Exception Group Traceback (most recent call last):
    |
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 257, in wrap
    | await func()
    | └ functools.partial(<bound method _StreamingResponse.stream_response of <starlette.middleware.base._StreamingResponse object at...
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 217, in stream_response
    | return await super().stream_response(send)
    | └ <function ServerErrorMiddleware.call.._send at 0x7fdcf1105f30>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 246, in stream_response
    | async for chunk in self.body_iterator:
    | │ │ └ <async_generator object BaseHTTPMiddleware.call..call_next..body_stream at 0x7fdcf03319c0>
    | │ └ <starlette.middleware.base._StreamingResponse object at 0x7fdcf03a1750>
    | └ b'\n'
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 181, in body_stream
    | raise app_exc
    | └ ExceptionGroup('unhandled errors in a TaskGroup', [TypeError("unhashable type: 'list'")])
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 151, in coro
    | await self.app(scope, receive_or_disconnect, send_no_error)
    | │ │ │ │ └ <function BaseHTTPMiddleware.call..call_next..send_no_error at 0x7fdcf0347400>
    | │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <starlette.middleware.cors.CORSMiddleware object at 0x7fdcf10e9780>
    | └ <starlette.middleware.base.BaseHTTPMiddleware object at 0x7fdcf10e8a00>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 83, in call
    | await self.app(scope, receive, send)
    | │ │ │ │ └ <function BaseHTTPMiddleware.call..call_next..send_no_error at 0x7fdcf0347400>
    | │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <starlette.middleware.exceptions.ExceptionMiddleware object at 0x7fdcf10e8c10>
    | └ <starlette.middleware.cors.CORSMiddleware object at 0x7fdcf10e9780>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 62, in call
    | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
    | │ │ │ │ │ │ └ <function BaseHTTPMiddleware.call..call_next..send_no_error at 0x7fdcf0347400>
    | │ │ │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ │ │ └ <starlette.requests.Request object at 0x7fdcf03a1960>
    | │ │ └ <fastapi.routing.APIRouter object at 0x7fdcf10e8d60>
    | │ └ <starlette.middleware.exceptions.ExceptionMiddleware object at 0x7fdcf10e8c10>
    | └ <function wrap_app_handling_exceptions at 0x7fdd03a64b80>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    | raise exc
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    | await app(scope, receive, sender)
    | │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf03475b0>
    | │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | └ <fastapi.routing.APIRouter object at 0x7fdcf10e8d60>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/routing.py", line 762, in call
    | await self.middleware_stack(scope, receive, send)
    | │ │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf03475b0>
    | │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <bound method Router.app of <fastapi.routing.APIRouter object at 0x7fdcf10e8d60>>
    | └ <fastapi.routing.APIRouter object at 0x7fdcf10e8d60>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/routing.py", line 782, in app
    | await route.handle(scope, receive, send)
    | │ │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf03475b0>
    | │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <function Route.handle at 0x7fdd03a66200>
    | └ Route(path='/{api_path:path}', name='reverse_proxy', methods=['DELETE', 'GET', 'HEAD', 'OPTIONS', 'PATCH', 'POST', 'PUT', 'TR...
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
    | await self.app(scope, receive, send)
    | │ │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf03475b0>
    | │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <function request_response..app at 0x7fdcf11057e0>
    | └ Route(path='/{api_path:path}', name='reverse_proxy', methods=['DELETE', 'GET', 'HEAD', 'OPTIONS', 'PATCH', 'POST', 'PUT', 'TR...
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
    | await wrap_app_handling_exceptions(app, request)(scope, receive, send)
    | │ │ │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf03475b0>
    | │ │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ │ └ <starlette.requests.Request object at 0x7fdcf03a2290>
    | │ └ <function request_response..app..app at 0x7fdcf0347130>
    | └ <function wrap_app_handling_exceptions at 0x7fdd03a64b80>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    | raise exc
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    | await app(scope, receive, sender)
    | │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf0347520>
    | │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | └ <function request_response..app..app at 0x7fdcf0347130>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/routing.py", line 75, in app
    | await response(scope, receive, send)
    | │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf0347520>
    | │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | └ <starlette.responses.StreamingResponse object at 0x7fdcf03a2560>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 254, in call
    | async with anyio.create_task_group() as task_group:
    | │ │ └ <anyio._backends._asyncio.TaskGroup object at 0x7fdcf03a0220>
    | │ └ <function create_task_group at 0x7fdd03d71990>
    | └ <module 'anyio' from '/var/www/openai-forward/myenv/lib/python3.10/site-packages/anyio/init.py'>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 678, in aexit
    | raise BaseExceptionGroup(
    | └ <class 'exceptiongroup.BaseExceptionGroup'>
    |
    | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
    +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 404, in run_asgi
    | result = await app( # type: ignore[func-returns-value]
    | └ <uvicorn.middleware.proxy_headers.ProxyHeadersMiddleware object at 0x7fdd0494a2c0>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
    | return await self.app(scope, receive, send)
    | │ │ │ │ └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | │ │ │ └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <fastapi.applications.FastAPI object at 0x7fdcf10ebc70>
    | └ <uvicorn.middleware.proxy_headers.ProxyHeadersMiddleware object at 0x7fdd0494a2c0>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call
    | await super().call(scope, receive, send)
    | │ │ └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | │ └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/applications.py", line 123, in call
    | await self.middleware_stack(scope, receive, send)
    | │ │ │ │ └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | │ │ │ └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <starlette.middleware.errors.ServerErrorMiddleware object at 0x7fdcf11201f0>
    | └ <fastapi.applications.FastAPI object at 0x7fdcf10ebc70>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call
    | raise exc
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call
    | await self.app(scope, receive, _send)
    | │ │ │ │ └ <function ServerErrorMiddleware.call.._send at 0x7fdcf1105f30>
    | │ │ │ └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <starlette.middleware.base.BaseHTTPMiddleware object at 0x7fdcf10e8a00>
    | └ <starlette.middleware.errors.ServerErrorMiddleware object at 0x7fdcf11201f0>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 189, in call
    | with collapse_excgroups():
    | └ <function collapse_excgroups at 0x7fdd0399ab90>
    | File "/usr/lib/python3.10/contextlib.py", line 153, in exit
    | self.gen.throw(typ, value, traceback)
    | │ │ │ │ │ └ <traceback object at 0x7fdcf03c0780>
    | │ │ │ │ └ ExceptionGroup('unhandled errors in a TaskGroup', [ExceptionGroup('unhandled errors in a TaskGroup', [ExceptionGroup('unhandl...
    | │ │ │ └ <class 'exceptiongroup.ExceptionGroup'>
    | │ │ └ <method 'throw' of 'generator' objects>
    | │ └ <generator object collapse_excgroups at 0x7fdcf1110970>
    | └ <contextlib._GeneratorContextManager object at 0x7fdcf0343e80>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/_utils.py", line 91, in collapse_excgroups
    | raise exc
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 257, in wrap
    | await func()
    | └ functools.partial(<bound method StreamingResponse.stream_response of <starlette.responses.StreamingResponse object at 0x7fdcf...
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 246, in stream_response
    | async for chunk in self.body_iterator:
    | │ │ └ <async_generator object OpenaiForward.aiter_bytes at 0x7fdcf0330dc0>
    | │ └ <starlette.responses.StreamingResponse object at 0x7fdcf03a2560>
    | └ b'\n'
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/decorators.py", line 157, in wrapper
    | async for value in async_gen:
    | │ └ <async_generator object OpenaiForward.aiter_bytes at 0x7fdcf0331940>
    | └ b'\n'
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/forward/core.py", line 422, in aiter_bytes
    | cache_response(cache_key, target_info, route_path, chunk_list)
    | │ │ │ │ └ [b'data: {"id":"","object":"","created":0,"model":"","prompt_filter_results":[{"prompt_index":0,"content_filter_results":{"ha...
    | │ │ │ └ '/v1/chat/completions'
    | │ │ └ {}
    | │ └ b'\x98\x01\x92\x82\xa4role\xa6system\xa7content\xbcYou are a helpful assistant.\x82\xa4role\xa4user\xa7content\xda\x1b\x9b\xe...
    | └ <function cache_response at 0x7fdcf1070dc0>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/cache/init.py", line 58, in cache_response
    | cache_generic_response(cache_key, route_path, chunk_list)
    | │ │ │ └ [b'data: {"id":"","object":"","created":0,"model":"","prompt_filter_results":[{"prompt_index":0,"content_filter_results":{"ha...
    | │ │ └ '/v1/chat/completions'
    | │ └ b'\x98\x01\x92\x82\xa4role\xa6system\xa7content\xbcYou are a helpful assistant.\x82\xa4role\xa4user\xa7content\xda\x1b\x9b\xe...
    | └ <function cache_generic_response at 0x7fdcf1070e50>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/cache/init.py", line 62, in cache_generic_response
    | if cache_key and route_path in CACHE_ROUTE_SET:
    | │ │ └ {'/v1/chat/completions', '/v1/embeddings'}
    | │ └ [b'data: {"id":"","object":"","created":0,"model":"","prompt_filter_results":[{"prompt_index":0,"content_filter_results":{"ha...
    | └ b'\x98\x01\x92\x82\xa4role\xa6system\xa7content\xbcYou are a helpful assistant.\x82\xa4role\xa4user\xa7content\xda\x1b\x9b\xe...
    |
    | TypeError: unhashable type: 'list'
    +------------------------------------

During handling of the above exception, another exception occurred:

  • Exception Group Traceback (most recent call last):
    |
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/_utils.py", line 85, in collapse_excgroups
    | yield
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 190, in call
    | async with anyio.create_task_group() as task_group:
    | │ │ └ <anyio._backends._asyncio.TaskGroup object at 0x7fdcf0343f40>
    | │ └ <function create_task_group at 0x7fdd03d71990>
    | └ <module 'anyio' from '/var/www/openai-forward/myenv/lib/python3.10/site-packages/anyio/init.py'>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 678, in aexit
    | raise BaseExceptionGroup(
    | └ <class 'exceptiongroup.BaseExceptionGroup'>
    |
    | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
    +-+---------------- 1 ----------------
    | Exception Group Traceback (most recent call last):
    |
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 192, in call
    | await response(scope, wrapped_receive, send)
    | │ │ │ └ <function ServerErrorMiddleware.call.._send at 0x7fdcf1105f30>
    | │ │ └ <bound method _CachedRequest.wrapped_receive of <starlette.middleware.base._CachedRequest object at 0x7fdcf0342e00>>
    | │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | └ <starlette.middleware.base._StreamingResponse object at 0x7fdcf03a1750>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 254, in call
    | async with anyio.create_task_group() as task_group:
    | │ │ └ <anyio._backends._asyncio.TaskGroup object at 0x7fdcf03a3190>
    | │ └ <function create_task_group at 0x7fdd03d71990>
    | └ <module 'anyio' from '/var/www/openai-forward/myenv/lib/python3.10/site-packages/anyio/init.py'>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 678, in aexit
    | raise BaseExceptionGroup(
    | └ <class 'exceptiongroup.BaseExceptionGroup'>
    |
    | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
    +-+---------------- 1 ----------------
    | Exception Group Traceback (most recent call last):
    |
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 257, in wrap
    | await func()
    | └ functools.partial(<bound method _StreamingResponse.stream_response of <starlette.middleware.base._StreamingResponse object at...
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 217, in stream_response
    | return await super().stream_response(send)
    | └ <function ServerErrorMiddleware.call.._send at 0x7fdcf1105f30>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 246, in stream_response
    | async for chunk in self.body_iterator:
    | │ │ └ <async_generator object BaseHTTPMiddleware.call..call_next..body_stream at 0x7fdcf03319c0>
    | │ └ <starlette.middleware.base._StreamingResponse object at 0x7fdcf03a1750>
    | └ b'\n'
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 181, in body_stream
    | raise app_exc
    | └ ExceptionGroup('unhandled errors in a TaskGroup', [TypeError("unhashable type: 'list'")])
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 151, in coro
    | await self.app(scope, receive_or_disconnect, send_no_error)
    | │ │ │ │ └ <function BaseHTTPMiddleware.call..call_next..send_no_error at 0x7fdcf0347400>
    | │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <starlette.middleware.cors.CORSMiddleware object at 0x7fdcf10e9780>
    | └ <starlette.middleware.base.BaseHTTPMiddleware object at 0x7fdcf10e8a00>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 83, in call
    | await self.app(scope, receive, send)
    | │ │ │ │ └ <function BaseHTTPMiddleware.call..call_next..send_no_error at 0x7fdcf0347400>
    | │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <starlette.middleware.exceptions.ExceptionMiddleware object at 0x7fdcf10e8c10>
    | └ <starlette.middleware.cors.CORSMiddleware object at 0x7fdcf10e9780>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 62, in call
    | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
    | │ │ │ │ │ │ └ <function BaseHTTPMiddleware.call..call_next..send_no_error at 0x7fdcf0347400>
    | │ │ │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ │ │ └ <starlette.requests.Request object at 0x7fdcf03a1960>
    | │ │ └ <fastapi.routing.APIRouter object at 0x7fdcf10e8d60>
    | │ └ <starlette.middleware.exceptions.ExceptionMiddleware object at 0x7fdcf10e8c10>
    | └ <function wrap_app_handling_exceptions at 0x7fdd03a64b80>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    | raise exc
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    | await app(scope, receive, sender)
    | │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf03475b0>
    | │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | └ <fastapi.routing.APIRouter object at 0x7fdcf10e8d60>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/routing.py", line 762, in call
    | await self.middleware_stack(scope, receive, send)
    | │ │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf03475b0>
    | │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <bound method Router.app of <fastapi.routing.APIRouter object at 0x7fdcf10e8d60>>
    | └ <fastapi.routing.APIRouter object at 0x7fdcf10e8d60>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/routing.py", line 782, in app
    | await route.handle(scope, receive, send)
    | │ │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf03475b0>
    | │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <function Route.handle at 0x7fdd03a66200>
    | └ Route(path='/{api_path:path}', name='reverse_proxy', methods=['DELETE', 'GET', 'HEAD', 'OPTIONS', 'PATCH', 'POST', 'PUT', 'TR...
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
    | await self.app(scope, receive, send)
    | │ │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf03475b0>
    | │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <function request_response..app at 0x7fdcf11057e0>
    | └ Route(path='/{api_path:path}', name='reverse_proxy', methods=['DELETE', 'GET', 'HEAD', 'OPTIONS', 'PATCH', 'POST', 'PUT', 'TR...
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
    | await wrap_app_handling_exceptions(app, request)(scope, receive, send)
    | │ │ │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf03475b0>
    | │ │ │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ │ └ <starlette.requests.Request object at 0x7fdcf03a2290>
    | │ └ <function request_response..app..app at 0x7fdcf0347130>
    | └ <function wrap_app_handling_exceptions at 0x7fdd03a64b80>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    | raise exc
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    | await app(scope, receive, sender)
    | │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf0347520>
    | │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | └ <function request_response..app..app at 0x7fdcf0347130>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/routing.py", line 75, in app
    | await response(scope, receive, send)
    | │ │ │ └ <function wrap_app_handling_exceptions..wrapped_app..sender at 0x7fdcf0347520>
    | │ │ └ <function BaseHTTPMiddleware.call..call_next..receive_or_disconnect at 0x7fdcf0345b40>
    | │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | └ <starlette.responses.StreamingResponse object at 0x7fdcf03a2560>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 254, in call
    | async with anyio.create_task_group() as task_group:
    | │ │ └ <anyio._backends._asyncio.TaskGroup object at 0x7fdcf03a0220>
    | │ └ <function create_task_group at 0x7fdd03d71990>
    | └ <module 'anyio' from '/var/www/openai-forward/myenv/lib/python3.10/site-packages/anyio/init.py'>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 678, in aexit
    | raise BaseExceptionGroup(
    | └ <class 'exceptiongroup.BaseExceptionGroup'>
    |
    | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
    +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 404, in run_asgi
    | result = await app( # type: ignore[func-returns-value]
    | └ <uvicorn.middleware.proxy_headers.ProxyHeadersMiddleware object at 0x7fdd0494a2c0>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
    | return await self.app(scope, receive, send)
    | │ │ │ │ └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | │ │ │ └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <fastapi.applications.FastAPI object at 0x7fdcf10ebc70>
    | └ <uvicorn.middleware.proxy_headers.ProxyHeadersMiddleware object at 0x7fdd0494a2c0>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call
    | await super().call(scope, receive, send)
    | │ │ └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | │ └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/applications.py", line 123, in call
    | await self.middleware_stack(scope, receive, send)
    | │ │ │ │ └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | │ │ │ └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <starlette.middleware.errors.ServerErrorMiddleware object at 0x7fdcf11201f0>
    | └ <fastapi.applications.FastAPI object at 0x7fdcf10ebc70>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call
    | raise exc
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call
    | await self.app(scope, receive, _send)
    | │ │ │ │ └ <function ServerErrorMiddleware.call.._send at 0x7fdcf1105f30>
    | │ │ │ └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
    | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
    | │ └ <starlette.middleware.base.BaseHTTPMiddleware object at 0x7fdcf10e8a00>
    | └ <starlette.middleware.errors.ServerErrorMiddleware object at 0x7fdcf11201f0>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 189, in call
    | with collapse_excgroups():
    | └ <function collapse_excgroups at 0x7fdd0399ab90>
    | File "/usr/lib/python3.10/contextlib.py", line 153, in exit
    | self.gen.throw(typ, value, traceback)
    | │ │ │ │ │ └ <traceback object at 0x7fdcf03c0780>
    | │ │ │ │ └ ExceptionGroup('unhandled errors in a TaskGroup', [ExceptionGroup('unhandled errors in a TaskGroup', [ExceptionGroup('unhandl...
    | │ │ │ └ <class 'exceptiongroup.ExceptionGroup'>
    | │ │ └ <method 'throw' of 'generator' objects>
    | │ └ <generator object collapse_excgroups at 0x7fdcf1110970>
    | └ <contextlib._GeneratorContextManager object at 0x7fdcf0343e80>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/_utils.py", line 91, in collapse_excgroups
    | raise exc
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 257, in wrap
    | await func()
    | └ functools.partial(<bound method StreamingResponse.stream_response of <starlette.responses.StreamingResponse object at 0x7fdcf...
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 246, in stream_response
    | async for chunk in self.body_iterator:
    | │ │ └ <async_generator object OpenaiForward.aiter_bytes at 0x7fdcf0330dc0>
    | │ └ <starlette.responses.StreamingResponse object at 0x7fdcf03a2560>
    | └ b'\n'
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/decorators.py", line 157, in wrapper
    | async for value in async_gen:
    | │ └ <async_generator object OpenaiForward.aiter_bytes at 0x7fdcf0331940>
    | └ b'\n'
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/forward/core.py", line 422, in aiter_bytes
    | cache_response(cache_key, target_info, route_path, chunk_list)
    | │ │ │ │ └ [b'data: {"id":"","object":"","created":0,"model":"","prompt_filter_results":[{"prompt_index":0,"content_filter_results":{"ha...
    | │ │ │ └ '/v1/chat/completions'
    | │ │ └ {}
    | │ └ b'\x98\x01\x92\x82\xa4role\xa6system\xa7content\xbcYou are a helpful assistant.\x82\xa4role\xa4user\xa7content\xda\x1b\x9b\xe...
    | └ <function cache_response at 0x7fdcf1070dc0>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/cache/init.py", line 58, in cache_response
    | cache_generic_response(cache_key, route_path, chunk_list)
    | │ │ │ └ [b'data: {"id":"","object":"","created":0,"model":"","prompt_filter_results":[{"prompt_index":0,"content_filter_results":{"ha...
    | │ │ └ '/v1/chat/completions'
    | │ └ b'\x98\x01\x92\x82\xa4role\xa6system\xa7content\xbcYou are a helpful assistant.\x82\xa4role\xa4user\xa7content\xda\x1b\x9b\xe...
    | └ <function cache_generic_response at 0x7fdcf1070e50>
    | File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/cache/init.py", line 62, in cache_generic_response
    | if cache_key and route_path in CACHE_ROUTE_SET:
    | │ │ └ {'/v1/chat/completions', '/v1/embeddings'}
    | │ └ [b'data: {"id":"","object":"","created":0,"model":"","prompt_filter_results":[{"prompt_index":0,"content_filter_results":{"ha...
    | └ b'\x98\x01\x92\x82\xa4role\xa6system\xa7content\xbcYou are a helpful assistant.\x82\xa4role\xa4user\xa7content\xda\x1b\x9b\xe...
    |
    | TypeError: unhashable type: 'list'
    +------------------------------------

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

File "/var/www/openai-forward/myenv/bin/aifd", line 8, in
sys.exit(main())
│ │ └ <function main at 0x7fdd055fd870>
│ └
└ <module 'sys' (built-in)>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/main.py", line 177, in main
fire.Fire(Cli)
│ │ └ <class 'openai_forward.main.Cli'>
│ └ <function Fire at 0x7fdd05391d80>
└ <module 'fire' from '/var/www/openai-forward/myenv/lib/python3.10/site-packages/fire/init.py'>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/fire/core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
│ │ │ │ │ └ 'aifd'
│ │ │ │ └ {}
│ │ │ └ Namespace(verbose=False, interactive=False, separator='-', completion=None, help=False, trace=False)
│ │ └ ['run']
│ └ <class 'openai_forward.main.Cli'>
└ <function _Fire at 0x7fdd04a96e60>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/fire/core.py", line 475, in _Fire
component, remaining_args = _CallAndUpdateTrace(
│ └ <function _CallAndUpdateTrace at 0x7fdd04a96f80>
└ <bound method Cli.run of <openai_forward.main.Cli object at 0x7fdd0494a530>>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
│ │ └ {}
│ └ [8000, 1, False, 8001]
└ <bound method Cli.run of <openai_forward.main.Cli object at 0x7fdd0494a530>>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/main.py", line 35, in run
uvicorn.run(
│ └ <function run at 0x7fdd04a3dc60>
└ <module 'uvicorn' from '/var/www/openai-forward/myenv/lib/python3.10/site-packages/uvicorn/init.py'>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/uvicorn/main.py", line 587, in run
server.run()
│ └ <function Server.run at 0x7fdd04a3d6c0>
└ <uvicorn.server.Server object at 0x7fdd049496f0>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/uvicorn/server.py", line 62, in run
return asyncio.run(self.serve(sockets=sockets))
│ │ │ │ └ None
│ │ │ └ <function Server.serve at 0x7fdd04a3d750>
│ │ └ <uvicorn.server.Server object at 0x7fdd049496f0>
│ └ <function run at 0x7fdd05280f70>
└ <module 'asyncio' from '/usr/lib/python3.10/asyncio/init.py'>
File "/usr/lib/python3.10/asyncio/runners.py", line 44, in run
return loop.run_until_complete(main)
│ │ └ <coroutine object Server.serve at 0x7fdd0493c350>
│ └ <function BaseEventLoop.run_until_complete at 0x7fdd050af880>
└ <_UnixSelectorEventLoop running=True closed=False debug=False>
File "/usr/lib/python3.10/asyncio/base_events.py", line 636, in run_until_complete
self.run_forever()
│ └ <function BaseEventLoop.run_forever at 0x7fdd050af7f0>
└ <_UnixSelectorEventLoop running=True closed=False debug=False>
File "/usr/lib/python3.10/asyncio/base_events.py", line 603, in run_forever
self._run_once()
│ └ <function BaseEventLoop._run_once at 0x7fdd050bd360>
└ <_UnixSelectorEventLoop running=True closed=False debug=False>
File "/usr/lib/python3.10/asyncio/base_events.py", line 1909, in _run_once
handle._run()
│ └ <function Handle._run at 0x7fdd05348d30>
└ <Handle Task.task_wakeup()>
File "/usr/lib/python3.10/asyncio/events.py", line 80, in _run
self._context.run(self._callback, *self._args)
│ │ │ │ │ └ <member '_args' of 'Handle' objects>
│ │ │ │ └ <Handle Task.task_wakeup()>
│ │ │ └ <member '_callback' of 'Handle' objects>
│ │ └ <Handle Task.task_wakeup()>
│ └ <member '_context' of 'Handle' objects>
└ <Handle Task.task_wakeup()>

File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 404, in run_asgi
result = await app( # type: ignore[func-returns-value]
└ <uvicorn.middleware.proxy_headers.ProxyHeadersMiddleware object at 0x7fdd0494a2c0>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
│ │ │ │ └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
│ │ │ └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
│ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
│ └ <fastapi.applications.FastAPI object at 0x7fdcf10ebc70>
└ <uvicorn.middleware.proxy_headers.ProxyHeadersMiddleware object at 0x7fdd0494a2c0>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
│ │ └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
│ └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
└ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
│ │ │ │ └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
│ │ │ └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
│ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
│ └ <starlette.middleware.errors.ServerErrorMiddleware object at 0x7fdcf11201f0>
└ <fastapi.applications.FastAPI object at 0x7fdcf10ebc70>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
│ │ │ │ └ <function ServerErrorMiddleware.call.._send at 0x7fdcf1105f30>
│ │ │ └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.h11_impl.RequestResponseCycle object at 0x7fdcf0342fb0>>
│ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.0', 'server': ('127.0.0.1', 8000), 'cl...
│ └ <starlette.middleware.base.BaseHTTPMiddleware object at 0x7fdcf10e8a00>
└ <starlette.middleware.errors.ServerErrorMiddleware object at 0x7fdcf11201f0>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/middleware/base.py", line 189, in call
with collapse_excgroups():
└ <function collapse_excgroups at 0x7fdd0399ab90>
File "/usr/lib/python3.10/contextlib.py", line 153, in exit
self.gen.throw(typ, value, traceback)
│ │ │ │ │ └ <traceback object at 0x7fdcf03c0780>
│ │ │ │ └ ExceptionGroup('unhandled errors in a TaskGroup', [ExceptionGroup('unhandled errors in a TaskGroup', [ExceptionGroup('unhandl...
│ │ │ └ <class 'exceptiongroup.ExceptionGroup'>
│ │ └ <method 'throw' of 'generator' objects>
│ └ <generator object collapse_excgroups at 0x7fdcf1110970>
└ <contextlib._GeneratorContextManager object at 0x7fdcf0343e80>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/_utils.py", line 91, in collapse_excgroups
raise exc
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 257, in wrap
await func()
└ functools.partial(<bound method StreamingResponse.stream_response of <starlette.responses.StreamingResponse object at 0x7fdcf...
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/starlette/responses.py", line 246, in stream_response
async for chunk in self.body_iterator:
│ │ └ <async_generator object OpenaiForward.aiter_bytes at 0x7fdcf0330dc0>
│ └ <starlette.responses.StreamingResponse object at 0x7fdcf03a2560>
└ b'\n'
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/decorators.py", line 157, in wrapper
async for value in async_gen:
│ └ <async_generator object OpenaiForward.aiter_bytes at 0x7fdcf0331940>
└ b'\n'
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/forward/core.py", line 422, in aiter_bytes
cache_response(cache_key, target_info, route_path, chunk_list)
│ │ │ │ └ [b'data: {"id":"","object":"","created":0,"model":"","prompt_filter_results":[{"prompt_index":0,"content_filter_results":{"ha...
│ │ │ └ '/v1/chat/completions'
│ │ └ {}
│ └ b'\x98\x01\x92\x82\xa4role\xa6system\xa7content\xbcYou are a helpful assistant.\x82\xa4role\xa4user\xa7content\xda\x1b\x9b\xe...
└ <function cache_response at 0x7fdcf1070dc0>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/cache/init.py", line 58, in cache_response
cache_generic_response(cache_key, route_path, chunk_list)
│ │ │ └ [b'data: {"id":"","object":"","created":0,"model":"","prompt_filter_results":[{"prompt_index":0,"content_filter_results":{"ha...
│ │ └ '/v1/chat/completions'
│ └ b'\x98\x01\x92\x82\xa4role\xa6system\xa7content\xbcYou are a helpful assistant.\x82\xa4role\xa4user\xa7content\xda\x1b\x9b\xe...
└ <function cache_generic_response at 0x7fdcf1070e50>
File "/var/www/openai-forward/myenv/lib/python3.10/site-packages/openai_forward/cache/init.py", line 62, in cache_generic_response
if cache_key and route_path in CACHE_ROUTE_SET:
│ │ └ {'/v1/chat/completions', '/v1/embeddings'}
│ └ [b'data: {"id":"","object":"","created":0,"model":"","prompt_filter_results":[{"prompt_index":0,"content_filter_results":{"ha...
└ b'\x98\x01\x92\x82\xa4role\xa6system\xa7content\xbcYou are a helpful assistant.\x82\xa4role\xa4user\xa7content\xda\x1b\x9b\xe...

TypeError: unhashable type: 'list'

from openai-forward.

KenyonY avatar KenyonY commented on August 17, 2024

建议你先将openai-forward 升级到最新:

pip install git+https://github.com/KenyonY/openai-forward

然后按照 .env 进行正确配置,cache相关的可以先注释掉,测试是否还有问题。
PS: 你的这些问题在我部署的服务中没有遇到过。

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

image
使用0.7.2之后运行报了这个错误,跑不起来服务,env文件已经是新的了

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

0.7.1也是上面的报错,0.7.0可用

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

image
这是目前的.env文件配置

from openai-forward.

KenyonY avatar KenyonY commented on August 17, 2024

已修复仓库的.env 配置示例,是其中的OPENAI_API_KEY正确使用方式是

OPENAI_API_KEY={"sk-xxx": [0], "sk-xxx": [1], "sk-xxx": [1,2]}

感谢反馈

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

我按照新的.env格式配置了参数,0.7.1和0.7.2都能正常跑了

这次测试,openai-forward没有异常信息,但程序还是会有peer closed connection without sending complete message body (incomplete chunked read),我想再排查一下nginx和微软返回的信息,有进展了第一时间回复

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

直接使用ip访问,问题依然存在,不是nginx的问题,目前还不清楚是哪里的问题,也许是返回给openai-forward的信息有问题,我想要查看返回的信息应该从哪里看,chat的log里面只能看到发送的,看不到返回的

from openai-forward.

KenyonY avatar KenyonY commented on August 17, 2024

直接使用ip访问,问题依然存在,不是nginx的问题,目前还不清楚是哪里的问题,也许是返回给openai-forward的信息有问题,我想要查看返回的信息应该从哪里看,chat的log里面只能看到发送的,看不到返回的

那得修改源码了:
你可以在这里


的上一行去打印每次返回的chunk

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

好的,感谢,周一会仔细研究一下,有了消息第一时间回复

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

您好,我运行源代码,打印返回内容,然后发现了问题
image
在这里返回了不完整的数据,然后代码就报错,peer closed connection without sending complete message body

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

我在网上查了下,流式返回了不完整的信息的时候,可能需要存起来最后一行,然后等下一次消息收到的时候拼起来再返回完整的内容

我在使用langchain或者openai的api时都出现了peer closed connection without sending complete message body,猜测可能这些包默认都是要求返回的信息完整,所以当openai-forward返回了不完整的json时,代码会报错

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

image
这里似乎也是类似的错误但不太一样,json的内容有问题,导致程序收到之后就报错了

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

image
返回这些消息之后也报错了

然后蓝色文字是程序收到不完整信息之后报错,又自动重发了请求

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

image
运行完之后会稳定报这个错,是什么原因

from openai-forward.

KenyonY avatar KenyonY commented on August 17, 2024
  1. 原则上 openai-forward 会对接受到的chunk进行不做任何修改的返回,所以它直到在结束时收到的是一份不完整的返回体,这里我理解是上游返回出了问题,但你之前说使用别的转发没有问题,这很奇怪。
  2. 日志包IndexError还是由于接收到的response不完整
  3. openai-forward只会在发送不成功时重试,你的程序收到不完整信息后重试应该是OpenAI()或Langchain的自动重试

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

问题应该就出在上游返回的消息不完整了,使用别的转发没问题,是因为别的转发会把json拼完整再返回,如果不做处理直接返回给程序的话,那么程序使用的openai或是langchain都会出现因为接受内容不完整而报错的问题

from openai-forward.

KenyonY avatar KenyonY commented on August 17, 2024

@XianYue0125
我刚刚提交了一个pr #121,内部对stream与非stream请求做了区分,你试试用这个分支部署看能否解决问题

from openai-forward.

github-actions avatar github-actions commented on August 17, 2024

This issue is stale because it has been open for 30 days with no activity.

from openai-forward.

github-actions avatar github-actions commented on August 17, 2024

This issue was closed because it has been inactive for 7 days since being marked as stale.

from openai-forward.

XianYue0125 avatar XianYue0125 commented on August 17, 2024

抱歉一直在忙其他时间忘记了跟进这个分支。

我这边通过测试得出了结论,使用流式返回的时候,因为使用的是azure的服务,在达到每分钟最大的tokens时,azure端会自动断掉流式返回的对话,导致接收端收不到完整的信息,然后报错peer closed connection without sending complete message body。

关于这点,可以肯定不是转发的问题,因为尝试了其他的转发服务也有类似的情况,即使是直接使用openai官方的服务器和api,在达到上限的时候也会出现同样的错误。

有一些作者会在接收端对这个错误进行额外的处理,我在想有没有可能做到转发服务器里面,多一个服务端关闭连接的处理,提醒客户端发送频率超额了等信息。

from openai-forward.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.