Comments (10)
fantastic repo. kindly ask is there any update on this thread?
from lanarky.
@ajndkr would you be able to elaborate on this? Is lanarky meant to solely focused on enabling streaming or does it have a broader goal?
from lanarky.
@awtkns yeah sure. So the project is still holds streaming as the key features but from my experience, there are other common functionalities which can be used via lanarky. The broader goal is to provide necessary tooling for deploying without having to build them from scratch.
For example, tracking tokens per user question is something which is not trivial to build but it needs an sql connection. Hence this particular issues.
from lanarky.
user id, conversation id or session id may also be useful for vector store retrieving
from lanarky.
hi! this feature is still part of the roadmap but i'm unable to find time currently. would you like to contribute?
from lanarky.
@ajndkr glad to contribute if have time.
One question, currently the langchain object is created before passing to the LangChainRouter
and wrapped in a simple create_langchain_dependency
for dependency injection. Actually, langchain already has session support for memory using sqlalchemy
, redis
or motorhead
, but in order to bind with some user session id, the dependency injection should be more flexible.
For example, instead of the simple create_langchain_dependency
, maybe it's better for user to pass the dependency, which may use the session_id
in input query or body to construct the memory used for langchain object. Do you have any insights on the ideal API for such things?
https://python.langchain.com/docs/integrations/memory/sql_chat_message_history
from lanarky.
@ajndkr In current implementation, the request and response model are dynamically determined by langchain_object
with create_request_from_langchain_dependency
. What if the constrcution langchain_object
needs to parse request first to get something like session_id
for db memory? Can u give me some advice on this?
from lanarky.
@npuichigo hi! create_request_from_langchain_dependency
as a feature isn't fully mature to handle all use cases. We add new parameters to handle new requirements but in the meantime, you can check this implementation: https://github.com/menloparklab/chatbot-ui/blob/d2d4aa84ebb6351bfacee99429228d091221b60b/backend/app.py#L101-L114
You can directly use the StreamingResponse class and add custom logic in the wrapped function.
from lanarky.
The code linked uses all the features except add_langchain_api_route
part of LangchainRouter
. So is that also immature since it uses create_request_from_langchain_dependency
?
from lanarky.
The code linked uses all the features except
add_langchain_api_route
part ofLangchainRouter
. So is that also immature since it usescreate_request_from_langchain_dependency
?
nope! the linked code is more low level, so you can skip the use of add_langchain_api_route
and use the base fastapi route decorator.
from lanarky.
Related Issues (20)
- Question: SSE StreamingResponse support for FastAPI HOT 2
- Final answer not returned with using GPTCache, with streaming JSON. HOT 6
- LangchainRouter not using chain's built-in memory HOT 2
- Feature Request: Customized Chain Support HOT 4
- Issue with Gradio Integration in conversation_retrieval in examples/app HOT 4
- StreamingResponse SSE text/event-stream requires "data:" prefix and "\n\n" terminator to work properly HOT 5
- typing-extensions outdated for pydantic v2 HOT 6
- Vercel is not recognizing my LangchainRouter HOT 3
- Create Streaming for Agent types and add Memory HOT 2
- How to custom an AsyncLanarkyCallback in llmchain? HOT 7
- UserWarning: Field name "raise_error" shadows an attribute in parent "AsyncCallbackHandler"; HOT 7
- How to export OpenAI-Compatible API ? HOT 2
- Streaming sometimes has no response on repeated same request HOT 10
- AttributeError: module 'openai' has no attribute 'aiosession' HOT 5
- bug: callback handlers for langchain adapter doesn't work for LLMs without async support HOT 1
- bug: Expected response header Content-Type to contain 'text/event-stream', got 'text/plain' HOT 31
- bug: TokenStreamingCallbackHandler accepts only one key HOT 5
- bug: openai error: list index out of range HOT 1
- feat: LLMChain was deprecated HOT 1
- Allow langchain version > 0.2.0 to be used HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from lanarky.