Git Product home page Git Product logo

rags's People

Contributors

alexfilothodoros avatar anoopshrma avatar eltociear avatar jerryjliu avatar logan-markewich avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rags's Issues

4cbf5338609c0124420e189378280428b5ed4cff09d96c93798bfb38d857b5d7

https://codesandbox.io/p/github/codesandbox/codesandbox-template-vite-react/main?layout=%257B%2522sidebarPanel%2522%253A%2522EXPLORER%2522%252C%2522rootPanelGroup%2522%253A%257B%2522direction%2522%253A%2522horizontal%2522%252C%2522contentType%2522%253A%2522UNKNOWN%2522%252C%2522type%2522%253A%2522PANEL_GROUP%2522%252C%2522id%2522%253A%2522ROOT_LAYOUT%2522%252C%2522panels%2522%253A%255B%257B%2522type%2522%253A%2522PANEL_GROUP%2522%252C%2522contentType%2522%253A%2522UNKNOWN%2522%252C%2522direction%2522%253A%2522vertical%2522%252C%2522id%2522%253A%2522clpj3my8o0008336i682sbypn%2522%252C%2522sizes%2522%253A%255B70%252C30%255D%252C%2522panels%2522%253A%255B%257B%2522type%2522%253A%2522PANEL_GROUP%2522%252C%2522contentType%2522%253A%2522EDITOR%2522%252C%2522direction%2522%253A%2522horizontal%2522%252C%2522id%2522%253A%2522EDITOR%2522%252C%2522panels%2522%253A%255B%257B%2522type%2522%253A%2522PANEL%2522%252C%2522contentType%2522%253A%2522EDITOR%2522%252C%2522id%2522%253A%2522clpj3my8n0003336i6n4n9ddv%2522%257D%255D%257D%252C%257B%2522type%2522%253A%2522PANEL_GROUP%2522%252C%2522contentType%2522%253A%2522SHELLS%2522%252C%2522direction%2522%253A%2522horizontal%2522%252C%2522id%2522%253A%2522SHELLS%2522%252C%2522panels%2522%253A%255B%257B%2522type%2522%253A%2522PANEL%2522%252C%2522contentType%2522%253A%2522SHELLS%2522%252C%2522id%2522%253A%2522clpj3my8o0005336itpitsjwl%2522%257D%255D%252C%2522sizes%2522%253A%255B100%255D%257D%255D%257D%252C%257B%2522type%2522%253A%2522PANEL_GROUP%2522%252C%2522contentType%2522%253A%2522DEVTOOLS%2522%252C%2522direction%2522%253A%2522vertical%2522%252C%2522id%2522%253A%2522DEVTOOLS%2522%252C%2522panels%2522%253A%255B%257B%2522type%2522%253A%2522PANEL%2522%252C%2522contentType%2522%253A%2522DEVTOOLS%2522%252C%2522id%2522%253A%2522clpj3my8o0007336iq5mh7pwv%2522%257D%255D%252C%2522sizes%2522%253A%255B100%255D%257D%255D%252C%2522sizes%2522%253A%255B50%252C50%255D%257D%252C%2522tabbedPanels%2522%253A%257B%2522clpj3my8n0003336i6n4n9ddv%2522%253A%257B%2522tabs%2522%253A%255B%257B%2522id%2522%253A%2522clpj3my8m0002336i9fm2siqt%2522%252C%2522mode%2522%253A%2522permanent%2522%252C%2522type%2522%253A%2522FILE%2522%252C%2522filepath%2522%253A%2522%252FREADME.md%2522%252C%2522state%2522%253A%2522IDLE%2522%257D%255D%252C%2522id%2522%253A%2522clpj3my8n0003336i6n4n9ddv%2522%252C%2522activeTabId%2522%253A%2522clpj3my8m0002336i9fm2siqt%2522%257D%252C%2522clpj3my8o0007336iq5mh7pwv%2522%253A%257B%2522tabs%2522%253A%255B%257B%2522id%2522%253A%2522clpj3my8o0006336id95q7dvs%2522%252C%2522mode%2522%253A%2522permanent%2522%252C%2522type%2522%253A%2522TASK_PORT%2522%252C%2522taskId%2522%253A%2522yarn%2520dev%2522%252C%2522port%2522%253A5173%252C%2522path%2522%253A%2522%252F%2522%257D%255D%252C%2522id%2522%253A%2522clpj3my8o0007336iq5mh7pwv%2522%252C%2522activeTabId%2522%253A%2522clpj3my8o0006336id95q7dvs%2522%257D%252C%2522clpj3my8o0005336itpitsjwl%2522%253A%257B%2522tabs%2522%253A%255B%257B%2522id%2522%253A%2522clpj3my8n0004336i64scxmv0%2522%252C%2522mode%2522%253A%2522permanent%2522%252C%2522type%2522%253A%2522TASK_LOG%2522%252C%2522taskId%2522%253A%2522yarn%2520dev%2522%257D%255D%252C%2522id%2522%253A%2522clpj3my8o0005336itpitsjwl%2522%252C%2522activeTabId%2522%253A%2522clpj3my8n0004336i64scxmv0%2522%257D%257D%252C%2522showDevtools%2522%253Atrue%252C%2522showShells%2522%253Atrue%252C%2522showSidebar%2522%253Atrue%252C%2522sidebarPanelSize%2522%253A15%257D

If multiple PDF agents can be defined ?

Hi ,

I succeeded to create some PDF documnet agent . I wonder if it's possible to create agent on several PDFs or create agent per PDF and reference them by name or similar way in "Generated RAG Agent" chat ?

"Update Agent" creates an error and deletes the Agent

After creating an agent and navigating to "RAG Config" for the purpose of selecting the checkbox labeled "Include Summarization (only works for GPT-4)" an error is produced and the Agent is deleted after clicking "Update Agent".

Error:
image

About llama_index version

WHY only support llama-index==0.9.7 ?
I want to use LLM like gemini which can not be found in 0.9.7.
Hope for your reply.

Error during running answer to Home page

Hi,

I am getting this error during submitting an answer to "What RAG bot do you want to build?" question :

image

I guess it relates to openai credentials . What do you mean : "Please .streamlit/secrets.toml in the home folder."

Is "home" folder is "rags" folder - the roor folder of repo or home folder of host machine ?

Deployment?

Is there a way to deploy the created agent only? thanks

Stop response generation in langchain framework

Python code:
qa_chain = RetrievalQA.from_chain_type(llm=turbo_llm,
chain_type="stuff",
retriever=compression_retriever,
return_source_documents=True
)
response = qa_chain("What is Langchain?")

This is the python code I am using to query over a PDF by following RAG approach.
My requirement is, if it takes more than 1 minute to generate the response then it should stop response generation from the backend.
How I can do that? Is there any python code architecture available for this?

Installation issue

I have tried now several times to install rags, but I always get this error message:

(base) kalle@MacBook-Air rags % poetry install --with dev
Installing dependencies from lock file

No dependencies to install or update

Installing the current project: rags (0.0.5)
The current project could not be installed: No file/folder found for package rags
If you do not want to install the current project use --no-root

Any suggestions?

BadRequestError: Error code: 400 & General Observations

@jerryjliu was having a great session building out a bot. then things started to get weird. the conversation on Home - setting up the bot - I could not really tell if I was getting RAGs advice and information or general GPT4. That is, after a while it seemed the setup process was being hallucinated. I then went to Generated RAG Agent to test how much of the system prompt conversation was internalized. The results we pretty poor. I copied and pasted the conversation from the generated agent and fed to the Home (need to have names for these different actors, it's confusing) and asked home if they were good responses or not. I says that they were not. we talk about modifications. it does them. the results are no better so we do the same and when I go to test the latest tweak, I go to GenRAG and my first prompt is can you try that last one again.. then poof

BadRequestError: Error code: 400 - {'error': {'message': "Invalid value for 'content': expected a string, got null.", 'type': 'invalid_request_error', 'param': 'messages.[122].content', 'code': None}}

While this is probably some trivial issue, I believe there are issues to address regarding the general behavior of the system and how some aspects present to the user.

To that end, I have attached my project (minus the .toml with my key) I have also copy/pasted the conversations from both Home and GenRAG - they are in the folder _trouble.

Please let me know if there is anyway I can be helpful. I need this to be awesome ;^)
I have also opened a thread on discord with you and @logan-markewich tagged - has some other conceptual questions and ideas. Thanks for all you do.
rags_error_build_nickknyc.zip

test

+++start+++5FNiK1XxojuyyADBZU4cMmjkamutRyUUfpuYWxgmQoxAVGVE9wv6W9QvSLdH8fcy3FB3ivia5JuFb8WKEzQsWouuUkevqbNjBL54YdoVSJ2K3R9NGZVW8sY16jjFQ6vpfPhvyGF1JLYnjboSGDQo1MAkQhzVhrLPAovhNpovL9n1xVshK11fT9Ns8g+++end+++

Agent is ready , how to query pdf ?

Hi ,

I've got a chance to set filepath during this conversation at Home page :

image
image
image

But bot at Generated RAG Agent page still unable to answer questions about the PDF :

image

How to query specific PDF ? Where exactly put the pdf ? Now it's at "rags" sub-folder named "files"

INTEGRATE GOOGLE COLAB

Suggestion : it would be great if you integrate the google col lab feature so we can run over it and improve the project

Why RAG?

What is and why one needs a RAG?

I dont see any streamlit download

can someone please explain this step
By default, we use OpenAI for both the builder agent as well as the generated RAG agent. Please .streamlit/secrets.toml in the home folder.

I dont see any strealit/secrets.toml downloaded although I did run requirements.txt

How to upload files?

I turn to page RAG Config, but it shows "File/URL paths (not editable)".So where can I upload PDFs?

If the Agent can be persisted

Once I loaded the PDF and was able to ask questions about it I want to save the agent to use it between launches of 1_🏠_Home.py.
Or should 1_🏠_Home.py is intended to run as online service ? Any way to create the Agent programmatically once the service was down or migrated ?

[ENH] can we use azure openai endpoints?

Hi.

What would be needed in order to use azure openai endpoints?
I think some changes in utils/_resolve_llm are certainly needed?

Any advice? I'd like to work on this topic, but I was thinking that someone else might have alreadt thought about that and could provide some feedback.

Cheers

Support streaming chat responses

πŸ” Description

LlamaIndex chat engines support streaming responses. It would be a small UX improvement if rags could support streaming the engine's responses to the Streamlit frontend such that users don't have to wait until the entire response is generated.

The only issue is that .stream_chat uses async functions. But Streamlit runs in a separate thread that doesn’t have an event loop by default. To make it work, the implementation will need to create an event loop and run the .stream_chat call inside it.

Happy to submit a PR for this!

How to run this project?

After stream run 1_home.py. How I can build a agent? by input what and how I can upload or point a file, have a example? No matter what I input, it always be:
system_prompt=None file_paths=[] docs=[] tools=[] rag_params=RAGParams(include_summarization=False, top_k=2, chunk_size=1024, embed_model='default', llm='gpt-4-1106-preview') agent=None
Thanks a lot!

Installation failed with poetry

Hi,
I am running on Windows 11, and I have created a separate venv for this project.

when I run "poetry install --with dev", I got the following five lines of message, error message
Installing dependencies from lock file
No dependencies to install or update
Installing the current project: rags (0.0.2)
The current project could not be installed: No file/folder found for package rags
If you do not want to install the current project use --no-root

To resolve the problem, I have deleted venv and github repo, and then recreated the venv and re-clone the repo, but the problem above persist.

Pls help,
thanks,
Sean

Metaphor Key - for cloud deploy - no .toml and no .env

I am trying to deploy to Azure App Serivices. I have ARM templates that work fine. The one issue I am having is that i need to set API keys as if they are stored as environment variables.
For OpenAI -
os.environ["OPENAI_API_KEY"] = st.secrets.openai_key
from utils.py makes sense

but am not seeing anything that straightforward for metaphor

Am I missing something?

implement using llamacpp as LLM model

i am trying to implement using open source llm model with llamacpp but getting this error

"ValueError: Must pass in vector index for CondensePlusContextChatEngine."
i am new to llamaindex also can anyone help me what exactly i need to configure in order to run the RAGs

metaphor to requirements.txt

deploying to azure app services, during setup it uses requirements.txt to load up the virtual environment.
believe that metaphor-python needs to be in there.

Or am I missing something ;^)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.