Git Product home page Git Product logo

ksm26 / langchain-chat-with-your-data Goto Github PK

View Code? Open in Web Editor NEW
114.0 3.0 76.0 73.05 MB

Explore LangChain and build powerful chatbots that interact with your own data. Gain insights into document loading, splitting, retrieval, question answering, and more.

Home Page: https://learn.deeplearning.ai/langchain-chat-with-your-data/lesson/1/introduction

Jupyter Notebook 100.00%
langchain chat-with-your-dat chatbot-development language-models llms document-loading conversational-ai embeddings question-answering document-splitting

langchain-chat-with-your-data's Introduction

📚 Welcome to the "LangChain: Chat with Your Data" course! Learn directly from the LangChain creator, Harrison Chase, and discover the power of LangChain in building chatbots that interact with information from your own documents and data.

LangChain: 🔗GitHub, 📚Documentation

Course Summary

📖 A short course on LangChain: Chat With Your Data! Explore two main topics: Retrieval Augmented Generation (RAG) and building a chatbot. Unlock the potential of Large Language Models (LLMs) to retrieve contextual documents and create chatbots that respond using your own data.

You'll learn about:

  1. 📥 Document Loading: Access over 80 unique loaders provided by LangChain to handle various data sources, including audio and video.

  1. ✂️ Document Splitting: Discover best practices and considerations for splitting data effectively.

  1. 🧮 Vector Stores and Embeddings: Dive into embeddings and explore vector store integrations within LangChain.

  1. 🔄 Retrieval: Grasp advanced techniques for accessing and indexing data in the vector store to retrieve relevant information beyond semantic queries.

  1. 🤔 Question Answering: Build a one-pass question-answering solution.

  1. 💬 Chat: Track and select pertinent information from conversations and data sources to build your own chatbot using LangChain.

💡 Start building practical applications that allow you to interact with data using LangChain and LLMs.

Key Points

  • 🔑 Learn directly from the LangChain creator, Harrison Chase.
  • 📊 Apply LLMs to your proprietary data and develop personalized assistants and specialized chatbots.
  • 💡 Expand your utilization of LLMs through agents, chained calls, and memories.

About the Instructors

🌟Harrison Chase is Co-Founder and CEO at LangChain.

🌟Andrew Ng is Renowned AI researcher, co-founder of Coursera, and the founder of DeepLearning.AI. With a wealth of knowledge and expertise in the field, Andrew has played a pivotal role in popularizing AI education.

🔗 Reference: "LangChain: Chat with Your Data" course. To enroll in the course or for further information, visit deeplearning.ai.

langchain-chat-with-your-data's People

Contributors

ksm26 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

langchain-chat-with-your-data's Issues

pip install chromadb is failing with error. Using 3.12.1 python kernel

Collecting chromadb
Using cached chromadb-0.4.22-py3-none-any.whl.metadata (7.3 kB)
Collecting build>=1.0.3 (from chromadb)
Using cached build-1.0.3-py3-none-any.whl.metadata (4.2 kB)
Requirement already satisfied: requests>=2.28 in /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages (from chromadb) (2.31.0)
Requirement already satisfied: pydantic>=1.9 in /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages (from chromadb) (2.5.3)
Collecting chroma-hnswlib==0.7.3 (from chromadb)
Using cached chroma-hnswlib-0.7.3.tar.gz (31 kB)
Installing build dependencies ... �[?25ldone
�[?25h Getting requirements to build wheel ... �[?25ldone
�[?25h Preparing metadata (pyproject.toml) ... �[?25ldone
�[?25hCollecting fastapi>=0.95.2 (from chromadb)
Using cached fastapi-0.109.0-py3-none-any.whl.metadata (24 kB)
Collecting uvicorn>=0.18.3 (from uvicorn[standard]>=0.18.3->chromadb)
Using cached uvicorn-0.26.0-py3-none-any.whl.metadata (6.4 kB)
Requirement already satisfied: numpy>=1.22.5 in /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages (from chromadb) (1.26.3)
Collecting posthog>=2.4.0 (from chromadb)
Using cached posthog-3.3.2-py2.py3-none-any.whl.metadata (2.0 kB)
Requirement already satisfied: typing-extensions>=4.5.0 in /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages (from chromadb) (4.9.0)
Collecting pulsar-client>=3.1.0 (from chromadb)
Using cached pulsar_client-3.4.0-cp312-cp312-macosx_10_15_universal2.whl.metadata (1.0 kB)
INFO: pip is looking at multiple versions of chromadb to determine which version is compatible with other requirements. This could take a while.
Collecting chromadb
Using cached chromadb-0.4.21-py3-none-any.whl.metadata (7.3 kB)
Using cached chromadb-0.4.20-py3-none-any.whl.metadata (7.3 kB)
Using cached chromadb-0.4.19-py3-none-any.whl.metadata (7.3 kB)
Using cached chromadb-0.4.18-py3-none-any.whl.metadata (7.4 kB)
Using cached chromadb-0.4.17-py3-none-any.whl.metadata (7.3 kB)
Using cached chromadb-0.4.16-py3-none-any.whl.metadata (7.3 kB)
Using cached chromadb-0.4.15-py3-none-any.whl.metadata (7.2 kB)
INFO: pip is still looking at multiple versions of chromadb to determine which version is compatible with other requirements. This could take a while.
Using cached chromadb-0.4.14-py3-none-any.whl.metadata (7.0 kB)
Using cached chromadb-0.4.13-py3-none-any.whl.metadata (7.0 kB)
Using cached chromadb-0.4.12-py3-none-any.whl.metadata (7.0 kB)
Collecting pydantic<2.0,>=1.9 (from chromadb)
Using cached pydantic-1.10.14-py3-none-any.whl.metadata (150 kB)
Collecting fastapi<0.100.0,>=0.95.2 (from chromadb)
Using cached fastapi-0.99.1-py3-none-any.whl.metadata (23 kB)
Collecting chromadb
Using cached chromadb-0.4.11-py3-none-any.whl.metadata (7.0 kB)
Using cached chromadb-0.4.10-py3-none-any.whl.metadata (7.0 kB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.
Using cached chromadb-0.4.9-py3-none-any.whl.metadata (7.0 kB)
Collecting chroma-hnswlib==0.7.2 (from chromadb)
Using cached chroma-hnswlib-0.7.2.tar.gz (31 kB)
Installing build dependencies ... �[?25ldone
�[?25h Getting requirements to build wheel ... �[?25ldone
�[?25h Preparing metadata (pyproject.toml) ... �[?25ldone
�[?25hCollecting chromadb
Using cached chromadb-0.4.8-py3-none-any.whl.metadata (6.9 kB)
Using cached chromadb-0.4.7-py3-none-any.whl.metadata (6.9 kB)
Using cached chromadb-0.4.6-py3-none-any.whl.metadata (6.8 kB)
Using cached chromadb-0.4.5-py3-none-any.whl.metadata (6.8 kB)
Using cached chromadb-0.4.4-py3-none-any.whl.metadata (6.8 kB)
Using cached chromadb-0.4.3-py3-none-any.whl.metadata (6.9 kB)
Collecting pandas>=1.3 (from chromadb)
Using cached pandas-2.2.0-cp312-cp312-macosx_11_0_arm64.whl.metadata (19 kB)
Collecting chroma-hnswlib==0.7.1 (from chromadb)
Using cached chroma-hnswlib-0.7.1.tar.gz (30 kB)
Installing build dependencies ... �[?25ldone
�[?25h Getting requirements to build wheel ... �[?25ldone
�[?25h Preparing metadata (pyproject.toml) ... �[?25ldone
�[?25hCollecting chromadb
Using cached chromadb-0.4.2-py3-none-any.whl.metadata (6.9 kB)
Using cached chromadb-0.4.1-py3-none-any.whl.metadata (6.9 kB)
Using cached chromadb-0.4.0-py3-none-any.whl.metadata (6.9 kB)
Using cached chromadb-0.3.29-py3-none-any.whl.metadata (6.9 kB)
Collecting hnswlib>=0.7 (from chromadb)
Using cached hnswlib-0.8.0.tar.gz (36 kB)
Installing build dependencies ... �[?25ldone
�[?25h Getting requirements to build wheel ... �[?25ldone
�[?25h Preparing metadata (pyproject.toml) ... �[?25ldone
�[?25hCollecting clickhouse-connect>=0.5.7 (from chromadb)
Using cached clickhouse_connect-0.6.23-cp312-cp312-macosx_11_0_arm64.whl.metadata (2.8 kB)
Collecting duckdb>=0.7.1 (from chromadb)
Using cached duckdb-0.9.2.tar.gz (10.7 MB)
Preparing metadata (setup.py) ... �[?25ldone
�[?25hCollecting fastapi==0.85.1 (from chromadb)
Using cached fastapi-0.85.1-py3-none-any.whl (55 kB)
Collecting chromadb
Using cached chromadb-0.3.27-py3-none-any.whl.metadata (6.8 kB)
Collecting pydantic==1.9 (from chromadb)
Using cached pydantic-1.9.0-py3-none-any.whl (140 kB)
Collecting chromadb
Using cached chromadb-0.3.26-py3-none-any.whl.metadata (6.8 kB)
Using cached chromadb-0.3.25-py3-none-any.whl.metadata (6.7 kB)
Using cached chromadb-0.3.23-py3-none-any.whl.metadata (6.3 kB)
Collecting sentence-transformers>=2.2.2 (from chromadb)
Using cached sentence-transformers-2.2.2.tar.gz (85 kB)
Preparing metadata (setup.py) ... �[?25ldone
�[?25hRequirement already satisfied: certifi in /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages (from clickhouse-connect>=0.5.7->chromadb) (2023.11.17)
Requirement already satisfied: urllib3>=1.26 in /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages (from clickhouse-connect>=0.5.7->chromadb) (2.1.0)
Collecting pytz (from clickhouse-connect>=0.5.7->chromadb)
Using cached pytz-2023.3.post1-py2.py3-none-any.whl.metadata (22 kB)
Collecting zstandard (from clickhouse-connect>=0.5.7->chromadb)
Using cached zstandard-0.22.0-cp312-cp312-macosx_11_0_arm64.whl.metadata (2.9 kB)
Collecting lz4 (from clickhouse-connect>=0.5.7->chromadb)
Using cached lz4-4.3.3-cp312-cp312-macosx_11_0_arm64.whl.metadata (3.7 kB)
Collecting starlette<0.36.0,>=0.35.0 (from fastapi>=0.95.2->chromadb)
Using cached starlette-0.35.1-py3-none-any.whl.metadata (5.8 kB)
Requirement already satisfied: python-dateutil>=2.8.2 in /Users/raj/Library/Python/3.12/lib/python/site-packages (from pandas>=1.3->chromadb) (2.8.2)
Collecting tzdata>=2022.7 (from pandas>=1.3->chromadb)
Using cached tzdata-2023.4-py2.py3-none-any.whl.metadata (1.4 kB)
Requirement already satisfied: six>=1.5 in /Users/raj/Library/Python/3.12/lib/python/site-packages (from posthog>=2.4.0->chromadb) (1.16.0)
Collecting monotonic>=1.5 (from posthog>=2.4.0->chromadb)
Using cached monotonic-1.6-py2.py3-none-any.whl (8.2 kB)
Collecting backoff>=1.10.0 (from posthog>=2.4.0->chromadb)
Using cached backoff-2.2.1-py3-none-any.whl (15 kB)
Requirement already satisfied: annotated-types>=0.4.0 in /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages (from pydantic>=1.9->chromadb) (0.6.0)
Requirement already satisfied: pydantic-core==2.14.6 in /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages (from pydantic>=1.9->chromadb) (2.14.6)
Requirement already satisfied: charset-normalizer<4,>=2 in /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages (from requests>=2.28->chromadb) (3.3.2)
Requirement already satisfied: idna<4,>=2.5 in /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages (from requests>=2.28->chromadb) (3.6)
Collecting transformers<5.0.0,>=4.6.0 (from sentence-transformers>=2.2.2->chromadb)
Using cached transformers-4.36.2-py3-none-any.whl.metadata (126 kB)
Requirement already satisfied: tqdm in /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages (from sentence-transformers>=2.2.2->chromadb) (4.66.1)
INFO: pip is looking at multiple versions of sentence-transformers to determine which version is compatible with other requirements. This could take a while.
Collecting chromadb
Using cached chromadb-0.3.22-py3-none-any.whl (69 kB)
Using cached chromadb-0.3.21-py3-none-any.whl (46 kB)
Using cached chromadb-0.3.20-py3-none-any.whl (46 kB)
Using cached chromadb-0.3.18-py3-none-any.whl (46 kB)
Using cached chromadb-0.3.17-py3-none-any.whl (46 kB)
Using cached chromadb-0.3.16-py3-none-any.whl (46 kB)
Using cached chromadb-0.3.15-py3-none-any.whl (46 kB)
INFO: pip is still looking at multiple versions of sentence-transformers to determine which version is compatible with other requirements. This could take a while.
Using cached chromadb-0.3.14-py3-none-any.whl (45 kB)
Using cached chromadb-0.3.13-py3-none-any.whl (45 kB)
Using cached chromadb-0.3.12-py3-none-any.whl (45 kB)
Using cached chromadb-0.3.11-py3-none-any.whl (41 kB)
Using cached chromadb-0.3.10-py3-none-any.whl (40 kB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.
Using cached chromadb-0.3.8-py3-none-any.whl (40 kB)
Using cached chromadb-0.3.7-py3-none-any.whl (39 kB)
Using cached chromadb-0.3.6-py3-none-any.whl (39 kB)
Using cached chromadb-0.3.5-py3-none-any.whl (38 kB)
Using cached chromadb-0.3.4-py3-none-any.whl (38 kB)
Using cached chromadb-0.3.3-py3-none-any.whl (38 kB)
Using cached chromadb-0.3.2-py3-none-any.whl (37 kB)
Collecting pandas~=1.3 (from chromadb)
Using cached pandas-1.5.3.tar.gz (5.2 MB)
Installing build dependencies ... �[?25ldone
�[?25h Getting requirements to build wheel ... �[?25ldone
�[?25h Preparing metadata (pyproject.toml) ... �[?25ldone
�[?25hCollecting clickhouse-connect~=0.5.7 (from chromadb)
Using cached clickhouse-connect-0.5.25.tar.gz (70 kB)
Installing build dependencies ... �[?25ldone
�[?25h Getting requirements to build wheel ... �[?25ldone
�[?25h Installing backend dependencies ... �[?25ldone
�[?25h Preparing metadata (pyproject.toml) ... �[?25ldone
�[?25hCollecting duckdb~=0.5.1 (from chromadb)
Using cached duckdb-0.5.1.tar.gz (13.5 MB)
Preparing metadata (setup.py) ... �[?25ldone
�[?25hCollecting fastapi~=0.85.1 (from chromadb)
Using cached fastapi-0.85.2-py3-none-any.whl (55 kB)
Collecting uvicorn~=0.18.3 (from uvicorn[standard]=0.18.3->chromadb)
Using cached uvicorn-0.18.3-py3-none-any.whl (57 kB)
Collecting chromadb
Using cached chromadb-0.3.1-py3-none-any.whl (37 kB)
Using cached chromadb-0.3.0-py3-none-any.whl (36 kB)
Using cached chromadb-0.2.0-py3-none-any.whl (36 kB)
Collecting uuid
=1.30 (from chromadb)
Using cached uuid-1.30.tar.gz (5.8 kB)
Preparing metadata (setup.py) ... �[?25ldone
�[?25hCollecting chromadb
Using cached chromadb-0.1.0-py3-none-any.whl (34 kB)
�[31mERROR: Cannot install chromadb==0.1.0, chromadb==0.2.0, chromadb==0.3.0, chromadb==0.3.1, chromadb==0.3.2, chromadb==0.3.25, chromadb==0.3.26, chromadb==0.3.27, chromadb==0.3.29, chromadb==0.4.0, chromadb==0.4.1, chromadb==0.4.10, chromadb==0.4.11, chromadb==0.4.12, chromadb==0.4.13, chromadb==0.4.14, chromadb==0.4.15, chromadb==0.4.16, chromadb==0.4.17, chromadb==0.4.18, chromadb==0.4.19, chromadb==0.4.2, chromadb==0.4.20, chromadb==0.4.21, chromadb==0.4.22, chromadb==0.4.3, chromadb==0.4.4, chromadb==0.4.5, chromadb==0.4.6, chromadb==0.4.7, chromadb==0.4.8 and chromadb==0.4.9 because these package versions have conflicting dependencies.�[0m�[31m
�[0m
The conflict is caused by:
chromadb 0.4.22 depends on onnxruntime>=1.14.1
chromadb 0.4.21 depends on onnxruntime>=1.14.1
chromadb 0.4.20 depends on onnxruntime>=1.14.1
chromadb 0.4.19 depends on onnxruntime>=1.14.1
chromadb 0.4.18 depends on onnxruntime>=1.14.1
chromadb 0.4.17 depends on onnxruntime>=1.14.1
chromadb 0.4.16 depends on onnxruntime>=1.14.1
chromadb 0.4.15 depends on onnxruntime>=1.14.1
chromadb 0.4.14 depends on onnxruntime>=1.14.1
chromadb 0.4.13 depends on onnxruntime>=1.14.1
chromadb 0.4.12 depends on onnxruntime>=1.14.1
chromadb 0.4.11 depends on onnxruntime>=1.14.1
chromadb 0.4.10 depends on onnxruntime>=1.14.1
chromadb 0.4.9 depends on onnxruntime>=1.14.1
chromadb 0.4.8 depends on onnxruntime>=1.14.1
chromadb 0.4.7 depends on onnxruntime>=1.14.1
chromadb 0.4.6 depends on onnxruntime>=1.14.1
chromadb 0.4.5 depends on onnxruntime>=1.14.1
chromadb 0.4.4 depends on onnxruntime>=1.14.1
chromadb 0.4.3 depends on onnxruntime>=1.14.1
chromadb 0.4.2 depends on onnxruntime>=1.14.1
chromadb 0.4.1 depends on onnxruntime>=1.14.1
chromadb 0.4.0 depends on onnxruntime>=1.14.1
chromadb 0.3.29 depends on onnxruntime>=1.14.1
chromadb 0.3.27 depends on onnxruntime>=1.14.1
chromadb 0.3.26 depends on onnxruntime>=1.14.1
chromadb 0.3.25 depends on onnxruntime>=1.14.1
chromadb 0.3.2 depends on numpy~=1.21.6
chromadb 0.3.1 depends on numpy~=1.21.6
chromadb 0.3.0 depends on numpy~=1.21.6
chromadb 0.2.0 depends on numpy~=1.21.6
chromadb 0.1.0 depends on numpy~=1.21.6

To fix this you could try to:

  1. loosen the range of package versions you've specified
  2. remove package versions to allow pip attempt to solve the dependency conflict

�[31mERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts�[0m�[31m
�[0mNote: you may need to restart the kernel to use updated packages.

InvalidRequestError: The model `text-davinci-003` has been deprecated

InvalidRequestError                       Traceback (most recent call last)
Cell In[43], line 2
      1 question = "what did they say about matlab?"
----> 2 compressed_docs = compression_retriever.get_relevant_documents(question)
      3 pretty_print_docs(compressed_docs)

Points an error to this line of code(L4-Retrieval)

InvalidRequestError: The model text-davinci-003 has been deprecated, learn more here: https://platform.openai.com/docs/deprecations

I have use gpt-3.5-turbo-instruct model as per instruction in the notebook

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.