Git Product home page Git Product logo

langchainrb's Introduction

๐Ÿฆœ๏ธ๐Ÿ”— LangChain.rb

โšก Building applications with LLMs through composability โšก

๐Ÿ‘จโ€๐Ÿ’ป๐Ÿ‘ฉโ€๐Ÿ’ป CURRENTLY SEEKING PEOPLE TO FORM THE CORE GROUP OF MAINTAINERS WITH

โš ๏ธ UNDER ACTIVE AND RAPID DEVELOPMENT (MAY BE BUGGY AND UNTESTED)

Tests status Gem Version

Langchain.rb is a library that's an abstraction layer on top many emergent AI, ML and other DS tools. The goal is to abstract complexity and difficult concepts to make building AI/ML-supercharged applications approachable for traditional software engineers.

Installation

Install the gem and add to the application's Gemfile by executing:

$ bundle add langchainrb

If bundler is not being used to manage dependencies, install the gem by executing:

$ gem install langchainrb

Usage

require "langchain"

Supported vector search databases and features:

Database Querying Storage Schema Management Backups Rails Integration
Chroma โœ… โœ… โœ… WIP WIP
Milvus โœ… โœ… โœ… WIP WIP
Pinecone โœ… โœ… โœ… WIP WIP
Qdrant โœ… โœ… โœ… WIP WIP
Weaviate โœ… โœ… โœ… WIP WIP

Using Vector Search Databases ๐Ÿ”

Choose the LLM provider you'll be using (OpenAI or Cohere) and retrieve the API key.

Add gem "weaviate-ruby", "~> 0.8.0" to your Gemfile.

Pick the vector search database you'll be using and instantiate the client:

client = Vectorsearch::Weaviate.new(
    url: ENV["WEAVIATE_URL"],
    api_key: ENV["WEAVIATE_API_KEY"],
    llm: :openai, # or :cohere
    llm_api_key: ENV["OPENAI_API_KEY"]
)

# You can instantiate any other supported vector search database:
client = Vectorsearch::Milvus.new(...) # `gem "milvus", "~> 0.9.0"`
client = Vectorsearch::Qdrant.new(...) # `gem"qdrant-ruby", "~> 0.9.0"`
client = Vectorsearch::Pinecone.new(...) # `gem "pinecone", "~> 0.1.6"`
client = Vectorsearch::Chroma.new(...) # `gem "chroma-db", "~> 0.3.0"`
# Creating the default schema
client.create_default_schema
# Store plain texts in your vector search database
client.add_texts(
    texts: [
        "Begin by preheating your oven to 375ยฐF (190ยฐC). Prepare four boneless, skinless chicken breasts by cutting a pocket into the side of each breast, being careful not to cut all the way through. Season the chicken with salt and pepper to taste. In a large skillet, melt 2 tablespoons of unsalted butter over medium heat. Add 1 small diced onion and 2 minced garlic cloves, and cook until softened, about 3-4 minutes. Add 8 ounces of fresh spinach and cook until wilted, about 3 minutes. Remove the skillet from heat and let the mixture cool slightly.",
        "In a bowl, combine the spinach mixture with 4 ounces of softened cream cheese, 1/4 cup of grated Parmesan cheese, 1/4 cup of shredded mozzarella cheese, and 1/4 teaspoon of red pepper flakes. Mix until well combined. Stuff each chicken breast pocket with an equal amount of the spinach mixture. Seal the pocket with a toothpick if necessary. In the same skillet, heat 1 tablespoon of olive oil over medium-high heat. Add the stuffed chicken breasts and sear on each side for 3-4 minutes, or until golden brown."
    ]
)
# Store the contents of your files in your vector search database
my_pdf = Langchain.root.join("path/to/my.pdf")
my_text = Langchain.root.join("path/to/my.txt")
my_docx = Langchain.root.join("path/to/my.docx")

client.add_data(paths: [my_pdf, my_text, my_docx])
# Retrieve similar documents based on the query string passed in
client.similarity_search(
    query:,
    k:       # number of results to be retrieved
)
# Retrieve similar documents based on the embedding passed in
client.similarity_search_by_vector(
    embedding:,
    k:       # number of results to be retrieved
)
# Q&A-style querying based on the question passed in
client.ask(
    question:
)

Using Standalone LLMs ๐Ÿ—ฃ๏ธ

Add gem "ruby-openai", "~> 4.0.0" to your Gemfile.

OpenAI

openai = LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
openai.embed(text: "foo bar")
openai.complete(prompt: "What is the meaning of life?")

Cohere

Add gem "cohere-ruby", "~> 0.9.3" to your Gemfile.

cohere = LLM::Cohere.new(api_key: ENV["COHERE_API_KEY"])
cohere.embed(text: "foo bar")
cohere.complete(prompt: "What is the meaning of life?")

HuggingFace

Add gem "hugging-face", "~> 0.3.2" to your Gemfile.

cohere = LLM::HuggingFace.new(api_key: ENV["HUGGING_FACE_API_KEY"])

Replicate

Add gem "replicate-ruby", git: "https://github.com/andreibondarev/replicate-ruby.git", branch: "faraday-1.x" to your Gemfile.

cohere = LLM::Replicate.new(api_key: ENV["REPLICATE_API_KEY"])

Using Prompts ๐Ÿ“‹

Prompt Templates

Create a prompt with one input variable:

prompt = Prompt::PromptTemplate.new(template: "Tell me a {adjective} joke.", input_variables: ["adjective"])
prompt.format(adjective: "funny") # "Tell me a funny joke."

Create a prompt with multiple input variables:

prompt = Prompt::PromptTemplate.new(template: "Tell me a {adjective} joke about {content}.", input_variables: ["adjective", "content"])
prompt.format(adjective: "funny", content: "chickens") # "Tell me a funny joke about chickens."

Creating a PromptTemplate using just a prompt and no input_variables:

prompt = Prompt::PromptTemplate.from_template("Tell me a {adjective} joke about {content}.")
prompt.input_variables # ["adjective", "content"]
prompt.format(adjective: "funny", content: "chickens") # "Tell me a funny joke about chickens."

Save prompt template to JSON file:

prompt.save(file_path: "spec/fixtures/prompt/prompt_template.json")

Loading a new prompt template using a JSON file:

prompt = Prompt.load_from_path(file_path: "spec/fixtures/prompt/prompt_template.json")
prompt.input_variables # ["adjective", "content"]

Few Shot Prompt Templates

Create a prompt with a few shot examples:

prompt = Prompt::FewShotPromptTemplate.new(
  prefix: "Write antonyms for the following words.",
  suffix: "Input: {adjective}\nOutput:",
  example_prompt: Prompt::PromptTemplate.new(
    input_variables: ["input", "output"],
    template: "Input: {input}\nOutput: {output}"
  ),
  examples: [
    { "input": "happy", "output": "sad" },
    { "input": "tall", "output": "short" }
  ],
   input_variables: ["adjective"]
)

prompt.format(adjective: "good")

# Write antonyms for the following words.
#
# Input: happy
# Output: sad
#
# Input: tall
# Output: short
#
# Input: good
# Output:

Save prompt template to JSON file:

prompt.save(file_path: "spec/fixtures/prompt/few_shot_prompt_template.json")

Loading a new prompt template using a JSON file:

prompt = Prompt.load_from_path(file_path: "spec/fixtures/prompt/few_shot_prompt_template.json")
prompt.prefix # "Write antonyms for the following words."

Using Agents ๐Ÿค–

Agents are semi-autonomous bots that can respond to user questions and use available to them Tools to provide informed replies. They break down problems into series of steps and define Actions (and Action Inputs) along the way that are executed and fed back to them as additional information. Once an Agent decides that it has the Final Answer it responds with it.

Chain-of-Thought Agent

Add gem "openai-ruby", gem "eqn", and gem "google_search_results" to your Gemfile

agent = Agent::ChainOfThoughtAgent.new(llm: :openai, llm_api_key: ENV["OPENAI_API_KEY"], tools: ['search', 'calculator'])

agent.tools
# => ["search", "calculator"]
agent.run(question: "How many full soccer fields would be needed to cover the distance between NYC and DC in a straight line?")
#=> "Approximately 2,945 soccer fields would be needed to cover the distance between NYC and DC in a straight line."

Demo

May-12-2023 13-09-13

May-12-2023 13-07-45

Available Tools ๐Ÿ› ๏ธ

Name Description ENV Requirements Gem Requirements
"calculator" Useful for getting the result of a math expression gem "eqn", "~> 1.6.5"
"search" A wrapper around Google Search ENV["SERPAPI_API_KEY"] (https://serpapi.com/manage-api-key) gem "google_search_results", "~> 2.0.0"
"wikipedia" Calls Wikipedia API to retrieve the summary gem "wikipedia-client", "~> 1.17.0"

Loaders ๐Ÿšš

Need to read data from various sources? Load it up.

Name Class Gem Requirements
docx Loaders::Docx gem "docx", branch: "master", git: "https://github.com/ruby-docx/docx.git"
pdf Loaders::PDF gem "pdf-reader", "~> 1.4"
text Loaders::Text
html Loaders::HTML gem "nokogiri", "~> 1.13"

Examples

Additional examples available: /examples

Logging

LangChain.rb uses standard logging mechanisms and defaults to :debug level. Most messages are at info level, but we will add debug or warn statements as needed. To show all log messages:

Langchain.logger.level = :info

Development

  1. git clone https://github.com/andreibondarev/langchainrb.git
  2. cp .env.example .env, then fill out the environment variables in .env
  3. rspec spec/ to ensure that the tests pass
  4. bin/console to load the gem in a REPL session. Feel free to add your own instances of LLMs, Tools, Agents, etc. and experiment with them.

Core Contributors

Andrei Bondarev

Honorary Contributors

Andrei Bondarev Rafael Figueiredo Ricky Chilcott

(Criteria for becoming an Honorary Contributor or Core Contributor is pending...)

Contributing

Bug reports and pull requests are welcome on GitHub at https://github.com/andreibondarev/langchain.

License

The gem is available as open source under the terms of the MIT License.

langchainrb's People

Contributors

andreibondarev avatar rickychilcott avatar alchaplinsky avatar rafaelqfigueiredo avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.