Git Product home page Git Product logo

raytan / langflow Goto Github PK

View Code? Open in Web Editor NEW

This project forked from langflow-ai/langflow

0.0 0.0 0.0 142.35 MB

⛓️ Langflow is a UI for LangChain, designed with react-flow to provide an effortless way to experiment and prototype flows.

Home Page: http://www.logspace.ai

License: MIT License

Shell 0.46% JavaScript 17.44% Python 42.66% TypeScript 35.87% CSS 3.00% Makefile 0.26% HTML 0.05% Dockerfile 0.26%

langflow's Introduction

⛓️ Langflow

~ An effortless way to experiment and prototype LangChain pipelines ~

GitHub Contributors GitHub Last Commit GitHub Issues GitHub Pull Requests Github License

Discord Server HuggingFace Spaces

Table of Contents

📦 Installation

Locally

You can install Langflow from pip:

# This installs the package without dependencies for local models
pip install langflow

To use local models (e.g llama-cpp-python) run:

pip install langflow[local]

This will install the following dependencies:

You can still use models from projects like LocalAI

Next, run:

python -m langflow

or

langflow # or langflow --help

HuggingFace Spaces

You can also check it out on HuggingFace Spaces and run it in your browser! You can even clone it and have your own copy of Langflow to play with.

🖥️ Command Line Interface (CLI)

Langflow provides a command-line interface (CLI) for easy management and configuration.

Usage

You can run the Langflow using the following command:

langflow [OPTIONS]

Each option is detailed below:

  • --help: Displays all available options.
  • --host: Defines the host to bind the server to. Can be set using the LANGFLOW_HOST environment variable. The default is 127.0.0.1.
  • --workers: Sets the number of worker processes. Can be set using the LANGFLOW_WORKERS environment variable. The default is 1.
  • --timeout: Sets the worker timeout in seconds. The default is 60.
  • --port: Sets the port to listen on. Can be set using the LANGFLOW_PORT environment variable. The default is 7860.
  • --config: Defines the path to the configuration file. The default is config.yaml.
  • --env-file: Specifies the path to the .env file containing environment variables. The default is .env.
  • --log-level: Defines the logging level. Can be set using the LANGFLOW_LOG_LEVEL environment variable. The default is critical.
  • --components-path: Specifies the path to the directory containing custom components. Can be set using the LANGFLOW_COMPONENTS_PATH environment variable. The default is langflow/components.
  • --log-file: Specifies the path to the log file. Can be set using the LANGFLOW_LOG_FILE environment variable. The default is logs/langflow.log.
  • --cache: Selects the type of cache to use. Options are InMemoryCache and SQLiteCache. Can be set using the LANGFLOW_LANGCHAIN_CACHE environment variable. The default is SQLiteCache.
  • --jcloud/--no-jcloud: Toggles the option to deploy on Jina AI Cloud. The default is no-jcloud.
  • --dev/--no-dev: Toggles the development mode. The default is no-dev.
  • --path: Specifies the path to the frontend directory containing build files. This option is for development purposes only. Can be set using the LANGFLOW_FRONTEND_PATH environment variable.
  • --open-browser/--no-open-browser: Toggles the option to open the browser after starting the server. Can be set using the LANGFLOW_OPEN_BROWSER environment variable. The default is open-browser.
  • --remove-api-keys/--no-remove-api-keys: Toggles the option to remove API keys from the projects saved in the database. Can be set using the LANGFLOW_REMOVE_API_KEYS environment variable. The default is no-remove-api-keys.
  • --install-completion [bash|zsh|fish|powershell|pwsh]: Installs completion for the specified shell.
  • --show-completion [bash|zsh|fish|powershell|pwsh]: Shows completion for the specified shell, allowing you to copy it or customize the installation.

Environment Variables

You can configure many of the CLI options using environment variables. These can be exported in your operating system or added to a .env file and loaded using the --env-file option.

A sample .env file named .env.example is included with the project. Copy this file to a new file named .env and replace the example values with your actual settings. If you're setting values in both your OS and the .env file, the .env settings will take precedence.

Deployment

Deploy Langflow on Google Cloud Platform

Follow our step-by-step guide to deploy Langflow on Google Cloud Platform (GCP) using Google Cloud Shell. The guide is available in the Langflow in Google Cloud Platform document.

Alternatively, click the "Open in Cloud Shell" button below to launch Google Cloud Shell, clone the Langflow repository, and start an interactive tutorial that will guide you through the process of setting up the necessary resources and deploying Langflow on your GCP project.

Open in Cloud Shell

Deploy Langflow on Jina AI Cloud

Langflow integrates with langchain-serve to provide a one-command deployment to Jina AI Cloud.

Start by installing langchain-serve with

pip install langflow[deploy]
# or
pip install -U langchain-serve

Then, run:

langflow --jcloud
🎉 Langflow server successfully deployed on Jina AI Cloud 🎉
🔗 Click on the link to open the server (please allow ~1-2 minutes for the server to startup): https://<your-app>.wolf.jina.ai/
📖 Read more about managing the server: https://github.com/jina-ai/langchain-serve
Show complete (example) output
  🚀 Deploying Langflow server on Jina AI Cloud
  ╭───────────────────────── 🎉 Flow is available! ──────────────────────────╮
  │                                                                          │
  │   ID                    langflow-e3dd8820ec                              │
  │   Gateway (Websocket)   wss://langflow-e3dd8820ec.wolf.jina.ai           │
  │   Dashboard             https://dashboard.wolf.jina.ai/flow/e3dd8820ec   │
  │                                                                          │
  ╰──────────────────────────────────────────────────────────────────────────╯
  ╭──────────────┬──────────────────────────────────────────────────────────────────────────────╮
  │ App ID       │                     langflow-e3dd8820ec                                      │
  ├──────────────┼──────────────────────────────────────────────────────────────────────────────┤
  │ Phase        │                            Serving                                           │
  ├──────────────┼──────────────────────────────────────────────────────────────────────────────┤
  │ Endpoint     │          wss://langflow-e3dd8820ec.wolf.jina.ai                              │
  ├──────────────┼──────────────────────────────────────────────────────────────────────────────┤
  │ App logs     │                  dashboards.wolf.jina.ai                                     │
  ├──────────────┼──────────────────────────────────────────────────────────────────────────────┤
  │ Swagger UI   │          https://langflow-e3dd8820ec.wolf.jina.ai/docs                       │
  ├──────────────┼──────────────────────────────────────────────────────────────────────────────┤
  │ OpenAPI JSON │        https://langflow-e3dd8820ec.wolf.jina.ai/openapi.json                 │
  ╰──────────────┴──────────────────────────────────────────────────────────────────────────────╯

  🎉 Langflow server successfully deployed on Jina AI Cloud 🎉
  🔗 Click on the link to open the server (please allow ~1-2 minutes for the server to startup): https://langflow-e3dd8820ec.wolf.jina.ai/
  📖 Read more about managing the server: https://github.com/jina-ai/langchain-serve

API Usage

You can use Langflow directly on your browser, or use the API endpoints on Jina AI Cloud to interact with the server.

Show API usage (with python)
import requests

BASE_API_URL = "https://langflow-e3dd8820ec.wolf.jina.ai/api/v1/predict"
FLOW_ID = "864c4f98-2e59-468b-8e13-79cd8da07468"
# You can tweak the flow by adding a tweaks dictionary
# e.g {"OpenAI-XXXXX": {"model_name": "gpt-4"}}
TWEAKS = {
"ChatOpenAI-g4jEr": {},
"ConversationChain-UidfJ": {}
}

def run_flow(message: str, flow_id: str, tweaks: dict = None) -> dict:
  """
  Run a flow with a given message and optional tweaks.

  :param message: The message to send to the flow
  :param flow_id: The ID of the flow to run
  :param tweaks: Optional tweaks to customize the flow
  :return: The JSON response from the flow
  """
  api_url = f"{BASE_API_URL}/{flow_id}"

  payload = {"message": message}

  if tweaks:
      payload["tweaks"] = tweaks

  response = requests.post(api_url, json=payload)
  return response.json()

# Setup any tweaks you want to apply to the flow
print(run_flow("Your message", flow_id=FLOW_ID, tweaks=TWEAKS))
{
  "result": "Great choice! Bangalore in the 1920s was a vibrant city with a rich cultural and political scene. Here are some suggestions for things to see and do:\n\n1. Visit the Bangalore Palace - built in 1887, this stunning palace is a perfect example of Tudor-style architecture. It was home to the Maharaja of Mysore and is now open to the public.\n\n2. Attend a performance at the Ravindra Kalakshetra - this cultural center was built in the 1920s and is still a popular venue for music and dance performances.\n\n3. Explore the neighborhoods of Basavanagudi and Malleswaram - both of these areas have retained much of their old-world charm and are great places to walk around and soak up the atmosphere.\n\n4. Check out the Bangalore Club - founded in 1868, this exclusive social club was a favorite haunt of the British expat community in the 1920s.\n\n5. Attend a meeting of the Indian National Congress - founded in 1885, the INC was a major force in the Indian independence movement and held many meetings and rallies in Bangalore in the 1920s.\n\nHope you enjoy your trip to 1920s Bangalore!"
}

Read more about resource customization, cost, and management of Langflow apps on Jina AI Cloud in the langchain-serve repository.

Deploy on Railway

Deploy on Railway

Deploy on Render

Deploy to Render

🎨 Creating Flows

Creating flows with Langflow is easy. Simply drag sidebar components onto the canvas and connect them together to create your pipeline. Langflow provides a range of LangChain components to choose from, including LLMs, prompt serializers, agents, and chains.

Explore by editing prompt parameters, link chains and agents, track an agent's thought process, and export your flow.

Once you're done, you can export your flow as a JSON file to use with LangChain. To do so, click the "Export" button in the top right corner of the canvas, then in Python, you can load the flow with:

from langflow import load_flow_from_json

flow = load_flow_from_json("path/to/flow.json")
# Now you can use it like any chain
flow("Hey, have you heard of Langflow?")

👋 Contributing

We welcome contributions from developers of all levels to our open-source project on GitHub. If you'd like to contribute, please check our contributing guidelines and help make Langflow more accessible.

Join our Discord server to ask questions, make suggestions and showcase your projects! 🦾

Star History Chart

📄 License

Langflow is released under the MIT License. See the LICENSE file for details.

langflow's People

Contributors

ogabrielluiz avatar anovazzi1 avatar lucaseduoli avatar cristhianzl avatar ibiscp avatar gustavoschaedler avatar rodrigosnader avatar genome21 avatar igorrcarvalho avatar deepankarm avatar carlosrcoelho avatar alexandrehptavares avatar gabfr avatar queenvictoria avatar yoazmenda avatar aaronsteers avatar nick-choudhary avatar dependabot[bot] avatar codeaunt avatar cayal avatar pepperyan avatar bobsburgers avatar bad-boy-ri-ri avatar artdent avatar eltociear avatar jacobhrussell avatar zac-li avatar scriptersugar avatar sudo-update avatar gsaivinay avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.