Git Product home page Git Product logo

langfuse-python's Introduction

langfuse_logo_1024

Langfuse: Open Source LLM Observability & Engineering Platform

Debug and improve your LLM app

LLM Observability, Prompt Management, LLM Evaluations, Datasets, LLM Metrics and Prompt Playground

Langfuse uses Github Discussions for Support and Feature Requests.
We're hiring. Join us in Backend Engineering, Product Engineering, and Developer Relations.

MIT License Y Combinator W23 Docker Image langfuse npm package langfuse Python package on PyPi

Langfuse Overview

Unmute video for voice-over

langfuse-overview-3min.mp4

Develop

Monitor

Test

  • Experiments: Track and test app behaviour before deploying a new version

Get started

Langfuse Cloud

Managed deployment by the Langfuse team, generous free-tier (hobby plan), no credit card required.

» Langfuse Cloud

Self-Hosting Open Source LLM Observability with Langfuse

Localhost (docker)

# Clone repository
git clone https://github.com/langfuse/langfuse.git
cd langfuse

# Run server and database
docker compose up -d

→ Learn more about deploying locally

Self-host (docker)

Langfuse is simple to self-host and keep updated. It currently requires only a single docker container. → Self Hosting Instructions

Templated deployments: Railway, GCP Cloud Run, AWS Fargate, Kubernetes and others

Get Started

API Keys

You need a Langfuse public and secret key to get started. Sign up here and find them in your project settings.

Ingesting Data · Instrumenting Your Application · LLM Observability with Langfuse

Note: We recommend using our fully async, typed SDKs that allow you to instrument any LLM application with any underlying model. They are available in Python (Decorators) & JS/TS. The SDKs will always be the most fully featured and stable way to ingest data into Langfuse.

You may want to use another integration to get started quickly or implement a use case that we do not yet support. However, we recommend to migrate to the Langfuse SDKs over time to ensure performance and stability.

See the → Quickstart to integrate Langfuse.

LLM Observability Integrations

Integration Supports Description
SDK Python, JS/TS Manual instrumentation using the SDKs for full flexibility.
OpenAI Python, JS/TS Automated instrumentation using drop-in replacement of OpenAI SDK.
Langchain Python, JS/TS Automated instrumentation by passing callback handler to Langchain application.
LlamaIndex Python Automated instrumentation via LlamaIndex callback system.
Haystack Python Automated instrumentation via Haystack content tracing system.
LiteLLM Python, JS/TS (proxy only) Use any LLM as a drop in replacement for GPT. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs).
API Directly call the public API. OpenAPI spec available.

Packages that integrate with Langfuse:

Name Description
Instructor Library to get structured LLM outputs (JSON, Pydantic)
Mirascope Python toolkit for building LLM applications.
AI SDK by Vercel Typescript SDK that makes streaming LLM outputs super easy.
Flowise JS/TS no-code builder for customized LLM flows.
Langflow Python-based UI for LangChain, designed with react-flow to provide an effortless way to experiment and prototype flows.

Questions and feedback

Ideas and roadmap

Support and feedback

In order of preference the best way to communicate with us:

Contributing to Langfuse

  • Vote on Ideas
  • Raise and comment on Issues
  • Open a PR - see CONTRIBUTING.md for details on how to setup a development environment.

License

This repository is MIT licensed, except for the ee folders. See LICENSE and docs for more details.

Misc

GET API to export your data

GET routes to use data in downstream applications (e.g. embedded analytics). You can also access them conveniently via the SDKs (docs).

Security & Privacy

We take data security and privacy seriously. Please refer to our Security and Privacy page for more information.

Telemetry

By default, Langfuse automatically reports basic usage statistics of self-hosted instances to a centralized server (PostHog).

This helps us to:

  1. Understand how Langfuse is used and improve the most relevant features.
  2. Track overall usage for internal and external (e.g. fundraising) reporting.

None of the data is shared with third parties and does not include any sensitive information. We want to be super transparent about this and you can find the exact data we collect here.

You can opt-out by setting TELEMETRY_ENABLED=false.

langfuse-python's People

Contributors

bell-steven avatar brandonkzw avatar christho23 avatar davidlms avatar dependabot[bot] avatar dev-khant avatar hassiebp avatar hubert-springbok avatar hugomichard avatar jan-kubica avatar kobrinartem avatar marcklingen avatar marliessophie avatar maxdeichmann avatar noble-varghese avatar richardkruemmel avatar rohan-mehta avatar rubms avatar samyxdev avatar singhcoder avatar yigitbey avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

langfuse-python's Issues

Host on CallbackHandler should be optional

from langfuse.callback import CallbackHandler
handler = CallbackHandler(PUBLIC_KEY, SECRET_KEY)

Currently need to manually set host=None as positional argument is required

Python SDK not compatible with Pydantic 2.0

poetry add langfuse           
The currently activated Python version 3.10.10 is not supported by the project (^3.11).
Trying to find and use a compatible version. 
Using python3 (3.11.0)
Using version ^1.1.2 for langfuse

Updating dependencies
Resolving dependencies... (0.7s)

Because no versions of pydantic-settings match >2.0.2,<2.0.3 || >2.0.3,<3.0.0
 and pydantic-settings (2.0.2) depends on pydantic (>=2.0.1), pydantic-settings (>=2.0.2,<2.0.3 || >2.0.3,<3.0.0) requires pydantic (>=2.0.1).
And because pydantic-settings (2.0.3) depends on pydantic (>=2.0.1), pydantic-settings (>=2.0.2,<3.0.0) requires pydantic (>=2.0.1).
Because no versions of langfuse match >1.1.2,<2.0.0
 and langfuse (1.1.2) depends on pydantic (>=1.10.7,<2.0), langfuse (>=1.1.2,<2.0.0) requires pydantic (>=1.10.7,<2.0).
Thus, langfuse (>=1.1.2,<2.0.0) is incompatible with pydantic-settings (>=2.0.2,<3.0.0).
So, because my-llm-app depends on both pydantic-settings (^2.0.2) and langfuse (^1.1.2), version solving failed.

[Langchain Integration] Support HuggingFaceHub as LLM

A user got the following error when using the Langchain integration with HuggingFaceHub for LLMs.

ERROR:root:'model_name'
ERROR:root:run not found

Steps

  • investigate issue to find root cause
  • implement fix
  • Add additional tests to the langchain test suite

The user's implementation


def initialize_huggingface_llm(prompt: PromptTemplate, temperature: float, max_length: int) -> LLMChain:
    repo_id = "google/flan-t5-xxl"

    # Experiment with the max_length parameter and temperature
    llm = HuggingFaceHub(
        repo_id=repo_id, model_kwargs={"temperature": temperature, "max_length": max_length}
    )
    return LLMChain(prompt=prompt, llm=llm)

def generate_prompt() -> PromptTemplate:
    # You can play around with the prompt, see the results change if you make small changes to the prompt
    template = """Given the name of the country, give the languages that are spoken in that country. 
    Start with the official languages of the country and continue with the other languages of that country.
    Country: {country}?
    Languages: 
    """

    return PromptTemplate(template=template, input_variables=["country"])
if __name__ == '__main__':
    load_dotenv()

    handler = CallbackHandler(os.getenv('LANGFUSE_PUBLIC_KEY'),
                              os.getenv('LANGFUSE_SECRET_KEY'),
                              os.getenv('LANGFUSE_HOST'))

    # Try other values to see impact on results
    country = "belgium"
    country_max_length = 100
    country_temperature = 0.1

    country_prompt = generate_prompt()

    hugging_chain = initialize_huggingface_llm(prompt=country_prompt,
                                               temperature=country_temperature,
                                               max_length=country_max_length)
    
    print("HuggingFace")
    print(hugging_chain.run(country, callbacks=[handler]))

Testing with local Langfuse server

Context
Currently tests depend on instance of langfuse server and require HOST, LF_PK and LF_SK environment secrets.

Goal
tests of the SDK should not require any secrets to enable safely running them on forks of this project (except for E2E test that use external APIs, e.g. OpenAI, Huggingface Hub)

Potential solution
Run dockerized langfuse (langfuse/langfuse) in CI, acc to instructions here: https://langfuse.com/docs/deployment/local

Blocked by
Need to add db seeder to langfuse/langfuse to create default user, project, public key, secret key

Add end() method to spans and generations

Users want to be able to quickly end spans and generations.

generation = langfuse.generation(...)
generation.end()

  • For StatefulGenerationClient and StatefulSpanClient, add end() methods, send endTime to the Langfuse server. Under the hood, they can use the existing update functions.
  • Create test coverage for both of the functions

bug: `meta.update()` fails when `meta = None`

This was incorrectly filed against the langfuse repo first (see langfuse/langfuse#1775)

Describe the bug

I believe this Commit da264a9 from last week has introduced a bug at https://github.com/langfuse/langfuse-python/blame/main/langfuse/callback/langchain.py#L722 by allowing returning None (edit: it seems the version before the commit has also allowed returning None, so it is interesting why I only see this now), which then fails at https://github.com/langfuse/langfuse-python/blame/main/langfuse/callback/langchain.py#L445, as None doesn't have an update() method

To reproduce

Run a callback with no metadata or tags and you will hit an error:

File "/..../lib/python3.10/site-packages/langfuse/callback/langchain.py", line 443, in on_tool_start
    meta.update(
    └ None

Additional information

No response

Missing `backoff` dependency

backoff is missing when installing langfuse on a fresh environment.

sh-3.2$ python3 -m venv env
sh-3.2$ source env/bin/activate
(env) sh-3.2$ python3 -V
Python 3.11.4
(env) sh-3.2$ python3 -m pip install langfuse
Collecting langfuse
  Using cached langfuse-1.0.18-py3-none-any.whl (45 kB)
Collecting ... (skipping a few lines here)
Installing collected packages: pytz, urllib3, typing-extensions, tenacity, sniffio, six, PyYAML, packaging, numpy, mypy-extensions, multidict, idna, h11, frozenlist, charset-normalizer, certifi, attrs, async-timeout, yarl, typing-inspect, SQLAlchemy, requests, python-dateutil, pydantic, numexpr, marshmallow, anyio, aiosignal, langsmith, httpcore, dataclasses-json, aiohttp, langchain, httpx, langfuse
Successfully installed PyYAML-6.0.1 SQLAlchemy-2.0.20 aiohttp-3.8.5 aiosignal-1.3.1 anyio-4.0.0 async-timeout-4.0.3 attrs-23.1.0 certifi-2023.7.22 charset-normalizer-3.2.0 dataclasses-json-0.5.14 frozenlist-1.4.0 h11-0.14.0 httpcore-0.17.3 httpx-0.24.1 idna-3.4 langchain-0.0.286 langfuse-1.0.18 langsmith-0.0.35 marshmallow-3.20.1 multidict-6.0.4 mypy-extensions-1.0.0 numexpr-2.8.6 numpy-1.25.2 packaging-23.1 pydantic-1.10.12 python-dateutil-2.8.2 pytz-2023.3.post1 requests-2.31.0 six-1.16.0 sniffio-1.3.0 tenacity-8.2.3 typing-extensions-4.7.1 typing-inspect-0.9.0 urllib3-2.0.4 yarl-1.9.2
(env) sh-3.2$ python3 -c "from langfuse.model import CreateTrace"
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/Users/alex/env/lib/python3.11/site-packages/langfuse/__init__.py", line 1, in <module>
    from .client import Langfuse
  File "/Users/alex/env/lib/python3.11/site-packages/langfuse/client.py", line 28, in <module>
    from langfuse.task_manager import TaskManager
  File "/Users/alex/env/lib/python3.11/site-packages/langfuse/task_manager.py", line 8, in <module>
    import backoff
ModuleNotFoundError: No module named 'backoff'
(env) sh-3.2$ python3 -m pip install backoff
Collecting backoff
  Using cached backoff-2.2.1-py3-none-any.whl (15 kB)
Installing collected packages: backoff
Successfully installed backoff-2.2.1
(env) sh-3.2$ python3 -c "from langfuse.model import CreateTrace"
(env) sh-3.2$

Amazon Bedrock and Authropic Claude issue with indenting model_name

How to set model name so that LangFuse can get it. I have errors:

'model_name'
Traceback (most recent call last):
File "/opt/conda/lib/python3.10/site-packages/langfuse/callback.py", line 479, in __on_llm_action
model_name = kwargs["invocation_params"]["model_name"]
KeyError: 'model_name'
run not found
Traceback (most recent call last):
File "/opt/conda/lib/python3.10/site-packages/langfuse/callback.py", line 541, in on_llm_end
raise Exception("run not found")
Exception: run not found

Failure to format api errors

I'm having trouble trying out langfuse due to some error formatting bad status codes in this block:

if 200 <= _response.status_code < 300:
return pydantic.parse_obj_as(Trace, _response.json()) # type: ignore
if _response.status_code == 400:
raise Error(pydantic.parse_obj_as(str, _response.json())) # type: ignore
if _response.status_code == 401:
raise UnauthorizedError(pydantic.parse_obj_as(str, _response.json())) # type: ignore
if _response.status_code == 403:
raise AccessDeniedError(pydantic.parse_obj_as(str, _response.json())) # type: ignore
if _response.status_code == 405:
raise MethodNotAllowedError(pydantic.parse_obj_as(str, _response.json())) # type: ignore
if _response.status_code == 404:
raise NotFoundError(pydantic.parse_obj_as(str, _response.json())) # type: ignore

  1. httpx.Response.json() surprisingly returns a dict object, not str (meanwhile, it's typed as Any)

https://github.com/encode/httpx/blob/053bc57c3799801ff11273dd393cb0715e63ecf9/httpx/_models.py#L756

  1. This causes pydantic.parse_obj_as(str, _) to fail:
pydantic.error_wrappers.ValidationError: 1 validation error for ParsingModel[str]
__root__
  str type expected (type=type_error.str)

It looks like this pattern is used a lot through the python sdk, I found 110 instances of "parse_obj_as(str".

Replace dependency on langchain with langchain-core

langchain has been split into several slim packages, including langchain-core, langchain-<aws, ...>. Users, especially who are using LCEL, don't need to install the whole langchain package in their working environment.

However, the langfuse CallbackHanlder still takes a hard dependency on langchain, instead of langchain-core. See https://github.com/langfuse/langfuse-python/blob/main/langfuse/callback/langchain.py#L7

Modules here could be imported from langchain_core.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.