Git Product home page Git Product logo

comet-llm's Introduction

cometLLM

PyPI version GitHub cometLLM Documentation Downloads

CometLLM is a tool to log and visualize your LLM prompts and chains. Use CometLLM to identify effective prompt strategies, streamline your troubleshooting, and ensure reproducible workflows!

CometLLM Preview

⚡️ Quickstart

Install comet_llm Python library with pip:

pip install comet_llm

If you don't have already, create your free Comet account and grab your API Key from the account settings page.

Now you are all set to log your first prompt and response:

import comet_llm

comet_llm.log_prompt(
    prompt="What is your name?",
    output=" My name is Alex.",
    api_key="<YOUR_COMET_API_KEY>",
)

🎯 Features

  • Log your prompts and responses, including prompt template, variables, timestamps and duration and any metadata that you need.
  • Visualize your prompts and responses in the UI.
  • Log your chain execution down to the level of granularity that you need.
  • Visualize your chain execution in the UI.
  • Diff your prompts and chain execution in the UI.

👀 Examples

To log a single LLM call as an individual prompt, use comet_llm.log_prompt. If you require more granularity, you can log a chain of executions that may include more than one LLM call, context retrieval, or data pre- or post-processing with comet_llm.start_chain.

Log a full prompt and response

import comet_llm

comet_llm.log_prompt(
    prompt="Answer the question and if the question can't be answered, say \"I don't know\"\n\n---\n\nQuestion: What is your name?\nAnswer:",
    prompt_template="Answer the question and if the question can't be answered, say \"I don't know\"\n\n---\n\nQuestion: {{question}}?\nAnswer:",
    prompt_template_variables={"question": "What is your name?"},
    metadata= {
        "usage.prompt_tokens": 7,
        "usage.completion_tokens": 5,
        "usage.total_tokens": 12,
    },
    output=" My name is Alex.",
    duration=16.598,
)

Read the full documentation for more details about logging a prompt.

Log a LLM chain

from comet_llm import Span, end_chain, start_chain
import datetime
from time import sleep


def retrieve_context(user_question):
    if "open" in user_question:
        return "Opening hours: 08:00 to 17:00 all days"


def llm_answering(user_question, current_time, context):
    prompt_template = """You are a helpful chatbot. You have access to the following context:
    {context}
    The current time is: {current_time}
    Analyze the following user question and decide if you can answer it, if the question can't be answered, say \"I don't know\":
    {user_question}
    """

    prompt = prompt_template.format(
        user_question=user_question, current_time=current_time, context=context
    )

    with Span(
        category="llm-call",
        inputs={"prompt_template": prompt_template, "prompt": prompt},
    ) as span:
        # Call your LLM model here
        sleep(0.1)
        result = "Yes we are currently open"
        usage = {"prompt_tokens": 52, "completion_tokens": 12, "total_tokens": 64}

        span.set_outputs(outputs={"result": result}, metadata={"usage": usage})

    return result


def main(user_question, current_time):
    start_chain(inputs={"user_question": user_question, "current_time": current_time})

    with Span(
        category="context-retrieval",
        name="Retrieve Context",
        inputs={"user_question": user_question},
    ) as span:
        context = retrieve_context(user_question)

        span.set_outputs(outputs={"context": context})

    with Span(
        category="llm-reasoning",
        inputs={
            "user_question": user_question,
            "current_time": current_time,
            "context": context,
        },
    ) as span:
        result = llm_answering(user_question, current_time, context)

        span.set_outputs(outputs={"result": result})

    end_chain(outputs={"result": result})


main("Are you open?", str(datetime.datetime.now().time()))

Read the full documentation for more details about logging a chain.

⚙️ Configuration

You can configure your Comet credentials and where you are logging data to:

Name Python parameter name Environment variable name
Comet API KEY api_key COMET_API_KEY
Comet Workspace name workspace COMET_WORKSPACE
Comet Project name project COMET_PROJECT_NAME

📝 License

Copyright (c) Comet 2023-present. cometLLM is free and open-source software licensed under the MIT License.

comet-llm's People

Contributors

alexkuzmik avatar lothiraldan avatar anmorgan24 avatar sherpan avatar omarsar avatar haarcuba avatar jacques-comet avatar

Stargazers

Zac Peksa avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.