Git Product home page Git Product logo

undertone0809 / promptulate Goto Github PK

View Code? Open in Web Editor NEW
261.0 3.0 29.0 8.27 MB

🚀Large language model automation and Autonomous Language Agents development framework. Build your LLM Agent Application in a pythonic way!

Home Page: https://www.promptulate.cn/

License: Apache License 2.0

Python 75.09% Makefile 0.98% Jupyter Notebook 23.93%
chatgpt gpt-4 langchain llm python prompt prompt-engineering pne promptulate agent

promptulate's Introduction

Promptulate

English 中文

News

  • 2024.5.19 Now you can use import pne to import the promptulate package. ref: https://github.com/Undertone0809/pne
  • 2024.5.14 OpenAI launched their newest "omni" model, offering improved speed and pricing compared to turbo.

Overview

Promptulate is an AI Agent application development framework crafted by Cogit Lab, which offers developers an extremely concise and efficient way to build Agent applications through a Pythonic development paradigm. The core philosophy of Promptulate is to borrow and integrate the wisdom of the open-source community, incorporating the highlights of various development frameworks to lower the barrier to entry and unify the consensus among developers. With Promptulate, you can manipulate components like LLM, Agent, Tool, RAG, etc., with the most succinct code, as most tasks can be easily completed with just a few lines of code. 🚀

💡 Features

  • 🐍 Pythonic Code Style: Embraces the habits of Python developers, providing a Pythonic SDK calling approach, putting everything within your grasp with just one pne.chat function to encapsulate all essential functionalities.
  • 🧠 Model Compatibility: Supports nearly all types of large models on the market and allows for easy customization to meet specific needs.
  • 🕵️‍♂️ Diverse Agents: Offers various types of Agents, such as WebAgent, ToolAgent, CodeAgent, etc., capable of planning, reasoning, and acting to handle complex problems. Atomize the Planner and other components to simplify the development process.
  • 🔗 Low-Cost Integration: Effortlessly integrates tools from different frameworks like LangChain, significantly reducing integration costs.
  • 🔨 Functions as Tools: Converts any Python function directly into a tool usable by Agents, simplifying the tool creation and usage process.
  • 🪝 Lifecycle and Hooks: Provides a wealth of Hooks and comprehensive lifecycle management, allowing the insertion of custom code at various stages of Agents, Tools, and LLMs.
  • 💻 Terminal Integration: Easily integrates application terminals, with built-in client support, offering rapid debugging capabilities for prompts.
  • ⏱️ Prompt Caching: Offers a caching mechanism for LLM Prompts to reduce repetitive work and enhance development efficiency.
  • 🤖 Powerful OpenAI Wrapper: With pne, you no longer need to use the openai sdk, the core functions can be replaced with pne.chat, and provides enhanced features to simplify development difficulty.

The following diagram shows the core architecture of promptulate:

promptulate-architecture

The core concept of Promptulate is we hope to provide a simple, pythonic and efficient way to build AI applications, which means you don't need to spend a lot of time learning the framework. We hope to use pne.chat() to do most of the works, and you can easily build any AI application with just a few lines of code.

Below, pne stands for Promptulate, which is the nickname for Promptulate. The p and e represent the beginning and end of Promptulate, respectively, and n stands for 9, which is a shorthand for the nine letters between p and e.

Supported Base Models

Promptulate integrates the capabilities of litellm, supporting nearly all types of large models on the market, including but not limited to the following models:

Provider Completion Streaming Async Completion Async Streaming Async Embedding Async Image Generation
openai
azure
aws - sagemaker
aws - bedrock
google - vertex_ai [Gemini]
google - palm
google AI Studio - gemini
mistral ai api
cloudflare AI Workers
cohere
anthropic
huggingface
replicate
together_ai
openrouter
ai21
baseten
vllm
nlp_cloud
aleph alpha
petals
ollama
deepinfra
perplexity-ai
Groq AI
anyscale
voyage ai
xinference [Xorbits Inference]

The powerful model support of pne allows you to easily build any third-party model calls.

Now let's see how to run local llama3 models of ollama with pne.

import promptulate as pne

resp: str = pne.chat(model="ollama/llama2", messages=[{"content": "Hello, how are you?", "role": "user"}])

🌟 2024.5.14 OpenAI launched their newest "omni" model, offering improved speed and pricing compared to turbo.

You can use the available multimodal capabilities of it in any of your promptulate applications!

import promptulate as pne

messages=[
    {
        "role": "user",
        "content": [
            {"type": "text", "text": "What's in this image?"},
            {
                "type": "image_url",
                "image_url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg",
            },
        ],
    }
]
resp = pne.chat(model="gpt-4o", messages=messages)
print(resp)

Use provider/model_name to call the model, and you can easily build any third-party model calls.

For more models, please visit the litellm documentation.

You can also see how to use pne.chat() in the Getting Started/Official Documentation.

📗 Related Documentation

📝 Examples

🛠 Quick Start

  • Open the terminal and enter the following command to install the framework:
pip install -U pne

Note: Your Python version should be 3.8 or higher.

Even though pne provides many modules, in 90% of LLM application development scenarios, you only need to use the pne.chat () function, so you only need to start with chat to understand the use of pne, and when you need to use additional modules, you can learn more about the features and use of other modules.

Chat like OpenAI

You can use pne.chat() to chat like openai. OpenAI chat API document: https://platform.openai.com/docs/api-reference/chat

import promptulate as pne

messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Who are you?"},
]
response: str = pne.chat(messages=messages, model="gpt-4-turbo")
print(response)

Replace the OpenAI SDK

Many third party libraries can use OpenAI SDK calls their models, such as Deepseek. In pne, you can directly use pne.chat() function to call these models, It does not need to use the OpenAI SDK, and provides enhanced features to simplify the development difficulty. Use the openai/xxx provider prefix in the model, and you can use the OpenAI model to make calls.

import os
import promptulate as pne

os.environ["DEEPSEEK_API_KEY"] = "your api key"

messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "How are you?"},
]
response = pne.chat(
    messages=messages,
    model="openai/deepseek-chat",
)
print(response)

Structured Output

Robust output formatting is a fundamental basis for LLM application development. We hope that LLMs can return stable data. With pne, you can easily perform formatted output. In the following example, we use Pydantic's BaseModel to encapsulate a data structure that needs to be returned.

from typing import List
import promptulate as pne
from pydantic import BaseModel, Field

class LLMResponse(BaseModel):
    provinces: List[str] = Field(description="List of provinces' names")

resp: LLMResponse = pne.chat("Please tell me all provinces in China.", output_schema=LLMResponse)
print(resp)

Output:

provinces=['Anhui', 'Fujian', 'Gansu', 'Guangdong', 'Guizhou', 'Hainan', 'Hebei', 'Heilongjiang', 'Henan', 'Hubei', 'Hunan', 'Jiangsu', 'Jiangxi', 'Jilin', 'Liaoning', 'Qinghai', 'Shaanxi', 'Shandong', 'Shanxi', 'Sichuan', 'Yunnan', 'Zhejiang', 'Taiwan', 'Guangxi', 'Nei Mongol', 'Ningxia', 'Xinjiang', 'Xizang', 'Beijing', 'Chongqing', 'Shanghai', 'Tianjin', 'Hong Kong', 'Macao']

Agent with Plan, Tool-Using and Reflection

Additionally, influenced by the Plan-and-Solve paper, pne also allows developers to build Agents capable of dealing with complex problems through planning, reasoning, and action. The Agent's planning abilities can be activated using the enable_plan parameter.

plan-and-execute.png

In this example, we use Tavily as the search engine, which is a powerful tool for searching information on the web. To use Tavily, you need to obtain an API key from Tavily.

import os

os.environ["TAVILY_API_KEY"] = "your_tavily_api_key"
os.environ["OPENAI_API_KEY"] = "your_openai_api_key"

In this case, we are using the TavilySearchResults Tool wrapped by LangChain.

from langchain_community.tools.tavily_search import TavilySearchResults

tools = [TavilySearchResults(max_results=5)]
import promptulate as pne

pne.chat("what is the hometown of the 2024 Australia open winner?", model="gpt-4-1106-preview", enable_plan=True)

Output:

[Agent] Assistant Agent start...
[User instruction] what is the hometown of the 2024 Australia open winner?
[Plan] {"goals": ["Find the hometown of the 2024 Australian Open winner"], "tasks": [{"task_id": 1, "description": "Identify the winner of the 2024 Australian Open."}, {"task_id": 2, "description": "Research the identified winner to find their place of birth or hometown."}, {"task_id": 3, "description": "Record the hometown of the 2024 Australian Open winner."}], "next_task_id": 1}
[Agent] Tool Agent start...
[User instruction] Identify the winner of the 2024 Australian Open.
[Thought] Since the current date is March 26, 2024, and the Australian Open typically takes place in January, the event has likely concluded for the year. To identify the winner, I should use the Tavily search tool to find the most recent information on the 2024 Australian Open winner.
[Action] tavily_search_results_json args: {'query': '2024 Australian Open winner'}
[Observation] [{'url': 'https://ausopen.com/articles/news/sinner-winner-italian-takes-first-major-ao-2024', 'content': 'The agile right-hander, who had claimed victory from a two-set deficit only once previously in his young career, is the second Italian man to achieve singles glory at a major, following Adriano Panatta in1976.With victories over Andrey Rublev, 10-time AO champion Novak Djokovic, and Medvedev, the Italian is the youngest player to defeat top 5 opponents in the final three matches of a major since Michael Stich did it at Wimbledon in 1991 – just weeks before Sinner was born.\n He saved the only break he faced with an ace down the tee, and helped by scoreboard pressure, broke Medvedev by slamming a huge forehand to force an error from his more experienced rival, sealing the fourth set to take the final to a decider.\n Sensing a shift in momentum as Medvedev served to close out the second at 5-3, Sinner set the RLA crowd alight with a pair of brilliant passing shots en route to creating a break point opportunity, which Medvedev snuffed out with trademark patience, drawing a forehand error from his opponent. “We are trying to get better every day, even during the tournament we try to get stronger, trying to understand every situation a little bit better, and I’m so glad to have you there supporting me, understanding me, which sometimes it’s not easy because I am a little bit young sometimes,” he said with a smile.\n Medvedev, who held to love in his first three service games of the second set, piled pressure on the Italian, forcing the right-hander to produce his best tennis to save four break points in a nearly 12-minute second game.\n'}, {'url': 'https://www.cbssports.com/tennis/news/australian-open-2024-jannik-sinner-claims-first-grand-slam-title-in-epic-comeback-win-over-daniil-medvedev/', 'content': '"\nOur Latest Tennis Stories\nSinner makes epic comeback to win Australian Open\nSinner, Sabalenka win Australian Open singles titles\n2024 Australian Open odds, Sinner vs. Medvedev picks\nSabalenka defeats Zheng to win 2024 Australian Open\n2024 Australian Open odds, Sabalenka vs. Zheng picks\n2024 Australian Open odds, Medvedev vs. Zverev picks\nAustralian Open odds: Djokovic vs. Sinner picks, bets\nAustralian Open odds: Gauff vs. Sabalenka picks, bets\nAustralian Open odds: Zheng vs. Yastremska picks, bets\nNick Kyrgios reveals he\'s contemplating retirement\n© 2004-2024 CBS Interactive. Jannik Sinner claims first Grand Slam title in epic comeback win over Daniil Medvedev\nSinner, 22, rallied back from a two-set deficit to become the third ever Italian Grand Slam men\'s singles champion\nAfter almost four hours, Jannik Sinner climbed back from a two-set deficit to win his first ever Grand Slam title with an epic 3-6, 3-6, 6-4, 6-4, 6-3 comeback victory against Daniil Medvedev. Sinner became the first Italian man to win the Australian Open since 1976, and just the eighth man to successfully come back from two sets down in a major final.\n He did not drop a single set until his meeting with Djokovic, and that win in itself was an accomplishment as Djokovic was riding a 33-match winning streak at the Australian Open and had never lost a semifinal in Melbourne.\n @janniksin • @wwos • @espn • @eurosport • @wowowtennis pic.twitter.com/DTCIqWoUoR\n"We are trying to get better everyday, and even during the tournament, trying to get stronger and understand the situation a little bit better," Sinner said.'}, {'url': 'https://www.bbc.com/sport/tennis/68120937', 'content': 'Live scores, results and order of play\nAlerts: Get tennis news sent to your phone\nRelated Topics\nTop Stories\nFA Cup: Blackburn Rovers v Wrexham - live text commentary\nRussian skater Valieva given four-year ban for doping\nLinks to Barcelona are \'totally untrue\' - Arteta\nElsewhere on the BBC\nThe truth behind the fake grooming scandal\nFeaturing unseen police footage and interviews with the officers at the heart of the case\nDid their father and uncle kill Nazi war criminals?\n A real-life murder mystery following three brothers in their quest for the truth\nWhat was it like to travel on the fastest plane?\nTake a behind-the-scenes look at the supersonic story of the Concorde\nToxic love, ruthless ambition and shocking betrayal\nTell Me Lies follows a passionate college relationship with unimaginable consequences...\n "\nMarathon man Medvedev runs out of steam\nMedvedev is the first player to lose two Grand Slam finals after winning the opening two sets\nSo many players with the experience of a Grand Slam final have talked about how different the occasion can be, particularly if it is the first time, and potentially overwhelming.\n Jannik Sinner beats Daniil Medvedev in Melbourne final\nJannik Sinner is the youngest player to win the Australian Open men\'s title since Novak Djokovic in 2008\nJannik Sinner landed the Grand Slam title he has long promised with an extraordinary fightback to beat Daniil Medvedev in the Australian Open final.\n "\nSinner starts 2024 in inspired form\nSinner won the first Australian Open men\'s final since 2005 which did not feature Roger Federer, Rafael Nadal or Novak Djokovic\nSinner was brought to the forefront of conversation when discussing Grand Slam champions in 2024 following a stunning end to last season.\n'}]
[Execute Result] {'thought': "The search results have provided consistent information about the winner of the 2024 Australian Open. Jannik Sinner is mentioned as the winner in multiple sources, which confirms the answer to the user's question.", 'action_name': 'finish', 'action_parameters': {'content': 'Jannik Sinner won the 2024 Australian Open.'}}
[Execute] Execute End.
[Revised Plan] {"goals": ["Find the hometown of the 2024 Australian Open winner"], "tasks": [{"task_id": 2, "description": "Research Jannik Sinner to find his place of birth or hometown."}, {"task_id": 3, "description": "Record the hometown of Jannik Sinner, the 2024 Australian Open winner."}], "next_task_id": 2}
[Agent] Tool Agent start...
[User instruction] Research Jannik Sinner to find his place of birth or hometown.
[Thought] To find Jannik Sinner's place of birth or hometown, I should use the search tool to find the most recent and accurate information.
[Action] tavily_search_results_json args: {'query': 'Jannik Sinner place of birth hometown'}
[Observation] [{'url': 'https://www.sportskeeda.com/tennis/jannik-sinner-nationality', 'content': "During the semifinal of the Cup, Sinner faced Djokovic for the third time in a row and became the first player to defeat him in a singles match. Jannik Sinner Nationality\nJannik Sinner is an Italian national and was born in Innichen, a town located in the mainly German-speaking area of South Tyrol in northern Italy. A. Jannik Sinner won his maiden Masters 1000 title at the 2023 Canadian Open defeating Alex de Minaur in the straight sets of the final.\n Apart from his glorious triumph at Melbourne Park in 2024, Jannik Sinner's best Grand Slam performance came at the 2023 Wimbledon, where he reached the semifinals. In 2020, Sinner became the youngest player since Novak Djokovic in 2006 to reach the quarter-finals of the French Open."}, {'url': 'https://en.wikipedia.org/wiki/Jannik_Sinner', 'content': "At the 2023 Australian Open, Sinner lost in the 4th round to eventual runner-up Stefanos Tsitsipas in 5 sets.[87]\nSinner then won his seventh title at the Open Sud de France in Montpellier, becoming the first player to win a tour-level title in the season without having dropped a single set and the first since countryman Lorenzo Musetti won the title in Naples in October 2022.[88]\nAt the ABN AMRO Open he defeated top seed and world No. 3 Stefanos Tsitsipas taking his revenge for the Australian Open loss, for his biggest win ever.[89] At the Cincinnati Masters, he lost in the third round to Félix Auger-Aliassime after being up a set, a break, and 2 match points.[76]\nSeeded 11th at the US Open, he reached the fourth round after defeating Brandon Nakashima in four sets.[77] Next, he defeated Ilya Ivashka in a five set match lasting close to four hours to reach the quarterfinals for the first time at this Major.[78] At five hours and 26 minutes, it was the longest match of Sinner's career up until this point and the fifth-longest in the tournament history[100] as well as the second longest of the season after Andy Murray against Thanasi Kokkinakis at the Australian Open.[101]\nHe reached back to back quarterfinals in Wimbledon after defeating Juan Manuel Cerundolo, Diego Schwartzman, Quentin Halys and Daniel Elahi Galan.[102] He then reached his first Major semifinal after defeating Roman Safiullin, before losing to Novak Djokovic in straight sets.[103] In the following round in the semifinals, he lost in straight sets to career rival and top seed Carlos Alcaraz who returned to world No. 1 following the tournament.[92] In Miami, he reached the quarterfinals of this tournament for a third straight year after defeating Grigor Dimitrov and Andrey Rublev, thus returning to the top 10 in the rankings at world No. In the final, he came from a two-set deficit to beat Daniil Medvedev to become the first Italian player, male or female, to win the Australian Open singles title, and the third man to win a Major (the second of which is in the Open Era), the first in 48 years.[8][122]"}, {'url': 'https://www.thesportreview.com/biography/jannik-sinner/', 'content': '• Date of birth: 16 August 2001\n• Age: 22 years old\n• Place of birth: San Candido, Italy\n• Nationality: Italian\n• Height: 188cm / 6ft 2ins\n• Weight: 76kg / 167lbs\n• Plays: Right-handed\n• Turned Pro: 2018\n• Career Prize Money: US$ 4,896,338\n• Instagram: @janniksin\nThe impressive 22-year-old turned professional back in 2018 and soon made an impact on the tour, breaking into the top 100 in the world rankings for the first time in 2019.\n Jannik Sinner (Photo: Dubai Duty Free Tennis Championships)\nSinner ended the season as number 78 in the world, becoming the youngest player since Rafael Nadal in 2003 to end the year in the top 80.\n The Italian then ended the 2019 season in style, qualifying for the 2019 Next Gen ATP Finals and going on to win the tournament with a win over Alex de Minaur in the final.\n Sinner then reached the main draw of a grand slam for the first time at the 2019 US Open, when he came through qualifying to reach the first round, where he lost to Stan Wawrinka.\n Asked to acknowledge some of the key figures in his development, Sinner replied: “I think first of all, my family who always helped me and gave me the confidence to actually change my life when I was 13-and-a-half, 14 years old.\n'}]
[Execute Result] {'thought': 'The search results have provided two different places of birth for Jannik Sinner: Innichen and San Candido. These are actually the same place, as San Candido is the Italian name and Innichen is the German name for the town. Since the user asked for the place of birth or hometown, I can now provide this information.', 'action_name': 'finish', 'action_parameters': {'content': 'Jannik Sinner was born in San Candido (Italian) / Innichen (German), Italy.'}}
[Execute] Execute End.
[Revised Plan] {"goals": ["Find the hometown of the 2024 Australian Open winner"], "tasks": [], "next_task_id": null}
[Agent Result] Jannik Sinner was born in San Candido (Italian) / Innichen (German), Italy.
[Agent] Agent End.

Atomize the Agent structure

In the scenario of Agent development, we often need to split many atomic components of agents to achieve better customization. pne provides atomized Agent components, such as Planner. The following example shows the use of a separate Planner component for task planning.

import promptulate as pne

model = pne.LLMFactory.build("gpt-4-turbo")
planner = pne.Planner(model, system_prompt="You are a planner")
plans = planner.run("Plan a trip to Paris")
print(plans)

Output:

('goals', ['Plan a trip to Paris'])
('tasks', [Task(task_id=1, description='Check passport validity', status=<TaskStatus.TODO: 'todo'>), Task(task_id=2, description='Determine travel dates', status=<TaskStatus.TODO: 'todo'>), Task(task_id=3, description='Research and book flights', status=<TaskStatus.TODO: 'todo'>), Task(task_id=4, description='Book accommodations', status=<TaskStatus.TODO: 'todo'>), Task(task_id=5, description='Plan itinerary for the trip', status=<TaskStatus.TODO: 'todo'>), Task(task_id=6, description='Investigate and purchase travel insurance', status=<TaskStatus.TODO: 'todo'>), Task(task_id=7, description='Set a budget for the trip', status=<TaskStatus.TODO: 'todo'>), Task(task_id=8, description='Pack luggage', status=<TaskStatus.TODO: 'todo'>), Task(task_id=9, description='Notify bank of international travel', status=<TaskStatus.TODO: 'todo'>), Task(task_id=10, description='Check weather forecast and pack accordingly', status=<TaskStatus.TODO: 'todo'>)])
('next_task_id', 1)

For more detailed information, please check the Getting Started/Official Documentation.

📚 Design Principles

The design principles of the pne framework include modularity, extensibility, interoperability, robustness, maintainability, security, efficiency, and usability.

  • Modularity refers to using modules as the basic unit, allowing for easy integration of new components, models, and tools.
  • Extensibility refers to the framework's ability to handle large amounts of data, complex tasks, and high concurrency.
  • Interoperability means the framework is compatible with various external systems, tools, and services and can achieve seamless integration and communication.
  • Robustness indicates the framework has strong error handling, fault tolerance, and recovery mechanisms to ensure reliable operation under various conditions.
  • Security implies the framework has implemented strict measures to protect against unauthorized access and malicious behavior.
  • Efficiency is about optimizing the framework's performance, resource usage, and response times to ensure a smooth and responsive user experience.
  • Usability means the framework uses user-friendly interfaces and clear documentation, making it easy to use and understand.

Following these principles and applying the latest artificial intelligence technologies, pne aims to provide a powerful and flexible framework for creating automated agents.

💌 Contact

For more information, please contact: [email protected]

⭐ Contribution

We appreciate your interest in contributing to our open-source initiative. We have provided a Developer's Guide outlining the steps to contribute to Promptulate. Please refer to this guide to ensure smooth collaboration and successful contributions. Additionally, you can view the Current Development Plan to see the latest development progress 🤝🚀

promptulate's People

Contributors

dependabot[bot] avatar eltociear avatar hizeros avatar longsihua2003 avatar longsihua2026 avatar ruanrongman avatar sweep-ai[bot] avatar undertone0809 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

promptulate's Issues

ChatBot v2

打算重构v2版本,大致使用方式如下:

from cushy-chat import ChatBot


bot = ChatBot(key='key')
# ask question
ret, conversation_id = bot.ask(prompt="hello",conversation_id=conversation_id)


# output markdown format conversation
# method1
md_output = bot.output(conversation_id=123)

# method2
md_output = bot.output(conversation_id=123, type='file', file_name='Conversation.md')


# read history
history = bot.get_history(conversation_id=123)

欢迎提出更多新特性以及解决方案,如果你有想法,请在issue中进行交流。

import promptulate as pne

🚀 Feature Request

I think we can build a more elegant way to use promptulate sdk and I recommend we can use the following writing method.

import promptulate as pne

pne.xxx()

To achieve this goal, we need to make overall modifications to the project, optimize the implementation of delayed imports for all modules, and build a global configuration

🔈 Motivation

Currently, promptulate sdk seems like langchain, a complex way to call component. pne way can provide a more easier writing method for user.

build output parser

from v2.0.0 plan #20

Proposal

We can build an output parser. Sometime, we want to format the output results like JSON schema output, so an output parser is necessary.

Expectation

  • OutputParser can parser the output of llm and formatting code.
  • Development can use output parser in llms and Agent.
  • Build some common parser, such as parser to list, json, yaml type data.

Demo

The following code is some examples:

class OutputParser:
    pass

def main():
    output_parser = OutputParser()
    llm = ChatOpenAI()
    llm("hello", output_parser=output_parser)

Functions

Specifically, we

References

We can refer to other solutions like:

'setup.py install' is deprecated

目前在比较新的 Python / setuptools 版本中,setup.py 已经被废弃,推荐使用pyproject.toml,promptulate 看起来是可以直接转成 pyproject.toml 的,修改的话可以提升对未来新版本的兼容性

ModuleNotFoundError: No module named 'paho'

I run pne-chat after pip install promptulate. It shows the following error:

Traceback (most recent call last):
  File "/home/zeeland/.local/bin/pne-chat", line 5, in <module>
    from promptulate.client.chat import main
  File "/home/zeeland/.local/lib/python3.10/site-packages/promptulate/client/chat.py", line 28, in <module>
    from promptulate.agents import ToolAgent
  File "/home/zeeland/.local/lib/python3.10/site-packages/promptulate/agents/__init__.py", line 2, in <module>
    from promptulate.agents.tool_agent.agent import ToolAgent
  File "/home/zeeland/.local/lib/python3.10/site-packages/promptulate/agents/tool_agent/__init__.py", line 1, in <module>
    from promptulate.agents.tool_agent.agent import ToolAgent
  File "/home/zeeland/.local/lib/python3.10/site-packages/promptulate/agents/tool_agent/agent.py", line 11, in <module>
    from promptulate.tools import BaseTool, Tool
  File "/home/zeeland/.local/lib/python3.10/site-packages/promptulate/tools/__init__.py", line 10, in <module>
    from promptulate.tools.iot_swith_mqtt import IotSwitchTool
  File "/home/zeeland/.local/lib/python3.10/site-packages/promptulate/tools/iot_swith_mqtt/__init__.py", line 1, in <module>
    from promptulate.tools.iot_swith_mqtt.tools import IotSwitchTool
  File "/home/zeeland/.local/lib/python3.10/site-packages/promptulate/tools/iot_swith_mqtt/tools.py", line 4, in <module>
    import paho.mqtt.client as mqtt
ModuleNotFoundError: No module named 'paho'

import paho is not a good way for developer. I think we should not import paho when initialization.

I think we can import paho when run the tool.

class IotSwitchTool(Tool):
    
    def _run(self, question: str, *args, **kwargs) -> str:
        import paho.mqtt.client as mqtt
        ...

add knowledge module

1.add Ernie embedding support
2.add OpenAi embedding support
3.add vectordb support ,example chromdb
4.add knowledge abstract module

Optimize Config

Many instance parameters of prompt.config.Config can be change to class method. Using class method obtain more optimized code experience.

  • current Config
class Config(metaclass=Singleton):
    def __init__(self):
        logger.info(f"[pne config] Config initialization")
        self.enable_cache: bool = True
        self._proxy_mode: str = PROXY_MODE[0]
        self._proxies: Optional[dict] = None
        self.openai_chat_api_url = "https://api.openai.com/v1/chat/completions"
        self.openai_completion_api_url = "https://api.openai.com/v1/completions"
        self.openai_proxy_url = "https://chatgpt-api.shn.hk/v1/"  # FREE API
        self.ernie_bot_url = "https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/completions"
        self.ernie_bot_turbo_url = "https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/eb-instant"
        self.ernie_bot_token = "https://aip.baidubce.com/oauth/2.0/token"
        self.key_default_retry_times = 5
        """If llm(like OpenAI) unable to obtain data, retry request until the data is obtained."""
        self.enable_stdout_hook = True
        ernie_token_pool: ErnieTokenPool = ErnieTokenPool()
        ernie_token_pool.start(self.get_ernie_api_key, self.get_ernie_api_secret)

        if self.enable_stdout_hook:
            StdOutHook.registry_stdout_hooks()

    def turn_off_stdout_hook(self):
        if self.enable_stdout_hook:
            self.enable_stdout_hook = False
            StdOutHook.unregister_stdout_hooks()

    @property
    def openai_api_key(self):
        """This attribution has deprecated to use. Using `get_openai_api_key`"""
        if "OPENAI_API_KEY" in os.environ.keys():
            if self.enable_cache:
                utils.get_cache()["OPENAI_API_KEY"] = os.getenv("OPENAI_API_KEY")
            return os.getenv("OPENAI_API_KEY")
        if self.enable_cache and "OPENAI_API_KEY" in utils.get_cache():
            return utils.get_cache()["OPENAI_API_KEY"]
        raise ValueError("OPENAI API key is not provided. Please set your key.")

    def get_openai_api_key(self, model: str) -> str:
        """Get openai key from KeyPool and environ"""
        if self.enable_cache:
            openai_key_pool: OpenAIKeyPool = OpenAIKeyPool()
            key = openai_key_pool.get(model)
            if key:
                return key
        return self.openai_api_key

    @property
    def get_ernie_api_key(self) -> str:
        if "ERNIE_API_KEY" in os.environ.keys():
            if self.enable_cache:
                utils.get_cache()["ERNIE_API_KEY"] = os.getenv("ERNIE_API_KEY")
            return os.getenv("ERNIE_API_KEY")
        if self.enable_cache and "ERNIE_API_KEY" in utils.get_cache():
            return utils.get_cache()["ERNIE_API_KEY"]
        raise ValueError("ERNIE_API_KEY is not provided. Please set your key.")

    @property
    def get_ernie_api_secret(self) -> str:
        if "ERNIE_API_SECRET" in os.environ.keys():
            if self.enable_cache:
                utils.get_cache()["ERNIE_API_SECRET"] = os.getenv("ERNIE_API_SECRET")
            return os.getenv("ERNIE_API_SECRET")
        if self.enable_cache and "ERNIE_API_SECRET" in utils.get_cache():
            return utils.get_cache()["ERNIE_API_SECRET"]
        raise ValueError("ERNIE_API_SECRET is not provided. Please set your secret.")

    def get_ernie_token(self) -> str:
        ernie_token_pool: ErnieTokenPool = ErnieTokenPool()
        if self.enable_cache:
            if "ERNIE_TOKEN" in utils.get_cache():
                return ernie_token_pool.get_token()
        return ernie_token_pool.nocache_get_token(
            self.get_ernie_api_key, self.get_ernie_api_secret
        )

    def get_key_retry_times(self, model: str) -> int:
        if self.enable_cache:
            openai_key_pool: OpenAIKeyPool = OpenAIKeyPool()
            return openai_key_pool.get_num(model)
        return self.key_default_retry_times

    @property
    def proxy_mode(self) -> str:
        if self.enable_cache and "PROXY_MODE" in utils.get_cache():
            return utils.get_cache()["PROXY_MODE"]
        return self._proxy_mode

    @proxy_mode.setter
    def proxy_mode(self, value):
        self._proxy_mode = value
        if self.enable_cache:
            utils.get_cache()["PROXY_MODE"] = value

    @property
    def proxies(self) -> Optional[dict]:
        if self.enable_cache and "PROXIES" in utils.get_cache():
            return utils.get_cache()["PROXIES"] if self.proxy_mode == "custom" else None
        return self._proxies

    @proxies.setter
    def proxies(self, value):
        self._proxies = value
        if self.enable_cache:
            utils.get_cache()["PROXIES"] = value

    @property
    def openai_chat_request_url(self) -> str:
        if self.proxy_mode == PROXY_MODE[2]:
            self.proxies = None
            return self.openai_proxy_url
        return self.openai_chat_api_url

    @property
    def openai_completion_request_url(self) -> str:
        if self.proxy_mode == PROXY_MODE[2]:
            self.proxies = None
            return f"{self.openai_proxy_url}completions"
        return self.openai_completion_api_url

    def set_proxy_mode(self, mode: str, proxies: Optional[dict] = None):
        self.proxy_mode = mode
        self.proxies = proxies
        logger.info(f"[pne] proxy mode: {mode}, proxies: {proxies}")

add file operator for ToolAgent

🚀 Feature Request

Add file operator like langchain file operator.

We seem need the following tools:

WriteFileTool
AppendFileTool # Attach information to an existing file
ReadFileTool
DeleteFileTool
ListDirectoryTool
CopyFileTool
MoveFileTool

ToolKit maybe a better solution if you want to use multiple tools at once.

ref: https://python.langchain.com/docs/integrations/tools/filesystem

A Possible Example Usage

Here is a possible way to use it.

from promptulate.tools.file import WriteFileTool, ReadFileTool
from promptulate.agent import ToolAgent

def main():
    tools = [WriteFileTool(), ReadFileTool()]
    agent = ToolAgent(tools=tools)
    agent.run("Please help me write a poetry in poetry.txt")

Challenges

  • how to initialize work directory.
  • how to define file toolkit with a pythonic way. A better and simplified usage.
  • File read/write, delete and create permissions.

Goal

  • ToolAgent can use file operator tool.
  • Provide relevant docs, show how to use in ToolAgent. Write at docs/tools/file.md.
  • Add relevant unittest
  • Implement an tiny example:
    ToolAgent can CRUD file workspace, eg: "Please write a bubble_sort in 'sort.py'".

ROADMAP 2024

The issue is for the development plan of promptulate v2.0.0. The In Development status indicates that there is ongoing progress on the feature, while Planned Development signifies the functionalities intended to be completed in this version but have not yet started. If you are interested in any of the points, you are welcome to create an issue about that content with a simple proposal, describing your general implementation ideas for discussion.

Please note:

  1. If you want to develop a certain feature, I highly recommend you start a new issue mentioning that content to avoid duplicate development efforts. On the other hand, issues provide a channel for developers to communicate.
  2. If someone proposes to develop a content listed under Planned Development, I will move it to In Development.
  3. There are some other development plans in the official documentation that are not yet included in the current version's development plan. If you see a feature you would like to work on, we would be very happy to add it to the development plan for this version!

If you have more interesting feature ideas, you are very welcome to suggest them, as well as submit related issues and pull requests!

Completed

  • Support for Wenxin Yiyan model #22
  • IoT control support #64
  • Shell Tool support #55 #62
  • Compatibility with langchain tools #67
  • Compatibility with huggingface tools #67
  • Transition from BaseTool to Tool type
  • Build vercel-promptulate-flask https://github.com/Undertone0809/vercel-promptulate-flask
    • Compatibility with huggingface models
  • Unit testing and black formatting checks during pull requests

In Development

  • Providing Chroma integration solution
  • Providing various external data access solutions, such as documents, URLs, etc.
  • Using OpenAI for GPT capability driving
  • Providing lazy loading for all modules

Planned Development

Agent

  • Support for more types of Agents

Tool

LLM

  • Adaptation to more LLMs, such as current various open-source models
    • ChatGLM
    • ChatGLM2-6B
    • iFlytek Spark Large Model

Memory

  • Memory storage for Agent, llm related configuration information
  • Memory compatibility with relational database storage

Others

  • Improve documentation, add new sections, provide knowledge on LLM, prompt techniques
  • Develop a simple feature demonstration server based on gradio (to be discussed)
  • Build a plugin marketplace
    • Build plugin templates
    • Integrate third-party plugins quickly via command line
  • Build multi-Agent scheduling model
  • Provide proxy configuration for DuckDuckGo
  • Build a custom Hook system
  • Build streaming transmission
  • Explore multimodal capabilities: text-to-image, integration with Huggingface models
  • Output parser

Discuss

  • Should the role of BaseFramework and the functionality of presets be removed, or is there a better alternative?
  • Should the embedding method of Tree of Thoughts be introduced as a mixin? Further discussion is needed.
  • Is it possible to build an adapter for langchain, thereby connecting to the langchain ecosystem?
  • Is it feasible to build an LLM's huggingface, offering a more streamlined and elegant way to quickly deploy a large language application model?

Translate into English version.

Add CodeAgent

🚀 Feature Request

Provide an Agent which can provide the ability of coding.

PaperSummaryTool出现异常bug

在使用PaperSummaryTool的时候出现如下问题

from promptulate.tools.paper.tools import PaperSummaryTool
from promptulate.utils.logger import get_logger, enable_log


def main():
    enable_log()
    tool = PaperSummaryTool()
    result = tool.run("Attention is all your need.")
    print(result)


if __name__ == "__main__":
    main()
[DEBUG] 2023-08-09 16:20:16 [broadcast-service] <get_paper_references> listen <PaperSummaryTool.run.get_paper_references> topic
[DEBUG] 2023-08-09 16:20:16 [broadcast-service] <get_translate> listen <PaperSummaryTool.run.get_translate> topic
[DEBUG] 2023-08-09 16:20:16 [broadcast-service] <get_opinion> listen <PaperSummaryTool.run.get_opinion> topic
[DEBUG] 2023-08-09 16:20:16 [broadcast-service] <get_keywords> listen <PaperSummaryTool.run.get_keywords> topic
[DEBUG] 2023-08-09 16:20:16 [broadcast-service] <get_advice> listen <PaperSummaryTool.run.get_advice> topic
[DEBUG] 2023-08-09 16:20:16 Starting new HTTPS connection (1): api.semanticscholar.org:443
[DEBUG] 2023-08-09 16:20:17 https://api.semanticscholar.org:443 "GET /graph/v1/paper/autocomplete?query=Attention%20is%20all%20your%20need.&limit=10 HTTP/1.1" 200 449
[DEBUG] 2023-08-09 16:20:17 Starting new HTTPS connection (1): api.semanticscholar.org:443
[DEBUG] 2023-08-09 16:20:19 https://api.semanticscholar.org:443 "POST /graph/v1/paper/batch?fields=title%2Curl%2Cabstract HTTP/1.1" 429 174
Traceback (most recent call last):
  File "D:\Programming\Python\Project\promptulate\private\paper_tool.py", line 19, in <module>
    main()
  File "D:\Programming\Python\Project\promptulate\private\paper_tool.py", line 14, in main
    result = tool.run("Attention is all your need.")
  File "D:\Programming\Python\Project\promptulate\promptulate\tools\base.py", line 50, in run
    result: Any = self._run(*args, **kwargs)
  File "D:\Programming\Python\Project\promptulate\promptulate\tools\paper\tools.py", line 132, in _run
    paper_info = self.semantic_scholar_query_tool.run(
  File "D:\Programming\Python\Project\promptulate\promptulate\tools\base.py", line 50, in run
    result: Any = self._run(*args, **kwargs)
  File "D:\Programming\Python\Project\promptulate\promptulate\tools\semantic_scholar\tools.py", line 26, in _run
    result = self.api_wrapper.get_paper(query, **kwargs)
  File "D:\Programming\Python\Project\promptulate\promptulate\tools\semantic_scholar\api_wrapper.py", line 95, in get_paper
    get_detail()
  File "D:\Programming\Python\Project\promptulate\promptulate\tools\semantic_scholar\api_wrapper.py", line 74, in get_detail
    if detail_item["paperId"] == original["id"]:
TypeError: string indices must be integers

Process finished with exit code 1

Add sliding window for WebAgent

🚀 Feature Request

WebAgent obtain information and insert into prompt context. It's a long context prompt for LLM. Sometimes, it will output of llm limit. So we should do something to reduce prompt. We can use sliding window to solve this problem.

We can use MessageSet and set a fixed length for MessageSet.

🔈 Motivation

Optimize WebAgent. Reduce token usage.

Add google serper tool

🚀 Feature Request

I think we can build more tools. We can refer to the code implementation in the duck duck go search section, and also refer to the Google Serper implementation section in Langchain. It is highly recommended to read the relevant implementation code.

We may use the following method to call tool:

import promptulate as pne
from promptulate.tools.serperimport Serper


serper= Serper()
agent = pne.ToolAgent(tools=[serper])
agent.run("What's the weather in Shanghai tomorrow?")

国产大模型支持

计划将框架扩展支持文心系列模型(包括文心一言和文心turbo等),国产大模型响应速度较快,亲测还不错。

add humanfeedback tool

添加这个工具可以使得agent变得更具交互性,用于智能家居场景,能更好满足需求,下面是添加后的结果
Question: 现在你是一个智能音箱,我现在感觉好暗
Thought:
I need to find a way to help the user with their lighting issue. I could potentially use the IoT Switch MQTT tool to control the lights in the room. However, I should confirm with the user if they want me to do that or if they have another solution in mind.

Action: human_feedback
Action Input: "Would you like me to adjust the lighting in the room for you?"

Observation: yes
Thought: The user wants me to adjust the lighting in the room. I can use the IoT Switch MQTT tool to do that.

Action: Iot_Switch_Mqtt
Action Input: "Turn on the lights"

Observation: ok
Thought: The lights have been turned on. The user should now have better lighting.

Final Answer: The lights in the room have been turned on.

[DEBUG] 2023-09-08 17:23:58 [pne hook] hooks <('Agent', 'on_agent_result')> mounted_obj call hook: <[BaseHookSchema(hook_name='on_agent_result', callback=<function StdOutHook.handle_agent_result at 0x0000015223F73CA0>, component_type='Agent')]>
[Agent Result] The lights in the room have been turned on.
Agent End.

Plugin Market

I hope to build a plugin market where users can quickly customize plugins through command line or plugin templates, and use git to quickly obtain plugins shared by other users on GitHub. I am trying to build an elegant solution and hope to receive your suggestions.

error in pne-chat: click.exceptions.ClickException: vi: Editing failed

🐛 Bug Report

error in pne-chat click.exceptions.ClickException: vi: Editing failed

🔬 How To Reproduce

Steps to reproduce the behavior:

zeeland@zeeland-virtual-machine:~/.gazebo$ pne-chat
Hi there, here is promptulate chat terminal.
? Choose a chat terminal: Web Agent Chat
? Choose a llm model: OpenAI
[User] 
/bin/sh: 1: vi: not found
Traceback (most recent call last):
  File "/home/zeeland/.local/bin/pne-chat", line 8, in <module>
    sys.exit(main())
  File "/home/zeeland/.local/lib/python3.10/site-packages/promptulate/client/chat.py", line 160, in main
    chat()
  File "/home/zeeland/.local/lib/python3.10/site-packages/promptulate/client/chat.py", line 156, in chat
    web_chat(llm)
  File "/home/zeeland/.local/lib/python3.10/site-packages/promptulate/client/chat.py", line 93, in web_chat
    prompt = get_user_input()
  File "/home/zeeland/.local/lib/python3.10/site-packages/promptulate/client/chat.py", line 59, in get_user_input
    message = click.edit(marker)
  File "/home/zeeland/.local/lib/python3.10/site-packages/click/termui.py", line 677, in edit
    return ed.edit(text)
  File "/home/zeeland/.local/lib/python3.10/site-packages/click/_termui_impl.py", line 557, in edit
    self.edit_file(name)
  File "/home/zeeland/.local/lib/python3.10/site-packages/click/_termui_impl.py", line 517, in edit_file
    raise ClickException(
click.exceptions.ClickException: vi: Editing failed

image

Environment

  • OS: ubuntu 22.04
  • Python version, get it with: python 3.10

📈 Expected behavior

Work normally.

add iot tool

This tool can made llm model to control the iot machine,that can used by smart home and other scene

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.