-
Mention (@) the bot and it will reply to your message. Reply to the bot's message to continue from that point. Build conversations with reply chains!
You can reply to any of the bot's messages to continue from wherever you want. Or reply to your friend's message (and @ the bot) to ask a question about it. There are no limits to this functionality.
Additionally, you can seamlessly move any conversation into a thread. When you @ the bot in a thread it will remember the conversation attached outside of it.
-
Supports models from OpenAI API and La plateforme de Mistral, or run a local model with LM Studio.
-
The bot's responses are dynamically generated and turn green when complete.
-
The bot can see image attachments when you choose a vision model.
- Easily set a custom personality (aka system prompt)
- User identity aware
- Fully asynchronous
- 1 Python file, ~200 lines of code
Before you start, install Python and clone this git repo.
- Install Python requirements:
pip install -r requirements.txt
- Create a copy of .env.example, rename it to .env and set it up:
Setting | Instructions |
---|---|
DISCORD_BOT_TOKEN | Create a new Discord application at discord.com/developers/applications and generate a token under the Bot tab. Also enable MESSAGE CONTENT INTENT. |
OPENAI_API_KEY | Only required if you choose an OpenAI API model. Generate an OpenAI API key at platform.openai.com/account/api-keys. You must also add a payment method to your OpenAI account at platform.openai.com/account/billing/payment-methods. |
MISTRAL_API_KEY | Only required if you choose a Mistral API model. Generate a Mistral API key at console.mistral.ai/user/api-keys. You must also add a payment method to your Mistral account at console.mistral.ai/billing. |
LOCAL_SERVER_URL | Only required if you choose to run a local model with LM Studio. Load your desired model and start the Local Inference Server. (Default: http://localhost:1234/v1) |
LLM | OpenAI API models: gpt-3.5-turbo gpt-4-turbo-preview gpt-4-vision-preview Mistral API models: mistral-tiny (Mistral-7B) mistral-small (Mixtral-8X7B) mistral-medium (Mistral internal prototype) LM Studio: local-model (use this regardless of the model you choose) |
CUSTOM_SYSTEM_PROMPT | Write practically anything you want to customize the bot's behavior! |
ALLOWED_CHANNEL_IDS | Discord channel IDs where the bot can send messages, separated by commas. Leave blank to allow all channels. |
ALLOWED_ROLE_IDS | Discord role IDs that can use the bot, separated by commas. Leave blank to allow everyone. |
MAX_IMAGES | The maximum number of image attachments allowed in a single message. Only applicable when using a vision model. (Default: 5) |
MAX_MESSAGES | The maximum number of messages allowed in a reply chain. (Default: 20) |
- Invite the bot to your Discord server with this URL (replace <CLIENT_ID> with your Discord application's client ID found under the OAuth2 tab):
https://discord.com/api/oauth2/authorize?client_id=<CLIENT_ID>&permissions=412317273088&scope=bot
- Run the bot:
python llmcord.py