Comments (10)
But how can it maintain track of a specific conversation?
There could be a command to start a new chat, then once started future chat commands maintain within that chat until you end the chat - or switch to another one.
Chats will make a lot more sense in the web interface, see:
Maybe there's a mode where a chat session is an interactive terminal process where you type in more text and hit enter to send a line at a time.
from llm.
The logging database schema should take chats into account as well.
from llm.
Demo of the freshly-merged #14 by @amjith:
$ llm 'What do otters like to eat?'
Otters are carnivores and they mainly eat fish, shellfish, crustaceans, and other aquatic animals. They have a varied diet, which includes clams, mussels, snails, crabs, crayfish, frogs, small mammals, and birds. They also enjoy eating insects and some vegetation occasionally.
$ llm -c 'where do they live?'
Usage: llm chatgpt [OPTIONS] [PROMPT]
Try 'llm chatgpt --help' for help.
Error: Invalid value for '-c' / '--continue': 'where do they live?' is not a valid integer.
$ llm 'where do they live?' -c
Otters are found in various parts of the world, inhabiting both freshwater and marine environments. There are 13 different species of otters, and their habitats vary depending on the species.
River otters are commonly found in North America, from Alaska to Mexico, while sea otters live along the west coast of North America, from California to Alaska. Eurasian otters inhabit the rivers, lakes, and wetlands of Europe and Asia.
Giant river otters, as their name suggests, are the largest of the otter species and can be found in the Amazon River basin of South America.
Other species such as the smooth-coated otter, clawless otter and African clawless otter can be found in various parts of Asia and Africa, respectively.
It's a bit annoying that llm -c 'where do they live?'
doesn't work, because the -c
option needs to come at the very end. I wonder if we can improve that.
from llm.
Idea: llm -c/--continue
could always act as a flag. That way you can use -c
anywhere in the command.
Then llm --chat 44
could be an alternative option which requires a valid chat ID argument.
from llm.
I think --continue
should reuse the same model as the logged conversation, unless the user specifies -m
directly.
from llm.
I broke this - I forgot to set the default model so now 'llm "prompt"` complains that no model was specified.
from llm.
I'm having second thoughts about the --chat 123
option now - because maybe I want to use --chat
as an option that starts an interactive chat UI instead?
from llm.
Options for an interactive chat mode:
- It's a new command -
llm chat -m gpt-4
for example. This feels a bit odd since the current default command is actuallyllm chatgpt ...
andllm chat
feels confusing. - It's part of the default command:
llm --chat -4
starts one running.
Maybe the llm chatgpt
command is mis-named, especially since it can be used to work with GPT-4.
from llm.
I named llm chatgpt
that because I thought I'd have a separate command for bard
and for llama
and so-on, and because I thought the other OpenAI complete APIs (the non-chat ones, like GPT-3) may end up with a separate command.
from llm.
I'm closing this in favour of an issue to redesign the top-level set of commands.
from llm.
Related Issues (20)
- Mechanism for recording a different model ID from the one requested HOT 1
- How to handle fake messages that were not part of real coversations? HOT 3
- llm-llamafile is missing from plugin directory
- How to cut off the LLM in chat mode
- IndexError on Windows for llm chat HOT 1
- llm-groq does not support llama 3 HOT 2
- some plugins fail to install with "Connection refused" error HOT 1
- UI around chat history HOT 1
- [plugin] add IBM watsonx
- A rapidly convert Files to Prompts Using Rust
- Enhancement idea: implement a self help
- Asynchronous API support HOT 1
- Add API documentation on how to import and use this tool as a Python library
- Support for GPT-4o HOT 3
- Fix for latest mypy
- Rename the gpt-4-turbo aliases HOT 1
- All I ever get is "insufficient_quota" HOT 5
- llm 0.14: Can't run <<llm chat>> on Windows 11 HOT 1
- llm keys set openai
- Would you be up for a PR that shows help on no options?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from llm.