drorm / gish Goto Github PK
View Code? Open in Web Editor NEWGPT command line
License: MIT License
GPT command line
License: MIT License
Notice the lack of space between approximately12,742
.
The same "how big is the earth" returns the same text from the api, except with the expected space. So it seems like maybe there's some text parsing issue in gish?
$ gish how big is the earth | cat
The Earth has a diameter of approximately12,742 kilometers (7,918 miles) and a circumference of about 40,075 kilometers (24,901 miles).
Tokens: 39 Cost: $0.000078 Elapsed: 1.422 Seconds. Tokens and cost are estimates when streaming.
Provide a way to view the history of previous requests.
While it's always possible to do a less ~/.gish/history.json, we need to provide a way to view the history.
To support #13, Provide a way to continue a chat to a specific previous request, we need to show the history of the requests.
gish -h n
Background
See https://platform.openai.com/chat for details about about what chat means, but fundamentally, we're providing a way to send the following:
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Who won the world series in 2020?"},
{"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
{"role": "user", "content": "Where was it played?"}
]
)
This is similar experience
to the one at https://chat.openai.com/chat.
Implementation details
The different ways to start a chat.
alias gchat gish -c
and then call gchat instead of gish when you want to be in a chat.
limitations
Since GPT3.5 is limited to 4K tokens, it can be quite easy to reach the limit in a conversation.
We currently have no way to prevent this, and probably don't handle the exception gracefully. Need to explore this in a separate ticket.
It seems like https://chat.openai.com/chat handles these issues gracefully, probably when it reaches limits, it summarizes the previous conversation and then continues.
We are sticking to the roles: system, user, assistant for this initial version.
First of all nice project, I appreciate a clean cli template for js :)
I think a useful addition would be to allow history.
let messages = [] ;
let n = 5;
let limit = 2000;
onNewMessage(msg) {
msg.count = gptoken.countTokens(msg.content);
messages.push(msg);
// prune the convo traling
if(messages.length < n) {
let trash = messages.shift();
}
let len = Math.sum(messages.map(m=>m.count));
//enforce an input length limit in tokens
while(len > limit) {
let trash = messages.shift();
len -= trash.count;
}
}
//psudo
user submit -> addMessage
submit messages -> get response -> addMessage
await Input
this adds to the requested cost but will improve performance. I think you can just pass in the array of messages. should be minimal tweak to API call
Currently the chat feature is only available to the last request as seen in
https://github.com/drorm/gish/blob/main/README.md#interactive-chat-mode
When implementing the request the idea was to provide a way to reference a range of requests, but that idea was dropped:
#5 (comment)
Now when gish is such a central part of my daily workflow, I realized that there are cases where I want to be able to continue a previous request into a conversation.
The syntax in CLI and interactive is parallel:
gish -c n
or
chat n
Depends on #14 to view the history to chose the right chat.
This is similar to the use from the shell. You use the history to find the chat you want to continue, and then use the -c using the number from there.
Currently the only way to change the config is to edit settings.ts which is not ideal.
Provide a way to pass various flags from https://platform.openai.com/docs/api-reference/chat to the API.
Rather than try to handle each flag, we'll use one flag on our side and pass them together to the API.
--extra-flags "{temperature:1.3, max_tokens:2000}"
The argument needs to be a valid JSON object.
Consider the following workflow/loop:
This can be incredibly useful for a variety of tasks, including writing new features where GPT doesn't get it right the first time and needs to iterate several time before getting it right.
Explore
#diff
?GPT4 is capable of generating full solutions that include multiple files enclosed in the conventional format of triple backticks. In general, it seems to have the inclination to include these even when not asked to.
You can ask it to generate a file like this (because of the markdown in the file, can't include it inline):
example.txt
So in effect, we could, with a simple gish request, generate a full, multi-file application, or library.
Generate the files in a directory, and for extra points, we can launch a local server, and the app is right there in the user's browser :-).
When you save to a file, say "foo.ts", gish will check if it exists and append a number: foo-1.ts, foo-2.ts, etc. the first one that's available.
Instead, always save the file as the original name and just rename the existing file to the first available slot same as before.
Currently the history uses the built in Readline library. Migrate to use our log in history.json. That way you get the same history from interactive and CLI.
Now that we have multiple config files: #2 and the history file, we need to migrate them to one place.
The history file needs to use ~/.gish/history rather than ~/.gish.json
sudo npm install -g gish-gpt
gish -c .. hi
file:///usr/lib/node_modules/gish-gpt/dist/gptRequest.js:101
messages.push({ role: "user", content: request });
^
TypeError: Cannot read properties of undefined (reading 'push')
at GptRequest.fetch (file:///usr/lib/node_modules/gish-gpt/dist/gptRequest.js:101:18)
at GptRequest.submitChat (file:///usr/lib/node_modules/gish-gpt/dist/gptRequest.js:64:37)
at Interactive.submitChat (file:///usr/lib/node_modules/gish-gpt/dist/interactive.js:120:26)
at Interactive.handlCommand (file:///usr/lib/node_modules/gish-gpt/dist/interactive.js:114:28)
at Interface.<anonymous> (file:///usr/lib/node_modules/gish-gpt/dist/interactive.js:76:24)
at Interface.emit (node:events:513:28)
at [_onLine] [as _onLine] (node:internal/readline/interface:422:12)
at [_line] [as _line] (node:internal/readline/interface:893:18)
at [_ttyWrite] [as _ttyWrite] (node:internal/readline/interface:1271:22)
at ReadStream.onkeypress (node:internal/readline/interface:270:20)
Node.js v18.15.0
Line 111 in 6ec8161
looks like we just need to add a check here if the previous message is not found.
if(!Array.isArray(messages)) {
console.warn("Unable to process previous message is this a new chat? is history corrupt?")
this.messages = [];
}
defiantly a better way of implementing but would work.
this might only be a problem on initial migration from an older version
gish "Return the first 5 lines only, nothing else' -i 10LinePoem.txt
or
gish -i 10LinePoem.txt "Return the first 5 lines only , nothing else"
It would be dope if a functionality like this could be implemented. Would be helpful to string with -s to summarize notes, manipulate documents, etc.
As part of #12 I added src/generateFiles.ts but did not remove the test on the bottom. As result, gish generates a directory "output" with some files in it.
This is a specific case of #10.
GPT4 is capable of generating web application that include HTML, CSS and javascript enclosing each portion in ```.
In general, it seems to have the inclination to include these even when not asked to.
You can ask it to generate a file like this (because of the markdown in the file, can't include it inline):
example.txt
So in effect, we could, with a simple gish request, generate a full web application.
Generate the files in a directory, and for extra points, we can launch a local server, and the app is right there in the user's browser :-).
For a standard web app it could be
For this initial version use regex to parse the file, but for future versions, consider using https://github.com/pegjs/pegjs.
The advantage of going in that direction is that
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.