Git Product home page Git Product logo

gish's People

Contributors

dependabot[bot] avatar drorm avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

gish's Issues

Gish seems to have dropped a space character in the first prompt I tried?

Notice the lack of space between approximately12,742.

The same "how big is the earth" returns the same text from the api, except with the expected space. So it seems like maybe there's some text parsing issue in gish?

$ gish how big is the earth | cat
The Earth has a diameter of approximately12,742 kilometers (7,918 miles) and a circumference of about 40,075 kilometers (24,901 miles).
Tokens: 39 Cost: $0.000078 Elapsed: 1.422 Seconds. Tokens and cost are estimates when streaming.

Provide a way to view the history

Provide a way to view the history of previous requests.
While it's always possible to do a less ~/.gish/history.json, we need to provide a way to view the history.
To support #13, Provide a way to continue a chat to a specific previous request, we need to show the history of the requests.

gish -h n
  • n defaults to the last 100 requests.
  • Requests can be thousands of tokens long. We only display the first 80 characters of the request in history (change this in your config)

Support openai chat

Background
See https://platform.openai.com/chat for details about about what chat means, but fundamentally, we're providing a way to send the following:

  messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Who won the world series in 2020?"},
        {"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
        {"role": "user", "content": "Where was it played?"}
    ]
)

This is similar experience to the one at https://chat.openai.com/chat.

Implementation details
The different ways to start a chat.

  • CLI: gish -c. To keep the conversation, you need to give the flag each time, but it's simple to do something like
alias gchat gish -c

and then call gchat instead of gish when you want to be in a chat.

  • Interactive: chat. In interactive mode, your prompt changes from '> ' to 'chat >' to remind you that you're in chat mode. To exit chat mode you type 'exit'.

limitations
Since GPT3.5 is limited to 4K tokens, it can be quite easy to reach the limit in a conversation.
We currently have no way to prevent this, and probably don't handle the exception gracefully. Need to explore this in a separate ticket.
It seems like https://chat.openai.com/chat handles these issues gracefully, probably when it reaches limits, it summarizes the previous conversation and then continues.

We are sticking to the roles: system, user, assistant for this initial version.

Cool stuff. Feature: enable history to be a trailing contenxt of n messages.

First of all nice project, I appreciate a clean cli template for js :)

I think a useful addition would be to allow history.

let messages = [] ;
let n = 5;
let limit = 2000;
onNewMessage(msg) {
  msg.count = gptoken.countTokens(msg.content);
  
  messages.push(msg);

  // prune the convo traling
  if(messages.length < n) {
    let trash = messages.shift();
  }
   
  let len = Math.sum(messages.map(m=>m.count));
  //enforce an input length limit in tokens
  while(len > limit) {
    let trash = messages.shift();
    len -= trash.count;
  }
}

//psudo 
user submit -> addMessage
submit messages -> get response -> addMessage
await Input

this adds to the requested cost but will improve performance. I think you can just pass in the array of messages. should be minimal tweak to API call

Provide a way to continue a chat to a specific previous request

Currently the chat feature is only available to the last request as seen in
https://github.com/drorm/gish/blob/main/README.md#interactive-chat-mode

When implementing the request the idea was to provide a way to reference a range of requests, but that idea was dropped:
#5 (comment)

Now when gish is such a central part of my daily workflow, I realized that there are cases where I want to be able to continue a previous request into a conversation.

The syntax in CLI and interactive is parallel:

gish -c n

or

chat n

Depends on #14 to view the history to chose the right chat.

This is similar to the use from the shell. You use the history to find the chat you want to continue, and then use the -c using the number from there.

Custom config in ~/.gish/config.json

Currently the only way to change the config is to edit settings.ts which is not ideal.

  • on startup check to see if ~/.gish/config.json exists
  • If it doesn't, create it and populate it with defaults
  • Use the flags from users config.

Use as an agent: running gish until GPT completes a task.

Consider the following workflow/loop:

  1. Run gish to generate a unit test
  2. Run the unit test
  3. Get an error
  4. Run Gish in chat mode with the error
  5. Go back to 2.

This can be incredibly useful for a variety of tasks, including writing new features where GPT doesn't get it right the first time and needs to iterate several time before getting it right.

Explore

  • Can we do the plumbing using a shell script as we've done in the git commit script, or do we need some special hooks as we did with #diff?
  • How do we tell if gish is stuck and needs human help :-)?
    • Limit the number of iterations, tell it to ask for help?
    • Provide a y/n continue after each iteration?

Handle saving multiple files enclosed in ``` to generate an app

GPT4 is capable of generating full solutions that include multiple files enclosed in the conventional format of triple backticks. In general, it seems to have the inclination to include these even when not asked to.

You can ask it to generate a file like this (because of the markdown in the file, can't include it inline):
example.txt

So in effect, we could, with a simple gish request, generate a full, multi-file application, or library.
Generate the files in a directory, and for extra points, we can launch a local server, and the app is right there in the user's browser :-).

When saving a file and it exists, rename the old rather than the new

When you save to a file, say "foo.ts", gish will check if it exists and append a number: foo-1.ts, foo-2.ts, etc. the first one that's available.
Instead, always save the file as the original name and just rename the existing file to the first available slot same as before.

Failing to read mesages with -c on first mesage, with an old config

sudo npm install -g gish-gpt

gish -c .. hi 


file:///usr/lib/node_modules/gish-gpt/dist/gptRequest.js:101
        messages.push({ role: "user", content: request });
                 ^

TypeError: Cannot read properties of undefined (reading 'push')
    at GptRequest.fetch (file:///usr/lib/node_modules/gish-gpt/dist/gptRequest.js:101:18)
    at GptRequest.submitChat (file:///usr/lib/node_modules/gish-gpt/dist/gptRequest.js:64:37)
    at Interactive.submitChat (file:///usr/lib/node_modules/gish-gpt/dist/interactive.js:120:26)
    at Interactive.handlCommand (file:///usr/lib/node_modules/gish-gpt/dist/interactive.js:114:28)
    at Interface.<anonymous> (file:///usr/lib/node_modules/gish-gpt/dist/interactive.js:76:24)
    at Interface.emit (node:events:513:28)
    at [_onLine] [as _onLine] (node:internal/readline/interface:422:12)
    at [_line] [as _line] (node:internal/readline/interface:893:18)
    at [_ttyWrite] [as _ttyWrite] (node:internal/readline/interface:1271:22)
    at ReadStream.onkeypress (node:internal/readline/interface:270:20)

Node.js v18.15.0

if (this.options.chat) {

looks like we just need to add a check here if the previous message is not found.

if(!Array.isArray(messages)) {
  console.warn("Unable to process previous message is this a new chat? is history corrupt?")
  this.messages = [];
}

defiantly a better way of implementing but would work.

this might only be a problem on initial migration from an older version

Prepend a prompt when passing a file

gish "Return the first 5 lines only, nothing else' -i 10LinePoem.txt
or
gish -i 10LinePoem.txt "Return the first 5 lines only , nothing else"

It would be dope if a functionality like this could be implemented. Would be helpful to string with -s to summarize notes, manipulate documents, etc.

Use gish to generate a web application

This is a specific case of #10.
GPT4 is capable of generating web application that include HTML, CSS and javascript enclosing each portion in ```.
In general, it seems to have the inclination to include these even when not asked to.

You can ask it to generate a file like this (because of the markdown in the file, can't include it inline):
example.txt

So in effect, we could, with a simple gish request, generate a full web application.
Generate the files in a directory, and for extra points, we can launch a local server, and the app is right there in the user's browser :-).

For a standard web app it could be

  • ```index.html
  • ```styles.css
  • ```index.js
  • ``` README.md -- optional for any comments.

For this initial version use regex to parse the file, but for future versions, consider using https://github.com/pegjs/pegjs.
The advantage of going in that direction is that

  1. It is more robust.
  2. It can give us a definitive answer whether GPT's response is compliant: it only has files enclosed in ```.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.