Git Product home page Git Product logo

Comments (9)

minad avatar minad commented on May 25, 2024 1

Which one do you think should be the default?

I don't care much, but personally I would for sure disable the header line.

The reason I picked the header-line is because there are too many mode-line packages out there that transform/hide mode-line elements and minor-mode lighter info.

I agree. In my setup I only use minions and made a few simple adjustments to the mode line myself. But besides that my mode-line is mostly vanilla.

Since the header-line is only intended for dedicated gptel buffers, it seemed like a safe bet.

Indeed, it is. The mode-line-process variable should be a reasonable safe alternative.

from gptel.

karthink avatar karthink commented on May 25, 2024 1

@minad Streaming responses is now added, and is turned on by default.

from gptel.

karthink avatar karthink commented on May 25, 2024 1

Done, finally! Set gptel-use-header-line to nil for less obtrusive status display/messaging when using gptel-mode.

from gptel.

karthink avatar karthink commented on May 25, 2024

Sure, the header-line is just a stopgap solution. I'm sure there are better alternatives.

To get more progress updates, there are a couple of possibilities:

  1. The API has a stream option that sends chunks of data. It is possible to write a process filter that updates the buffer live as these come in, like in the browser. This is the correct solution, but I've been working on other aspects. (PRs are welcome!)
  2. There is a gptel-playback option in the package that plays back data in chunks. This doesn't save you any time, it waits until it receives the full response, then plays it back continuously so it's actually slower! It's silly but it does make the process feel somewaht interactive.
  3. Another cheap trick I've tried is to pulse-momentary-highlight-region when the response comes in. It's still slow, but like the previous option it feels a little more interactive.

from gptel.

minad avatar minad commented on May 25, 2024

The API has a stream option that sends chunks of data. It is possible to write a process filter that updates the buffer live as these come in, like in the browser. This is the correct solution, but I've been working on other aspects. (PRs are welcome!)

This is the approach I favor. I am happy to help out if I find time, but I just started experimenting with this. The overall approach of your package seems good with the list of prompts in a plain text document, but I haven't checked out the other solutions.

from gptel.

minad avatar minad commented on May 25, 2024

Would you accept a quick PR which replaces the header-line with a mode-line indicator? I would add a gptel-status-function which can be either set to a function for the header-line and a function for the mode-line.

from gptel.

karthink avatar karthink commented on May 25, 2024

Sure, that would be a welcome addition! Which one do you think should be the default?

The reason I picked the header-line is because there are too many mode-line packages out there that transform/hide mode-line elements and minor-mode lighter info. doom-modeline is a popular example of this -- no Doom users would see any status messages until someone adds explicit support for doom-modeline. This makes my job of reporting errors/readiness etc harder.

Since the header-line is only intended for dedicated gptel buffers, it seemed like a safe bet.

from gptel.

karthink avatar karthink commented on May 25, 2024

Indeed, it is. The mode-line-process variable should be a reasonable safe alternative.

Sounds good to me.

from gptel.

karthink avatar karthink commented on May 25, 2024

This is the approach I favor. I am happy to help out if I find time, but I just started experimenting with this. The overall approach of your package seems good with the list of prompts in a plain text document, but I haven't checked out the other solutions.

@minad I added continuous response streaming -- it's now much faster (and continues to be async):

stream-demo.mp4

I haven't merged it yet (it's in the streaming branch) since the markdown -> Org conversion from an incomplete parse state is proving tricky.

from gptel.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.