Git Product home page Git Product logo

gptel's Introduction

GPTel: A simple ChatGPT client for Emacs

https://melpa.org/packages/gptel-badge.svg

GPTel is a simple, no-frills ChatGPT client for Emacs.

intro-demo.mp4
intro-demo-2.mp4
  • Requires an OpenAI API key.
  • It’s async and fast, streams responses.
  • Interact with ChatGPT from anywhere in Emacs (any buffer, shell, minibuffer, wherever)
  • ChatGPT’s responses are in Markdown or Org markup.
  • Supports conversations and multiple independent sessions.
  • You can go back and edit your previous prompts, or even ChatGPT’s previous responses when continuing a conversation. These will be fed back to ChatGPT.

GPTel uses Curl if available, but falls back to url-retrieve to work without external dependencies.

Contents

Breaking Changes

  • gptel-api-key-from-auth-source now searches for the API key using the value of gptel-host, i.e. “api.openai.com” instead of the original “openai.com”. You need to update your ~/.authinfo.

Installation

GPTel is on MELPA. Install it with M-x package-install⏎ gptel.

(Optional: Install markdown-mode.)

Straight

(straight-use-package 'gptel)

Installing the markdown-mode package is optional.

Manual

Clone or download this repository and run M-x package-install-file⏎ on the repository directory.

Installing the markdown-mode package is optional.

Doom Emacs

In packages.el

(package! gptel)

In config.el

(use-package! gptel
 :config
 (setq! gptel-api-key "your key"))

Spacemacs

After installation with M-x package-install⏎ gptel

  • Add gptel to dotspacemacs-additional-packages
  • Add (require 'gptel) to dotspacemacs/user-config

Usage

Procure an OpenAI API key.

Optional: Set gptel-api-key to the key. Alternatively, you may choose a more secure method such as:

  • Storing in ~/.authinfo. By default, “api.openai.com” is used as HOST and “apikey” as USER.
    machine api.openai.com login apikey password TOKEN
        
  • Storing in any file and manually reading it
    (use-package! gptel
      :config
      (with-temp-buffer
        (insert-file-contents "/file/containing/openai.token")
        (setq! gptel-api-key (buffer-string))))
        
  • Setting it to a function that returns the key.

In any buffer:

  1. Select a region of text and call M-x gptel-send. The response will be inserted below your region.
  2. You can select both the original prompt and the response and call M-x gptel-send again to continue the conversation.
  3. Call M-x gptel-send with a prefix argument to
  • set chat parameters (GPT model, directives etc) for this buffer,
  • to read the prompt from elsewhere or redirect the response elsewhere,
  • or to replace the prompt with the response.

https://user-images.githubusercontent.com/8607532/230770018-9ce87644-6c17-44af-bd39-8c899303dce1.png

With a region selected, you can also rewrite prose or refactor code from here:

Code:

https://user-images.githubusercontent.com/8607532/230770162-1a5a496c-ee57-4a67-9c95-d45f238544ae.png

Prose:

https://user-images.githubusercontent.com/8607532/230770352-ee6f45a3-a083-4cf0-b13c-619f7710e9ba.png

In a dedicated chat buffer:

  1. Run M-x gptel to start or switch to the ChatGPT buffer. It will ask you for the key if you skipped the previous step. Run it with a prefix-arg (C-u M-x gptel) to start a new session.
  2. In the gptel buffer, send your prompt with M-x gptel-send, bound to C-c RET.
  3. Set chat parameters (GPT model, directives etc) for the session by calling gptel-send with a prefix argument (C-u C-c RET):

https://user-images.githubusercontent.com/8607532/224946059-9b918810-ab8b-46a6-b917-549d50c908f2.png

That’s it. You can go back and edit previous prompts and responses if you want.

The default mode is markdown-mode if available, else text-mode. You can set gptel-default-mode to org-mode if desired.

Using it your way

GPTel’s default usage pattern is simple, and will stay this way: Read input in any buffer and insert the response below it.

If you want custom behavior, such as

  • reading input from or output to the echo area,
  • or in pop-up windows,
  • sending the current line only, etc,

GPTel provides a general gptel-request function that accepts a custom prompt and a callback to act on the response. You can use this to build custom workflows not supported by gptel-send. See the documentation of gptel-request, and the wiki for examples.

Extensions using GPTel

These are packages that depend on GPTel to provide additional functionality

Additional Configuration

  • gptel-host: Overrides the OpenAI API host. This is useful for those who transform Azure API into OpenAI API format, utilize reverse proxy, or employ third-party proxy services for the OpenAI API.
  • gptel-proxy: Path to a proxy to use for GPTel interactions. This is passed to Curl via the --proxy argument.

Why another ChatGPT client?

Other Emacs clients for ChatGPT prescribe the format of the interaction (a comint shell, org-babel blocks, etc). I wanted:

  1. Something that is as free-form as possible: query ChatGPT using any text in any buffer, and redirect the response as required. Using a dedicated gptel buffer just adds some visual flair to the interaction.
  2. Integration with org-mode, not using a walled-off org-babel block, but as regular text. This way ChatGPT can generate code blocks that I can run.

Will you add feature X?

Maybe, I’d like to experiment a bit more first. Features added since the inception of this package include

  • Curl support (gptel-use-curl)
  • Streaming responses (gptel-stream)
  • Cancelling requests in progress (gptel-abort)
  • General API for writing your own commands (gptel-request, wiki)
  • Dispatch menus using Transient (gptel-send with a prefix arg)
  • Specifying the conversation context size
  • GPT-4 support
  • Response redirection (to the echo area, another buffer, etc)
  • A built-in refactor/rewrite prompt

Features being considered or in the pipeline:

  • Limiting conversation context to Org headings using properties (#58)
  • Stateless design (#17)

Alternatives

Other Emacs clients for ChatGPT include

  • chatgpt-shell: comint-shell based interaction with ChatGPT. Also supports DALL-E, executable code blocks in the responses, and more.
  • org-ai: Interaction through special #+begin_ai ... #+end_ai Org-mode blocks. Also supports DALL-E, querying ChatGPT with the contents of project files, and more.

There are several more: chatgpt-arcana, leafy-mode, chat.el

Acknowledgments

gptel's People

Contributors

karthink avatar akirak avatar alessandrow avatar minad avatar fmguerreiro avatar marcuskammer avatar palacechan avatar ridaayed avatar schroedi avatar tshu-w avatar tony-dwire-paradigm avatar tmr08c avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.