Comments (14)
Fixed by:
(setq gptel-use-curl nil)
from gptel.
Hi @jsevo, I'm reopening this as this is a bug, avoiding curl doesn't fix it. Could you explain the problem exactly? Do you get this error:
Symbol’s function definition is void: gptel-curl-get-response
AND the status stuck on "Waiting"?
from gptel.
May I ask how you installed gptel? gptel-curl-get-response
should be available with a regular package-install
.
from gptel.
Not OP, just speaking for myself.
I had this issue too. Fixed by (require 'gptel-curl)
.
I didn't install gptel from a package archive, but simply cloned the repo and added it to load-path
, and thus the loaddefs file is not generated at all.
(I believe this is not a bug per se, but it may be helpful to add an autoload
call in gptel.el
.)
from gptel.
I had this same issue, also as a result of doing a manual install. (I don't want to add the melpa archive since I'm only using melpa-stable, so I didn't install through package-install.)
from gptel.
from gptel.
The only reason was I didn't know that was the correct way. Thanks for the pointer!
Sorry for the late reply. I just found your note here after encountering the same issue again.
from gptel.
Cool, I've updated the readme with better instructions. Thanks for bringing it up.
from gptel.
I got the same issue:
Querying ChatGPT...
gptel-send: Symbol’s function definition is void: gptel-curl-get-response
I can fix it be using:
(require 'gptel-curl)
(use-package gptel
:config
(require 'gptel-curl)
;; ...
)
from gptel.
@willbush How are you installing gptel?
from gptel.
@willbush How are you installing gptel?
I use https://github.com/nix-community/emacs-overlay in NixOS.
I bring that into my entire system configuration here.
I define what external packages I need here.
Then, I use use-package to configure it here.
Basically, I don't use a run-time package manager to fetch packages in Emacs. The command I use to update my system also updates Emacs and packages based on a flake.lock
file.
So in my load-path
I have:
"/nix/store/b94k756s95c21h5k3rvn30n8q3br6rhj-emacs-packages-deps/share/emacs/site-lisp/elpa/gptel-20231210.334"
Package gptel is external.
Status: External in ‘/nix/store/b94k756s95c21h5k3rvn30n8q3br6rhj-emacs-packages-deps/share/emacs/site-lisp/elpa/gptel-20231210.334/’ (unsigned).
Version: 20231210.334
Commit: 1f3b911c4e7c718ae5940f33ab12910e1cc7fac8
Summary: A simple multi-LLM client
Requires: emacs-27.1, transient-0.4.0
Website: https://github.com/karthink/gptel
Keywords: convenience
from gptel.
from gptel.
It does. I'm not too sure what's going wrong. Perhaps you might notice something?
dired of gptel-20231210.334:
/nix/store/b94k756s95c21h5k3rvn30n8q3br6rhj-emacs-packages-deps/share/emacs/site-lisp/elpa/gptel-20231210.334:
dr-xr-xr-x 2 root root 4.0K Dec 31 1969 .
dr-xr-xr-x 135 root root 12K Dec 31 1969 ..
lrwxrwxrwx 2 root root 123 Dec 31 1969 gptel.el -> /nix/store/0jpkpc67k0xzjk2dqbpr2jb56jrjs06b-emacs-gptel-20231210.334/share/emacs/site-lisp/elpa/gptel-20231210.334/gptel.el
lrwxrwxrwx 2 root root 124 Dec 31 1969 gptel.elc -> /nix/store/0jpkpc67k0xzjk2dqbpr2jb56jrjs06b-emacs-gptel-20231210.334/share/emacs/site-lisp/elpa/gptel-20231210.334/gptel.elc
lrwxrwxrwx 2 root root 133 Dec 31 1969 gptel-autoloads.el -> /nix/store/0jpkpc67k0xzjk2dqbpr2jb56jrjs06b-emacs-gptel-20231210.334/share/emacs/site-lisp/elpa/gptel-20231210.334/gptel-autoloads.el
lrwxrwxrwx 2 root root 128 Dec 31 1969 gptel-curl.el -> /nix/store/0jpkpc67k0xzjk2dqbpr2jb56jrjs06b-emacs-gptel-20231210.334/share/emacs/site-lisp/elpa/gptel-20231210.334/gptel-curl.el
lrwxrwxrwx 2 root root 129 Dec 31 1969 gptel-curl.elc -> /nix/store/0jpkpc67k0xzjk2dqbpr2jb56jrjs06b-emacs-gptel-20231210.334/share/emacs/site-lisp/elpa/gptel-20231210.334/gptel-curl.elc
lrwxrwxrwx 2 root root 130 Dec 31 1969 gptel-ollama.el -> /nix/store/0jpkpc67k0xzjk2dqbpr2jb56jrjs06b-emacs-gptel-20231210.334/share/emacs/site-lisp/elpa/gptel-20231210.334/gptel-ollama.el
lrwxrwxrwx 2 root root 131 Dec 31 1969 gptel-ollama.elc -> /nix/store/0jpkpc67k0xzjk2dqbpr2jb56jrjs06b-emacs-gptel-20231210.334/share/emacs/site-lisp/elpa/gptel-20231210.334/gptel-ollama.elc
lrwxrwxrwx 2 root root 130 Dec 31 1969 gptel-openai.el -> /nix/store/0jpkpc67k0xzjk2dqbpr2jb56jrjs06b-emacs-gptel-20231210.334/share/emacs/site-lisp/elpa/gptel-20231210.334/gptel-openai.el
lrwxrwxrwx 2 root root 131 Dec 31 1969 gptel-openai.elc -> /nix/store/0jpkpc67k0xzjk2dqbpr2jb56jrjs06b-emacs-gptel-20231210.334/share/emacs/site-lisp/elpa/gptel-20231210.334/gptel-openai.elc
lrwxrwxrwx 2 root root 127 Dec 31 1969 gptel-pkg.el -> /nix/store/0jpkpc67k0xzjk2dqbpr2jb56jrjs06b-emacs-gptel-20231210.334/share/emacs/site-lisp/elpa/gptel-20231210.334/gptel-pkg.el
lrwxrwxrwx 2 root root 133 Dec 31 1969 gptel-transient.el -> /nix/store/0jpkpc67k0xzjk2dqbpr2jb56jrjs06b-emacs-gptel-20231210.334/share/emacs/site-lisp/elpa/gptel-20231210.334/gptel-transient.el
lrwxrwxrwx 2 root root 134 Dec 31 1969 gptel-transient.elc -> /nix/store/0jpkpc67k0xzjk2dqbpr2jb56jrjs06b-emacs-gptel-20231210.334/share/emacs/site-lisp/elpa/gptel-20231210.334/gptel-transient.elc
gptel-autoloads.el
:
;;; gptel-autoloads.el --- automatically extracted autoloads (do not edit) -*- lexical-binding: t -*-
;; Generated by the `loaddefs-generate' function.
;; This file is part of GNU Emacs.
;;; Code:
(add-to-list 'load-path (or (and load-file-name (directory-file-name (file-name-directory load-file-name))) (car load-path)))
;;; Generated autoloads from gptel.el
(autoload 'gptel-mode "gptel" "\
Minor mode for interacting with ChatGPT.
This is a minor mode. If called interactively, toggle the `GPTel
mode' mode. If the prefix argument is positive, enable the mode,
and if it is zero or negative, disable the mode.
If called from Lisp, toggle the mode if ARG is `toggle'. Enable
the mode if ARG is nil, omitted, or is a positive number.
Disable the mode if ARG is a negative number.
To check whether the minor mode is enabled in the current buffer,
evaluate `gptel-mode'.
The mode's hook is called both when the mode is enabled and when
it is disabled.
(fn &optional ARG)" t)
(autoload 'gptel-send "gptel" "\
Submit this prompt to ChatGPT.
With prefix arg ARG activate a transient menu with more options
instead.
(fn &optional ARG)" t)
(autoload 'gptel "gptel" "\
Switch to or start ChatGPT session with NAME.
With a prefix arg, query for a (new) session name.
Ask for API-KEY if `gptel-api-key' is unset.
If region is active, use it as the INITIAL prompt. Returns the
buffer created or switched to.
(fn NAME &optional _ INITIAL)" t)
(register-definition-prefixes "gptel" '("gptel-"))
;;; Generated autoloads from gptel-curl.el
(autoload 'gptel-curl-get-response "gptel-curl" "\
Retrieve response to prompt in INFO.
INFO is a plist with the following keys:
- :prompt (the prompt being sent)
- :buffer (the gptel buffer)
- :position (marker at which to insert the response).
Call CALLBACK with the response and INFO afterwards. If omitted
the response is inserted into the current buffer after point.
(fn INFO &optional CALLBACK)")
(register-definition-prefixes "gptel-curl" '("gptel-"))
;;; Generated autoloads from gptel-ollama.el
(autoload 'gptel-make-ollama "gptel-ollama" "\
Register an Ollama backend for gptel with NAME.
Keyword arguments:
HOST is where Ollama runs (with port), typically localhost:11434
MODELS is a list of available model names.
STREAM is a boolean to toggle streaming responses, defaults to
false.
PROTOCOL (optional) specifies the protocol, http by default.
ENDPOINT (optional) is the API endpoint for completions, defaults to
\"/api/generate\".
HEADER (optional) is for additional headers to send with each
request. It should be an alist or a function that retuns an
alist, like:
((\"Content-Type\" . \"application/json\"))
KEY (optional) is a variable whose value is the API key, or
function that returns the key. This is typically not required for
local models like Ollama.
Example:
-------
(gptel-make-ollama
\"Ollama\"
:host \"localhost:11434\"
:models \\='(\"mistral:latest\")
:stream t)
(fn NAME &key HOST HEADER KEY MODELS STREAM (PROTOCOL \"http\") (ENDPOINT \"/api/generate\"))")
(register-definition-prefixes "gptel-ollama" '("gptel--ollama-context"))
;;; Generated autoloads from gptel-openai.el
(autoload 'gptel-make-openai "gptel-openai" "\
Register a ChatGPT backend for gptel with NAME.
Keyword arguments:
HOST (optional) is the API host, typically \"api.openai.com\".
MODELS is a list of available model names.
STREAM is a boolean to toggle streaming responses, defaults to
false.
PROTOCOL (optional) specifies the protocol, https by default.
ENDPOINT (optional) is the API endpoint for completions, defaults to
\"/v1/chat/completions\".
HEADER (optional) is for additional headers to send with each
request. It should be an alist or a function that retuns an
alist, like:
((\"Content-Type\" . \"application/json\"))
KEY (optional) is a variable whose value is the API key, or
function that returns the key.
(fn NAME &key HEADER MODELS STREAM (KEY \\='gptel-api-key) (HOST \"api.openai.com\") (PROTOCOL \"https\") (ENDPOINT \"/v1/chat/completions\"))")
(autoload 'gptel-make-azure "gptel-openai" "\
Register an Azure backend for gptel with NAME.
Keyword arguments:
HOST is the API host.
MODELS is a list of available model names.
STREAM is a boolean to toggle streaming responses, defaults to
false.
PROTOCOL (optional) specifies the protocol, https by default.
ENDPOINT is the API endpoint for completions.
HEADER (optional) is for additional headers to send with each
request. It should be an alist or a function that retuns an
alist, like:
((\"Content-Type\" . \"application/json\"))
KEY (optional) is a variable whose value is the API key, or
function that returns the key.
Example:
-------
(gptel-make-azure
\"Azure-1\"
:protocol \"https\"
:host \"RESOURCE_NAME.openai.azure.com\"
:endpoint
\"/openai/deployments/DEPLOYMENT_NAME/completions?api-version=2023-05-15\"
:stream t
:models \\='(\"gpt-3.5-turbo\" \"gpt-4\"))
(fn NAME &key HOST (PROTOCOL \"https\") (HEADER (lambda nil \\=`((\"api-key\" \\=\\, (gptel--get-api-key))))) (KEY \\='gptel-api-key) MODELS STREAM ENDPOINT)")
(defalias 'gptel-make-gpt4all 'gptel-make-openai "\
Register a GPT4All backend for gptel with NAME.
Keyword arguments:
HOST is where GPT4All runs (with port), typically localhost:8491
MODELS is a list of available model names.
STREAM is a boolean to toggle streaming responses, defaults to
false.
PROTOCOL specifies the protocol, https by default.
ENDPOINT (optional) is the API endpoint for completions, defaults to
\"/api/v1/completions\"
HEADER (optional) is for additional headers to send with each
request. It should be an alist or a function that retuns an
alist, like:
((\"Content-Type\" . \"application/json\"))
KEY (optional) is a variable whose value is the API key, or
function that returns the key. This is typically not required for
local models like GPT4All.
Example:
-------
(gptel-make-gpt4all
\"GPT4All\"
:protocol \"http\"
:host \"localhost:4891\"
:models \\='(\"mistral-7b-openorca.Q4_0.gguf\"))")
;;; Generated autoloads from gptel-transient.el
(autoload 'gptel-menu "gptel-transient" nil t)
(register-definition-prefixes "gptel-transient" '("gptel-"))
;;; End of scraped data
(provide 'gptel-autoloads)
;; Local Variables:
;; version-control: never
;; no-byte-compile: t
;; no-update-autoloads: t
;; no-native-compile: t
;; coding: utf-8-emacs-unix
;; End:
;;; gptel-autoloads.el ends here
I did check that gptel-curl.el
has
;;TODO: The :transformer argument here is an alternate implementation of
;;`gptel-response-filter-functions'. The two need to be unified.
;;;###autoload
(defun gptel-curl-get-response (info &optional callback)
;;; elided ...
from gptel.
Hmm I tried with emacs -Q
and same thing. I am going to try no native comp and perhaps installing manually.
from gptel.
Related Issues (20)
- gptel-org-branching-context requires Org 9.7 (unreleased) HOT 6
- Watermarking / inline annotation of LLM generated content vs prompt content HOT 2
- Org mode response formatting HOT 8
- Ollama as default backend HOT 1
- Perplexity: citations HOT 14
- Different results between gptel+ChatGPT and chat.openai.com HOT 2
- Auto Fill mode should be respected HOT 4
- Wrong language in the source block header HOT 4
- Help using local backends HOT 2
- Interacting with privateGPT (specifically for RAG) HOT 10
- Is it possible to interact with LLMs provided by an Open WebUI? HOT 1
- Error with the local LLM scripts in init file HOT 5
- How to disable the highlight after its done generating HOT 9
- How to register additional backends in doom emacs? HOT 5
- Enhance `gptel` to allow selecting from list of existing gptel buffers, when called interactively HOT 1
- [enhancement] Provide information about the different models HOT 1
- Updating `DoomEmacs`/`gptel` and chaing lightly configuration => only garbage generated now... HOT 8
- (gptel-menu) Truncated width of system message is wrong
- Better prefix-arg behavior for M-x gptel
- gptel-request wrong type argument HOT 7
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from gptel.