valentinegb / openai Goto Github PK
View Code? Open in Web Editor NEWAn unofficial Rust library for the OpenAI API.
Home Page: https://crates.io/crates/openai/
License: MIT License
An unofficial Rust library for the OpenAI API.
Home Page: https://crates.io/crates/openai/
License: MIT License
Sometimes, for whatever the reason may be, a developer may opt to not use an environment variable for authentication. Currently, this is impossible.
I'm thinking of a function something like set_key(key: &str)
that stores the key
attribute in a variable in the library that can only be read within the library. Then, using an environment variable for the API key would look something like this:
use dotenvy::dotenv;
use std::env;
#[tokio::main]
async fn main() {
dotenv().ok();
openai::set_key(env::var("OPENAI_KEY").unwrap());
}
There could just be a key
variable in the library that is able to be read and changed from wherever, but that doesn't sound very safe to me.
Currently, there's a little bit of boilerplate when creating, for example, a Completion
. (This boilerplate being Default::default()
.) It doesn't look all that nice.
We can get rid of boilerplate by introducing builders. This would also allow for more flexible ways of creating completions and such, like setting echo
later in code if a certain condition is met. With builders, creating a Completion
would look a bit like this:
let completion = Completion::new(ModelID::TextDavinci003)
.prompt(“Say this is a test”)
.max_tokens(7)
.temperature(0.0)
.create()
.await
.unwrap()
.unwrap();
No response
The current version of this library does not support setting stream: true
for EventSource responses. This means that when a user makes a request to the API that expects an EventSource response, the library cannot properly parse the incoming events.
In addition to the existing create()
function, add a new function stream()
to build something like CompletionStream
that implements the futures::Stream
trait. Then, the user can use the following to call the API in streaming mode:
let chat_completion_stream = ChatCompletion::builder()
.model(ModelID::Gpt3_5Turbo)
.messages(vec![...])
.stream()
.unwrap();
while let Some(event) = chat_completion_stream.next().await {
println!("Got {:?}", event);
}
No response
Sometimes, you might not want to or be able to use an environment variable to store the API key. For example, in a Shuttle project, Shuttle Secrets are used instead of environment variables. However, the openai
library currently requires you use an environment variable, regardless.
I'm pretty sure anyone will only want to use one API key per project, so it's not necessary to allow for specifying which key to use every time executing an API function/method. (Ex. having a key
parameter or being a method to a structure that is initialized with a key, we don't want that.)
I think the best way to do this would be to have a public set_key()
function that will set a private variables than can only be read within the library. Functions and methods within the library can then reference this variable when they need the API key.
There's still one problem though and it's that, currently, the key is needed during compile-time to generate the ModelID
enumerator. I don't yet have a solution to this. If anyone has any ideas, definitely do tell me here, please.
Documentation cannot be hosted on docs.rs because the project requires an API key to build.
Create a GitHub action that builds the documentation and hosts it with GitHub Pages.
No response
When the Rust workflow is triggered by a pull request, it checks out the last commit on the main
branch instead of the last commit in the pull request / forked branch. The following code block is of the Run actions/checkout@v3
step from the Rust check on #50, as an example.
2023-03-26T20:01:30.8912244Z ##[group]Run actions/checkout@v3
2023-03-26T20:01:30.8912552Z with:
2023-03-26T20:01:30.8912795Z repository: valentinegb/openai
2023-03-26T20:01:30.8913221Z token: ***
2023-03-26T20:01:30.8913433Z ssh-strict: true
2023-03-26T20:01:30.8913651Z persist-credentials: true
2023-03-26T20:01:30.8913876Z clean: true
2023-03-26T20:01:30.8914078Z fetch-depth: 1
2023-03-26T20:01:30.8914276Z lfs: false
2023-03-26T20:01:30.8914458Z submodules: false
2023-03-26T20:01:30.8914681Z set-safe-directory: true
2023-03-26T20:01:30.8914899Z env:
2023-03-26T20:01:30.8915084Z CARGO_TERM_COLOR: always
2023-03-26T20:01:30.8915450Z OPENAI_KEY: ***
2023-03-26T20:01:30.8915660Z ##[endgroup]
2023-03-26T20:01:31.3029806Z Syncing repository: valentinegb/openai
2023-03-26T20:01:31.3032228Z ##[group]Getting Git version info
2023-03-26T20:01:31.3032907Z Working directory is '/home/runner/work/openai/openai'
2023-03-26T20:01:31.3054321Z [command]/usr/bin/git version
2023-03-26T20:01:31.3131653Z git version 2.40.0
2023-03-26T20:01:31.3246224Z ##[endgroup]
2023-03-26T20:01:31.3264186Z Temporarily overriding HOME='/home/runner/work/_temp/1452584f-27dd-44d3-8302-14ec8bc73ad9' before making global git config changes
2023-03-26T20:01:31.3264747Z Adding repository directory to the temporary git global config as a safe directory
2023-03-26T20:01:31.3265340Z [command]/usr/bin/git config --global --add safe.directory /home/runner/work/openai/openai
2023-03-26T20:01:31.3333231Z Deleting the contents of '/home/runner/work/openai/openai'
2023-03-26T20:01:31.3337016Z ##[group]Initializing the repository
2023-03-26T20:01:31.3339808Z [command]/usr/bin/git init /home/runner/work/openai/openai
2023-03-26T20:01:31.3411568Z hint: Using 'master' as the name for the initial branch. This default branch name
2023-03-26T20:01:31.3414301Z hint: is subject to change. To configure the initial branch name to use in all
2023-03-26T20:01:31.3417670Z hint: of your new repositories, which will suppress this warning, call:
2023-03-26T20:01:31.3418420Z hint:
2023-03-26T20:01:31.3418855Z hint: git config --global init.defaultBranch <name>
2023-03-26T20:01:31.3419123Z hint:
2023-03-26T20:01:31.3419609Z hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and
2023-03-26T20:01:31.3508556Z hint: 'development'. The just-created branch can be renamed via this command:
2023-03-26T20:01:31.3508923Z hint:
2023-03-26T20:01:31.3509221Z hint: git branch -m <name>
2023-03-26T20:01:31.3509680Z Initialized empty Git repository in /home/runner/work/openai/openai/.git/
2023-03-26T20:01:31.3510773Z [command]/usr/bin/git remote add origin https://github.com/valentinegb/openai
2023-03-26T20:01:31.3519639Z ##[endgroup]
2023-03-26T20:01:31.3520124Z ##[group]Disabling automatic garbage collection
2023-03-26T20:01:31.3522047Z [command]/usr/bin/git config --local gc.auto 0
2023-03-26T20:01:31.3558206Z ##[endgroup]
2023-03-26T20:01:31.3559093Z ##[group]Setting up auth
2023-03-26T20:01:31.3566159Z [command]/usr/bin/git config --local --name-only --get-regexp core\.sshCommand
2023-03-26T20:01:31.3604809Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'core\.sshCommand' && git config --local --unset-all 'core.sshCommand' || :"
2023-03-26T20:01:31.4258738Z [command]/usr/bin/git config --local --name-only --get-regexp http\.https\:\/\/github\.com\/\.extraheader
2023-03-26T20:01:31.4260261Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'http\.https\:\/\/github\.com\/\.extraheader' && git config --local --unset-all 'http.https://github.com/.extraheader' || :"
2023-03-26T20:01:31.4495613Z [command]/usr/bin/git config --local http.https://github.com/.extraheader AUTHORIZATION: basic ***
2023-03-26T20:01:31.4546984Z ##[endgroup]
2023-03-26T20:01:31.4547582Z ##[group]Fetching the repository
2023-03-26T20:01:31.4572295Z [command]/usr/bin/git -c protocol.version=2 fetch --no-tags --prune --progress --no-recurse-submodules --depth=1 origin +a9514d747bc67f416370d671c8db7ae5663ab3ea:refs/remotes/origin/main
2023-03-26T20:01:31.8514624Z remote: Enumerating objects: 38, done.
2023-03-26T20:01:31.8515369Z remote: Counting objects: 2% (1/38)
2023-03-26T20:01:31.8515905Z remote: Counting objects: 5% (2/38)
2023-03-26T20:01:31.8516329Z remote: Counting objects: 7% (3/38)
2023-03-26T20:01:31.8516743Z remote: Counting objects: 10% (4/38)
2023-03-26T20:01:31.8517132Z remote: Counting objects: 13% (5/38)
2023-03-26T20:01:31.8517652Z remote: Counting objects: 15% (6/38)
2023-03-26T20:01:31.8518169Z remote: Counting objects: 18% (7/38)
2023-03-26T20:01:31.8518543Z remote: Counting objects: 21% (8/38)
2023-03-26T20:01:31.8518924Z remote: Counting objects: 23% (9/38)
2023-03-26T20:01:31.8519307Z remote: Counting objects: 26% (10/38)
2023-03-26T20:01:31.8519688Z remote: Counting objects: 28% (11/38)
2023-03-26T20:01:31.8520084Z remote: Counting objects: 31% (12/38)
2023-03-26T20:01:31.8520665Z remote: Counting objects: 34% (13/38)
2023-03-26T20:01:31.8521096Z remote: Counting objects: 36% (14/38)
2023-03-26T20:01:31.8521488Z remote: Counting objects: 39% (15/38)
2023-03-26T20:01:31.8521993Z remote: Counting objects: 42% (16/38)
2023-03-26T20:01:31.8522380Z remote: Counting objects: 44% (17/38)
2023-03-26T20:01:31.8522767Z remote: Counting objects: 47% (18/38)
2023-03-26T20:01:31.8523153Z remote: Counting objects: 50% (19/38)
2023-03-26T20:01:31.8523535Z remote: Counting objects: 52% (20/38)
2023-03-26T20:01:31.8523904Z remote: Counting objects: 55% (21/38)
2023-03-26T20:01:31.8524291Z remote: Counting objects: 57% (22/38)
2023-03-26T20:01:31.8524674Z remote: Counting objects: 60% (23/38)
2023-03-26T20:01:31.8525180Z remote: Counting objects: 63% (24/38)
2023-03-26T20:01:31.8525551Z remote: Counting objects: 65% (25/38)
2023-03-26T20:01:31.8526588Z remote: Counting objects: 68% (26/38)
2023-03-26T20:01:31.8527014Z remote: Counting objects: 71% (27/38)
2023-03-26T20:01:31.8527395Z remote: Counting objects: 73% (28/38)
2023-03-26T20:01:31.8527888Z remote: Counting objects: 76% (29/38)
2023-03-26T20:01:31.8528315Z remote: Counting objects: 78% (30/38)
2023-03-26T20:01:31.8528690Z remote: Counting objects: 81% (31/38)
2023-03-26T20:01:31.8529199Z remote: Counting objects: 84% (32/38)
2023-03-26T20:01:31.8529557Z remote: Counting objects: 86% (33/38)
2023-03-26T20:01:31.8529928Z remote: Counting objects: 89% (34/38)
2023-03-26T20:01:31.8530302Z remote: Counting objects: 92% (35/38)
2023-03-26T20:01:31.8530667Z remote: Counting objects: 94% (36/38)
2023-03-26T20:01:31.8531033Z remote: Counting objects: 97% (37/38)
2023-03-26T20:01:31.8531416Z remote: Counting objects: 100% (38/38)
2023-03-26T20:01:31.8531807Z remote: Counting objects: 100% (38/38), done.
2023-03-26T20:01:31.8532226Z remote: Compressing objects: 3% (1/32)
2023-03-26T20:01:31.8532632Z remote: Compressing objects: 6% (2/32)
2023-03-26T20:01:31.8533147Z remote: Compressing objects: 9% (3/32)
2023-03-26T20:01:31.8533524Z remote: Compressing objects: 12% (4/32)
2023-03-26T20:01:31.8539024Z remote: Compressing objects: 15% (5/32)
2023-03-26T20:01:31.8571036Z remote: Compressing objects: 18% (6/32)
2023-03-26T20:01:31.8574281Z remote: Compressing objects: 21% (7/32)
2023-03-26T20:01:31.8575565Z remote: Compressing objects: 25% (8/32)
2023-03-26T20:01:31.8576943Z remote: Compressing objects: 28% (9/32)
2023-03-26T20:01:31.8578232Z remote: Compressing objects: 31% (10/32)
2023-03-26T20:01:31.8579413Z remote: Compressing objects: 34% (11/32)
2023-03-26T20:01:31.8580912Z remote: Compressing objects: 37% (12/32)
2023-03-26T20:01:31.8587654Z remote: Compressing objects: 40% (13/32)
2023-03-26T20:01:31.8591450Z remote: Compressing objects: 43% (14/32)
2023-03-26T20:01:31.8595359Z remote: Compressing objects: 46% (15/32)
2023-03-26T20:01:31.8597148Z remote: Compressing objects: 50% (16/32)
2023-03-26T20:01:31.8597796Z remote: Compressing objects: 53% (17/32)
2023-03-26T20:01:31.8599089Z remote: Compressing objects: 56% (18/32)
2023-03-26T20:01:31.8600216Z remote: Compressing objects: 59% (19/32)
2023-03-26T20:01:31.8601471Z remote: Compressing objects: 62% (20/32)
2023-03-26T20:01:31.8602481Z remote: Compressing objects: 65% (21/32)
2023-03-26T20:01:31.8603480Z remote: Compressing objects: 68% (22/32)
2023-03-26T20:01:31.8604451Z remote: Compressing objects: 71% (23/32)
2023-03-26T20:01:31.8605558Z remote: Compressing objects: 75% (24/32)
2023-03-26T20:01:31.8606654Z remote: Compressing objects: 78% (25/32)
2023-03-26T20:01:31.8607608Z remote: Compressing objects: 81% (26/32)
2023-03-26T20:01:31.8608472Z remote: Compressing objects: 84% (27/32)
2023-03-26T20:01:31.8609429Z remote: Compressing objects: 87% (28/32)
2023-03-26T20:01:31.8610503Z remote: Compressing objects: 90% (29/32)
2023-03-26T20:01:31.8611488Z remote: Compressing objects: 93% (30/32)
2023-03-26T20:01:31.8612565Z remote: Compressing objects: 96% (31/32)
2023-03-26T20:01:31.8614393Z remote: Compressing objects: 100% (32/32)
2023-03-26T20:01:31.8614777Z remote: Compressing objects: 100% (32/32), done.
2023-03-26T20:01:31.8787888Z remote: Total 38 (delta 4), reused 13 (delta 2), pack-reused 0
2023-03-26T20:01:31.8891488Z From https://github.com/valentinegb/openai
2023-03-26T20:01:31.8892633Z * [new ref] a9514d747bc67f416370d671c8db7ae5663ab3ea -> origin/main
2023-03-26T20:01:31.8931550Z ##[endgroup]
2023-03-26T20:01:31.8932002Z ##[group]Determining the checkout info
2023-03-26T20:01:31.8933443Z ##[endgroup]
2023-03-26T20:01:31.8934799Z ##[group]Checking out the ref
2023-03-26T20:01:31.8943618Z [command]/usr/bin/git checkout --progress --force -B main refs/remotes/origin/main
2023-03-26T20:01:31.9031252Z Switched to a new branch 'main'
2023-03-26T20:01:31.9037129Z branch 'main' set up to track 'origin/main'.
2023-03-26T20:01:31.9045012Z ##[endgroup]
2023-03-26T20:01:31.9113715Z [command]/usr/bin/git log -1 --format='%H'
2023-03-26T20:01:31.9155848Z 'a9514d747bc67f416370d671c8db7ae5663ab3ea'
main
branchThe Rust check should checkout the last commit in the pull request / forked branch.
No response
1.0.0-alpha.7
1.68.0
1.68.0
Linux
Gpt4 is available in limited beta. I have access, but seems this crate doesn't support it yet. I went around the source to figure out how to add support for it, but I'm unable to figure it out. The enums seems to not actually be defined here?
Support for Gpt4 :)
No response
I have setup a cloudfalre worker as my proxy using cloudflare-proxy, but i can not config BASE_URL to my proxy url.
Thanks.
make BASE_URL configurable
No response
The license of this package (being AGPL) is prohibitive that it cannot be used in a lot of environments. Would it be possible to relicense this under MIT/BSD/Apache2?
Pick a more liberal license.
No response
Kind of need ChatCompletionMessage
to implement Cloneable trait which you did yesterday
I know I could fork the repo and point my cargo to a local reference (which I have), but it'd be nice if you'd just issue a new tag and published in crates.io :), pretty please!
Thank you for this work!
let returned_message = chat_completion.choices.first().unwrap().message.clone();
| ^^^^^ method not found in `ChatCompletionMessage`
foo
just tagging and publishing latest changes.
1.0.0-alpha.5
1.67.0
1.67.0
macOS
If the API responds with an error, the program will panic because it fails to deserialize the response, it doesn’t ever expect an error.
We could deserialize responses into an enum with an error variant and non-error variant. The non-error variant should probably take in a generic which will be the struct expected for a successful request.
We could possibly use the pre-existing Result
enum and create a custom error struct, but I’m not sure if we can deserialize into Result
.
CreateCompletionRequestBody
’s temperature
and top_p
properties have type u8
, however the API expects a value between 0.0 and 1.0.
The properties should be a type that can accept floating point numbers.
1.0.0-alpha.2
1.66.0 (69f9c33d7 2022-12-12)
1.66.0 (d65d197ad 2022-11-15)
macOS
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.