Comments (11)
I don't think using an API key from the environment is a problem, but requiring the environment variable to be set to compile your project is annoying.
I think passing a model string is perfectly fine, and Models enum is unnecessary - the api will complain if an invalid string was passed and that can be dealt with ergonomically.
from openai.
To be completely honest, it was a feature I worked on just because I didn't have experience with macros before and I thought it was cool- but it's causing more headaches than it's preventing, so you're right: it's got to go.
I got the idea to do something similar to generating ModelID
with a procedural macro, except generating just about everything using the OpenAI API schema, which would not need an API key. So I'll be experimenting with that, probably in another branch. This would be somewhat of a rewrite, so this issue would be resolved in that branch.
from openai.
Hey @ryanbliss, so the good news is that you aren't doing anything wrong, but the bad news is that it seems there was a pretty major oversight on my part. It looks like the library is looking for an .env
file at its own root directory when it should be looking for it at the root of your dependent project. I find it hard to believe I managed to go so long without realizing this- so if anyone can confirm that it's not working for them too, that would be great.
I think I have a solution in the generate-from-schema
branch, I'll have to copy it over to the main
branch. Shouldn't take longer than a day for me to do.
from openai.
Thanks for looking into it so quickly @valentinegb! I briefly looked over the generate-from-schema
branch, which looks great. Setting the key at runtime is a lot more flexible. I'll keep an eye out for this change and then give it another go! π₯³
from openai.
Should all be done in v1.0.0-alpha.7 π
from openai.
I looked into your branch and I think it is a step in a good direction but perhaps it could be even better?
I consider it an unnecessary complexity that some structs are generated in the build time based on some network service response. It basically means that different people can think they use the same version of the library but the code they have can be different.
My proposal is to define a cargo xtask, which you could run between releases, that would do what the proc macro does now.
Also, instead of the custom macro, you could use open-ai open-api and open-api-generator?
I think I could try playing with these if you were interested.
from openai.
I think youβre mistaken on your first point, the openai_proc_macros
package doesnβt even have reqwest
as a dependency, like the main openai
package does. It uses the openapi.yaml
file, which was downloaded manually by me from the OpenAI OpenAPI repository you linked.
It is my intention to continue to manually update this file between versions, so that certain versions of the library can be expected to match certain versions of the API, which like you said should prevent related confusion.
from openai.
Ah, yes, you are entirely correct! It seems I messed up two branches of the library in my head. Now I see that you even added a comment that explains why aren't you using open-api lib.
Still, what do you think about using open-api generator?
from openai.
Sorry for the delay in response- I read more about it and, yeah, I think it's worth giving the OpenAPI Generator, that OpenAI actually seems to advocate for in their OpenAPI repository, a try. Looks like I'll be making another branch haha-
If it ends up being just as hard or harder than what I'm working on in generate-from-schema
, or if the end result isn't how I want it to be, then I'll go back to working on the generate-from-schema
branch. Otherwise, thanks for the suggestion!
from openai.
Looks like there are changes being worked on here, but I am having some issues getting this package to compile and it seems semi-relevant. As far as I can tell, I'm following the samples pretty much exactly.
I have this as my Cargo.toml
:
[package]
name = "mindflex_server"
version = "0.1.0"
edition = "2021"
publish = false
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
tokio = { version = "1.26.0", features = ["full"] }
openai = "1.0.0-alpha.6"
dotenvy = "0.15.6"
and a .env
file in my project root (OPENAI_KEY=my_key). My folder structure looks like this:
.
βββ src
β βββ main.rs
βββ target
β βββ ...
βββ .gitignore
βββ Cargo.lock
βββ Cargo.toml
βββ .env
The main.rs file is the same as your completions sample. The only differences I can grok is that it isn't locally referencing the package locally since I'm not in your workspace. The env file can be accessed properly at runtime (if I remove openai
from the project) but not via the openai macro, thus triggering the panic:
help: message: environment variable `OPENAI_KEY` should be defined: NotPresent
I assume that this is a Rust beginner's mistake of some sort...if so, I apologize π . I'm on Windows (not Linux subsystem), if that matters. Any help would be appreciated!
from openai.
Awesome thank you! It's working great now π₯³
from openai.
Related Issues (12)
- Add way to define API key without environmental variables HOT 2
- `CreateCompletionRequestBody.temperature/top_p` should be floating point number
- Create builders for pub structs with lots of properties HOT 2
- Handle errors given from API responses
- Host documentation on GitHub Pages
- Use a more permissive license HOT 8
- Add support for setting `stream: true` for EventSource responses HOT 4
- please tag latest changes HOT 3
- Gpt4 support? HOT 2
- Pull request Rust check uses wrong commit HOT 1
- make BASE_URL configurable
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from openai.