Comments (41)
@patrick-moore heads up that we will prob integrate azure-openai as first party in the repo
from ai.
Looked into this some more, and there is some funkiness on the Azure API. Sending a single message with the {role: "user", content: [{type: "text", text: string}]}
returns a 200:
But, once you add multiple messages you get a misleading 500 error:
I was able to get chat streaming working by modifying the OpenAI provider to accept some Azure specific values and format user messages as {role: "user", content: string}
. Once I get time to test out the other functionality I can push this up as a community provider
from ai.
I'll have this up today for chat models and text input. Just trying to figure out some Github action stuff.
from ai.
https://github.com/patrickrmoore/azure-openai-provider
This has had minimal testing and is very much use at your own risk at the moment
from ai.
from ai.
I'm hoping to get something up later this week, and will update this thread when I do. Pretty sure I have everything working for text inputs with ai/rsc
.
The other change I've needed so far is adjusting the openaiChatChunkSchema
to allow for an empty object
property when streaming. It seems like Azure will send an empty first chunk sometimes, that can be ignored as far as I can tell.
from ai.
Azure OpenAI is not fully compatible with OpenAI and will need a separate provider
from ai.
It works now. Thanks.
from ai.
we also need this because our company is Azure only 🥲
from ai.
Awesome, I think that makes the most sense. The changes were pretty minimal and my current package is not set up to make integrating upstream changes easy. Adding a few extension points to the existing OpenAI provider would solve that. I'll hold off on putting in a PR to list this as a community provider. Let me know if yall would like any help on this!
from ai.
@patrick-moore thanks so much for your work! unblocked me today
from ai.
+1 on the timeline! @lgrammel :) Goes without saying but stellar work in this SDK!!
from ai.
@lgrammel I got it, yea that looks right and populates the endpoint in https://github.com/vercel/ai/blob/main/packages/azure/src/azure-openai-provider.ts#L64.
But it still seems like something is missing, it doesn't seem to send anything out with streamUI
. I am using this sample app for implementation https://github.com/vercel/ai-chatbot/tree/main. Going to dig a bit more into it..
Works great!
from ai.
@miguelvictor yes that's a know azure issue
from ai.
@miguelvictor not yet but I plan to implement #1893 which should help
from ai.
I am also having trouble switching to the ai/rsc SDK (specifically render
to streamUI
) because I can't figure out how to port the OpenAI constructor over to createOpenAI
. It seems like the provider code @ai-sdk/openai
is not generating the correct endpoint / parameters.
was:
import {
render
} from 'ai/rsc'
import OpenAI from 'openai'
export const openai = new OpenAI({
apiKey: process.env.AZURE_OPEN_AI_KEY,
baseURL: `${process.env.AZURE_OPEN_AI_ENDPOINT}/openai/deployments/${process.env.AZURE_OPEN_AI_DEPLOYMENT}`,
defaultQuery: { 'api-version': process.env.AZURE_OPEN_AI_API_VERSION },
defaultHeaders: { 'api-key': process.env.AZURE_OPEN_AI_KEY }
})
async function submitUserMessage(content: string) {
'use server'
const ui = render({
model: 'gpt-3.5-turbo',
provider: openai,
...
})
}
now:
import {
streamUI
} from 'ai/rsc';
import { createOpenAI } from '@ai-sdk/openai';
const openAI = createOpenAI({
apiKey: process.env.AZURE_OPEN_AI_KEY,
baseURL: `${process.env.AZURE_OPEN_AI_ENDPOINT}/openai/deployments/${process.env.AZURE_OPEN_AI_DEPLOYMENT}`,
headers: {
'api-version': process.env.AZURE_OPEN_AI_API_VERSION
},
compatibility: 'strict'
})
async function submitUserMessage(content: string) {
'use server'
const ui = await streamUI({
model: openAI(process.env.AZURE_OPEN_AI_DEPLOYMENT),
initial: <SpinnerMessage />,
...
})
...
}
from ai.
I was looking into this a little bit yesterday. I set the base url to my azure deployment, added the api-key
header, and then modified the openai provider endpoints to include an api-version
query parameter. This worked for the first request, but not subsequent ones that included chat history. I didn't have a chance to look in detail but I believe it was related to the openai provider including image type when I was using a model not supporting image type
from ai.
I spent hours trying to get it to work with Azure Open AI, but unfortunately, I couldn't make it happen. I'll just have to wait for the official Azure Open AI provider now.
from ai.
@patrick-moore heads up that we will prob integrate azure-openai as first party in the repo
@lgrammel is there a timeline for azure-openai first party provider? thanks
from ai.
I'm blocked (do not have Azure OpenAI access) so I cannot provide a timeline
from ai.
https://www.npmjs.com/package/@ai-sdk/azure
from ai.
I tried it immediately, but it seems there's a problem.
Cannot find module '[hidden]/node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/openai/internal/dist/index.mjs' imported from [hidden]/node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/azure/dist/index.mjs
I've already upgraded @ai-sdk/openai
to 0.0.25, but it doesn't export the internal
module.
from ai.
@wong2 thanks for reporting. #1912
from ai.
I tried it immediately, but it seems there's a problem.
Cannot find module '[hidden]/node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/openai/internal/dist/index.mjs' imported from [hidden]/node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/azure/dist/index.mjs
I've already upgraded
@ai-sdk/openai
to 0.0.25, but it doesn't export theinternal
module.
https://github.com/vercel/ai/releases/tag/%40ai-sdk/azure%400.0.2
from ai.
New error:
Type validation failed: Value: {"choices":[{"content_filter_offsets":{"check_offset":159,"start_offset":159,"end_offset":300},"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}},"finish_reason":null,"index":0}],"created":0,"id":"","model":"","object":""}.
Error message: [
{
"code": "invalid_union",
"unionErrors": [
{
"issues": [
{
"code": "invalid_type",
"expected": "object",
"received": "undefined",
"path": [
"choices",
0,
"delta"
],
"message": "Required"
}
],
"name": "ZodError"
},
{
"issues": [
{
"code": "invalid_type",
"expected": "object",
"received": "undefined",
"path": [
"error"
],
"message": "Required"
}
],
"name": "ZodError"
}
],
"path": [],
"message": "Invalid input"
}
]
from ai.
@wong2 what inputs / model are you using?
from ai.
@lgrammel GPT-4o, the input is just hi
from ai.
@wong2 i was not able to reproduce this with our deployments. However, I've prepared a PR that makes delta
optional: #1915
from ai.
@wong2 can you give https://github.com/vercel/ai/releases/tag/%40ai-sdk/azure%400.0.3 a try when you get a chance? i was not able to reproduce the issue, but this should add more robustness to the stream chunk validation
from ai.
@lgrammel awesome, giving this a spin now for streamUI
. However it's unclear how I pass in the four necessary configs:
import {
streamUI
} from 'ai/rsc';
import { createAzure } from '@ai-sdk/azure';
const azure = createAzure({
apiKey: process.env.AZURE_OPEN_AI_KEY,
resourceName: process.env.AZURE_OPEN_AI_DEPLOYMENT
})(process.env.AZURE_OPEN_AI_API_VERSION)
async function submitUserMessage(content: string) {
'use server'
const ui = await streamUI({
model: azure,
initial: <SpinnerMessage />,
...
})
...
}
It looks like the interface here is only resourceName
and apiKey
But I need to also set a baseURL
and an api version, no?
AZURE_OPEN_AI_KEY=$$$$$$
AZURE_OPEN_AI_ENDPOINT=yyyyy.zzzz.openai.azure.com/
AZURE_OPEN_AI_DEPLOYMENT=xxxxx-xxxx-gpt-4o
AZURE_OPEN_AI_API_VERSION=2024-05-01-preview
example:
// ?
createAzure({
apiKey: process.env.AZURE_OPEN_AI_KEY,
resourceName: process.env.AZURE_OPEN_AI_DEPLOYMENT
})(process.env.AZURE_OPEN_AI_API_VERSION)
// working
createOpenAI({
apiKey: process.env.OPEN_AI_KEY,
baseURL: process.env.OPEN_AI_BASE_URL,
organization: 'org-xxxxx',
headers: { ... }
})('gpt-3.5-turbo')
let me know how I can use this!
from ai.
You can use:
import { createAzure } from '@ai-sdk/azure';
const azure = createAzure({
resourceName: 'your-resource-name', // Azure resource name
apiKey: 'your-api-key',
});
and then create a model:
const model = azure('your-deployment-name');
here are the docs: https://sdk.vercel.ai/providers/ai-sdk-providers/azure
and if it helps, here is how the url is assembled internally: https://github.com/vercel/ai/blob/main/packages/azure/src/azure-openai-provider.ts#L64
from ai.
I'm noticing that streaming with azure is clunky than the one with openai but maybe this is just an azure problem 🤔
from ai.
I'm also noticing that the same tool is being called twice (at the same time; resulting to Error: .update(): UI stream is already closed.
errors) when using the azure provider but the problem goes away when I switch back to the openai provider.
from ai.
@miguelvictor do you have a test case that i could use to reproduce this?
from ai.
Sorry, it also happened with the openai provider. Is it possible to only process one tool per streamUI call?
from ai.
Cool! Thank you so much!
from ai.
Hi @lgrammel! I see you have already merged #1893 and you also added code for the azure provider. Did you perhaps forget to release the @ai-sdk/azure
provider package?
from ai.
@miguelvictor it is automatically released when the openai package changes: https://github.com/vercel/ai/releases/tag/%40ai-sdk%2Fazure%400.0.4 (should include that setting)
from ai.
Sorry, bun, for some reason, didn't pickup the semver update 🥲
I installed it manually but im getting another error: AI_APICallError: Unknown parameter: 'parallel_tool_calls'.
.
I'm using it like this:
import { createAzure } from "@ai-sdk/azure"
export const azure = createAzure({
resourceName: env.AZURE_OPENAI_RESOURCE_NAME,
apiKey: env.AZURE_OPENAI_KEY,
})(env.AZURE_OPENAI_DEPLOYMENT_CHAT, { parallelToolCalls: false })
from ai.
@miguelvictor That means that the Azure backend does not support it yet I think
from ai.
this works
export const azure = createAzure({
resourceName: env.AZURE_OPENAI_RESOURCE_NAME,
apiKey: env.AZURE_OPENAI_KEY,
})
model: azure('_gpt4o', {
parallelToolCalls: true,
}),
from ai.
Related Issues (20)
- On bedrock can't set tokens for claude-sonnet-3-5 > 4096 even though the model supports it HOT 1
- Module not found: Can't resolve '@google-cloud/vertexai' HOT 1
- convertToCoreMessages does not convert Message[] to CoreMessage[] HOT 1
- Bedrock client does not work natively with profiles, instance roles, container roles, etc. HOT 1
- Add Ai21Lab Provider @ai-sdk/ai21 for Jamba 1.5 Models HOT 1
- streamObject breaks with undefined (Mistral provider) HOT 4
- OpenAI Model tries to call unavailable tool `multi_tool_use.parallel` HOT 2
- Add JigsawStack SDK as part of the AI SDK HOT 3
- Type error when using model map HOT 2
- Adding .optional() or .partial() to the parent zod schema object with OpenAIs Structured output leads to error HOT 1
- Unexpected Termination of HTTP Connection with createstreamableUI in Concurrent WebSocket Sessions HOT 1
- generateObject fails with any Perplexity api models HOT 3
- `useObject` doesn't work properly when the LLM throws an error HOT 1
- Sveltekit handleSubmit sends back data as undefined HOT 1
- Tools call is triggered, but the new tools format does not execute the code. HOT 6
- When using streamUI, I need to be able to update the ui from onFinish HOT 1
- When using tools, I get a 'ui stream is already closed' HOT 3
- experimental_useObject to include attachments HOT 2
- ReferenceError: performance is not defined HOT 3
- Functions import from OpenAPI spec
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from ai.