Git Product home page Git Product logo

Comments (41)

lgrammel avatar lgrammel commented on September 26, 2024 11

@patrick-moore heads up that we will prob integrate azure-openai as first party in the repo

from ai.

patrick-moore avatar patrick-moore commented on September 26, 2024 6

Looked into this some more, and there is some funkiness on the Azure API. Sending a single message with the {role: "user", content: [{type: "text", text: string}]} returns a 200:

CleanShot 2024-05-25 at 01 20 03@2x

But, once you add multiple messages you get a misleading 500 error:

CleanShot 2024-05-25 at 01 21 10@2x

I was able to get chat streaming working by modifying the OpenAI provider to accept some Azure specific values and format user messages as {role: "user", content: string}. Once I get time to test out the other functionality I can push this up as a community provider

from ai.

patrick-moore avatar patrick-moore commented on September 26, 2024 5

I'll have this up today for chat models and text input. Just trying to figure out some Github action stuff.

from ai.

patrick-moore avatar patrick-moore commented on September 26, 2024 5

https://github.com/patrickrmoore/azure-openai-provider

This has had minimal testing and is very much use at your own risk at the moment

from ai.

lgrammel avatar lgrammel commented on September 26, 2024 5

#1901

from ai.

patrick-moore avatar patrick-moore commented on September 26, 2024 4

I'm hoping to get something up later this week, and will update this thread when I do. Pretty sure I have everything working for text inputs with ai/rsc.

The other change I've needed so far is adjusting the openaiChatChunkSchema to allow for an empty object property when streaming. It seems like Azure will send an empty first chunk sometimes, that can be ignored as far as I can tell.

from ai.

lgrammel avatar lgrammel commented on September 26, 2024 2

Azure OpenAI is not fully compatible with OpenAI and will need a separate provider

from ai.

wong2 avatar wong2 commented on September 26, 2024 2

It works now. Thanks.

from ai.

miguelvictor avatar miguelvictor commented on September 26, 2024 1

we also need this because our company is Azure only 🥲

from ai.

patrick-moore avatar patrick-moore commented on September 26, 2024 1

Awesome, I think that makes the most sense. The changes were pretty minimal and my current package is not set up to make integrating upstream changes easy. Adding a few extension points to the existing OpenAI provider would solve that. I'll hold off on putting in a PR to list this as a community provider. Let me know if yall would like any help on this!

from ai.

yangcheng avatar yangcheng commented on September 26, 2024 1

@patrick-moore thanks so much for your work! unblocked me today

from ai.

simpel avatar simpel commented on September 26, 2024 1

+1 on the timeline! @lgrammel :) Goes without saying but stellar work in this SDK!!

from ai.

bneigher avatar bneigher commented on September 26, 2024 1

@lgrammel I got it, yea that looks right and populates the endpoint in https://github.com/vercel/ai/blob/main/packages/azure/src/azure-openai-provider.ts#L64.

But it still seems like something is missing, it doesn't seem to send anything out with streamUI. I am using this sample app for implementation https://github.com/vercel/ai-chatbot/tree/main. Going to dig a bit more into it..

Works great!

from ai.

lgrammel avatar lgrammel commented on September 26, 2024 1

@miguelvictor yes that's a know azure issue

from ai.

lgrammel avatar lgrammel commented on September 26, 2024 1

@miguelvictor not yet but I plan to implement #1893 which should help

from ai.

bneigher avatar bneigher commented on September 26, 2024

I am also having trouble switching to the ai/rsc SDK (specifically render to streamUI) because I can't figure out how to port the OpenAI constructor over to createOpenAI. It seems like the provider code @ai-sdk/openai is not generating the correct endpoint / parameters.

was:

import {
  render
} from 'ai/rsc'
import OpenAI from 'openai'

export const openai = new OpenAI({
  apiKey: process.env.AZURE_OPEN_AI_KEY,
  baseURL: `${process.env.AZURE_OPEN_AI_ENDPOINT}/openai/deployments/${process.env.AZURE_OPEN_AI_DEPLOYMENT}`,
  defaultQuery: { 'api-version': process.env.AZURE_OPEN_AI_API_VERSION },
  defaultHeaders: { 'api-key': process.env.AZURE_OPEN_AI_KEY }
})

async function submitUserMessage(content: string) {
  'use server'

  const ui = render({
    model: 'gpt-3.5-turbo',
    provider: openai,
    ...
  })
}
    

now:

import {
  streamUI
} from 'ai/rsc';
import { createOpenAI } from '@ai-sdk/openai';

const openAI = createOpenAI({
  apiKey: process.env.AZURE_OPEN_AI_KEY,
  baseURL: `${process.env.AZURE_OPEN_AI_ENDPOINT}/openai/deployments/${process.env.AZURE_OPEN_AI_DEPLOYMENT}`,
  headers: {
    'api-version': process.env.AZURE_OPEN_AI_API_VERSION
  },
  compatibility: 'strict'
})

async function submitUserMessage(content: string) {
  'use server'

  const ui = await streamUI({
      model: openAI(process.env.AZURE_OPEN_AI_DEPLOYMENT),
      initial: <SpinnerMessage />,
      ...
  })
  ...
}

from ai.

patrick-moore avatar patrick-moore commented on September 26, 2024

I was looking into this a little bit yesterday. I set the base url to my azure deployment, added the api-key header, and then modified the openai provider endpoints to include an api-version query parameter. This worked for the first request, but not subsequent ones that included chat history. I didn't have a chance to look in detail but I believe it was related to the openai provider including image type when I was using a model not supporting image type

CleanShot 2024-05-23 at 13 29 27@2x

from ai.

microflyer avatar microflyer commented on September 26, 2024

I spent hours trying to get it to work with Azure Open AI, but unfortunately, I couldn't make it happen. I'll just have to wait for the official Azure Open AI provider now.

from ai.

yangcheng avatar yangcheng commented on September 26, 2024

@patrick-moore heads up that we will prob integrate azure-openai as first party in the repo

@lgrammel is there a timeline for azure-openai first party provider? thanks

from ai.

lgrammel avatar lgrammel commented on September 26, 2024

I'm blocked (do not have Azure OpenAI access) so I cannot provide a timeline

from ai.

lgrammel avatar lgrammel commented on September 26, 2024

https://www.npmjs.com/package/@ai-sdk/azure

from ai.

wong2 avatar wong2 commented on September 26, 2024

I tried it immediately, but it seems there's a problem.

Cannot find module '[hidden]/node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/openai/internal/dist/index.mjs' imported from [hidden]/node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/azure/dist/index.mjs

I've already upgraded @ai-sdk/openai to 0.0.25, but it doesn't export the internal module.

from ai.

lgrammel avatar lgrammel commented on September 26, 2024

@wong2 thanks for reporting. #1912

from ai.

lgrammel avatar lgrammel commented on September 26, 2024

I tried it immediately, but it seems there's a problem.

Cannot find module '[hidden]/node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/openai/internal/dist/index.mjs' imported from [hidden]/node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/azure/dist/index.mjs

I've already upgraded @ai-sdk/openai to 0.0.25, but it doesn't export the internal module.

https://github.com/vercel/ai/releases/tag/%40ai-sdk/azure%400.0.2

from ai.

wong2 avatar wong2 commented on September 26, 2024

New error:

Type validation failed: Value: {"choices":[{"content_filter_offsets":{"check_offset":159,"start_offset":159,"end_offset":300},"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}},"finish_reason":null,"index":0}],"created":0,"id":"","model":"","object":""}.
Error message: [
  {
    "code": "invalid_union",
    "unionErrors": [
      {
        "issues": [
          {
            "code": "invalid_type",
            "expected": "object",
            "received": "undefined",
            "path": [
              "choices",
              0,
              "delta"
            ],
            "message": "Required"
          }
        ],
        "name": "ZodError"
      },
      {
        "issues": [
          {
            "code": "invalid_type",
            "expected": "object",
            "received": "undefined",
            "path": [
              "error"
            ],
            "message": "Required"
          }
        ],
        "name": "ZodError"
      }
    ],
    "path": [],
    "message": "Invalid input"
  }
]

from ai.

lgrammel avatar lgrammel commented on September 26, 2024

@wong2 what inputs / model are you using?

from ai.

wong2 avatar wong2 commented on September 26, 2024

@lgrammel GPT-4o, the input is just hi

from ai.

lgrammel avatar lgrammel commented on September 26, 2024

@wong2 i was not able to reproduce this with our deployments. However, I've prepared a PR that makes delta optional: #1915

from ai.

lgrammel avatar lgrammel commented on September 26, 2024

@wong2 can you give https://github.com/vercel/ai/releases/tag/%40ai-sdk/azure%400.0.3 a try when you get a chance? i was not able to reproduce the issue, but this should add more robustness to the stream chunk validation

from ai.

bneigher avatar bneigher commented on September 26, 2024

@lgrammel awesome, giving this a spin now for streamUI. However it's unclear how I pass in the four necessary configs:

import {
  streamUI
} from 'ai/rsc';
import { createAzure } from '@ai-sdk/azure';

const azure = createAzure({
    apiKey: process.env.AZURE_OPEN_AI_KEY,
    resourceName: process.env.AZURE_OPEN_AI_DEPLOYMENT
})(process.env.AZURE_OPEN_AI_API_VERSION)

async function submitUserMessage(content: string) {
  'use server'

  const ui = await streamUI({
      model: azure,
      initial: <SpinnerMessage />,
      ...
  })
  ...
}

It looks like the interface here is only resourceName and apiKey

But I need to also set a baseURL and an api version, no?

AZURE_OPEN_AI_KEY=$$$$$$
AZURE_OPEN_AI_ENDPOINT=yyyyy.zzzz.openai.azure.com/
AZURE_OPEN_AI_DEPLOYMENT=xxxxx-xxxx-gpt-4o
AZURE_OPEN_AI_API_VERSION=2024-05-01-preview

example:

// ?
createAzure({
    apiKey: process.env.AZURE_OPEN_AI_KEY,
    resourceName: process.env.AZURE_OPEN_AI_DEPLOYMENT
})(process.env.AZURE_OPEN_AI_API_VERSION)

// working
createOpenAI({
    apiKey: process.env.OPEN_AI_KEY,
    baseURL: process.env.OPEN_AI_BASE_URL,
    organization: 'org-xxxxx',
    headers: { ... }
})('gpt-3.5-turbo')

let me know how I can use this!

from ai.

lgrammel avatar lgrammel commented on September 26, 2024

@bneigher

You can use:

import { createAzure } from '@ai-sdk/azure';

const azure = createAzure({
  resourceName: 'your-resource-name', // Azure resource name
  apiKey: 'your-api-key',
});

and then create a model:

const model = azure('your-deployment-name');

here are the docs: https://sdk.vercel.ai/providers/ai-sdk-providers/azure

and if it helps, here is how the url is assembled internally: https://github.com/vercel/ai/blob/main/packages/azure/src/azure-openai-provider.ts#L64

from ai.

miguelvictor avatar miguelvictor commented on September 26, 2024

I'm noticing that streaming with azure is clunky than the one with openai but maybe this is just an azure problem 🤔

from ai.

miguelvictor avatar miguelvictor commented on September 26, 2024

I'm also noticing that the same tool is being called twice (at the same time; resulting to Error: .update(): UI stream is already closed. errors) when using the azure provider but the problem goes away when I switch back to the openai provider.

from ai.

lgrammel avatar lgrammel commented on September 26, 2024

@miguelvictor do you have a test case that i could use to reproduce this?

from ai.

miguelvictor avatar miguelvictor commented on September 26, 2024

Sorry, it also happened with the openai provider. Is it possible to only process one tool per streamUI call?

from ai.

miguelvictor avatar miguelvictor commented on September 26, 2024

Cool! Thank you so much!

from ai.

miguelvictor avatar miguelvictor commented on September 26, 2024

Hi @lgrammel! I see you have already merged #1893 and you also added code for the azure provider. Did you perhaps forget to release the @ai-sdk/azure provider package?

from ai.

lgrammel avatar lgrammel commented on September 26, 2024

@miguelvictor it is automatically released when the openai package changes: https://github.com/vercel/ai/releases/tag/%40ai-sdk%2Fazure%400.0.4 (should include that setting)

from ai.

miguelvictor avatar miguelvictor commented on September 26, 2024

Sorry, bun, for some reason, didn't pickup the semver update 🥲
I installed it manually but im getting another error: AI_APICallError: Unknown parameter: 'parallel_tool_calls'..

I'm using it like this:

import { createAzure } from "@ai-sdk/azure"

export const azure = createAzure({
  resourceName: env.AZURE_OPENAI_RESOURCE_NAME,
  apiKey: env.AZURE_OPENAI_KEY,
})(env.AZURE_OPENAI_DEPLOYMENT_CHAT, { parallelToolCalls: false })

from ai.

lgrammel avatar lgrammel commented on September 26, 2024

@miguelvictor That means that the Azure backend does not support it yet I think

from ai.

Dinuda avatar Dinuda commented on September 26, 2024

this works

export const azure = createAzure({
  resourceName: env.AZURE_OPENAI_RESOURCE_NAME,
  apiKey: env.AZURE_OPENAI_KEY,
})

 model: azure('_gpt4o', {
      parallelToolCalls: true,
    }),

from ai.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.