hemulgm / delphiopenai Goto Github PK
View Code? Open in Web Editor NEWOpenAI API wrapper for Delphi. Use ChatGPT, DALL-E, Whisper and other products.
License: MIT License
OpenAI API wrapper for Delphi. Use ChatGPT, DALL-E, Whisper and other products.
License: MIT License
The enum element "Answer" should be "Answers".
Regardless, "Answers", "Search", and "Classification" are all deprecated, see this https://github.com/openai/openai-cookbook/tree/main/transition_guides_for_deprecated_API_endpoints
Hi,
When I try to use stream mode using the example in ChatGPT_Console.dproj I have the error below
[dcc32 Error] ChatGPT_Console.dpr(68): E2010 Incompatible types: 'TChatEvent' and 'Procedure'
OpenAI.Chat.CreateStream(
procedure(Params: TChatParams)
begin
Params.Messages([TchatMessageBuild.User(Buf.Text)]);
Params.MaxTokens(1024);
Params.Stream;
end,
procedure(Chat: TChat; IsDone: Boolean; var Cancel: Boolean)
begin
if (not IsDone) and Assigned(Chat) then
Writeln(Chat.Choices[0].Delta.Content)
else if IsDone then
Writeln('DONE!');
Writeln('-------');
Sleep(100);
end);
Hi,
Today when I try to send prompt to ChatGPT-4, I will always receive the following error:
Exception class ENetHTTPClientException with message 'Error receiving data: (12002) Operation timeout'
Why?
I got this error when I try to use my long prompt........
Hi,
I want to mimic the "Regenerate" function in ChatGPT interface.
for var Choice in Chat.Choices do
begin
ChatHistory.New(TMessageRole.Assistant, Choice.Message.Content, '');
txtChatReply.Lines.Add(Choice.Message.Content);
txtChatReply.Lines.Add('---------------');
end;
I just wonder if the regenerated prompt is just another Choice in the current Chat.Choices, or just resending the original prompt and get a new group of Choices(and use the first one)?
Recently when I call Dalle-3 model, I will frequently get OpenAI exception with the following error message:
“The input prompt cannot be handled by the engine.”
I ask on OpenAI community, but someone said it is related to the library, not OpenAI, see this: https://community.openai.com/t/the-input-prompt-cannot-be-handled-by-the-engine/700460
Hi,
I try to use the following statement to send a prompt to OpenAI. However, it usually takes about 38s to get a response, during this period. The application is blocked. Is there a way to send the prompt asynchronously so that the function will return immediately and when the response is back, a callback function can be notified?
var Chat := OpenAI.Chat.Create(
procedure(Params: TChatParams)
begin
Params.Messages([TChatMessageBuild.Create(TMessageRole.User, Text)]);
Params.MaxTokens(1024);
end);
Thank you.
Hello there!
First of all, thanks very much again for your work at this project. Trying to upgrade to the version 1.2 in Delphi 10.4 I find the below code in the "OpenAI.FineTuning.pas" unit:
{ TFineTuningCreateParams }
function TFineTuningCreateParams.Hyperparameters(const NEpochs: Integer): TFineTuningCreateParams;
begin
Result := TFineTuningCreateParams(Add('hyperparameters', TJSONObject.Create(TJSONPair.Create('n_epochs', NEpochs))));
end;
When compile I get the below error:
[dcc64 Error] OpenAI.FineTuning.pas(257): E2250 There is no overloaded version of 'Create' that can be called with these arguments
What I do is to apply the below possible solution: basically use TJSONNumber.Create() to properly pass the "NEpochs" argument as a "TJsonValue", which is one of the possible types of the argument expected by the "TJSONPair.Create()" method.
{ TFineTuningCreateParams }
function TFineTuningCreateParams.Hyperparameters(const NEpochs: Integer): TFineTuningCreateParams;
begin
Result := TFineTuningCreateParams(Add('hyperparameters',
TJSONObject.Create(TJSONPair.Create('n_epochs', TJSONNumber.Create(NEpochs)))));
end;
The above change compile as expected, but, since I am not use this specific class (at least not specifically), I am not sure if this is the better possible solution (but I can compile and use the library without problems). Maybe we can use something like below too:
{ TFineTuningCreateParams }
function TFineTuningCreateParams.Hyperparameters(const NEpochs: Integer): TFineTuningCreateParams;
begin
Result := TFineTuningCreateParams(Add('hyperparameters',
TJSONObject.Create(TJSONPair.Create('n_epochs', IntToStr(NEpochs)))));
end;
... since to "TJSONPair.Create()" we can also pass the argument as a string. Maybe this is not a bug nor an error but something that I simply can't properly understand, but, certainly with the original code I can't compile the unit due to the error that I referred above.
I hope this can help in some way!
When trying to post a large file, an error triggers "Maximum content size limit (26214400) exceeded (26405554 bytes read)". Is there a way to send the file (and also receive the data back) asynchronously to keep track of the upload/received data? (and not lock the thread while waiting for the response). Thanks
Hi,
After I sending the prompt, I may get multiple Choices, as below:
for Choice in Chat.Choices do
begin
end;
Is the first Choice has the highest probability? How to know which Choice is better? I only find a property Delta, which seems to be related to this, but cannot find the documentation on this.
Thank you.
Hello,
The default model of completion = text-davinci-003
doesn't work more.
Your sample :
var Completions := OpenAI.Completion.Create(
procedure(Params: TCompletionParams)
begin
Params.Prompt(MemoPrompt.Text);
Params.MaxTokens(2048);
end);
try
for var Choice in Completions.Choices do
MemoChat.Lines.Add(Choice.Index.ToString + ' ' + Choice.Text);
finally
Completions.Free;
end;
Will return :
Exception OpenAIExceptionInvalidRequestError with the message 'The model text-davinci-003
has been deprecated, learn more here: https://platform.openai.com/docs/deprecations'.
Because of by default:
OpenAI.complmetion :
constructor TCompletionParams.Create;
begin
inherited;
Model('text-davinci-003');
Temperature(0);
end;
Maybe replaced by :
constructor TCompletionParams.Create;
begin
inherited;
Model('gpt-3.5-turbo-instruct');
//Model('text-davinci-003');
Temperature(0);
end;
`
Of course ovveride params.model can be a workarround :
var Completions := OpenAI.Completion.Create(
procedure(Params: TCompletionParams)
begin
Params.Prompt(MemoPrompt.Text);
Params.MaxTokens(2048);
Params.model('gpt-3.5-turbo-instruct'); // HERE
end);
try
for var Choice in Completions.Choices do
MemoChat.Lines.Add(Choice.Index.ToString + ' ' + Choice.Text);
finally
Completions.Free;
end;`
Today when I call OpenAI APIs, I get the following exception:
Project Test.exe raised exception class OpenAIException with message 'Bad gateway.'.
Can I safely ignore the exception and resend prompt to OpenAI?
Basically, I modify the codes as below:
while True do
try
Chat := FOpenAI.Chat.Create(
procedure(Params: TChatParams)
begin
Params.Messages(FChatHistory.ToArray());
Params.Model(FModel);
end);
// If everything is OK, then we will exit
Break;
except
// Ignore exceptions and restart
Temp := 'Exception';
end;
Is it possible (or will be possible in the future) to use OpenAI models deployed via Azure? Models are always distributed via API KEY but through a different service from the OpenAI platform.
Thanks
Hi,
Currently I am using the following codes to generate images:
var Images := OpenAI.Image.Create(
procedure(Params: TImageCreateParams)
begin
Params.Prompt(MemoPrompt.Text);
Params.ResponseFormat('url');
end);
try
for var Image in Images.Data do
Image1.Bitmap.LoadFromUrl(Image.Url);
finally
Images.Free;
end;
However, the generated image is not very good. So I am thinking of changing the model. Can I set the model for the image parameters?
Also I check the model list, and it seems only the following one is used for image:
dall-e-2
Hello,
I'm using the Delphi 10.4 CE and had two compile errors:
[dcc32 Fehler] OpenAI.Images.pas(511): E2003 Undeclared Identifier: 'GetTickCount64'
[dcc32 Fehler] OpenAI.Images.pas(526): E2003 Undeclared Identifier: 'GetTickCount64'
In OpenAI.Images.pas in TImagesAzureRoute.Create are two occurences of the call TThread.GetTickCount64. GetTickCount64 is available since Delphi 11 if I remeber correctly. In previous versions of Delphi only GetTickCount is available within TThread.
Maybe this call can be encapsulated by compiler switches to keep the backward compatibility. I corrected it for myself within the source code, so there is no bigger problem for me, but other developers will lead into this issue too.
Hi,
Does your code supports the OpenAI text to speech model at https://platform.openai.com/docs/guides/text-to-speech? I cannot find any info in the document, but find some codes that may related to that, such as OpenAI.Component.Reg.pas.
Thank you.
Hi,
When I run this code in Delphi 11, I get this error "Error sending data: (12030) La conexión con el servidor finalizó anormalmente.".
Whats the problem?
Thansk.
procedure TForm1.CornerButton1Click(Sender: TObject);
var
OpenAI: TOpenAI;
begin
OpenAI := TOpenAI.Create(API_KEY2) ;
try
openAI.Organization:=ORGANIZATION_ID;
var Completions := OpenAI.Completion.Create(
procedure(Params: TCompletionParams)
begin
Params.Prompt(Memo1.Text);
Params.MaxTokens(2048);
end);
try
for var Choice in Completions.Choices do
Memo2.Lines.Add(Choice.Index.ToString + ' ' + Choice.Text);
finally
Completions.Free;
end;
finally
OpenAI.Free;
end;
end;
No destructor defined to release Results: TArray. Results has memory leak.
https://github.com/HemulGM/DelphiOpenAI/blob/main/OpenAI.Moderations.pas
Congratulations on the library!
Could you provide an example of how to upload a PDF to the OpenAI API and get a text summary of the content of the PDF?
Here is a modification of the OpenAI.Chat unit in order to handle functions with 0613 models.
unit OpenAI.Chat;
interface
uses
System.SysUtils, OpenAI.API.Params, OpenAI.API, System.Classes;
{$SCOPEDENUMS ON}
type
TMessageRole = (System, User, Assistant, Fonction);
TMessageRoleHelper = record helper for TMessageRole
function ToString: string;
class function FromString(const Value: string): TMessageRole; static;
end;
TChatMessageBuild = record
private
FRole: TMessageRole;
FContent: string;
FFunction_call: string;
FArguments: string;
FTag: string;
FName: string;
public
/// <summary>
/// The role of the author of this message. One of system, user, or assistant.
/// </summary>
property Role: TMessageRole read FRole write FRole;
/// <summary>
/// The contents of the message.
/// </summary>
property Content: string read FContent write FContent;
/// <summary>
/// The function call of the message.
/// </summary>
property function_call: string read FFunction_call write FFunction_call;
/// <summary>
/// The arguments of the function called.
/// </summary>
property Arguments: string read FArguments write FArguments;
/// <summary>
/// The name of this message. May contain a-z, A-Z, 0-9, and underscores, with a maximum length of 64 characters.
/// </summary>
property Name: string read FName write FName;
/// <summary>
/// Tag - custom field for convenience. Not used in requests!
/// </summary>
property Tag: string read FTag write FTag;
class function Create(Role: TMessageRole; const Content: string; const Name: string = ''): TChatMessageBuild; static;
class function User(const Content: string; const Name: string = ''): TChatMessageBuild; static;
class function System(const Content: string; const Name: string = ''): TChatMessageBuild; static;
class function Assistant(const Content: string; const Name: string = ''): TChatMessageBuild; static;
class function AssistantFunc(const Function_Name: string; const Arguments: string): TChatMessageBuild; static;
class function Fonction(const Content: string; const Name: string = ''): TChatMessageBuild; static;
end;
TChatFonctionBuild = record
private
FName: string;
FDescription: string;
FParameters: string;
public
/// <summary>
/// The name this function. May contain a-z, A-Z, 0-9, and underscores, with a maximum length of 64 characters.
/// </summary>
property Name: string read FName write FName;
/// <summary>
/// The description of this function.
/// </summary>
property Description: string read FDescription write FDescription;
/// <summary>
/// The parameters of this function.
/// </summary>
property Parameters: string read FParameters write FParameters;
class function Fonction(const Name: string; const Description: string; const Arguments: string): TChatFonctionBuild; static;
end;
TChatParams = class(TJSONParam)
/// <summary>
/// ID of the model to use. Currently, gpt-3.5-turbo and gpt-4 are supported.
/// TODO : définir tous les modèles actuels
/// </summary>
function Model(const Value: string): TChatParams;
/// <summary>
/// The messages to generate chat completions for, in the chat format.
/// </summary>
function Messages(const Value: TArray<TChatMessageBuild>): TChatParams; overload;
/// <summary>
/// The functions of chat completions for, in the chat format.
/// </summary>
function Fonctions(const Value: TArray<TChatFonctionBuild>): TChatParams;
/// <summary>
/// What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random,
/// while lower values like 0.2 will make it more focused and deterministic.
/// We generally recommend altering this or top_p but not both.
/// </summary>
function Temperature(const Value: Single = 1): TChatParams;
/// <summary>
/// An alternative to sampling with temperature, called nucleus sampling, where the model considers the
/// results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10%
/// probability mass are considered.
/// We generally recommend altering this or temperature but not both.
/// </summary>
function TopP(const Value: Single = 1): TChatParams;
/// <summary>
/// How many chat completion choices to generate for each input message.
/// </summary>
function N(const Value: Integer = 1): TChatParams;
/// <summary>
/// If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as
/// data-only server-sent events as they become available, with the stream terminated by a data: [DONE] message.
/// </summary>
function Stream(const Value: Boolean = True): TChatParams;
/// <summary>
/// Up to 4 sequences where the API will stop generating further tokens.
/// </summary>
function Stop(const Value: string): TChatParams; overload;
/// <summary>
/// Up to 4 sequences where the API will stop generating further tokens.
/// </summary>
function Stop(const Value: TArray<string>): TChatParams; overload;
/// <summary>
/// The maximum number of tokens allowed for the generated answer. By default, the number of
/// tokens the model can return will be (4096 - prompt tokens).
/// </summary>
function MaxTokens(const Value: Integer = 16): TChatParams;
/// <summary>
/// Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far,
/// increasing the model's likelihood to talk about new topics.
/// </summary>
function PresencePenalty(const Value: Single = 0): TChatParams;
/// <summary>
/// Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far,
/// decreasing the model's likelihood to repeat the same line verbatim.
/// </summary>
function FrequencyPenalty(const Value: Single = 0): TChatParams;
/// <summary>
/// A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse.
/// </summary>
function User(const Value: string): TChatParams;
constructor Create; override;
end;
TChatUsage = class
private
FCompletion_tokens: Int64;
FPrompt_tokens: Int64;
FTotal_tokens: Int64;
public
property CompletionTokens: Int64 read FCompletion_tokens write FCompletion_tokens;
property PromptTokens: Int64 read FPrompt_tokens write FPrompt_tokens;
property TotalTokens: Int64 read FTotal_tokens write FTotal_tokens;
end;
TFunction_call = class
private
FName: string;
FArguments: string;
public
property Name: string read FName write FName;
property Arguments: string read FArguments write FArguments;
end;
TChatMessage = class
private
FRole: string;
FContent: string;
FFunction_call: TFunction_call;
public
property Role: string read FRole write FRole;
property Content: string read FContent write FContent;
property Function_call: TFunction_call read FFunction_call write FFunction_call;
destructor Destroy; override;
end;
TChatChoices = class
private
FIndex: Int64;
FMessage: TChatMessage;
FFinish_reason: string;
FDelta: TChatMessage;
public
property Index: Int64 read FIndex write FIndex;
property Message: TChatMessage read FMessage write FMessage;
property Delta: TChatMessage read FDelta write FDelta;
/// <summary>
/// The possible values for finish_reason are:
/// stop: API returned complete model output
/// length: Incomplete model output due to max_tokens parameter or token limit
/// content_filter: Omitted content due to a flag from our content filters
/// function_call:
/// null: API response still in progress or incomplete
/// </summary>
property FinishReason: string read FFinish_reason write FFinish_reason;
destructor Destroy; override;
end;
TChat = class
private
FChoices: TArray<TChatChoices>;
FCreated: Int64;
FId: string;
FObject: string;
FUsage: TChatUsage;
public
property Id: string read FId write FId;
property &Object: string read FObject write FObject;
property Created: Int64 read FCreated write FCreated;
property Choices: TArray<TChatChoices> read FChoices write FChoices;
property Usage: TChatUsage read FUsage write FUsage;
destructor Destroy; override;
end;
TChatEvent = reference to procedure(Chat: TChat; IsDone: Boolean; var Cancel: Boolean);
/// <summary>
/// Given a chat conversation, the model will return a chat completion response.
/// </summary>
TChatRoute = class(TOpenAIAPIRoute)
public
/// <summary>
/// Creates a completion for the chat message
/// </summary>
function Create(ParamProc: TProc<TChatParams>): TChat;
/// <summary>
/// Creates a completion for the chat message
/// </summary>
function CreateStream(ParamProc: TProc<TChatParams>; Event: TChatEvent): Boolean;
end;
implementation
uses
System.JSON, Rest.Json;
{ TChatRoute }
function TChatRoute.Create(ParamProc: TProc<TChatParams>): TChat;
begin
Result := API.Post<TChat, TChatParams>('chat/completions', ParamProc);
end;
function TChatRoute.CreateStream(ParamProc: TProc<TChatParams>; Event: TChatEvent): Boolean;
var
Response: TStringStream;
RetPos: Integer;
begin
Response := TStringStream.Create('', TEncoding.UTF8);
try
RetPos := 0;
Result := API.Post<TChatParams>('chat/completions', ParamProc, Response,
procedure(const Sender: TObject; AContentLength: Int64; AReadCount: Int64; var AAbort: Boolean)
var
IsDone: Boolean;
Data: string;
Chat: TChat;
TextBuffer: string;
Line: string;
Ret: Integer;
begin
TextBuffer := Response.DataString;
repeat
Ret := TextBuffer.IndexOf(#10, RetPos);
if Ret >= 0 then
begin
Line := TextBuffer.Substring(RetPos, Ret - RetPos);
RetPos := Ret + 1;
if Line.IsEmpty or (Line.StartsWith(#10)) then
Continue;
Chat := nil;
Data := Line.Replace('data: ', '').Trim([' ', #13, #10]);
IsDone := Data = '[DONE]';
if not IsDone then
begin
try
Chat := TJson.JsonToObject<TChat>(Data);
except
Chat := nil;
end;
end;
try
Event(Chat, IsDone, AAbort);
finally
if Assigned(Chat) then
Chat.Free;
end;
end;
until Ret < 0;
end);
finally
Response.Free;
end;
end;
{ TChat }
destructor TChat.Destroy;
begin
if Assigned(FUsage) then
FUsage.Free;
for var Item in FChoices do
if Assigned(Item) then
Item.Free;
inherited;
end;
{ TChatParams }
constructor TChatParams.Create;
begin
inherited;
// Model('gpt-3.5-turbo');
Model('gpt-3.5-turbo-0613');
// Model('gpt-3.5-turbo-16k');
end;
function TChatParams.Fonctions(
const Value: TArray<TChatFonctionBuild>): TChatParams;
var
Item: TChatFonctionBuild;
JSON: TJSONObject;
Items: TJSONArray;
begin
Items := TJSONArray.Create;
for Item in Value do
begin
JSON := TJSONObject.Create;
JSON.AddPair('name', Item.Name);
JSON.AddPair('description', Item.Description);
JSON.AddPair('parameters', TJSONObject.ParseJSONValue(Item.Parameters));
Items.Add(JSON);
end;
Result := TChatParams(Add('functions', Items));
end;
function TChatParams.FrequencyPenalty(const Value: Single): TChatParams;
begin
Result := TChatParams(Add('frequency_penalty', Value));
end;
function TChatParams.MaxTokens(const Value: Integer): TChatParams;
begin
Result := TChatParams(Add('max_tokens', Value));
end;
function TChatParams.Model(const Value: string): TChatParams;
begin
Result := TChatParams(Add('model', Value));
end;
function TChatParams.N(const Value: Integer): TChatParams;
begin
Result := TChatParams(Add('n', Value));
end;
function TChatParams.PresencePenalty(const Value: Single): TChatParams;
begin
Result := TChatParams(Add('presence_penalty', Value));
end;
function TChatParams.Messages(const Value: TArray<TChatMessageBuild>): TChatParams;
var
Item: TChatMessageBuild;
JSON: TJSONObject;
Items: TJSONArray;
begin
Items := TJSONArray.Create;
for Item in Value do
begin
JSON := TJSONObject.Create;
JSON.AddPair('role', Item.Role.ToString);
if Item.Content <> 'null' then JSON.AddPair('content', Item.Content)
else begin
JSON.AddPair('content', TJSONNull.Create);
var LFonction := TJSONObject.Create(
TJSONPair.Create('name', Item.FFunction_call));
LFonction.AddPair('arguments', Format('{ %s}', [Item.Arguments]));
JSON.AddPair('function_call', LFonction);
end;
if not Item.Name.IsEmpty then
JSON.AddPair('name', Item.Name);
Items.Add(JSON);
end;
Result := TChatParams(Add('messages', Items));
end;
function TChatParams.Stop(const Value: TArray<string>): TChatParams;
begin
Result := TChatParams(Add('stop', Value));
end;
function TChatParams.Stop(const Value: string): TChatParams;
begin
Result := TChatParams(Add('stop', Value));
end;
function TChatParams.Stream(const Value: Boolean): TChatParams;
begin
Result := TChatParams(Add('stream', Value));
end;
function TChatParams.Temperature(const Value: Single): TChatParams;
begin
Result := TChatParams(Add('temperature', Value));
end;
function TChatParams.TopP(const Value: Single): TChatParams;
begin
Result := TChatParams(Add('top_p', Value));
end;
function TChatParams.User(const Value: string): TChatParams;
begin
Result := TChatParams(Add('user', Value));
end;
{ TChatMessageBuild }
class function TChatMessageBuild.Assistant(const Content: string; const Name: string = ''): TChatMessageBuild;
begin
Result.FRole := TMessageRole.Assistant;
Result.FContent := Content;
Result.FName := Name;
end;
class function TChatMessageBuild.AssistantFunc(const Function_Name: string;
const Arguments: string): TChatMessageBuild;
begin
Result.FRole := TMessageRole.Assistant;
Result.FContent := 'null';
Result.function_call := Function_Name;
Result.Arguments := Arguments;
end;
class function TChatMessageBuild.Create(Role: TMessageRole; const Content: string; const Name: string = ''): TChatMessageBuild;
begin
Result.FRole := Role;
Result.FContent := Content;
Result.FName := Name;
end;
class function TChatMessageBuild.Fonction(const Content,
Name: string): TChatMessageBuild;
begin
Result.FRole := TMessageRole.Fonction;
Result.FContent := Content;
Result.Name := Name;
end;
class function TChatMessageBuild.System(const Content: string; const Name: string = ''): TChatMessageBuild;
begin
Result.FRole := TMessageRole.System;
Result.FContent := Content;
Result.FName := Name;
end;
class function TChatMessageBuild.User(const Content: string; const Name: string = ''): TChatMessageBuild;
begin
Result.FRole := TMessageRole.User;
Result.FContent := Content;
Result.FName := Name;
end;
{ TMessageRoleHelper }
class function TMessageRoleHelper.FromString(const Value: string): TMessageRole;
begin
if Value = 'system' then
Exit(TMessageRole.System)
else if Value = 'user' then
Exit(TMessageRole.User)
else if Value = 'assistant' then
Exit(TMessageRole.Assistant)
else if Value = 'function' then
Exit(TMessageRole.Fonction)
else
Result := TMessageRole.User;
end;
function TMessageRoleHelper.ToString: string;
begin
case Self of
TMessageRole.System:
Result := 'system';
TMessageRole.User:
Result := 'user';
TMessageRole.Assistant:
Result := 'assistant';
TMessageRole.Fonction:
Result := 'function';
end;
end;
{ TChatChoices }
destructor TChatChoices.Destroy;
begin
if Assigned(FMessage) then
FMessage.Free;
if Assigned(FDelta) then
FDelta.Free;
inherited;
end;
{ TChatFonctionBuild }
class function TChatFonctionBuild.Fonction(const Name: string;
const Description: string; const Arguments: string): TChatFonctionBuild;
begin
Result.Name := Name;
Result.Description := Description;
Result.Parameters := Arguments;
end;
{ TChatMessage }
destructor TChatMessage.Destroy;
begin
if Assigned(FFunction_call) then
FFunction_call.Free;
inherited;
end;
end.
Use example from https://openai.com/blog/function-calling-and-other-api-updates
function script: string;
begin
with TStringWriter.Create do
try
WriteLine('{');
WriteLine(' "type": "object",');
WriteLine(' "properties": {');
WriteLine(' "location": {');
WriteLine(' "type": "string",');
WriteLine(' "description": "La ville ou la région, par ex. Paris, Île-de-France"},');
WriteLine(' "unit": {');
WriteLine(' "type": "string",');
WriteLine(' "enum": ["celsius", "fahrenheit"]}},');
WriteLine(' "required": ["location"]');
WriteLine('}');
Result := ToString;
finally
Free;
end;
end;
1--------------
var OpenAI: IOpenAI := TOpenAI.Create(KEY);
try
var Chat := OpenAI.Chat.Create(
procedure(Params: TChatParams)
begin
Params.Messages([
TchatMessageBuild.System('L''heure actuelle est '+TimeToStr(Now)+'.'),
TchatMessageBuild.User(Memo1.Text)
// TchatMessageBuild.AssistantFunc('get_current_weather', '"location": "Boston, MA"'),
// TchatMessageBuild.Fonction('"temperature": "22", "unit": "celsius", "description": "Sunny"','get_current_weather')
]);
Params.Fonctions([
TChatFonctionBuild.Fonction('get_current_weather',
'Obtenir la météo actuelle à un endroit donné', Script)
]);
Params.MaxTokens(1024);
end);
try
for var Choice in Chat.Choices do begin
try
Memo1.Lines.Add(Choice.Message.Function_call.Name);
Memo1.Lines.Add(Choice.Message.Function_call.Arguments);
except
end;
Memo1.Lines.Add(Choice.Message.Content);
end;
finally
Chat.Free;
end;
except
Memo1.Text := 'error';
Raise;
end;
2------
var
Function_call: string;
Arguments: string;
begin
var OpenAI: IOpenAI := TOpenAI.Create(KEY);
var Chat := OpenAI.Chat.CreateStream(
procedure(Params: TChatParams)
begin
Params.Messages([
TchatMessageBuild.System('L''heure actuelle est '+TimeToStr(Now)),
TchatMessageBuild.User(Memo1.Text)
]);
Params.Fonctions([
TChatFonctionBuild.Fonction('get_current_weather',
'Obtenir la météo actuelle à un endroit donné', Script)
]);
Params.MaxTokens(1024);
Params.Stream;
end,
procedure(Chat: TChat; IsDone: Boolean; var Cancel: Boolean)
begin
if (not IsDone) and Assigned(Chat) then begin
Memo1.Text := Memo1.Text + Chat.Choices[0].Delta.Content;
try
if Function_call = EmptyStr then
Function_call := Chat.Choices[0].Delta.Function_call.Name;
Arguments := Arguments + Chat.Choices[0].Delta.Function_call.Arguments;
except
end;
Application.ProcessMessages;
end
else if IsDone then begin
Memo1.Text := Memo1.Text + #13;
end;
Sleep(100);
end);
if Function_call <> EmptyStr then Memo1.Lines.Add(Function_call);
if Arguments <> EmptyStr then Memo1.Lines.Add(Arguments);
end;
It works perfect with Alexandria (The latest version at the time) and probably Rio, and of course, it's not mandatory but if you want to cover all Delphi versions then there should be some refinements.
I couldn't compile the library properly in some earlier Delphi versions.
Here is the result using XE5 and Berlin.
1- System.JSON is not available in my XE5 (maybe because of an installation fault I'm not sure) but I found this sentence in the Delphi Cookbook - Second Edition by Daniele Teti: Since version 2009, Delphi provides built-in support for JSON.
2- AddStream function is not available in the TMultipartFormData class in this unit: OpenAI.Files, tested with Berlin 10.1
I don't see "stream.id" is ever used/defined - then how to continue the chat/completion stream?
The OpenAI.Assistants unit can only work if the Headers include: “OpenAI-Beta','assistants=v1”
In this case, you must modify the GetHeader method of the TOpenAIAPI class as follows:
function TOpenAIAPI.GetHeaders: TNetHeaders;
begin
// Additional headers are not required when using azure
if IsAzure then
Exit;
Result := [TNetHeader.Create('Authorization', 'Bearer ' + FToken)] + FCustomHeaders;
if not FOrganization.IsEmpty then
Result := Result + [TNetHeader.Create('OpenAI-Organization', FOrganization)];
Result := Result + [TNetHeader.Create('OpenAI-Beta','assistants=v1')]; // Added line
end;
Furthermore, when using OpenAI.Assistants we encounter a memory leak, in fact, FAssistantRoute is not freed in TOpenAI.Destroy.
destructor TOpenAI.Destroy;
begin
FCompletionsRoute.Free;
FEditsRoute.Free;
FImagesRoute.Free;
FImagesAzureRoute.Free;
FModelsRoute.Free;
FEmbeddingsRoute.Free;
FModerationsRoute.Free;
FEnginesRoute.Free;
FFilesRoute.Free;
FFineTunesRoute.Free;
FFineTuningRoute.Free;
FChatRoute.Free;
FAudioRoute.Free;
FAPI.Free;
FAssistantsRoute.Free; //ADDED
FThreadsRoute.Free; //ADDED
FRunsRoute.Free; //ADDED
FMessagesRoute.Free; //ADDED
inherited;
end;
There are also several small things to factor into the OpenAI.Assistants unit:
When developing the OpenAI.Threads, OpenAI.Runs and OpenAI.Messages classes (the Beta functions), we must reuse the classes named above.
Furthermore, the TAssistant class must be made generic because it must manage several different classes.
Furthermore, it would be wise to rename the TAssistantListParams and TAssistants classes because they will be reused in other situations. I renamed them TListedDataParams and TListedData<T: class> respectively.
I suggest the OpenAI.Assistants.Tools unit below.
On the other hand, the names of the URLs with the Beta functions must take into account parameters such as:
https://api.openai.com/v1/threads/{thread_id}/runs/{run_id}.
So I simplified their use with the "OpenAI.PathParameters" unit, which will simplify future modifications if OpenAI decides to change these folders in a future update.
So I'm sending you the changes for the OpenAI.Assistants unit. If you wish, I can also send you the OpenAI.Threads, OpenAI.Runs and OpenAI.Messages units.
unit OpenAI.PathParameters;
interface
uses
System.SysUtils;
type
TPathParameters = record
private
FPath: string;
public
/// <summary>
/// Path value : To complete HTTP request in order to use APIs.
/// </summary>
property Path: string read FPath write FPath;
/// <summary>
/// <para>
/// assistants
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/assistants </remarks>
class function Assistants: TPathParameters; overload; static;
/// <summary>
/// <para>
/// assistants/{assistant_id}
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/assistants/{assistant_id} </remarks>
class function Assistants(const AssistantId: string): TPathParameters; overload; static;
/// <summary>
/// <para>
/// assistants/{assistant_id}/files
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/assistants/{assistant_id}/files </remarks>
class function AssistantsFiles(const AssistantId: string): TPathParameters; overload; static;
/// <summary>
/// <para>
/// assistants/{assistant_id}/files/{file_id}
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/assistants/{assistant_id}/files/{file_id} </remarks>
class function AssistantsFiles(const AssistantId, FileId: string): TPathParameters; overload; static;
/// <summary>
/// <para>
/// threads/{thread_id}/messages
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/threads/{thread_id}/messages </remarks>
class function Messages(const ThreadId: string): TPathParameters; overload; static;
/// <summary>
/// <para>
/// threads/{thread_id}/messages/{message_id}
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/threads/{thread_id}/messages/{message_id} </remarks>
class function Messages(const ThreadId, MessageId: string): TPathParameters; overload; static;
/// <summary>
/// <para>
/// threads/{thread_id}/messages/{message_id}/files
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/threads/{thread_id}/messages/{message_id}/files </remarks>
class function MessagesFiles(const ThreadId, MessageId: string): TPathParameters; overload; static;
/// <summary>
/// <para>
/// threads/{thread_id}/messages/{message_id}/files/{file_id}
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/threads/{thread_id}/messages/{message_id}/files/{file_id} </remarks>
class function MessagesFiles(const ThreadId, MessageId, FileId: string): TPathParameters; overload; static;
/// <summary>
/// <para>
/// threads/runs
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/threads/runs </remarks>
class function Runs: TPathParameters; overload; static;
/// <summary>
/// <para>
/// threads/{thread_id}/runs
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/threads/{thread_id}/runs </remarks>
class function Runs(const ThreadId: string): TPathParameters; overload; static;
/// <summary>
/// <para>
/// threads/{thread_id}/runs/{run_id}
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/threads/{thread_id}/runs/{run_id} </remarks>
class function Runs(const ThreadId, RunId: string): TPathParameters; overload; static;
/// <summary>
/// <para>
/// threads/{thread_id}/runs/{run_id}/cancel
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/threads/{thread_id}/runs/{run_id}/cancel </remarks>
class function RunsCancel(const ThreadId, RunId: string): TPathParameters; static;
/// <summary>
/// <para>
/// threads/{thread_id}/runs/{run_id}/steps
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/threads/{thread_id}/runs/{run_id}/steps </remarks>
class function RunsSteps(const ThreadId, RunId: string): TPathParameters; overload; static;
/// <summary>
/// <para>
/// threads/{thread_id}/runs/{run_id}/steps/{step_id}
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/threads/{thread_id}/runs/{run_id}/steps/{step_id} </remarks>
class function RunsSteps(const ThreadId, RunId, StepId: string): TPathParameters; overload; static;
/// <summary>
/// <para>
/// threads/{thread_id}/runs/{run_id}/submit_tool_outputs
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/threads/{thread_id}/runs/{run_id}/submit_tool_outputs </remarks>
class function RunsToolOutput(const ThreadId, RunId: string): TPathParameters; static;
/// <summary>
/// <para>
/// threads
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/threads </remarks>
class function Threads: TPathParameters; overload; static;
/// <summary>
/// <para>
/// threads/{thread_id}
/// </para>
/// Builded "path" string.
/// </summary>
/// <remarks> https://api.openai.com/v1/threads/{thread_id} </remarks>
class function Threads(const ThreadId: string): TPathParameters; overload; static;
end;
implementation
{ TPathParameters }
class function TPathParameters.Assistants(
const AssistantId: string): TPathParameters;
begin
Result.Path := Format('assistants/%s', [AssistantId]);
end;
class function TPathParameters.AssistantsFiles(
const AssistantId: string): TPathParameters;
begin
Result.Path := Format('assistants/%s/files', [AssistantId]);
end;
class function TPathParameters.Assistants: TPathParameters;
begin
Result.Path := 'assistants';
end;
class function TPathParameters.AssistantsFiles(const AssistantId,
FileId: string): TPathParameters;
begin
Result.Path := Format('assistants/%s/files/%s', [AssistantId, FileId]);
end;
class function TPathParameters.Messages(
const ThreadId: string): TPathParameters;
begin
Result.Path := Format('threads/%s/messages', [ThreadId]);
end;
class function TPathParameters.Messages(const ThreadId,
MessageId: string): TPathParameters;
begin
Result.Path := Format('threads/%s/messages/%s', [ThreadId, MessageId]);
end;
class function TPathParameters.MessagesFiles(const ThreadId, MessageId,
FileId: string): TPathParameters;
begin
Result.Path := Format('threads/%s/messages/%s/files/%s', [ThreadId, MessageId, FileId]);
end;
class function TPathParameters.MessagesFiles(const ThreadId,
MessageId: string): TPathParameters;
begin
Result.Path := Format('threads/%s/messages/%s/files', [ThreadId, MessageId]);
end;
class function TPathParameters.Runs: TPathParameters;
begin
Result.Path := 'threads/runs';
end;
class function TPathParameters.Runs(const ThreadId: string): TPathParameters;
begin
Result.Path := Format('threads/%s/runs', [ThreadId]);
end;
class function TPathParameters.Runs(const ThreadId,
RunId: string): TPathParameters;
begin
Result.Path := Format('threads/%s/runs/%s', [ThreadId, RunId]);
end;
class function TPathParameters.RunsCancel(const ThreadId,
RunId: string): TPathParameters;
begin
Result.Path := Format('threads/%s/runs/%s/cancel', [ThreadId, RunId]);
end;
class function TPathParameters.RunsSteps(const ThreadId, RunId,
StepId: string): TPathParameters;
begin
Result.Path := Format('threads/%s/runs/%s/steps/%s', [ThreadId, RunId, StepId]);
end;
class function TPathParameters.RunsToolOutput(const ThreadId,
RunId: string): TPathParameters;
begin
Result.Path := Format('threads/%s/runs/%s/submit_tool_outputs', [ThreadId, RunId]);
end;
class function TPathParameters.RunsSteps(const ThreadId,
RunId: string): TPathParameters;
begin
Result.Path := Format('threads/%s/runs/%s/steps', [ThreadId, RunId]);
end;
class function TPathParameters.Threads: TPathParameters;
begin
Result.Path := 'threads';
end;
class function TPathParameters.Threads(const ThreadId: string): TPathParameters;
begin
Result.Path := Format('threads/%s', [ThreadId]);
end;
end.
unit OpenAI.Assistants.Tools;
interface
uses
OpenAI.API, OpenAI.API.Params, System.JSON;
type
TAssistantTool = class abstract(TJSONParam)
//Code interpreter tool
//Retrieval tool
//Function tool
end;
TAssistantCodeInterpreterTool = class(TAssistantTool)
class function Build: TAssistantCodeInterpreterTool;
end;
TAssistantRetrievalTool = class(TAssistantTool)
class function Build: TAssistantRetrievalTool;
end;
TAssistantFunctionTool = class(TAssistantTool)
/// <summary>
/// A description of what the function does, used by the model to choose when and how to call the function.
/// </summary>
function Description(const Value: string): TAssistantFunctionTool;
/// <summary>
/// The name of the function to be called. Must be a-z, A-Z, 0-9,
/// or contain underscores and dashes, with a maximum length of 64.
/// </summary>
function Name(const Value: string): TAssistantFunctionTool;
/// <summary>
/// The parameters the functions accepts, described as a JSON Schema object.
/// See the guide for examples, and the JSON Schema reference for documentation about the format.
/// <br>
/// To describe a function that accepts no parameters, provide the value {"type": "object", "properties": {}}.
/// </summary>
function Parameters(const Value: TJSONObject): TAssistantFunctionTool;
class function Build(const &function: TAssistantFunctionTool): TAssistantFunctionTool; overload;
class function Build(const Name, Description: string; const Parameters: TJSONObject): TAssistantFunctionTool; overload;
end;
TMetadata = class
end;
TListedDataParams = class(TJSONParam)
/// <summary>
/// A cursor for use in pagination. after is an object ID that defines your place in the list.
/// For instance, if you make a list request and receive 100 objects, ending with obj_foo,
/// your subsequent call can include after=obj_foo in order to fetch the next page of the list.
/// </summary>
function After(const Value: string): TListedDataParams;
/// <summary>
/// A cursor for use in pagination. before is an object ID that defines your place in the list.
/// For instance, if you make a list request and receive 100 objects, ending with obj_foo,
/// your subsequent call can include before=obj_foo in order to fetch the previous page of the list.
/// </summary>
function Before(const Value: string): TListedDataParams;
/// <summary>
/// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20.
/// </summary>
function Limit(const Value: Integer): TListedDataParams;
/// <summary>
/// Sort order by the created_at timestamp of the objects. asc for ascending order and desc for descending order.
/// </summary>
function Order(const Value: string): TListedDataParams;
end;
TListedData<T: class> = class
private
FObject: string;
FData: TArray<T>;
FHas_more: Boolean;
FLast_id: string;
FFirst_id: string;
public
property &Object: string read FObject write FObject;
property Data: TArray<T> read FData write FData;
property HasMore: Boolean read FHas_more write FHas_more;
property FirstId: string read FFirst_id write FFirst_id;
property LastId: string read FLast_id write FLast_id;
destructor Destroy; override;
end;
implementation
{ TAssistantRetrievalTool }
class function TAssistantRetrievalTool.Build: TAssistantRetrievalTool;
begin
Result := TAssistantRetrievalTool.Create;
Result.Add('type', 'retrieval');
end;
{ TAssistantCodeInterpreterTool }
class function TAssistantCodeInterpreterTool.Build: TAssistantCodeInterpreterTool;
begin
Result := TAssistantCodeInterpreterTool.Create;
Result.Add('type', 'code_interpreter');
end;
{ TAssistantFunctionTool }
class function TAssistantFunctionTool.Build(
const &function: TAssistantFunctionTool): TAssistantFunctionTool;
begin
Result := TAssistantFunctionTool.Create;
Result := TAssistantFunctionTool(Result.Add('type', 'function'));
Result := TAssistantFunctionTool(Result.Add('function', &function));
end;
class function TAssistantFunctionTool.Build(const Name, Description: string;
const Parameters: TJSONObject): TAssistantFunctionTool;
begin
Result := TAssistantFunctionTool.Create;
Result := TAssistantFunctionTool(Result.Name(Name));
Result := TAssistantFunctionTool(Result.Description(Description));
Result := TAssistantFunctionTool(Result.Parameters(Parameters));
end;
function TAssistantFunctionTool.Description(
const Value: string): TAssistantFunctionTool;
begin
Result := TAssistantFunctionTool(Add('description', Value));
end;
function TAssistantFunctionTool.Name(
const Value: string): TAssistantFunctionTool;
begin
Result := TAssistantFunctionTool(Add('name', Value));
end;
function TAssistantFunctionTool.Parameters(
const Value: TJSONObject): TAssistantFunctionTool;
begin
Result := TAssistantFunctionTool(Add('parameters', Value));
end;
{ TListedData<T> }
destructor TListedData<T>.Destroy;
begin
for var Item in FData do
Item.Free;
inherited;
end;
{ TListedDataParams }
function TListedDataParams.After(const Value: string): TListedDataParams;
begin
Result := TListedDataParams(Add('after', Value));
end;
function TListedDataParams.Before(const Value: string): TListedDataParams;
begin
Result := TListedDataParams(Add('before', Value));
end;
function TListedDataParams.Limit(const Value: Integer): TListedDataParams;
begin
Result := TListedDataParams(Add('limit', Value));
end;
function TListedDataParams.Order(const Value: string): TListedDataParams;
begin
Result := TListedDataParams(Add('order', Value));
end;
end.
unit OpenAI.Assistants;
interface
uses
System.SysUtils, Rest.Json, Rest.Json.Types, OpenAI.API, OpenAI.API.Params,
OpenAI.Types, System.JSON, OpenAI.PathParameters, OpenAI.Assistants.Tools;
type
TAssistantParams = class(TJSONParam)
/// <summary>
/// ID of the model to use. You can use the List models API to see all of your available models,
/// or see our Model overview for descriptions of them.
/// </summary>
/// <remarks>
/// Required value
/// </remarks>
function Model(const Value: string): TAssistantParams;
/// <summary>
/// The name of the assistant. The maximum length is 256 characters.
/// </summary>
function Name(const Value: string): TAssistantParams; overload;
/// <summary>
/// The description of the assistant. The maximum length is 512 characters.
/// </summary>
function Description(const Value: string): TAssistantParams; overload;
/// <summary>
/// The system instructions that the assistant uses. The maximum length is 32768 characters.
/// </summary>
function Instructions(const Value: string): TAssistantParams; overload;
/// <summary>
/// A list of tool enabled on the assistant. There can be a maximum of 128 tools per assistant.
/// Tools can be of types code_interpreter, retrieval, or function.
/// </summary>
function Tools(const Value: TArray<TAssistantTool>): TAssistantParams; overload;
/// <summary>
/// A list of file IDs attached to this assistant. There can be a maximum of 20 files
/// attached to the assistant. Files are ordered by their creation date in ascending order.
/// </summary>
function FileIds(const Value: TArray<string>): TAssistantParams;
/// <summary>
/// Set of 16 key-value pairs that can be attached to an object. This can be useful
/// for storing additional information about the object in a structured format.
/// Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long.
/// </summary>
function Metadata(const Value: TJSONParam): TAssistantParams;
end;
TAssistantFileParams = class(TJSONParam)
/// <summary>
/// File ID (with purpose="assistants") that the assistant should use. Useful for tools like
/// retrieval and code_interpreter that can access files.
/// </summary>
/// <remarks>
/// Required value
/// </remarks>
function FileId(const Value: string): TAssistantFileParams;
end;
TFunctionParameters = class
end;
TToolFunction = class
private
[JsonNameAttribute('name')]
FName: string;
[JsonNameAttribute('description')]
FDescription: string;
[JsonNameAttribute('parameters')]
FParameters: TFunctionParameters;
public
/// <summary>
/// The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and dashes,
/// with a maximum length of 64.
/// </summary>
property Name: string read FName write FName;
/// <summary>
/// A description of what the function does, used by the model to choose when and how to call the function.
/// </summary>
property Description: string read FDescription write FDescription;
/// <summary>
/// The parameters the functions accepts, described as a JSON Schema object. See the guide for examples,
/// and the JSON Schema reference for documentation about the format.
/// To describe a function that accepts no parameters, provide the value {"type": "object", "properties": {}}.
/// </summary>
property Parameters: TFunctionParameters read FParameters write FParameters;
destructor Destroy; override;
end;
/// <summary>
/// Tools can be of types code_interpreter, retrieval, or function.
/// </summary>
TTools = class
private
[JsonNameAttribute('type')]
FType: string;
[JsonNameAttribute('function')]
FFunction: TToolFunction;
public
/// <summary>
/// The type of tool can be defined: code_interpreter or retrieval or function
/// </summary>
property &Type: string read FType write FType;
/// <summary>
/// "function" is defined by : a field for description, a field for name and a field for parameters
/// </summary>
property &Function: TToolFunction read FFunction write FFunction;
destructor Destroy; override;
end;
/// <summary>
/// Represents an assistant that can call the model and use tools.
/// </summary>
TAssistant = class
private
[JsonNameAttribute('id')]
FId: string;
[JsonNameAttribute('object')]
FObject: string;
[JsonNameAttribute('created_at')]
FCreatedAt: Int64;
[JsonNameAttribute('name')]
FName: string;
[JsonNameAttribute('description')]
FDescription: string;
[JsonNameAttribute('model')]
FModel: string;
[JsonNameAttribute('instructions')]
FInstructions: string;
[JsonNameAttribute('tools')]
FTools: TArray<TTools>;
[JsonNameAttribute('file_ids')]
FFileIds: TArray<string>;
[JsonNameAttribute('metadata')]
FMetadata: TMetadata;
public
/// <summary>
/// The identifier, which can be referenced in API endpoints.
/// </summary>
property Id: string read FId write FId;
/// <summary>
/// The object type, which is always assistant.
/// </summary>
property &Object: string read FObject write FObject;
/// <summary>
/// The Unix timestamp (in seconds) for when the assistant was created.
/// </summary>
property CreatedAt: Int64 read FCreatedAt write FCreatedAt;
/// <summary>
/// The name of the assistant. The maximum length is 256 characters.
/// </summary>
property Name: string read FName write FName;
/// <summary>
/// The description of the assistant. The maximum length is 512 characters.
/// </summary>
property Description: string read FDescription write FDescription;
/// <summary>
/// ID of the model to use. You can use the List models API to see all of your available models,
/// or see our Model overview for descriptions of them.
/// </summary>
property Model: string read FModel write FModel;
/// <summary>
/// The system instructions that the assistant uses. The maximum length is 32768 characters.
/// </summary>
property Instructions: string read FInstructions write FInstructions;
/// <summary>
/// A list of tool enabled on the assistant. There can be a maximum of 128 tools per assistant.
/// Tools can be of types code_interpreter, retrieval, or function.
/// </summary>
property Tools: TArray<TTools> read FTools write FTools;
/// <summary>
/// A list of file IDs attached to this assistant.
/// There can be a maximum of 20 files attached to the assistant.
/// Files are ordered by their creation date in ascending order.
/// </summary>
property FileIds: TArray<string> read FFileIds write FFileIds;
/// <summary>
/// Set of 16 key-value pairs that can be attached to an object.
/// This can be useful for storing additional information about the object in a structured format.
/// Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long.
/// </summary>
property Metadata: TMetadata read FMetadata write FMetadata;
destructor Destroy; override;
end;
TAssistantFiles = class
private
[JsonNameAttribute('id')]
FId: string;
[JsonNameAttribute('object')]
FObject: string;
[JsonNameAttribute('created_at')]
FCreatedAt: Int64;
[JsonNameAttribute('assistant_id')]
FAssistantId: string;
public
/// <summary>
/// The identifier, which can be referenced in API endpoints.
/// </summary>
property Id: string read FId write FId;
/// <summary>
/// The object type, which is always assistant.file.
/// </summary>
property &Object: string read FObject write FObject;
/// <summary>
/// The Unix timestamp (in seconds) for when the assistant file was created.
/// </summary>
property CreatedAt: Int64 read FCreatedAt write FCreatedAt;
/// <summary>
/// The assistant ID that the file is attached to.
/// </summary>
property AssistantId: string read FAssistantId write FAssistantId;
end;
TAssistantsFilesRoute = class(TOpenAIAPIRoute)
public
/// <summary>
/// Create an assistant file by attaching a File to an assistant.
/// </summary>
/// <param name="AssistantId: string"> The ID of the assistant for which to create a File. </param>
function Create(const AssistantId: string; ParamProc: TProc<TAssistantFileParams>): TAssistantFiles; overload;
/// <summary>
/// Create an assistant file by attaching a File to an assistant.
/// </summary>
/// <param name="AssistantId: string"> The ID of the assistant for which to create a File. </param>
/// <param name="FileId: string"> The ID of the attached File. </param>
function Create(const AssistantId, FileId: string): TAssistantFiles; overload;
/// <summary>
/// Delete an assistant file.
/// </summary>
/// <param name="AssistantId: string"> The ID of the assistant that the file belongs to. </param>
/// <param name="FileId: string"> The ID of the file to delete. </param>
function Delete(const AssistantId, FileId: string): TDeletionStatus;
/// <summary>
/// Returns a list of assistant files.
/// </summary>
/// <param name="AssistantId: string"> The ID of the assistant that the file belongs to. </param>
function List(const AssistantId: string; ParamProc: TProc<TListedDataParams> = nil): TListedData<TAssistantFiles>;
/// <summary>
/// Retrieves an AssistantFile.
/// </summary>
/// <param name="AssistantId: string"> The ID of the assistant who the file belongs to. </param>
/// <param name="FileId: string"> The ID of the file we're getting. </param>
function Retrieve(const AssistantId, FileId: string): TAssistantFiles;
end;
TAssistantsRoute = class(TOpenAIAPIRoute)
private
FFiles: TAssistantsFilesRoute;
public
constructor CreateRoute(AAPI: TOpenAIAPI); reintroduce;
destructor Destroy; override;
/// <summary>
/// Create an assistant with a model and instructions.
/// </summary>
function Create(ParamProc: TProc<TAssistantParams>): TAssistant;
/// <summary>
/// Retrieves an assistant.
/// </summary>
/// <param name="AssistantId: string">The ID of the assistant to retrieve.</param>
function Retrieve(const AssistantId: string): TAssistant;
/// <summary>
/// Modifies an assistant.
/// </summary>
/// <param name="AssistantId: string">The ID of the assistant to modify.</param>
function Modify(const AssistantId: string; ParamProc: TProc<TAssistantParams>): TAssistant;
/// <summary>
/// Delete an assistant.
/// </summary>
/// <param name="AssistantId: string">The ID of the assistant to delete.</param>
function Delete(const AssistantId: string): TDeletionStatus;
/// <summary>
/// Retrieves an assistant.
/// </summary>
function List(ParamProc: TProc<TListedDataParams> = nil): TListedData<TAssistant>;
/// <summary>
/// Access to Files of assistant.
/// </summary>
property &Files: TAssistantsFilesRoute read FFiles;
end;
implementation
{ TAssistant }
destructor TAssistant.Destroy;
begin
for var Item in FTools do
Item.Free;
FMetadata.Free;
inherited;
end;
{ TAssistantsRoute }
function TAssistantsRoute.Create(ParamProc: TProc<TAssistantParams>): TAssistant;
begin
Result := API.Post<TAssistant, TAssistantParams>(TPathParameters.Assistants.Path, ParamProc);
end;
constructor TAssistantsRoute.CreateRoute(AAPI: TOpenAIAPI);
begin
inherited CreateRoute(AAPI);
FFiles := TAssistantsFilesRoute.CreateRoute(AAPI);
end;
function TAssistantsRoute.Delete(const AssistantId: string): TDeletionStatus;
begin
Result := API.Delete<TDeletionStatus>(TPathParameters.Assistants(AssistantId).Path);
end;
destructor TAssistantsRoute.Destroy;
begin
FFiles.Free;
inherited;
end;
function TAssistantsRoute.List(ParamProc: TProc<TListedDataParams>): TListedData<TAssistant>;
begin
Result := API.Get<TListedData<TAssistant>, TListedDataParams>(TPathParameters.Assistants.Path, ParamProc);
end;
function TAssistantsRoute.Modify(const AssistantId: string; ParamProc: TProc<TAssistantParams>): TAssistant;
begin
Result := API.Post<TAssistant, TAssistantParams>(TPathParameters.Assistants(AssistantId).Path, ParamProc);
end;
function TAssistantsRoute.Retrieve(const AssistantId: string): TAssistant;
begin
Result := API.Get<TAssistant>(TPathParameters.Assistants(AssistantId).Path);
end;
{ TAssistantParams }
function TAssistantParams.Description(const Value: string): TAssistantParams;
begin
Result := TAssistantParams(Add('description', Value));
end;
function TAssistantParams.FileIds(const Value: TArray<string>): TAssistantParams;
begin
Result := TAssistantParams(Add('file_ids', Value));
end;
function TAssistantParams.Instructions(const Value: string): TAssistantParams;
begin
Result := TAssistantParams(Add('instructions', Value));
end;
function TAssistantParams.Metadata(const Value: TJSONParam): TAssistantParams;
begin
Result := TAssistantParams(Add('metadata', Value));
end;
function TAssistantParams.Model(const Value: string): TAssistantParams;
begin
Result := TAssistantParams(Add('model', Value));
end;
function TAssistantParams.Name(const Value: string): TAssistantParams;
begin
Result := TAssistantParams(Add('name', Value));
end;
function TAssistantParams.Tools(const Value: TArray<TAssistantTool>): TAssistantParams;
begin
Result := TAssistantParams(Add('tools', TArray<TJSONParam>(Value)));
end;
{ TToolFunction }
destructor TToolFunction.Destroy;
begin
FParameters.Free;
inherited;
end;
{ TTools }
destructor TTools.Destroy;
begin
FFunction.Free;
inherited;
end;
{ TAssistantsFilesRoute }
function TAssistantsFilesRoute.Create(const AssistantId: string;
ParamProc: TProc<TAssistantFileParams>): TAssistantFiles;
begin
var Path := TPathParameters.AssistantsFiles(AssistantId).Path;
Result := API.Post<TAssistantFiles, TAssistantFileParams>(Path, ParamProc);
end;
function TAssistantsFilesRoute.Create(const AssistantId,
FileId: string): TAssistantFiles;
begin
Result := Create(AssistantId,
procedure (Params: TAssistantFileParams)
begin
Params.FileId(FileId);
end
);
end;
function TAssistantsFilesRoute.Delete(const AssistantId,
FileId: string): TDeletionStatus;
begin
Result := API.Delete<TDeletionStatus>(TPathParameters.AssistantsFiles(AssistantId, FileId).Path);
end;
function TAssistantsFilesRoute.List(const AssistantId: string;
ParamProc: TProc<TListedDataParams>): TListedData<TAssistantFiles>;
begin
var Path := TPathParameters.AssistantsFiles(AssistantId).Path;
Result := API.Get<TListedData<TAssistantFiles>, TListedDataParams>(Path, ParamProc);
end;
function TAssistantsFilesRoute.Retrieve(const AssistantId,
FileId: string): TAssistantFiles;
begin
Result := API.Get<TAssistantFiles>(TPathParameters.AssistantsFiles(AssistantId, FileId).Path);
end;
{ TAssistantFileParams }
function TAssistantFileParams.FileId(const Value: string): TAssistantFileParams;
begin
Result := TAssistantFileParams(Add('file_id', Value));
end;
end.
Unfortunately the latest changes for streaming are causing issues.... I am seeing 2 things. (1) Code that previous worked for the OpenAI.Chat.Create is now timing out. I have the OpenAI libraries prior to the Streaming update. Code works with that, but with the new streaming changes, that code no longer works. I can switch back and forth between the 2 versions of the OpenAI library. The old version works fine, but the new version times out on the exact same simple question. The 2nd issue is with the streaming update itself. I can't explain why, but if I set a breakpoint on the results procedure, then I get data. If no breakpoint, then it times out. I have attached my source. Look at both the stream model section (line 45) and the chat model section (line 68). In the stream model, I never get 'In response loop', UNLESS I am stepping through the code.
Chat_Sample.txt
When using the transcription functionality, if you request SRT format in the response, even though valid data is returned, it results in an OpenAIExceptionInvalidResponse exception. The problem is the returned data from the API is plain text SRT, not JSON data.
procedure TMainUI.Button3Click(Sender: TObject);
Var OpenAI: IOpenAI;
AIT: TAudioTranscription;
AIR: TAudioText;
begin
OpenAI := TOpenAI.Create('***');
AIR := TAudioText.Create;
Try
AIR := OpenAI.Audio.CreateTranscription(
procedure(Params: TAudioTranscription)
begin
Params.&File('D:\test.mp4');
Params.ResponseFormat('srt');
Params.Language('en');
end);
Except
on E: OpenAIExceptionInvalidResponse do
Memo2.Lines.Add('OpenAI Error: ' + E.Message + ' - ' + E.Code.ToString);
End;
Memo2.Lines.Add(AIR.Text);
AIR.Free;
end;
triggers here, even with code 200 and valid SRT text in the response text
function TOpenAIAPI.ParseResponse<T>(const Code: Int64; const ResponseText: string): T;
begin
case Code of
200..299:
try
Result := TJson.JsonToObject<T>(ResponseText);
except
Result := nil;
end;
else
ParseError(Code, ResponseText);
end;
if not Assigned(Result) then
raise OpenAIExceptionInvalidResponse.Create('Empty or invalid response:', '', '', Code);
end;
Hello, will you have plan to support ollama?
This version does not take Vision into account. Here is some sample code to support this.
Add the OpenAI.Vision.Images unit to manage images to send to the model for analysis.
unit OpenAI.Vision.Images;
interface
uses
System.SysUtils, System.Classes;
type
/// <summary>
/// Access method to the image by the model.
/// Warning : Images can be passed in the user and assistant messages.
/// Currently that don't support images in the first system message but this may change in the future.
/// </summary>
TImageSourceType = (
/// <summary>
/// The model acceed to the image by using a link to the image
/// </summary>
FromUrl,
/// <summary>
/// The model acceed to th image by using the base64 encoded image directly in the request.
/// </summary>
FromBase64);
TImageFormatType = (jpeg, png);
TImageFormatTypehelper = record helper for TImageFormatType
function ToString: string;
end;
TImageDetail = (
/// <summary>
/// By default, the model will use the auto setting which will look at the image input size and
/// decide if it should use the low or high setting.
/// </summary>
auto,
/// <summary>
/// low will disable the “high res” model. The model will receive a low-res 512px x 512px version
/// of the image, and represent the image with a budget of 65 tokens. This allows the API to return
/// faster responses and consume fewer input tokens for use cases that do not require high detail.
/// </summary>
low,
/// <summary>
/// high will enable “high res” mode, which first allows the model to see the low res image and then
/// creates detailed crops of input images as 512px squares based on the input image size. Each of
/// the detailed crops uses twice the token budget (65 tokens) for a total of 129 tokens.
/// </summary>
high);
TImageDetailhelper = record helper for TImageDetail
function ToString: string;
end;
TImageSource = record
private
FType: TImageSourceType;
FFormat: TImageFormatType;
FDetail: TImageDetail;
FValue: string;
public
/// <summary>
/// By controlling the detail parameter, which has three options, low, high, or auto, you have control over
/// how the model processes the image and generates its textual understanding.
/// </summary>
property Detail: TImageDetail read FDetail write FDetail;
/// <summary>
/// The value field can be either a URL with the link to the image to query or the image in base 64 format
/// which will be transmitted with the request.
/// </summary>
property Value: string read FValue write FValue;
class function Create(const Url: string; const DetailValue: TImageDetail = auto): TImageSource; overload; static;
class function Create(const Format: TImageFormatType; const Base64Img: string;
const DetailValue: TImageDetail = auto): TImageSource; overload; static;
end;
TChatVisionBuild = record
private
FContent: string;
FImageSources: TArray<TImageSource>;
public
property Content: string read FContent write FContent;
property ImageSources: TArray<TImageSource> read FImageSources write FImageSources;
class function Create(const ContextText: string; Images: TArray<TImageSource>): TChatVisionBuild; static;
end;
implementation
{ TImageFormatTypehelper }
function TImageFormatTypehelper.ToString: string;
begin
case Self of
TImageFormatType.jpeg:
Exit('data:image/jpeg;base64,%s');
TImageFormatType.png:
Exit('data:image/png;base64,%s');
end;
end;
{ TImageSource }
class function TImageSource.Create(const Url: string;
const DetailValue: TImageDetail = auto): TImageSource;
begin
Result.FType := TImageSourceType.FromUrl;
Result.FDetail := DetailValue;
Result.FValue := Url;
end;
class function TImageSource.Create(const Format: TImageFormatType;
const Base64Img: string;
const DetailValue: TImageDetail = auto): TImageSource;
begin
Result.FType := TImageSourceType.FromBase64;
Result.FFormat := Format;
Result.FDetail := DetailValue;
Result.FValue := System.SysUtils.Format(Format.ToString, [Base64Img]);
end;
{ TChatVisionBuild }
class function TChatVisionBuild.Create(const ContextText: string;
Images: TArray<TImageSource>): TChatVisionBuild;
begin
Result.FContent := ContextText;
Result.FImageSources := Images;
end;
{ TImageDetailhelper }
function TImageDetailhelper.ToString: string;
begin
case Self of
TImageDetail.auto:
Exit('auto');
TImageDetail.low:
Exit('low');
TImageDetail.high:
Exit('high');
end;
end;
end.
So modify OpenAI.chat unit to support Vision.
unit OpenAI.Chat;
interface
uses
System.SysUtils, OpenAI.API.Params, OpenAI.API, OpenAI.Chat.Functions,
System.Classes, REST.JsonReflect, System.JSON,
OpenAI.Vision.Images;
......
TChatParams = class(TJSONParam)
/// <summary>
/// ID of the model to use. See the model endpoint compatibility table for details on which models work with the Chat API.
/// </summary>
/// <seealso>https://platform.openai.com/docs/models/model-endpoint-compatibility</seealso>
function Model(const Value: string): TChatParams;
.........
/// <summary>
/// GPT-4 with Vision, sometimes referred to as GPT-4V or gpt-4-vision-preview in the API, allows the model to take in images
/// and answer questions about them.
/// </summary>
function Vision(const Context: string; Images: TArray<TImageSource>;
const Role: TMessageRole = TMessageRole.User): TChatParams;
constructor Create; override;
end;
........
function TChatParams.Vision(const Context: string;
Images: TArray<TImageSource>;
const Role: TMessageRole = TMessageRole.User): TChatParams;
var
Item: TImageSource;
JSON: TJSONObject;
JSONImgObj: TJSONObject;
Items: TJSONArray;
ArrayMessage: TJSONArray;
begin
case Role of
TMessageRole.User, TMessageRole.Assistant: ;
else raise Exception.CreateFmt('Inappropriate role (%s)', [Role.ToString]);
end;
Items := TJSONArray.Create;
try
JSON := TJSONObject.Create;
{"type": "text", "text": "What’s in this image?"}
JSON.AddPair('type', 'text');
JSON.AddPair('text', Context);
Items.Add(JSON);
for Item in Images do
begin
JSON := TJSONObject.Create;
{"type": "image_url",
"image_url": {
"url": "Url or Image content to base64 string",
"detail": "auto/low/high"}
JSON.AddPair('type', 'image_url');
JSONImgObj := TJSONObject.Create;
JSONImgObj.AddPair('url', Item.Value);
JSONImgObj.AddPair('detail', Item.Detail.ToString);
JSON.AddPair('image_url', JSONImgObj);
Items.Add(JSON);
end;
JSON := TJSONObject.Create;
{"role": "user/assistant"}
JSON.AddPair('role', Role.ToString);
{"content": "content_value"}
JSON.AddPair('content', Items);
ArrayMessage := TJSONArray.Create;
ArrayMessage.Add(JSON);
except
Items.Free;
raise;
end;
Result := TChatParams(Add('messages', ArrayMessage));
end;
To use Vision:
Example 1: Using a link to the image to process
var Chat := OpenAI.Chat.Create(
procedure(Params: TChatParams)
begin
Params.Model('gpt-4-vision-preview');
Params.Vision(MemoQuery.Text, [TImageSource.Create('https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg', TImageDetail.low)], TMessageRole.User);
Params.MaxTokens(300);
end);
try
for var Choice in Chat.Choices do
MemoResult.Lines.Add(Choice.Message.Content);
finally
Chat.Free;
end;
Example 2: By uploading the image with the query
var Chat := OpenAI.Chat.Create(
procedure(Params: TChatParams)
begin
Params.Model('gpt-4-vision-preview');
Params.Vision(MemoQuery.Text, [TImageSource.Create(TImageFormatType.png, Image1.Picture.ToBase64, TImageDetail.low)], TMessageRole.System);
Params.MaxTokens(300);
end);
try
for var Choice in Chat.Choices do
MemoResult.Lines.Add(Choice.Message.Content);
finally
Chat.Free;
end;
if the Image1 component is the VCL component
TPictureHelper = class helper for TPicture
function ToBase64: string;
end;
{ TPictureHelper }
function TPictureHelper.ToBase64: string;
begin
var Input := TMemoryStream.Create;
var Output := TStringStream.Create(EmptyStr, TEncoding.UTF8);
try
Self.SaveToStream(Input);
Input.Position := 0;
TNetEncoding.Base64.Encode(Input, Output);
Result := OutPut.DataString;
finally
Input.Free;
Output.Free;
end;
end;
Hi,
I try to set Params.N(Count); // Generate Count images when generating images via Dall-e3, but get an error message saying this model does not support to generate multiple images, why?
Hi,
I try to use your class, as below:
const
PromptTemplate4 = 'You are UIBot, a translator of GUI elements and controls for software.' +
' You translate English buttons and menu items to Japanese.' +
' Important: Keyboard shortcuts in English may exist and their operation must be preserved.' +
' If the translation will be only kana or kanji, you must extract the shortcut key and add it at the end.' +
' Example input: ''&Add-ons and features'' gives example output ''アドオンと機能 (&A)''';
function TMainForm.SendPrompt(Role: TMessageRole; Prompt: string): string;
begin
var Chat := FOpenAI.Chat.Create(
procedure(Params: TChatParams)
begin
Params.Messages([TChatMessageBuild.Create(Role, Prompt)]);
Params.Model('gpt-4');
//Params.MaxTokens(1024);
end);
try
for var Choice in Chat.Choices do
begin
txtChatReply.Lines.Add(Choice.Message.Content);
txtChatReply.Lines.Add('---------------');
end;
finally
Chat.Free;
end;
end;
procedure TMainForm.btnSendPromptClick(Sender: TObject);
begin
SendPrompt(TMessageRole.System, PromptTemplate4);
SendPrompt(TMessageRole.User, 'Translate button text: &Repair');
end;
The results are:
"Close &Without Saving"
---------------
&Reparar
---------------
It seems that the 2nd time when I call SendPrompt, it has forgotten his role to translate English to Japanese. And translate "Repair" to German.
I have tried to input the same two prompts on your ChatGPT sample project, and then for the 2nd prompt, it can returns the Japanese correct. Why?
Just uses OpenAI; Nothing else !!!!
Compile with 10.2.3
On unit OpenAI.Completions;
error at end);
[dcc32 Error] OpenAI.Completions.pas(218): E2250 There is no overloaded version of 'TOpenAIAPI.Post<OpenAI.Completions.TCompletionParams>' that can be called with these arguments
P.S i have a few versions from you. What is going on?
Thank you so much for this incredible library. I am trying to use this in a console based, streaming example. I can create a Chat, and get all data back in one return message. However when I try to use streaming, I get an error. The following console code works fine. I submit my chat, and I get the entire answer back in one "event". I would like the same behavior as the ChatGPT website, so the tokens would be displayed as they are generated. My code is as follows...
var buf : TStringlist;
begin
...
var Chat := OpenAI.Chat.Create(
procedure(Params: TChatParams)
begin
Params.Messages([TChatMessageBuild.Create(TMessageRole.User, Buf.Text)]);
Params.MaxTokens(1024);
// Params.Stream(True);
end);
try
for var Choice in Chat.Choices do
begin
Buf.Add(Choice.Message.Content);
Writeln(Choice.Message.Content);
end;
finally
Chat.Free;
end;
This code works. When I try to turn on streaming, I get the EConversionError 'The input value is not a valid Object', which causes ChatGPT to return 'Empty or Invalid Response'. Any ideas appreciated.
Hello to all!
First of all, thanks very much for your work with DelphiOpenAI, I find it really useful and very easy to implement. I have a question related to a possible "conversation" chat with Open AI.
By a "conversation" (sorry for my english) I mean to maintain a chat like we can do at https://chat.openai.com/chat
If we start a chat using the above URL and place the below question:
"Hello! My name is David, how are you?"
... we will get a similar response like below:
"Hello David! As an artificial intelligence language model, I don't have emotions, but I'm functioning optimally and ready to assist you with any questions or tasks you have. How can I assist you today?"
If, after that response, we place another question / text like below:
"What is my name?"
... Open AI chat response is something like below:
"Your name is David, as you mentioned in your previous message."
However, using DelphiOpenAI, if I reproduce the above "conversation", after the "What is my name?" question, what I get is something similar to below:
"As an AI language model, I don't have access to your personal information or your name. Can you please tell me your name?"
Maybe I am missing something? It's very probably, so, sorry for that if it's the case! I am using your provided chat stream sample "as is" and I can't find a way to do something like the chat with Open AI with DelphiOpenAI.
Thanks in advance for any help, and, again thanks again for your work!
https://github.com/HemulGM/DelphiOpenAI/blob/main/OpenAI.Files.pas
Line 18,
TFileCreateParams inherits from TMultipartFormData. However, TFileCreateParams didn't implement its own constructor. Therefore,
TFileCreateParams.Create will call TObject.Create, skipping all the initialization code inside TMultipartFormData.Create(...).
This later will result invalid access to Stream object inside TMultipartFormData.
I was using function calling and everything works.
But made changes in code to support Tools (using same functions as before) .. but I can't get it to work.
It's not calling the Tools (i.e. function) and returns BLANK content.
OpenAI.Engines.pas, Line 13, FOwned_by should really be "FOwner"
unit OpenAI.Chat;
class function TFinishReasonHelper.Create(const Value: string): TFinishReason;
begin
if Value = 'stop' then
Exit(TFinishReason.Stop)
else if Value = 'length' then
Exit(TFinishReason.Length)
else if Value = 'function_call' then
Exit(TFinishReason.FunctionCall)
else if Value = 'content_filter' then
Exit(TFinishReason.ContentFilter)
else if Value = 'null' then
Exit(TFinishReason.ToolCalls)
else if Value = 'tool_calls ' then
Exit(TFinishReason.Null);
Result := TFinishReason.Stop;
end;
is seem null and tool_calls is incorrect...
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.