AI Agent simplifies the implementation and use of generative AI with LangChain, was inspired by the project autogent
Use the package manager npm to install AI Agent.
npm install ai-agent
const agent = new Agent({
name: '<name>',
systemMesssage: '<a message that will specialize your agent>',
llmConfig: {
type: '<cloud-provider-llm-service>', // Check availability at <link>
model: '<llm-model>',
instance: '<instance-name>', // Optional
apiKey: '<key-your-llm-service>', // Optional
},
chatConfig: {
temperature: 0,
},
});
agent.on('onMessage', async (message) => {
console.warn('MESSAGE:', message);
});
await agent.call({
question: 'What is the best way to get started with Azure?',
chatThreadID: '<chat-id>',
});
When using LLM + Vector stores the Agent finds the documents relevant to the requested input. The documents found are used for the context of the Agent.
const agent = new Agent({
name: '<name>',
systemMesssage: '<a message that will specialize your agent>',
chatConfig: {
temperature: 0,
},
llmConfig: {
type: '<cloud-provider-llm-service>', // Check availability at <link>
model: '<llm-model>',
instance: '<instance-name>', // Optional
apiKey: '<key-your-llm-service>', // Optional
},
vectorStoreConfig: {
type: '<cloud-provider-llm-service>', // Check availability at <link>
apiKey: '<your-api-key>', // Optional
indexes: ['<index-name>'], // Your indexes name. Optional
vectorFieldName: '<vector-base-field>', // Optional
name: '<vector-service-name>', // Optional
apiVersion: "<api-version>", // Optional
model: '<llm-model>' // Optional
customFilters: '<custom-filter>' // Optional. Example: 'field-vector-store=(userSessionId)' check at <link>
},
});
agent.on('onMessage', async (message) => {
console.warn('MESSAGE:', message);
});
await agent.call({
question: 'What is the best way to get started with Azure?',
chatThreadID: '<chat-id>',
});
If you've ever wanted to contribute to open source, and a great cause, now is your chance!
See the contributing docs for more information
JP. Nobrega 💬 📖 👀 📢 |