Git Product home page Git Product logo

html-chat-ai-sdk's People

Contributors

rajeshdavidbabu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

Forkers

einfachalf

html-chat-ai-sdk's Issues

Saving Chat History

Hey Rajesh! Great Project!!

I am building off of the supabase clone of the vercel ai chatbot & I am currently stuck on saving the chats to storage. I have gotten it to work but often times langchain stream seems to interrupt the upsert into supabase. My current implementation, after a long day of back and forth I switched back to the approach you are taking, does not implement chat history.

This is my current code

import 'server-only';
import { getSession } from '@/app/supabase-server';
import { Database } from '@/lib/db_types';
import { templates } from '@/lib/template';
import { nanoid } from '@/lib/utils';
import { PineconeClient } from '@pinecone-database/pinecone';
import { createServerActionClient } from '@supabase/auth-helpers-nextjs';
import { LangChainStream, Message, StreamingTextResponse } from 'ai';
import { ConversationalRetrievalQAChain, LLMChain } from 'langchain/chains';
import { ChatOpenAI } from 'langchain/chat_models/openai';
import { OpenAIEmbeddings } from 'langchain/embeddings/openai';
import { BufferMemory } from 'langchain/memory';
import { PineconeStore } from 'langchain/vectorstores/pinecone';
import { cookies } from 'next/headers';
import { Configuration, OpenAIApi } from 'openai-edge';

export const runtime = 'nodejs';

const configuration = new Configuration({
  apiKey: process.env.OPENAI_API_KEY
});

const openai = new OpenAIApi(configuration);

const formatMessage = (message: Message) => {
  return `${message.role === 'user' ? 'Human' : 'Assistant'}: ${
    message.content
  }`;
};

export async function POST(req: Request) {
  const cookieStore = cookies();
  const supabase = createServerActionClient<Database>({
    cookies: () => cookieStore
  });
  const session = await getSession();
  const userId = session?.user.id;

  if (!userId) {
    return new Response('Unauthorized', {
      status: 401
    });
  }

  const streamingModel = new ChatOpenAI({
    modelName: 'gpt-3.5-turbo',
    streaming: true,
    verbose: true,
    temperature: 0
  });

  const nonStreamingModel = new ChatOpenAI({
    modelName: 'gpt-3.5-turbo',
    verbose: true,
    temperature: 0
  });
  const pinecone = new PineconeClient();
  await pinecone.init({
    environment: process.env.PINECONE_ENVIRONMENT ?? '',
    apiKey: process.env.PINECONE_API_KEY ?? ''
  });

  const pineconeIndex = pinecone.Index(process.env.PINECONE_INDEX_NAME!);

  const vectorStore = await PineconeStore.fromExistingIndex(
    new OpenAIEmbeddings(),
    { pineconeIndex }
  );

  const json = await req.json();
  const messages: Message[] = json.messages ?? [];
  const formattedPreviousMessages = messages.slice(0, -1).map(formatMessage);
  const question = messages[messages.length - 1].content;
  const sanitizedQuestion = question.trim().replaceAll('\n', ' ');
  const { stream, handlers } = LangChainStream();
  const chatHistory = formattedPreviousMessages.join('\n');

  const chain = ConversationalRetrievalQAChain.fromLLM(
    streamingModel,
    vectorStore.asRetriever(),
    {
      qaTemplate: templates.qaPrompt,
      questionGeneratorTemplate: templates.condensePrompt,
      returnSourceDocuments: true, //default 4
      memory: new BufferMemory({
        memoryKey: 'chat_history',
        inputKey: 'question', // The key for the input to the chain
        outputKey: 'text', // The key for the final conversational output of the chain
        returnMessages: true // If using with a chat model (e.g. gpt-3.5 or gpt-4)
      }),
      questionGeneratorChainOptions: {
        llm: nonStreamingModel
      }
    }
  );

  // Question using chat-history
  // Reference https://js.langchain.com/docs/modules/chains/popular/chat_vector_db#externally-managed-memory
  chain.call(
    {
      question: sanitizedQuestion,
      chat_history: chatHistory
    },
    [handlers]
  )
  return new StreamingTextResponse(stream);
}

And I need to figure out how to implement this logic, expect not using the callback functionality as it breaks the stream and wont save the data to the db 

const { stream, handlers } = LangChainStream({
      async onCompletion(completion) {
        const title = json.messages[0].content.substring(0, 100);
        const id = json.id ?? nanoid();
        const createdAt = Date.now();
        const path = `/chat/${id}`;
        const payload = {
          id,
          title,
          userId,
          createdAt,
          path,
          messages: [
            ...messages,
            {
              content: completion,
              role: 'assistant'
            }
          ]
        };

        await supabase.from('chats').upsert({ id, payload }).throwOnError();
      }
    });

I would appreciate any help!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.