Git Product home page Git Product logo

Comments (11)

abidlabs avatar abidlabs commented on July 28, 2024 1

Yes concurrency_limit=1 is just a default because often a machine will only have the resources to support a single user for ML demos, but in many demos (including lmsys chat arena for example), this is increased to support more users at a time. in which case, you'll want to add a lock around the dataset to ensure no race conditions

from gradio.

natolambert avatar natolambert commented on July 28, 2024 1

Another solution idea that I have, given that I'm modifying the ChatInterface source is to store all the conversations to an internal variable of the chat interface, and then save prompts every N seconds with another process.

Saving data outside of the predict() function seems best if I can full it off. Given we have GPUs to enable concurrency.

from gradio.

abidlabs avatar abidlabs commented on July 28, 2024 1

Yup agreed, I'm going to close this issue since I think should be handled user-side. @pngwn feel free to reopen if you disagree

from gradio.

natolambert avatar natolambert commented on July 28, 2024

Ohhh HuggingFaceDatasetSaver is cool!!!

from gradio.

abidlabs avatar abidlabs commented on July 28, 2024

I kinda regret implementing HuggingFaceDatasetSaver, as it is a clunky and shallow abstraction. It would have been better to let users write an arbitrary call back function when the Flag button was clicked. Anyways, in this case, its pretty straightforward to save prompts part of your ChatInterface function, something like:

from datasets import Dataset, DatasetDict
from huggingface_hub import HfApi, HfFolder
import gradio as gr

prompts = {'text': []}
dataset = Dataset.from_dict(empty_data)

def get_response(prompt, history):
   prompts = prompts.add_item(new_sample)
   prompts.push_to_hub('your_username/your_text_dataset')
   return ...

demo = gr.ChatInterface(
  get_response,
)

demo.launch()

I would suggest that we make an example demo and share it, rather than including a relatively shallow abstraction that we will have to maintain

from gradio.

natolambert avatar natolambert commented on July 28, 2024

@abidlabs let me try this.
Will there be any issue with asynchronicity? Not sure how multiple queries to get_response get handled in the back end?

from gradio.

abidlabs avatar abidlabs commented on July 28, 2024

to prevent issues with concurrency, by default, only 1 worker will be running get_response() at any given time (this can be changed by setting the concurrency_limit parameter of gr.ChatInterface(): https://www.gradio.app/docs/gradio/chatinterface

i.e. if one user is getting a submission back, all other users will be waiting in queue

from gradio.

natolambert avatar natolambert commented on July 28, 2024

This makes sense. There is some flexibility in this by using streaming maybe (+ a VLLM backend enables async-ness).
I was also wondering if VLLM implemented a saving method, both make sense on my side.

I think eventually we'll want to enable more than one worker, hopefully multiple people use our demo's. I'll look.

from gradio.

natolambert avatar natolambert commented on July 28, 2024

Or, if I really want this, I should just save the prompts locally in the predict() function, then periodically upload the results.

from gradio.

pngwn avatar pngwn commented on July 28, 2024

I agree that it should be handled in userland conceptually but I think we can make it easier somehow. I'll reopen if I can come up with a decent proposal.

from gradio.

natolambert avatar natolambert commented on July 28, 2024

We got this working pretty easily @pngwn / @abidlabs.

Added these functions (with some other attributes):

    # below added by nathanl@
    def _save_single_conversation(self, chat_history):
        timestamp = datetime.datetime.now().strftime("%Y-%m-%d_%H-%M-%S")
        debug_mode = self.model_client.debug

        file_suffix = "_debug" if debug_mode else ""
        directory = "user_data"
        os.makedirs(directory, exist_ok=True)  # Ensure directory exists
        file_path = f"{directory}/chat_history_{timestamp}{file_suffix}.json"

        data_to_save = {
            "model_name": self.model_client.model,
            "conversation": chat_history,
            "model_name_2": None,  # No second model in this function
            "conversation_2": [
                [],
            ],  # Making sure to add an empty list or lists for data compatibility
            "timestamp": timestamp,
            "debug": debug_mode,
            "metadata": {},  # TODO add safety metadata
        }

        with open(file_path, "w") as f:
            json.dump(data_to_save, f, indent=4)

        return "Conversation saved successfully!"

    def _save_dual_conversation(self, chat_history, chat_history_2):
        timestamp = datetime.datetime.now().strftime("%Y-%m-%d_%H-%M-%S")
        debug_mode = self.model_client.debug

        file_suffix = "_debug" if debug_mode else ""
        directory = "user_data"
        os.makedirs(directory, exist_ok=True)  # Ensure directory exists
        file_path = f"{directory}/chat_history_{timestamp}{file_suffix}.json"

        data_to_save = {
            "model_name": self.model_client.model,
            "conversation": chat_history,
            "model_name_2": self.model_client_2.model,
            "conversation_2": chat_history_2,
            "timestamp": timestamp,
            "debug": debug_mode,
            "metadata": {},  # TODO add safety metadata
        }

        with open(file_path, "w") as f:
            json.dump(data_to_save, f, indent=4)

        return "Conversation saved successfully!"

They're called after inference like:

                .then(
                    submit_fn,
                    [self.saved_input, self.chatbot_state] + self.additional_inputs,
                    [self.chatbot, self.chatbot_state],
                    show_api=False,
                    concurrency_limit=cast(Union[int, Literal["default"], None], self.concurrency_limit),
                    show_progress=cast(Literal["full", "minimal", "hidden"], self.show_progress),
                )
                .then(
                    self.safety_fn,
                    [self.saved_input, self.chatbot_state] + self.additional_inputs,
                    [self.safety_log, self.safe_response],
                    concurrency_limit=cast(Union[int, Literal["default"], None], self.concurrency_limit),
                )  # SAVING DATA BELOW
                .then(
                    self._save_single_conversation,
                    inputs=[self.chatbot_state],
                    outputs=[],
                    show_api=False,
                    concurrency_limit=cast(Union[int, Literal["default"], None], self.concurrency_limit),
                )

I'm hoping to open source the example :)

from gradio.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.