Local LLMBot is a chat application built with Streamlit and the Ollama API to simulate a chat with a local language model. The app provides an intuitive interface for managing multiple chat sessions and interacting with the model.
- Create and manage multiple chat sessions
- Generate responses using the Ollama model
- Intuitive user interface for displaying chat history
- Sidebar for chat session management
- Python 3.7 or higher
pip
package manager
Before installing the required Python dependencies, ensure you have Ollama installed and properly configured.
-
Install Ollama
Follow the official instructions from the Ollama Offical Website to install the
ollama3
package. -
Configure Ollama
ollama run llama3
-
Clone the Repository
git clone https://github.com/Kevoyuan/LLAMA3.git cd LocalLLMBot
-
Create and Activate Virtual Environment
python -m venv llama3 .\llama3\Scripts\activate
-
Install Dependencies
pip install -r requirements.txt
-
Activate the Virtual Environment
Windows:
.\llama3\Scripts\activate
macOS/Linux:
source llama3/bin/activate
-
Run the Streamlit App
streamlit run app.py
-
Access the App Open your web browser and navigate to
http://localhost:8501
.
Contributions are welcome! Please fork the repository and create a pull request with your changes. Ensure that your code adheres to the project's coding standards and includes appropriate tests.