This project is an AI assistant chat application built using FastAPI + LLamaIndex + ChromaDB. The AI assistant is capable of responding to user queries and performing various tasks such as providing information and answering questions.
To install and run the AI assistant application locally, follow these steps:
-
Clone the repository to your local machine:
git clone https://github.com/K4rlosReyes/ai-assistant.git
-
Navigate to the project directory:
cd ai-assistant
-
Install the dependencies using pip:
pip install -r requirements.txt
-
Start the FastAPI server:
python main.py
-
Access the AI assistant API at http://localhost:8000.
python -m core.process.rag
Once the application is running, you can interact with the AI assistant through requests. The AI assistant will respond accordingly.
This repository follows the following development practices:
- Conventional Commits: Commits follow the conventional commits specification for clear and standardized commit messages.
- Code Style: Code adheres to the PEP 8 style guide for Python code.
- Documentation: Code is well-documented, and additional documentation is provided where necessary.
- Testing: Unit tests are included in the
tests
directory to ensure code reliability and maintainability. - Continuous Integration: Continuous Integration (CI) pipelines are set up to automatically run tests and checks on every push or pull request.
Contributions to the AI assistant chat application are welcome! If you'd like to contribute, please follow these steps:
- Fork the repository and create a new branch for your feature or fix.
- Make your changes and ensure that the code adheres to the project's development practices.
- Write tests for your changes to ensure code reliability.
- Submit a pull request detailing your changes and the problem they solve.
This project is licensed under the MIT License.