This project showcases a chat application built with Streamlit that integrates Llama3 models via the Groq platform. It's designed to offer speedy real-time AI-driven chat functionalities, making the most of Llama3's capabilities within an interactive web interface.
demo.mp4
- Llama3 Integration: Leverages the Llama3 and other major models, such as Mixtral and Gemma, to provide intelligent and context-aware responses.
- Groq Platform: Utilizes Groq's powerful computation capabilities to ensure fast and efficient model responses.
- Streamlit Interface: Offers a user-friendly web interface that allows users to interact with the AI dynamically.
- Real-Time Responses: Engineered to handle user inputs and deliver AI responses in real time.
- Python 3.x
- Streamlit
- Groq API
- dotenv for environment management
Ensure you have Python 3.6 or higher installed on your system. Streamlit and other required packages will be installed via the requirements file.
- Clone the repository:
git clone https://github.com/Op27/llama3-groq-streamlit.git
- Navigate to the project directory:
cd llama3-groq-streamlit
- Install the required Python packages:
pip install -r requirements.txt
To use this application, you'll need an API key from Groq. Visit the Groq API documentation to learn how to obtain one.
Once you have your API key, you need to set it in your environment:
- Rename
.env.example
to.env
. - Open the
.env
file and replaceYOUR_API_KEY_HERE
with your Groq API key.
This step is crucial for the application to interact with Groq's services securely.
To run the application, use the following command:
bash streamlit run main.py
This project is licensed under the MIT License - see the LICENSE file for details.