Skip to content

Learning about how to make a local LLM and added a web scraper and using streamlit for a UI

Notifications You must be signed in to change notification settings

jdicke/offline-web-scraper-gpt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Local ChatGPT with URL Scraping This project provides a local ChatGPT-like application that can generate responses based on user prompts. Additionally, it allows users to input a URL, which will be scraped, and the scraped content will be used as context for generating the response.

Features Generate responses based on user prompts Optionally, scrape content from a provided URL to use as context for generating responses Simple and intuitive user interface built with Streamlit Prerequisites Python 3.7 or higher NVIDIA GPU (optional, for GPU acceleration) CUDA (if using GPU acceleration) Installation Clone the repository:

Copy code git clone https://github.com/your-username/local-chatgpt.git cd local-chatgpt Install the required packages:

Copy code pip install -r requirements.txt (Optional) If you have an NVIDIA GPU and want to use GPU acceleration, install the appropriate CUDA version of PyTorch by following the instructions on the PyTorch website. (Optional) Create a .env file in the project root directory and add any necessary API credentials or environment variables. For example:

Copy code API_KEY=your_api_key Usage Run the Streamlit app:

Copy code streamlit run app.py The app will open in your default web browser. Enter your prompt in the text area. (Optional) Enter a URL in the text input field if you want to scrape content from a website to use as context for generating the response. Click the "Generate Response" button. The generated response will be displayed below the input fields. Contributing Contributions are welcome! Please follow the standard GitHub workflow:

Fork the repository Create a new branch (git checkout -b feature/your-feature) Commit your changes (git commit -am 'Add some feature') Push to the branch (git push origin feature/your-feature) Create a new Pull Request License This project is licensed under the MIT License.

Acknowledgments Hugging Face Transformers - The library used for loading and generating responses from the OPT-IML 1.3B model. Streamlit - The Python library used for creating the user interface. Requests and BeautifulSoup - The libraries used for web scraping.

About

Learning about how to make a local LLM and added a web scraper and using streamlit for a UI

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages