To install the necessary dependencies, run the following commands:
pip install gradio
pip install ollama
conda install langchain langchain-community
conda install conda-forge::pypdf- Install Ollama.
- Pull the
llama3model:ollama pull llama3
- Serve requests (you may need to allow access from other IPs if required).
Set the Ollama host in the .env file:
HOST=http://localhost:11434
By default, Ollama runs on port 11434.
Start the application by running:
python app.pyOnce the application is running, open your browser and navigate to the provided URL.