Skip to content

nojux-official/RAG-prototype

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RAG prototype

Installation

To install the necessary dependencies, run the following commands:

pip install gradio
pip install ollama
conda install langchain langchain-community
conda install conda-forge::pypdf

Setting Up Ollama

  1. Install Ollama.
  2. Pull the llama3 model:
    ollama pull llama3
  3. Serve requests (you may need to allow access from other IPs if required).

Configuring Ollama Host

Set the Ollama host in the .env file:

HOST=http://localhost:11434

By default, Ollama runs on port 11434.

Running the Application

Start the application by running:

python app.py

Accessing the Application

Once the application is running, open your browser and navigate to the provided URL.

About

A project for KTU Skilled AI

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published