This is an AI-powered assistant for a pizza restaurant that supports:
- Conversational interaction via the command line
- A web server with a FastAPI backend
- Tool usage like placing pizza orders and querying documents
- 🔧 Tool-based interactions with LLM agents
- 🍕 Place pizza orders by providing pizza type, size, quantity, and delivery address
- 📄 Query indexed documents (e.g., reviews, orders) using semantic search
- 🧠 Persistent conversation history
- 💾 Vector store support with ChromaDB and Ollama embeddings
- 🌐 REST API and WebSocket support via FastAPI
-
Install and Setup Ollama
First, install Ollama:
curl -fsSL https://ollama.com/install.sh | shStart the Ollama server in the background:
nohup ollama serve &Download the required models:
ollama pull llama3.2 ollama pull mxbai-embed-large ollama pull llama3.2-vision
Note: The Ollama server must be running for the assistant to function properly.
-
Clone the repo
git clone https://github.com/patrickwide/PizzaAssist.git cd PizzaAssist -
Create and activate a virtual environment
python -m venv venv source venv/bin/activate # For Linux/macOS .\venv\Scripts\activate # For Windows
-
Install dependencies
pip install -r requirements.txt
-
Initialize Data Structure
./create_data_structure.sh
You should see the following output:
🔧 Creating base directory structure... ✅ Created directory: data/db ✅ Created directory: data/history ✅ Created directory: data/documents 📄 Creating placeholder files in data/documents... ✅ Created file: data/documents/orders.txt ✅ Created file: data/documents/realistic_restaurant_reviews.csv 📝 Creating system_message.md and welcome_message.md... ✅ Created: data/system_message.md ✅ Created: data/welcome_message.md 🎉 Directory and file structure successfully initialized under 'data/' ✅ Setup complete. You can start working with your project.
You can easily test the Pizza Restaurant AI Assistant using Google Colab without any local setup:
The notebook provides an interactive environment to:
- Test the assistant's capabilities
- Place pizza orders
- Query documents
- Experiment with different prompts
Simply click the "Open in Colab" button above to get started!
This launches the assistant in the terminal with natural language support.
python cli.pyYou'll see:
Welcome to the Pizza Restaurant Assistant!
Ask about reviews or place an order.
(e.g., 'How is the pepperoni pizza?', 'Tell me about the service',
'I want to order 1 large veggie pizza to 456 Oak Avenue', 'exit' to quit)
This launches the FastAPI backend on port 8000.
python main.pyYou'll see logs like:
🚀 Server starting on http://127.0.0.1:8000
🔌 WebSocket endpoint: ws://127.0.0.1:8000/ws/ai
❤️ Health check: http://127.0.0.1:8000/health
| Method | Endpoint | Description |
|---|---|---|
| GET | /health |
Health check |
| WS | /ws/ai |
WebSocket for AI chat |
-
Install the package in development mode
pip install -e . -
Run the web server
pizza-assist-server
- Build the distribution packages
This will create both wheel (
python -m build
.whl) and source distribution (.tar.gz) files in thedist/directory.
-
Install from a distribution file
pip install dist/pizza_assist-0.1.0-py3-none-any.whl
-
Run in production
# Start the web server pizza-assist-server --host 0.0.0.0 --port 8000
-
place_pizza_order- Required:
pizza_type,size,quantity,delivery_address - Description: Places a pizza order and saves it
- Required:
-
query_documents- Required:
query - Description: Searches and retrieves content from indexed documents
- Required:
Logs are written to the logs/ directory for debugging and tracing interactions.
- Conversation History: stored in
core/data/history/conversation_history.jsonl - Vector Store: ChromaDB-powered local vector DB with Ollama embeddings
MIT License
Made by patrickwide