This repository contains example code and resources for the blog post
“When LLMs Stop Talking and Start Doing: A Guide to LangChain Function Calling with Python and Azure AI Foundry.”
The project demonstrates how to use LangChain with an Azure OpenAI GPT deployment to extend LLMs beyond conversation and into actionable tasks.
- Python integration with LangChain and Azure OpenAI SDK
- Example function calling demos:
- Fetching the latest news
- Checking live weather
- Clear pattern to extend into enterprise use cases (e.g., HR portals, timesheet submissions, ERP workflows, IT helpdesk tasks)
Full setup instructions are included in the blog post.
- Create a virtual environment and install requirements (Windows PowerShell):
python -m venv .venv; .\.venv\Scripts\Activate.ps1; pip install -r requirements.txt-
Copy
.env.exampleto.envand fill your Azure OpenAI values. -
Start the server with uvicorn (from repo root):
uvicorn app.main:app --reload --host 127.0.0.1 --port 8000-
Health check: GET http://127.0.0.1:8000/healthz
-
Chat endpoint example:
POST http://127.0.0.1:8000/chat Content-Type: application/json
{ "input": "Hello", "session_id": "default" }
Running in VS Code
- Select the Python interpreter for the project (pick the
.venvyou created) from the bottom-right status bar orCtrl+Shift+P->Python: Select Interpreter. - Open the Run view (left sidebar) and choose "Python: Uvicorn (app.main)" then press the green play button. This uses
.vscode/launch.jsonand the project's.envfile.