A demo application showcasing OpenAI Assistants API function call capabilities with ICTLife integration.
assistants-function-calling/
├── server/ # Node.js backend (plain JavaScript)
│ ├── server.js # Express server with API routes
│ └── package.json
├── client/ # React frontend (TypeScript)
│ ├── src/
│ │ ├── components/ # React components
│ │ ├── pages/ # Page components
│ │ ├── utils/ # Utility functions
│ │ ├── hooks/ # Custom React hooks
│ │ ├── App.tsx # Main app with routing
│ │ └── main.tsx
│ └── package.json
└── package.json # Root package.json with scripts
- Node.js 18+ (for native fetch API support)
- Install all dependencies:
npm run install-all- Start both server and client:
npm run devOr start them separately:
# Terminal 1 - Backend
npm run server
# Terminal 2 - Frontend
npm run client- Open the application in your browser (usually http://localhost:3000)
- Enter your API keys on the home page:
- OpenAI API Key
- ICTLife API Key
- ICTLife User ID (numeric)
- Click "Continue" - keys are saved to localStorage and you're redirected to
/orgs - Browse organizations (groups) vertically with pagination
- Click an organization to view its assistants at
/org/:uuid/assistants - Select an assistant from the sidebar to view details and start chatting
- Chat messages persist in URL params for page reload support
/- Home page with API key form/orgs- Organizations list with pagination/org/:uuid/assistants- Assistants page with sidebar and chat
Backend (http://localhost:3001)
All endpoints now require API keys in request headers:
-
X-OpenAI-Key- OpenAI API key -
X-ICTLife-Key- ICTLife API key -
X-ICTLife-User-Id- ICTLife user ID -
GET /api/groups- Fetch groups from ICTLife API -
GET /api/groups/:groupUuid/assistants- Fetch assistants for a group -
GET /api/assistants/:assistantId- Fetch assistant details from OpenAI -
POST /api/assistants/:assistantId/tools- Add theassign_chat_to_agentfunction to an assistant (preserves existing tools) -
GET /api/agents- Fetch available agents (mock data: agent_id, agent_name, agent_role) -
POST /api/functions/assign_chat_to_agent- Execute assign_chat_to_agent (two-step: first call returns agents, second call runs assignment with selected_agent) -
POST /api/threads- Create a new conversation thread -
GET /api/threads/:threadId/messages- Get thread messages -
POST /api/threads/:threadId/messages- Add message to thread -
POST /api/threads/:threadId/runs- Run assistant on thread (adds additional_instructions when assistant has assign_chat_to_agent) -
GET /api/threads/:threadId/runs/:runId- Get run status -
POST /api/threads/:threadId/runs/:runId/submit-tool-outputs- Submit tool outputs and poll until completion -
GET /api/health- Health check
- ✅ Modern UI with centered form and clean design
- ✅ localStorage persistence for API keys
- ✅ React Router for navigation
- ✅ Vertical organizations list with pagination
- ✅ Sidebar layout for assistants
- ✅ Assistant details display from OpenAI
- ✅ Chat interface with thread management
- ✅ URL params for assistant/thread persistence
- ✅ Toast notifications for errors
- ✅ Error boundary for error handling
- ✅ Loading, empty, and error states
- ✅ Logout functionality
- ✅ Chat Assignment capability: Optional
assign_chat_to_agentfunction can be added to an assistant from the details panel. When enabled:- Button "Enable Chat Assignment capability" shows an accordion with function schema and "Add Function".
- The function uses a two-step flow: (1) Assistant calls with user_message → system returns list of agents; (2) Assistant selects best match and calls again with selected_agent → server runs assignment and returns confirmation. The Assistant then informs the customer which agent was assigned.
- Run creation automatically adds
additional_instructionswhen the assistant has this function. Frontend handlesrequires_actionby calling the function endpoint and submitting tool outputs.
The app is set up to deploy as a single Vercel project (client and server together).
- Root directory: Use the repo root (
./orassistants-function-calling). Do not chooseclientorserveralone. - Build: Root
npm run buildbuilds the React client (Vite) intoclient/dist. - Backend: On Vercel, the Express server in
server/is run as a serverless function viasrc/server.js, which also serves the built client for/and SPA routes. API routes stay under/api/*.
In Build and Output Settings:
| Setting | Value |
|---|---|
| Framework Preset | Other |
| Root Directory | ./ (leave as repo root) |
| Build Command | npm run build (or leave default; root package.json has build) |
| Output Directory | client/dist (set in vercel.json). Vercel expects a build output; the client build produces this. The Express app in src/server.js runs as the serverless function and also serves these assets. |
| Install Command | npm run install-all (so root, server, and client dependencies are installed) |
If the UI suggests “Override” toggles, you can set:
- Install Command:
npm run install-all - Build Command:
npm run build
- Push the repo to GitHub and import it as a new Vercel project.
- Set Root Directory to the repo root (e.g.
assistants-function-callingif that’s the root). - Set Install Command to
npm run install-alland Build Command tonpm run build. - (Optional) Add any env vars your server or client need in Environment Variables (e.g. for the server; API keys are still sent from the client via headers).
- Deploy. The same URL will serve the React app and
/api/*(e.g.https://your-project.vercel.appandhttps://your-project.vercel.app/api/health).
- Local:
npm run devruns the Vite dev server (port 3000) and the Express server (port 3001) with the proxy inclient/vite.config.ts. - Vercel: One serverless function runs the Express app; it serves the built client from
client/distand handles/api/*. No separate “output directory” is used for the frontend;vercel.jsonsetsoutputDirectorytoclient/distso the build step passes; the Express app serves those assets.
- API keys are stored in browser localStorage (for demo purposes only)
- Keys are sent in request headers to the backend
- In production, implement proper authentication and secure storage