Skip to content

Ruslando/GameInteractionAgent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

55 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GameInteractionAgent Server

FastAPI server for streamed, tag-based NPC turns.

What This Server Does

  • accepts a WebSocket connection from Godot
  • stores per-agent streamed-turn history
  • normalizes incoming game events
  • requests the latest runtime agent spec from Godot before each turn
  • builds the active-turn prompt for the agent
  • streams XML-tag output from the LLM
  • forwards turn lifecycle and tag stream messages back to Godot
  • accepts the Godot-confirmed agent_turn_result buffer for history/logging
  • writes conversation and LLM-input logs on disconnect

Runtime Flow

  1. Godot connects to /ws.
  2. Godot sends interaction_event when an agent should react.
  3. The server requests get_agent_state to fetch the latest prompt parts.
  4. The server starts one streamed LLM turn for that event.
  5. The server sends: agent_turn_started agent_tag_open agent_tag_chunk agent_tag_close agent_turn_interrupted when a newer event cancels the stream agent_turn_finished when the streamed turn ends normally
  6. Godot uses those explicit turn lifecycle messages to manage the local turn buffer.
  7. Godot assembles and returns agent_turn_result.
  8. The server appends that confirmed AI turn to history for later prompts and logs.

Main Files

  • src/server.py WebSocket entrypoint, runtime message routing, and active turn orchestration.
  • src/active_stream_controller.py Server-side streamed-turn controller that runs the LLM and parses XML tags.
  • src/agent_spec.py Runtime agent spec model plus character dossier builder.
  • src/session.py Per-agent session state for the active runtime.
  • src/protocol.py Payload normalization for agent state and incoming events.
  • src/conversation_log.py Conversation and LLM-input logging.
  • src/config.py LLM configuration and model factory helpers.
  • src/logging_utils.py Structured server logging helpers.

WebSocket Messages

Client -> Server

  • ping
  • interaction_event
  • notification_event
  • agent_state_snapshot
  • agent_turn_result
  • tool_results

Server -> Client

  • pong
  • get_agent_state
  • agent_turn_started
  • agent_turn_interrupted
  • agent_turn_finished
  • agent_tag_open
  • agent_tag_chunk
  • agent_tag_close
  • error

Logging

Structured server logs are emitted through src/logging_utils.py.

  • default log level: INFO
  • override with: GAME_AGENT_LOG_LEVEL=DEBUG

Conversation logs are written to logs/ on disconnect.

Requirements

  • Python 3.11+
  • OpenAI-compatible chat endpoint

Setup

python -m venv venv
source venv/bin/activate
pip install -r requirements.txt

Example environment:

OPENAI_API_KEY=your-api-key
LLM_MODEL=anthropic/claude-3.5-sonnet
OPENAI_API_BASE=https://openrouter.ai/api/v1
ACTIVE_STREAM_LLM_MODEL=

Start

uvicorn src.server:app --host 0.0.0.0 --port 8000

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages