A computer-use agent you control via Telegram. Give it tasks, approve commands, and let it work on your machine.
- Computer Control: Execute bash commands, take screenshots, manage files — all from Telegram
- Visual Understanding: Bot can view and analyze screenshots to understand what's on your screen
- Multi-LLM Support: Powered by Google Gemini, OpenAI, Groq, or DeepSeek
- Safe Execution: All bash commands require your approval via inline buttons
- MCP Integration: Extend capabilities with Model Context Protocol servers
- Persistent Memory: Remembers conversation context across sessions
| Tool | Description |
|---|---|
bash |
Execute shell commands with user approval via inline buttons |
web_search |
Search the web using DuckDuckGo |
send_file |
Send files/images to the user via Telegram |
view_image |
Load and analyze images from disk (screenshots, etc.) |
Requires Python 3.13+ and uv.
# Clone the repository
git clone https://github.com/bnovik0v/eliza.git
cd eliza
# Install dependencies
uv sync
# Copy and configure environment
cp .env.example .env
# Edit .env with your API keysCreate a .env file with the following variables:
# Required
TELEGRAM_TOKEN=your_telegram_bot_token
OWNER_USER_ID=your_telegram_user_id
# LLM Provider (choose one or more)
GOOGLE_API_KEY=your_google_api_key
OPENAI_API_KEY=your_openai_api_key
GROQ_API_KEY=your_groq_api_key
DEEPSEEK_API_KEY=your_deepseek_api_key
# Optional
DEFAULT_PROVIDER=google # google, openai, groq, deepseek
GOOGLE_MODEL=gemini-3-flash-preview# Run the bot
uv run eliza/start- Show bot info and available tools/clear- Clear conversation history/provider <name>- Switch LLM provider (google, openai, groq, deepseek)/tools- List available MCP tools
See your screen:
"Take a screenshot and tell me what's happening"
Web search:
"What's the current Bitcoin price?"
System tasks:
"Check disk usage and clean up temp files"
Development:
"Run the tests and fix any failures"
File management:
"Find large files in my Downloads folder"
Configure MCP servers in mcp_servers.json:
{
"servers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@anthropic/mcp-filesystem", "/home/user"]
}
}
}src/eliza/
├── bot/
│ ├── telegram_bot.py # Main bot implementation
│ ├── bash_tool.py # Bash execution with approval
│ └── security.py # Owner-only decorator
├── llm/
│ ├── base.py # Abstract LLM provider
│ ├── google.py # Google Gemini provider
│ ├── openai.py # OpenAI provider
│ ├── groq.py # Groq provider
│ ├── deepseek.py # DeepSeek provider
│ └── factory.py # Provider factory
├── mcp/
│ ├── client.py # MCP client manager
│ └── config.py # MCP configuration
├── memory/
│ ├── base.py # Memory interface
│ └── sqlite.py # SQLite implementation
├── config.py # Settings management
└── main.py # Entry point
MIT