Download the GLM-5.1 Lightweight Installer. Run the powerful 744B open-source AI locally on your GPU via llama.cpp or use our free Cloud-Hybrid API. Zero setup. - GLM-5-1/GLM-5.1
-
Updated
Apr 18, 2026 - Python
Download the GLM-5.1 Lightweight Installer. Run the powerful 744B open-source AI locally on your GPU via llama.cpp or use our free Cloud-Hybrid API. Zero setup. - GLM-5-1/GLM-5.1
Kairo: Securely orchestrate multiple AI providers via Anthropic. Hardened with X25519 (age) encryption, integrated audit trails, and a terminal-native interface.
Sidecar plugin that pipes Z.AI's GLM-5.1 / GLM-4.5-Air models into Claude Code for token-efficient code generation, review, and design consultation. Per-mode model + sampling tuning, machine-readable XML envelope, hardened cancellation.
Add a description, image, and links to the glm-5-1 topic page so that developers can more easily learn about it.
To associate your repository with the glm-5-1 topic, visit your repo's landing page and select "manage topics."