#
llm-d
Here are 10 public repositories matching this topic...
Accelerate reproducible inference experiments for large language models with LLM-D! This lab automates the setup of a complete evaluation environment on OpenShift/OKD: GPU worker pools, core operators, observability, traffic control, and ready-to-run example workloads.
-
Updated
Dec 23, 2025 - Python
Open Source software code for use with PCIe card-based hardware AI accelerators catering to both inference and training use cases
cuda distro k8s visualize pcie mpi4py exo k3s kestra cxl runai cxl-mem photonics-computing mpio llamacpp vllm llm-d paxos-cluster opentelemetry-ebpf-profiler onnxoptimizer
-
Updated
Jan 5, 2026 - Python
Improve this page
Add a description, image, and links to the llm-d topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llm-d topic, visit your repo's landing page and select "manage topics."