Skip to content

Popular repositories Loading

  1. exllamav2 exllamav2 Public

    A fast inference library for running LLMs locally on modern consumer-class GPUs

    Python 4.5k 327

  2. exllamav3 exllamav3 Public

    An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs

    Python 652 69

  3. exui exui Public

    Web UI for ExLlamaV2

    JavaScript 511 46

Repositories

Showing 3 of 3 repositories
  • exllamav3 Public

    An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs

    turboderp-org/exllamav3’s past year of commit activity
    Python 652 MIT 69 59 8 Updated Mar 7, 2026
  • exllamav2 Public

    A fast inference library for running LLMs locally on modern consumer-class GPUs

    turboderp-org/exllamav2’s past year of commit activity
    Python 4,453 MIT 327 136 22 Updated Mar 4, 2026
  • exui Public

    Web UI for ExLlamaV2

    turboderp-org/exui’s past year of commit activity
    JavaScript 511 MIT 46 34 3 Updated Feb 5, 2025

Most used topics

Loading…