Skip to content

Popular repositories Loading

  1. exllamav2 exllamav2 Public

    A fast inference library for running LLMs locally on modern consumer-class GPUs

    Python 4.3k 323

  2. exllamav3 exllamav3 Public

    An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs

    Python 531 47

  3. exui exui Public

    Web UI for ExLlamaV2

    JavaScript 510 47

Repositories

Showing 3 of 3 repositories
  • exllamav3 Public

    An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs

    turboderp-org/exllamav3’s past year of commit activity
    Python 531 MIT 47 35 1 Updated Oct 13, 2025
  • exllamav2 Public

    A fast inference library for running LLMs locally on modern consumer-class GPUs

    turboderp-org/exllamav2’s past year of commit activity
    Python 4,341 MIT 323 135 22 Updated Aug 16, 2025
  • exui Public

    Web UI for ExLlamaV2

    turboderp-org/exui’s past year of commit activity
    JavaScript 510 MIT 47 34 3 Updated Feb 5, 2025

Most used topics

Loading…