Skip to content

Compatibility with CUDA 12.8 and PyTorch nightly? #1011

Answered by yzh119
RodriMora asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @RodriMora , the JIT version of flashinfer should not rely on any cuda/torch versions: https://pypi.org/project/flashinfer-python/
and you should be able to use it out of the box for cuda 12.8 + torch 2.8.

Once pytorch makes the final release of torch2.8 + cu128, we will quickly follow up and release a corresponding AOT version.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by RodriMora
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants