-
Hi! It seems like there is only compatibility up to cuda 12.6 here: https://flashinfer.ai/whl/ Is there any plan for compatibility with Cuda 12.8 and the pytorch nightly? (I believe they are 2.8.0) Thanks for all the hard work! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi @RodriMora , the JIT version of flashinfer should not rely on any cuda/torch versions: https://pypi.org/project/flashinfer-python/ Once pytorch makes the final release of torch2.8 + cu128, we will quickly follow up and release a corresponding AOT version. |
Beta Was this translation helpful? Give feedback.
Hi @RodriMora , the JIT version of flashinfer should not rely on any cuda/torch versions: https://pypi.org/project/flashinfer-python/
and you should be able to use it out of the box for cuda 12.8 + torch 2.8.
Once pytorch makes the final release of torch2.8 + cu128, we will quickly follow up and release a corresponding AOT version.