Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: how to use with vLLM? #62

Open
test3211234 opened this issue Jan 10, 2025 · 1 comment
Open

Question: how to use with vLLM? #62

test3211234 opened this issue Jan 10, 2025 · 1 comment
Labels
question Further information is requested

Comments

@test3211234
Copy link

When trying to pip install vllm on Windows I get an error about vllm doesn't work on ROCm and Windows. So I think I can use WSL, which I don't know how, and also I don't know about the distinction between ZLUDA and ROCm and how to use ZLUDA.

@lshqqytiger
Copy link
Owner

You'll install vllm package for CUDA 11.8. Also, if needed, replace dll files inside vllm (or torch? I don't know)
However, if vLLM depends on libraries that is not supported/implemented such as cuDNN and it does not allow to disable them, it won't work.

@lshqqytiger lshqqytiger added the question Further information is requested label Jan 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants