You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran the inference model on a 3090 video card with 24G of available video memory, and then 23.5G of needed video memory, which should run, but shows video memory overflow. I don't know if this inference model supports multi-card parallelism.
The text was updated successfully, but these errors were encountered:
I ran the inference model on a 3090 video card with 24G of available video memory, and then 23.5G of needed video memory, which should run, but shows video memory overflow. I don't know if this inference model supports multi-card parallelism.
The text was updated successfully, but these errors were encountered: