You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to run llamafile on windows and llamafiles uses both GPU's and limit's the VRAM to the weaker one. Is there a way to manually select on what GPU to run? I get incompatible error codes when it's trying to use both GPU's
The text was updated successfully, but these errors were encountered:
TheAmpPlayer
changed the title
Running llamafile on 2 GPU's
Running llamafile on 2 GPU's instead of 1
Apr 28, 2024
I am trying to run llamafile on windows and llamafiles uses both GPU's and limit's the VRAM to the weaker one. Is there a way to manually select on what GPU to run? I get incompatible error codes when it's trying to use both GPU's
The text was updated successfully, but these errors were encountered: