How to get lm-studio to work on 9070 xt??
Posted by Mobile-Site-800@reddit | LocalLLaMA | View on Reddit | 4 comments
It keeps saying it can't detect a GPU with CUDA cores. I expected this after switching to AMD, but I thought there was a way around it. I have yet to find one.
Fearless-Lime-5384@reddit
Use vulcan.
Which OS are you using?
Okay, vulcan should be fine with windows and linux.
Shap6@reddit
https://www.amd.com/en/blogs/2024/how-to-run-a-large-language-model-llm-on-your-am.html
Mobile-Site-800@reddit (OP)
it seems only a small number of models af supported. does this mean the gemma wont work?
HopePupal@reddit
find the runtime section of the app's prefs and switch the GGUF runtime to ROCm or Vulkan