turkishdelight@lemmy.ml to LocalLLaMA@sh.itjust.worksEnglish · 1 year agoOllama now supports AMD graphics cardsollama.comexternal-linkmessage-square4fedilinkarrow-up177arrow-down10file-text
arrow-up177arrow-down1external-linkOllama now supports AMD graphics cardsollama.comturkishdelight@lemmy.ml to LocalLLaMA@sh.itjust.worksEnglish · 1 year agomessage-square4fedilinkfile-text
But in all fairness, it’s really llama.cpp that supports AMD. Now looking forward to the Vulkan support!
minus-squareAlex@lemmy.mllinkfedilinkEnglisharrow-up1·1 year agoI was sadly stymied by the fact the rocm driver install is very much x86 only.
minus-squareturkishdelight@lemmy.mlOPlinkfedilinkEnglisharrow-up2·1 year agoIt’s improving very fast. Give it a little time.
I was sadly stymied by the fact the rocm driver install is very much x86 only.
It’s improving very fast. Give it a little time.