Howdy!

(moved this comment from the noob question thread because no replies)

I’m not a total noob when it comes to general compute and AI. I’ve been using online models for some time, but I’ve never tried to run one locally.

I’m thinking about buying a new computer for gaming and for running/testing/developing LLMs (not training, only inference and in context learning) . My understanding is that ROCm is becoming decent (and I also hate Nvidia) , so I’m thinking that a Radeon Rx 7900 XTX might be a good start. If I buy the right motherboard I should be able to put another XTX in there as well, later. If I use watercooling.

So first, what do you think about this? Are the 24 gigs of VRAM worth the extra bucks? Or should I just go for a mid-range GPU like the Arc B580?

I’m also curious experimenting with a no-GPU setup. I.e. CPU + lots of RAM. What kind of models do you think I’ll be able to run, with decent performance, if I have something like a Ryzen 7 9800X3D and 128/256 GB of DDR5? How does it compare to the Radeon RX 7900 XTX? Is it possible to utilize both CPU and GPU when running inference with a single model, or is it either or?

Also… Is it not better if noobs post questions in the main thread? Then questions will probably reach more people. It’s not like there is super much activity…

  • will_a113@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    ·
    16 days ago

    24GB VRAM will easily let you run medium-sized models with good context length, and if you’re a gamer the XTX is a beast for raster performance and has good price/performance.

    If you want to get serious about LLMs also keep in mind that most models and tools scale well across multiple GPUs, so you might buy one today (even a lesser one with “only” 16 or 12GB) and add another later. Just make sure your motherboard can fit 2, and you have a CPU, RAM and power supply that can handle it.

    Here’s a good example from a guy who glued two much more modest cards together with decent results: https://adamniederer.com/blog/rocm-cross-arch.html