this post was submitted on 06 Sep 2023
26 points (93.3% liked)

LocalLLaMA

2249 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS
26
How usable are AMD GPUs? (lemmy.dbzer0.com)
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 

Heyho, I'm currently on a RTX3070 but want to upgrade to a RX 7900 XT

I see that AMD installers are there, but is it all smooth sailing? How well do AMD cards compare to NVidia in terms of performance?

I'd mainly use oobabooga but would also love to try some other backends.

Anyone here with one of the newer AMD cards that could talk about their experience?

EDIT: To clear things up a little bit. I am on Linux, and i'd say i am quite experienced with it. I know how to handle a card swap and i know where to get my drivers from. I know of the gaming performance difference between NVidia and AMD. Those are the main reasons i want to switch to AMD. Now i just want to hear from someone who ALSO has Linux + AMD what their experience with Oobabooga and Automatic1111 are when using ROCm for example.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 1 year ago (2 children)

If you don't need cuda or ai, the 7900 is great.

[–] [email protected] 3 points 1 year ago (1 children)

You can run CUDA apps on ROCm HIP. It’s easy.

[–] [email protected] 2 points 1 year ago

Whoa need to me. I'll have to dig in on that.

[–] [email protected] 2 points 1 year ago (1 children)

Well that's the question...

What you mean with "not needing ai"? I mean oobabooga and stable diffusion have AMD installers, and that's exactly what I am asking about. Therefore I post in community...

To find out how good those AIs run on AMD

[–] [email protected] 3 points 1 year ago (1 children)

Oops. I wasn't looking at the community just my main feed. Ok, so from what I understand the amd installer is a bit of a pain on Linux. If you're on windows it's probably a different story.

[–] [email protected] 1 points 1 year ago (1 children)

I am on Linux, but I can live with a painful install. I wanted to hear if it performs on par with nvidia

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (2 children)

Again. Apologies for the confusion. I had thought my initial comment was on a gaming community. Here is puget systems benchmarks and they don't look great - https://www.pugetsystems.com/labs/articles/stable-diffusion-performance-nvidia-geforce-vs-amd-radeon/#Automatic_1111

"Although this is our first look at Stable Diffusion performance, what is most striking is the disparity in performance between various implementations of Stable Diffusion: up to 11 times the iterations per second for some GPUs. NVIDIA offered the highest performance on Automatic 1111, while AMD had the best results on SHARK, and the highest-end GPU on their respective implementations had relatively similar performance."

[–] [email protected] 2 points 1 year ago (2 children)

Sorry, not trying to come at you, but I’m just trying to provide a bit of fact checking. In this link, they tested on Windows which would have to be using DirectML which is super slow. Did Linus Tech Tips do this? Anyway, the cool kids use ROCm on Linux. Much, much faster.

[–] [email protected] 1 points 1 year ago

Haha, you're not, I definitely stumbled into this. These guys mainly build edit systems for post companies, so they stick to windows. Good to know about ROCm, got something to read up on.

[–] [email protected] 1 points 1 year ago

Yeah that was what i was worried about after reading the article; I've heard about the different backends...

Do you have AMD + Linux + Auto111 / Ooobabooga? Can you give me some real-life feedback? :D

[–] [email protected] 1 points 1 year ago

No worries

Interesting article Never heard about SHARK, seems interesting then