this post was submitted on 06 Sep 2023
26 points (93.3% liked)
LocalLLaMA
2249 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you don't need cuda or ai, the 7900 is great.
You can run CUDA apps on ROCm HIP. It’s easy.
Whoa need to me. I'll have to dig in on that.
Well that's the question...
What you mean with "not needing ai"? I mean oobabooga and stable diffusion have AMD installers, and that's exactly what I am asking about. Therefore I post in community...
To find out how good those AIs run on AMD
Oops. I wasn't looking at the community just my main feed. Ok, so from what I understand the amd installer is a bit of a pain on Linux. If you're on windows it's probably a different story.
I am on Linux, but I can live with a painful install. I wanted to hear if it performs on par with nvidia
Again. Apologies for the confusion. I had thought my initial comment was on a gaming community. Here is puget systems benchmarks and they don't look great - https://www.pugetsystems.com/labs/articles/stable-diffusion-performance-nvidia-geforce-vs-amd-radeon/#Automatic_1111
"Although this is our first look at Stable Diffusion performance, what is most striking is the disparity in performance between various implementations of Stable Diffusion: up to 11 times the iterations per second for some GPUs. NVIDIA offered the highest performance on Automatic 1111, while AMD had the best results on SHARK, and the highest-end GPU on their respective implementations had relatively similar performance."
Sorry, not trying to come at you, but I’m just trying to provide a bit of fact checking. In this link, they tested on Windows which would have to be using DirectML which is super slow. Did Linus Tech Tips do this? Anyway, the cool kids use ROCm on Linux. Much, much faster.
Haha, you're not, I definitely stumbled into this. These guys mainly build edit systems for post companies, so they stick to windows. Good to know about ROCm, got something to read up on.
Yeah that was what i was worried about after reading the article; I've heard about the different backends...
Do you have AMD + Linux + Auto111 / Ooobabooga? Can you give me some real-life feedback? :D
No worries
Interesting article Never heard about SHARK, seems interesting then