this post was submitted on 06 Sep 2023
26 points (93.3% liked)
LocalLLaMA
2249 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm not asking about drivers or such. I'm asking about performance specifically in Oobabooga and/or StableDiffusion.
Have you done anything with those?
used stable diffusion and it took like 6 seconds to generate a 512x512px image
That's sadly not that descriptive. It depends on your iterations ... Can you tell me how many it/s you get with which sampler you use? That would make it much better comparable for me
made this video a while ago. shoud contain all relevant infos
Thanks
I was hoping to see a console output that shows me the iterations per second in dependence of the specific sampler. But I guess that suffices
Thanks again!
can send you a video with my terminal once I get home
That'd be awesome. No hurries though
I'm getting around 4.3 iterations per second
https://files.catbox.moe/53v8ex.mp4
Perfect, thanks!