this post was submitted on 02 Jul 2023
14 points (100.0% liked)
Stable Diffusion
4296 readers
3 users here now
Discuss matters related to our favourite AI Art generation technology
Also see
Other communities
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't really agree.
Recent AI inovations are pretty modest and use the innovation of raw fucking power to achieve goals.
Gpt4 uses 230B parameters, whereas to run a 7B LLM you need 16gb of vram already, and llms are o(n²) in complexity in terms of parameters, I'll let you do the maths
Stable diffusion (latent diffusion to be more precise) is about the same, the initial training required billions of teraflop, while it was relatively cheap (100k$), it still rides on modern GPU technology .