this post was submitted on 13 Aug 2024
150 points (89.1% liked)

Technology

35125 readers
235 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

Illusion โ€” Why do we keep believing that AI will solve the climate crisis (which it is facilitating), get rid of poverty (on which it is heavily relying), and unleash the full potential of human creativity (which it is undermining)?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 2 points 4 months ago

We've had the tech to drastically cut power consumption for a few years now, it's just about adapting the existing hardware to include the tech.

There's a company MythicAI which found that using analog computers (ones built specifically to soft through .CKPT models, for example) drastically cuts down energy usage, is consistently 98-99% accurate, simply by taking a digital request call, converting it to an analog signal, the signal is processed then converted back to a digital signal and set to the computer to finish the task.

In my experience, AI is only drawing 350+ watts when it is sifting through the model, it ramps up and ramps down consistently based on when the GPU is utilizing the CUDA cores and VRAM, which are when the program is processing an image or the text response (Stable Diffusion and KoboldAI). Outside of that, you can keep stable diffusion open all day idle and power draw is marginally higher, if it even is.

So according to MythicAI, the groundwork is there. Computers just need an analog computer attachment that remove the workload from the GPU.

The thing is... I'm not sure how popular it will become. 1) these aren't widely available and you have to order them from the company and get a quote. Who knows if you can only order one. 2) if you do get one, it's likely not just going to pop into most basic users Windows install running Stable Diffusion, it's probably expecting server grade hardware (which is where the majority of the power consumption comes from, so good for business but consumer availability would be nice). And, most importantly, 3), NVIDIA has sunk so much money into GPU powered AI. If throwing 1,000 watts at CUDA doesn't keep making strides, they may try to obfuscate this competition. NVIDIA has a lot of money riding on the AI wave and if word gets out that some other company can cut costs of development both in cost of hardware and cost of running it, and the need for multiple 4090s or whatever is best and you get more efficiency from accuracy per watt.

Oh, and 4) MythicAI is specifically geared towards real time camera AI tracking, so they're likely an evil surveillance company and also the hardware itself isn't explicitly geared towards all around AI, but specific models built in mind. It isn't inherently an issue, it just circles back to point 2) where it's not just the hardware running it that will be a hassle, but the models themselves too.