this post was submitted on 09 Mar 2024
45 points (97.9% liked)

TechTakes

1490 readers
30 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

HN reacts to a New Yorker piece on the "obscene energy demands of AI" with exactly the same arguments coiners use when confronted with the energy cost of blockchain - the product is valuable in of itself, demands for more energy will spur investment in energy generation, and what about the energy costs of painting oil on canvas, hmmmmmm??????

Maybe it's just my newness antennae needing calibrating, but I do feel the extreme energy requirements for what's arguably just a frivolous toy is gonna cause AI boosters big problems, especially as energy demands ramp up in the US in the warmer months. Expect the narrative to adjust to counter it.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 13 points 9 months ago
  1. It will get better, and in the case of language models, that could have profound impacts on society

why is that a given?

the materials research Deepmind published

these results were extremely flawed and disappointing, in a way that’s highly reminiscent of the Bell Labs replication crisis

  1. There are other things being researched that are equally important that get little daylight, such as robotics and agentic AIs

these get brought up a lot in marketing, but the academic results of attempting to apply LLMs and generative AI to these fields have also been extremely disappointing

if you’re here seeking nuance, I encourage you to learn more about the history of academic fraud that occurred during the first AI boom and led directly to the AI winter. the tragedy of AI as a field is that all of the obvious fraud is and was treated with the same respect as the occasional truly useful computational technique

I also encourage you to learn more about the Rationalist cult that steers a lot of decisions around AI (and especially AI with an AGI end goal) research. the communities on this instance have a long history of sneering at the Rationalists who would (years later) go on to become key researchers at essentially every large AI company, and that history has shaped the language we use. the podcast Behind the Bastards has a couple of episodes about the Rationalist cult and its relationship with AI research, and Robert Evans definitely does a better job describing it than I can