this post was submitted on 23 Dec 2023
178 points (94.9% liked)

Technology

58303 readers
18 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Firm predicts it will cost $28 billion to build a 2nm fab and $30,000 per wafer, a 50 percent increase in chipmaking costs as complexity rises::As wafer fab tools are getting more expensive, so do fabs and, ultimately, chips. A new report claims that

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 108 points 11 months ago (5 children)

Or, and hear me out, we could just write less shitty software...

[–] [email protected] 42 points 11 months ago

You're right. This is the biggest issue facing computing currently.

[–] [email protected] 17 points 10 months ago (1 children)

The ratio of people who are capable of writing less-shitty software to the number of things we want to do with software ensures this problem will not get solved anytime soon.

[–] [email protected] 9 points 10 months ago

The ratio of people who are capable of writing less-shitty software to the number of things we want to do with software ensures this problem will not get solved anytime soon.

Eh I disagree. Every software engineer I've ever worked with knows how to make some optimizations to their code bases. But it's literally never prioritized by the business. I suspect this will shift as IaaS takes over and it's a lot easier to generate the necessary graphs showing the stability of your product being maintained while the consumed resources has been reduced.

[–] [email protected] 16 points 10 months ago

But what if i want to do all my work inside a JavaScript "application" inside a web browser inside a desktop?

(We really do have do much CPU power these days that we're inventing new ways to waste it...)

[–] [email protected] 13 points 11 months ago* (last edited 11 months ago)

But where's the fun in that?

[–] [email protected] -3 points 11 months ago* (last edited 11 months ago) (1 children)

As long as humans have some hand in writing and designing software we'll always have shitty software.

[–] [email protected] 6 points 10 months ago (2 children)

While I agree with the cynical view of humans and shortcuts, I think it’s actually the “automated” part of the process to blame. If you develop an app, there’s only so much you can code. However if you start with a framework, now you’ve automated part of your job for huge efficiency gains, but you’re also starting off with a much bigger app and likely lots of functionality you aren’t really using

[–] [email protected] 2 points 10 months ago

I was more getting at with software development it's never just the developers making all of the decisions. There are always stakeholders who often force time and attention to other things and make unrealistic deadlines, while most software developers I know would love to be able to take the time to do everything the right way first.

I also agree with the example you provided. Back when I used to work on more personal projects I loved it when I found a good minimal framework that allowed you to expand it as needed so you rarely ever had unused bloat.

[–] [email protected] 1 points 10 months ago* (last edited 10 months ago)

If you're not using the functionality it's probably not significantly contributing to the required CPU/GPU cycles. Though I would welcome a counter example.