1413
this post was submitted on 20 Oct 2023
1413 points (98.3% liked)
Technology
58303 readers
27 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Performance per watt is a CPU metric I've never seen anyone use before lol
It's wild what Apple marketing comes up with
Yes, measuring the consumption of electricity for a given performance benchmark is totally irrelevant to datacenter providers, who get their electricity for free from the electricity fairy, and thus can harvest pure profit without operational costs.
It is also totally irrelevant for portable devices, because batteries last forever, and every smart phone has a huge fan inside to dump all the heat waste via dual XTREME exhausts.
Performance per watt has always been a major concern of chip manufacturers. Sure, they have increased the amount of watts you can throw at a chip without melting it (8086 used 1-2W) , but the most significant improvements have been towards making less watts give more power.
https://en.wikipedia.org/w/index.php?title=Performance_per_watt&action=history&offset=&limit=500
I can appreciate not liking Apple, there are plenty of valid reasons. This is not one of them. Their CPUs are state of the art and took both Intel and AMD with their pants down.