this post was submitted on 03 Jul 2024
166 points (98.3% liked)
Steam Deck
14828 readers
363 users here now
A place to discuss and support all things Steam Deck.
Replacement for r/steamdeck_linux.
As Lemmy doesn't have flairs yet, you can use these prefixes to indicate what type of post you have made, eg:
[Flair] My post title
The following is a list of suggested flairs:
[Discussion] - General discussion.
[Help] - A request for help or support.
[News] - News about the deck.
[PSA] - Sharing important information.
[Game] - News / info about a game on the deck.
[Update] - An update to a previous post.
[Meta] - Discussion about this community.
Some more Steam Deck specific flairs:
[Boot Screen] - Custom boot screens/videos.
[Selling] - If you are selling your deck.
These are not enforced, but they are encouraged.
Rules:
- Follow the rules of Sopuli
- Posts must be related to the Steam Deck in an obvious way.
- No piracy, there are other communities for that.
- Discussion of emulators are allowed, but no discussion on how to illegally acquire ROMs.
- This is a place of civil discussion, no trolling.
- Have fun.
founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
they wouldnt use nvidia because outside of the driver issues, they dont have an x86 license nor nvidia does semi custom designs for clients.
valves only other option is basically Intel, which at the time, didnt have much emphasis in igpu performance to give valve a decent value/performance ratio
Intel graphics has improved leaps and bounds but it's still problematic and more poorly supported than AMD.
I imagine part of it (beyond general stuff like Intel trailing AMD in efficiency, both on the CPU and GPU side, as well as the die size being far larger for the same performance, meaning more expensive) is that Valve really didn't want Intel graphics issues being reported in reviews and forums as being Proton/Linux issues.
On top of that, Intel straight up doesn't have a custom semiconductor division. AMD does (predominantly for Xbox/PS, but they're not the only ones).
Intel would either have to set up an entirely new working group for Valve (expensive! Something that Valve would've wanted to avoid considering they had no idea whether the Deck would be a hit or not) or they'd have had to go with an off-the-shelf intel CPU.
it mostly improved after tigerlake, but at the time of steam deck taping out designs, intel was still far behind and realistically was not an option. it will down the line given the AI boom has essentially made the igpu a very important piece of hardware, but not when the original deck was designed on paper.
unless intel was going to give valve a really good deal on tigerlake cpus back in 2020, it was not going to happen.
The "AI boom" means that Intel is going to take die space from the GPU and give it to an NPU. That's how you get Windows 11®️ CoPilot™️ cetified.
Isn’t the Tegra X1 on the Switch modified for Nintendo?
no, because the tegra x1 was a processor originally designed for nvidia shield tv and jetson developer boards. companies like nintendo for the switch and google for the pixel c tablet, used the tegra x1 as an off the shelf chip, which is why all of the listed devices are suscceptable to the rcm exploit, as they are the same chip.
semi custom means they are key functionality added to the chip from oem designs that fundamentally make it different. e. g valve has zen 2 + rdna 2 igpu instead of the off the shelf zen 3 + rdna 2 option. Sony for example has a memory accelerator on the PS5 to give the PS5 faster data streaming capability than standard designs. and supposedly have a compute block for the PS5 pro supposedly for better resolution scaling and ray tracing than standard amd designs.
Nvidia not doing semi custom is the main reason why Apple stopped using nvidia after the GTX 670 in their imac pro lines in favor of AMD, and for example, why nvidia is very strict on the form factor theor gpus are in (e. g theres a reason why a smaller egpu doesnt really exist much for nvidia gpus, while the AMD option is more common, despite being less bought by consumers)
Thanks for the info and examples!