this post was submitted on 20 May 2024
149 points (100.0% liked)

PC Gaming

8550 readers
431 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
all 41 comments
sorted by: hot top controversial new old
[–] [email protected] 24 points 5 months ago* (last edited 5 months ago) (2 children)

I feel like I've been hearing about AMDs "next" CPU having dozens of cores on a bunch of chiplets for the last few generations, then the main gaming consumer parts end up with 6 or 8 or something.

[–] [email protected] 32 points 5 months ago* (last edited 5 months ago) (2 children)

The 7950 has 16 cores. I think what the article is suggesting is the very top of the line in the next gen could go potentially double, up to 32. I would imagine if that happened though that the more midline ones would still be in the 12-16 core range. I guess we'll see when they come out though.

[–] [email protected] 13 points 5 months ago (1 children)

Yea here's hoping. I'm skipping the 7000 series parts and sticking with my 5800x3d, I really want a higher core part that still has all the single ccd x3d advantages, since I game and do CPU heavy work on the same rig.

[–] [email protected] 5 points 5 months ago (1 children)

Same here. 5800x3d is great, and I'd rather not buy a new motherboard and things just yet.

[–] [email protected] 2 points 5 months ago

Yes, no desire for all the things that will have to come with this upgrade. I want a huge boost, so sitting out this first wave.

[–] [email protected] 4 points 5 months ago

900 series have 16 cores going back to 1950

[–] [email protected] 6 points 5 months ago

Most games can’t take advantage of more than a couple cores anyway, and the high-core-count CPUS often sacrifice a little clock speed.

The optimal gaming CPU is like 4-8 cores but with a high clock speed. The 32+ core machines are for compute heavy tasks like CAD or running simulations. Sometimes compilers.

[–] [email protected] 11 points 5 months ago (1 children)

I thought they were already up there on Threadrippers, or am I misunderstanding and that's either not counting as a CPU or not a single die?

[–] [email protected] 17 points 5 months ago

Threadripper 7000 went up to 64 cores with 8 dies (excluding IO die) , so 8 cores per die.

[–] [email protected] 11 points 5 months ago (3 children)

I'd kill for a single CCD 16 core x3d part. The 7950x3d is tempting with it's 3d CCD and high clock speed CCD, but since not every game/program knows how to use it properly you end up with hit or miss performance.

[–] ChairmanMeow 15 points 5 months ago (1 children)

Honestly with the 7950x3D being so powerful, you rarely notice it if a game isn't fully utilizing it. I have one and I'm very pleased with it!

[–] [email protected] 5 points 5 months ago (1 children)

My biggest concern from what I've seen is that the weird hack AMD uses to get programs to run on one set of cores vs the other wasn't exactly great last I looked and can cause issues when a game tries to move off of one CCD onto the other. That said I haven't looked into this ever since the CPU first came out so hopefully things are better now.

How observant are you to micro stutters in a game? That was the biggest reason I got the 5800x3d in the first place, but now that I have a better GPU I can tell that thing struggles. And from what I remember most of the issued you'd have moving from CCD to CCD were more micro stutters vs normal frame rate dips or just lower average frame rates.

[–] ChairmanMeow 4 points 5 months ago

I only really notice stutters in heavily modded Minecraft, where it's clearly linked to the garbage collector. In more demanding games I don't notice any stuttering really. Or at least, none that I can't easily link to something triggering in the game that is likely causing it.

Sure, perhaps I have slightly lower average FPS compared to a 7800x3D, but I also use this PC for productivity reasons so there the extra oompf really does help. Still, 97% of a framerate that's already way higher than what my 144Hz monitors support is still well above what my monitors support. I don't think the performance difference is really noticeable, other than in certain benchmarks or if you try really hard to see them.

It's considerably faster than a 5800x3D though.

[–] [email protected] 2 points 5 months ago (1 children)

I'm also wondering why there is even a difference in FPS in higher class CPUs - shouldn't it be the GPU bottlenecking, especially in 4k high settings?

[–] [email protected] 2 points 5 months ago

1% and 0.1% lows will almost always be CPU bound as it loads more in. Well assuming it’s not vram limiting you. Games are pretty CPU intensive these days since the PS5 and Xbox no longer have potato CPUs. At 120+ fps I regularly see >50% CPU usage in most games. And that’s with nothing running in the background. In the real world you have a ton of background tasks, YouTube videos, discord etc eating your CPU.

Also the 4090 is an absolute beast. My 5800X3D absolutely holds my 4090 back pretty often honestly.

[–] [email protected] 6 points 5 months ago (2 children)

Doesn't c stand for e-cores? Packing up to 32 e-cores must be easier than with normal cores.

Also kinda wish they went the other direction a little, cut cure counts and put more cache across all levels on some cores instead for better single thread performance, a 'very big' core so to say. Intel's cache sizes have been larger then amd since alder lake and there stayed competitive despite their process node disadvantage

[–] [email protected] 4 points 5 months ago* (last edited 5 months ago)

Not quite an e-core but the goal is the same: Make more efficient use of the available die space by packing in more, slower cores.

The difference is that Intel's e-cores achieve this by having a different architecture and support less features than their p-cores. E-cores for example do not support multi threading. E-cores are about 1/4 the size of a o-core.

AMD's 4c cores support the same features and have the same IPC as full zen 4 cores but operate at a lower clock speed. This reduces thermal output of the core, allowing them to pack in the circuitry much more densely.

Undoubtedly Intel's e-cores take advantage of this effect as well and they are in fact quite a bit smaller than 4c: a 4c core is about 1/2 the size of a zen 4 core. The advantage of AMD's approach is that having the cores be the same simplifies the software side of things.

[–] [email protected] 3 points 5 months ago* (last edited 5 months ago)

AMD's c cores aren't quite the same as Intel's e cores. Intel's e-cores are 1/4 of the size of their P cores, while AMD's c cores are about the same size as their standard cores, but a bit more square shaped geometrically.

Intel's e cores are completely different architectures from their p cores, while the only difference between AMD's cores are a bit less cache and a bit lower frequency.

Intel's are like comparing an Raspberry pi core to a full x86 core, while AMD's is like a lower binned regular core.

AMD has "big" cores, too. Their 3d vcache models trade multithreaded performance for more cache. Their "3 core tiers" approach is very obvious in their server line up:

https://www.servethehome.com/amd-epyc-bergamo-epyc-9754-cloud-native-sp5/

[–] [email protected] 4 points 5 months ago (1 children)

Is there really a need for them?

[–] [email protected] 1 points 5 months ago

The c variants of zen are for cloud and are more compact variants of the full zen 5 cores, they generally want as many cores in as compact a format as possible.

We might see 5c show up in SoCs (like the chip in a hypothetical steam deck 2) as well because they want their chips to be as small as possible so they can price their devices as competitively as possible. I don't think we will see those go up to 32 cores however as there is indeed no need for that many cores on consumer chips.

[–] [email protected] 2 points 5 months ago (1 children)

Cool. When's the ARM chip coming out?

[–] [email protected] 4 points 5 months ago (2 children)

Arm is dead. The future is RISC-V

[–] [email protected] 5 points 5 months ago

It should. An open technology standard should gain traction over closed proprietary ones.

[–] [email protected] 1 points 5 months ago (1 children)
[–] [email protected] -4 points 5 months ago (2 children)
[–] [email protected] 3 points 5 months ago (1 children)

Did you really copyright your comment 😭

[–] [email protected] -1 points 5 months ago* (last edited 5 months ago)

Did you really copyright your comment 😭

No, I licensed my content with a limited license that does not allow for commercial usage.

~Anti~ ~Commercial-AI~ ~license~ ~(CC~ ~BY-NC-SA~ ~4.0)~

[–] [email protected] 2 points 5 months ago (2 children)

Probably negatively, but also likey not enough to matter. CPUs these days run pretty cool.

Were a long way from the days of an idle Pentium 4 at 75C

[–] [email protected] 2 points 5 months ago

We're in the days of Intel's top chips degrading themselves in a matter of weeks due to thermals being simply unmanageable under anything less than a beefy 360mm AIO or custom loop cooling at stock settings

[–] [email protected] 0 points 5 months ago (1 children)

CPUs these days run pretty cool.

Thought the AMD CPU ran around 90 celsius?

~Anti~ ~Commercial-AI~ ~license~ ~(CC~ ~BY-NC-SA~ ~4.0)~

[–] [email protected] 2 points 5 months ago (2 children)

My Ryzen 3900X idles at around 50C, although that's a few generations ago now

[–] [email protected] 2 points 5 months ago

My 3900X idles at 35 and hits 65 when it's 100% all cores. With a decent cooler modern AMD runs pretty chill

[–] [email protected] -1 points 5 months ago

My Ryzen 3900X idles at around 50C, although that’s a few generations ago now

There seems to be a big difference between older CPUs and the newer ones, where the newer ones are running a lot hotter now under load.

I personally use a 5800X and it gets to 90c often.

~Anti~ ~Commercial-AI~ ~license~ ~(CC~ ~BY-NC-SA~ ~4.0)~