this post was submitted on 05 Feb 2024
722 points (97.6% liked)

Gaming

2947 readers
1 users here now

!gaming is a community for gaming noobs through gaming aficionados. Unlike !games, we don’t take ourselves quite as serious. Shitposts and memes are welcome.

Our Rules:

1. Keep it civil.


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only.


2. No sexism, racism, homophobia, transphobia or any other flavor of bigotry.


I should not need to explain this one.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Try not to repost anything posted within the past month.


Beyond that, go for it. Not everyone is on every site all the time.



Logo uses joystick by liftarn

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 35 points 9 months ago (3 children)

It is my opinion that we reached peak graphics 6 or 7 years ago when GTX1080 was king. Why?

  1. Games from that era look gorgeous (eg Shadow of Tomb Raider), yet were well optimized to run high/ultra at FHD on RX570.
  2. We didn't need to rely on fakery like DLSS and frame generation to get playable frame rates. If anything, people used to supersample for the ultimate picture quality. Even upping the rendering scale to 1.25 made everything so crisp.
  3. MSAA and SMAA antialiasing look better, but somehow even TAA from that era doesn't seem as blurry. Today, might as well use FXAA.

Graphics today seem ass-backward to me: render at 60...70% scale to have good framerates, FX are often rendered at even lower resolution, slap on overly blurry TAA to hide the jaggies, then use some upsample trickery to get to the native resolution. And it's still blurry, so squirt some sharpening and noise on top to create an illusion of detail. And still runs like crap, so throw in frame interpolation to get the illusion of higher frame rate.

I think it's high time we should be able to run non-raytracing graphics at 4k native and raytracing at 2.5k native on 500€ MSRP GPU-s with no trickery involved.

[–] [email protected] 10 points 9 months ago

We peaked when we had full hd. After all what could top full high definition... fuller high definition? That would just be silly.

[–] [email protected] 7 points 9 months ago (1 children)

GPUs are getting better, but the demand from the crypto and ML AI markets mean they can just jack up the price of every new card to higher than the last so the prices have stopped dropping with each new generation.

[–] [email protected] 2 points 9 months ago* (last edited 9 months ago)

Intel saving us with their gpu prices, too bad they didn't made good drivers YET

[–] UndercoverUlrikHD 4 points 9 months ago (2 children)
  1. We didn't need to rely on fakery like DLSS and frame generation to get playable frame rates.

If truly believe what you wrote, then you should never look into the details of how a game world is rendered. It's fakery stacked upon fakery that somehow looks great. If anything, the current move of ray tracing with upscaling is less fakery than what was before.

[–] [email protected] 5 points 9 months ago (1 children)

There's a saying in computer graphics: if it looks right, it is right. Meaning you shouldn't worry if the technique makes a mockary of how light actually works as long as the viewer won't notice.

[–] UndercoverUlrikHD 1 points 9 months ago

That's the point

[–] [email protected] 1 points 9 months ago (1 children)

Sure, all graphics is about creating an illusion.

But there's a stark difference between optimization like culling, occlusion planes, LOD-s, half-res rendering of costly FX (like AO) and using a crutch like lowering the rendering resolution of the whole frame to try and make up for bad optimization or crap hardware. DLSS has it's place for 150...200€ entry-level GPU-s trying to drive a 2.5k monitor, not 700€ "midrange" cards.

[–] UndercoverUlrikHD 1 points 9 months ago

But there's a stark difference between optimization like culling, occlusion planes, LOD-s, half-res rendering of costly FX (like AO) ~~and using a crutch like lowering the rendering resolution of the whole frame to try and make up for bad optimization or crap hardware.~~

There is not a stark difference if you were to describe the techniques objectively and not twist it to what you feel they're like.

There are so many steps in the render pipeline where native resolution isn't used. Yet I don't here the crowd complaining about shadow map size or how reflections are half res. Upscaling is just another tool that allows us to create better looking frames at playable refresh rates. Compare Alan Wake or Avatar with DLSS with any other game without DLSS and they will still come out on top.

DLSS has it's place for 150...200€ entry-level GPU-s trying to drive a 2.5k monitor, not 700€ "midrange" cards.

Just because you're unhappy with Nvidia's pricing strategy doesn't mean you should slander new render techniques. You're mixing two different topics.