Someone mentioned Neural Radiance Caching to me recently, which Nvidia’s been working on for a while. They presented about it at an event in 2023 (disclaimer: account-gated and I haven’t watched - but a 6-minute “teaser” is available: YouTube).
I don’t really understand how it works after having skimmed through some stuff about it, but it sounds like it could be one of several ways to improve this specific problem?
It’s at least used in RTX Global Illumination as far as the nvidia site mentions it, and I heard rumors about Cyberpunk getting it, but unsure if it’s used in current tech or not. I think I heard mentions of it in some graphics review of a game.
As someone who grew up when you could physically count the bloody pixels on the screen, this whining about a miserable invisible miniscule artifact makes me palmface very hard.
Also as a person that grew up when game consoles could connect to the TV via an RF Switch, the image-damaging effects of Temporal Anti Alias smearing are extremely visible, and NOT a “miserable invisible miniscule artifact.” They’re massive on the screen. The particular examples shown in this video do not show it particularly well because it only focuses on raytracing, but the effects of TAA are still visible because turning on raytracing almost always forces on TAA, since the low resolution raytracing benefits from the smearing TAA causes.
I focus on the playability and addictiveness of the game; you focus on the immersiveness and photo-realism of the experience.
That does not make either of us right or wrong, we’re just weighing different aspects differently.
No, I don’t focus on realism over the playability of the game. The last “photorealistic” game I played was Ready or Not. But I have recently been enjoying Vintage Story, The Legend of Dragoon, Koudelka, and other games with “bad” graphics. Aside from Vintage Story, it should be noted that these other games were considered “cutting edge” graphics for their time, but they are by no means photorealistic.
My issue is that TAA (among other things such as UE5s Nanite and Lumen tech when incorrectly used) typically ruins games it is used in, both from an image quality perspective and a performance perspective. I wish that developers would stop using the default or current implementations of TAA so that better, more performant algorithms that don’t have the downside of smearing and has the upside of being faster can naturally emerge. Really, these are mostly problems that have already been solved but are ignored because big game studios operate via “Checkbox Development.” Rather than spending the time and money to implement these better solutions, they instead just check the default box for the default effect because it is faster and costs them less money.
Gamers massively overstate minor inconveniences. TAA smearing, upscaling artifacts, VA ghosting and blooming, ray tracing noise, all of that you get used to it on a day or two and never notice it again but leave it to the gamer to go on a crusade like their lives and the world are doomed because of it
Ah yes, wanting better looking games that also perform better is a “crusade.” Of course. How bigotted of me.
Different people also have different sensitivity to different types of artifacts. No doubt a degree of the complaints is overblown due to a big of tribal / mob mentality going on, but a few of the people complaining might just be more sensitive to it.
With TAA specifically there’s probably also implementation differences going on, where someone has a bad experience with it once or twice and then generalizes that experience to all implementations of it.
M8, yer old, the kids have better acuity so they notice it. Maybe they are right on this one.