And even if such filters are somehow so intensive, they should be turned off by default. I am playing Cities Skylines 2 and there is so much unnecessary eye candy that is turned on even with a mid-range graphics card. The game runs fine for me at 1440p when I turn off the really intensive post processing.
They screwed up the defaults. When I first loaded it was set to a resolution with a 24Hz refresh rate. Also the game looks bad with the default SMAA, it looks great if you change the advanced graphics settings to use TAA. I'm on a 3080 at 4K and after fiddling with settings it looks wonderful and is very playable. Unbelievable that they hosed the first impressions so much.
But no TV maxes out at 24hz, that is a lower bound meant to support things like blu-ray players for judder-free movie playback.
60 hz has always been the bare minimum any display supports, so if any game is picking something below that then something has gone horribly wrong. But really, games should be using the current desktop refresh rate as the default, because you know it's supported and makes things like alt+tabbing considerably faster even in exclusive fullscreen.
Agreed. First time I launched it on my lowly 6600XT it defaulted to high settings at 4k, completely unplayable (like 3fps).
Reading Reddit and going down to 1080p mediumish with depth of field turned off and it's fine, 30fps. It's a very fun game!
I have absolutely no idea why they didn't turn the defaults down. It would have been a 5 min config change surely and the impressions of the game would have been vastly vastly better.
It does look terrible for me though, despite being fun. The shadows are terrible and there are loads of rendering glitches.
I'm also a bit concerned now my city is getting bigger at how poor the traffic wayfinding is. Seems to be as dumb as CS1 (with no mods) which is really bad, with the added nuisance that the traffic cycle seems to be basically 0 cars until rush hour then a massive flood of traffic.
It's actually quite ironic that all I really wanted from CS was faster performance in bigger cities (huge fail from CS2 here), better traffic simulation (jury is out but isn't looking great) and better road tools (this is what cs2 is great at).
I do not understand motion blur. I am paying top dollar for these pixels, let me see them all. That it is a negative performance impact makes my decision all the easier.
Apparently it makes the game feel "faster" in racing games because it adds to the movement illusion. I don't know why you would use it in other games, though
It was used to try and make 30fps console racing games look less juddery, but didn't really work. Racing games are a genre that really does need 60fps+ to feel good.
Even in the 90s, the developers of 3D arcade racing games were well aware of this, ensuring that the original arcade versions of games like Ridge Racer, Daytona, and Sega Rally ran at 60fps, rather than sacrificing that smoothness to add more detail. And those games looked spectacularly good for their time.
In real life your eyes do the blurring on fast moving objects. Real life is nothing like a game without motion blur. Swipe you hand fast in front of your face fast. Do you really see it sharply all that time through the movement? Or just the beginning and the end?
At 24fps motion blur in games or 3d movies is bad. But at 120fps or more motion will never look well without it.
Nowhere are they replacing models, replacing textures, or anything of the sort. I suspect the issue here is that box blur with a sampling of 3x3 is exponential performance hit at 4k vs 1080p. By a factor of 8x due to the extra sampling lookups.
Whats more, such simple postprocessing filters should not tank FPS so dramatically, even with a severe CPU bottleneck.
That alone is not just unoptimized, its a severe bug.