You are viewing a single thread.
View all comments
11 points

On a similar vein, Arkham Knight (and in some cases Arkham City) looked worse in cutscenes if you maxed out the graphics settings. Obviously not if you ran it on a potato, but the games are somewhat well optimized these days*.

*At launch, Arkham Knight was an unoptimized, buggy mess. It has since gotten much better.

permalink
report
reply
3 points

Wait you mean that the game’s gameplay looks better than the actual cutscenes in the game?

But how? Does the game use FMV for the cutscenes or something?

permalink
report
parent
reply
5 points

The cutscenes were rendered using certain graphics settings that you could exceed if you maxed out your own settings. Plus, because it was a pre-rendered video, there must have been some compression or something, as you could just tell when you’re in a cutscene-- it was grainier and there was a smidge of artifacting. Don’t quote me on this, but I believe the cutscenes were rendered at, like, 1080p, and if you were playing at 4K it would be a very noticeable downgrade. (Note that I did not and still do not have a 4K monitor)

Although thinking about it again, I do vividly remember some in-game-engine cutscenes in Arkham Knight. I’ll have to replay that game again sometime to jog my memory.

permalink
report
parent
reply
4 points

I am playing through Rise of Tomb Raider in 4K and having a similar experience. I think the cut scenes are in 1080p.

permalink
report
parent
reply
3 points

On PS5 Hogwarts Legacy runs at 60fps but the cutscenes are 30fps.

permalink
report
parent
reply