If we run any games in some resolution X, where X<native monitor resolution in fullscreen mode - we will get a pretty blur image, and the amount of blur will depends on how exactly X less that native resolution.
But i wonder, can we keep original unblurried image if we run game in X<native monitor resolution, but in windowed mode?
Theoretically, no pixels on that window won’t have to be scaled up and wouldn’t be represented as 2x2, or 3x3, or NxN pixels with interpolation with neighboring pixels color. (as i understand how it all works).
Some information which i found says what nVidia have supporting for proper scaling algorithms (“integer scaling” probably the right name for this) for some RTX and GTX16xx series cards and 436.02 drivers.
Im on GTX1060 and doesnt ever know did my card support those feature?
But maybe someone already try this? I just doesnt have access to any 4k monitors for tests.