An x86-64 PC includes Windows (10 x64, if this matters) and a 1920×1080 SDR true 8-bit-per-pixel monitor. There are many different videos in 3840×2160 YUV 4:2:0 10-bit (can be either HDR or SDR) format, usually H.265 (if H.265 matters) encoded. Usually GUI video player decodes them using the GPU’s hardware decoder, since the CPU performance is insufficient for the real-time H.265 decoding. This assumes the existence of the intermediate uncompressed 3840×2160 YUV 4:2:0 10-bit stream.

During playback which does GUI video players and with what settings linear convert the uncompressed 3840×2160 YUV 4:2:0 10-bit stream to the 1920×1080 YUV 4:4:4 or RGB SDR 8-bit stream without going through an intermediate 1920×1080 YUV 4:2:0 stage? It may be
3840×2160 YUV 4:2:0 → 3840×2160 Y + 1920×1080 UV → 1920×1080 Y + 1920×1080 UV → 1920×1080 YUV 4:4:4 → 1920×1080 RGB
or
3840×2160 YUV 4:2:0 → 3840×2160 RGB → 1920×1080 RGB
The linear conversion means each pixel of 1920×1080 is equal to average of square 2x2 pixel of 3840×2160, the presence of gamma correction does not matter. It doesn’t matter at which stage the YUV 4:4:4 conversion to RGB happens. It doesn’t matter at which stage the 10-bit conversion to SDR 8-bit happens.

Perhaps most popular players (VLC, MPC, PotPlayer, KMPlayer, GOM Player) do this by default? But I couldn’t find obvious explicit information about it.