U ser interfaces in 2020 need to consider many more details than those of previous generations. One of the biggest changes of the last decade is certainly the introduction of HiDPI displays, displays with much higher resolutions together with software using more pixels per GUI element. Win32 with its pixel-based approach on user interface design may look totally inappropriate here, and many examples of blurry Win32 applications on high-resolution displays seem to confirm that. However, that appearance is deceiving and this article will show you why.
Back in the days of Windows 95 to XP, things were still easy: The operating system used a default setting of “96 DPI” (dots-per-inch) for all GUI elements, meaning that a 12-point font was rendered on screen by allotting
for its height. As all computer monitors of that time had roughly the same pixel size (between 250 and 300 µm), there was hardly a need to ever deviate from this default. While Windows already came with an additional “120 DPI” option and also allowed for a custom DPI setting, any non-default value was clearly a second-class citizen. Just look at the following screenshots of Windows XP scaled at 100% (96 DPI) and 200%: