Windows has had DPI scaling for years, but it has mostly sucked and had bad software support. With the latest Windows 8.1 iteration, is it functionally identical to how Macs use high DPI displays?
Content is rendered 1:1 (movies, photos, etc) with the UI interface scaled up to be useable on high resolution displays.
Am I missing something that makes the Mac implementation better?
Yes. Retina is completely different from Windows scaling.
OSX/Retina starts with the "looks like" resolution (by default 2560x1440, but can be higher, such as 3200x1800, depending on what you select as the scaling). It tells all software that the screen is this big, so it all works. But it creates a video buffer exactly twice the size in both directions (5120x2880, or 6400x3600 in the examples above). It renders text twice the size at the higher resolution, without telling the application that it's done this. The application's measurements and inbuilt images are doubled and rendered on to the higher resolution buffer.
If the application is retina-aware, it is allowed to serve up double-resolution images, which are put into the buffer instead of the non-retina images.
Once complete, the final screen is scaled down to the screen's physical resolution (or just displayed, if 5120x2880 was the buffer resolution).
In this way, all existing software works exactly as before except that text is super-sharp. Retina-aware applications can use the maximum resolution possible. And (because it's always down-scaled) even at higher "looks like" resolutions the text remains sharp because it's down-scaled rather than being pixel doubled.
*This* is what retina is. Apple haters will tell you it just means high resolution screens, but only because they don't understand what the difference is.