This seems like a Rogue Amoeba promotional post. Why scale up an image simply to scale it back down? It's pointless and may be there just to bulk out an already short article.
It could have been summarized with one sentence: "Scaling up old interfaces for modern displays looks better with nearest-neighbor interpolation than other options." Which is also not necessarily true, as the retro gaming crowd would attest.
Ah, but they didn't say they used nearest-neighbor when scaling it back down. This gives a (theoretically) better result when you want to scale up by a non-integer factor. (If it ends up just being an integer factor then there's no need to scale things back down.)
This feels similar to how you can generate high-quality square waves at frequencies that don't evenly divide the sampling rate. You generate it at a higher sampling rate, band limit it to the Nyquist frequency of the destination, then downsample. Otherwise there is aliasing phenomena that gives unwanted audible artifacts. For nearest-neighbor resized images, I suppose that would just appear as pixels of varying sizes, which is not nice to look at. Scaling it down antialiases the pixel so they optically look the same size.
I'm seeing a lot of negative comments about this post. I was a little underwhelmed, but I think that was mostly the fault of the title setting higher expectations than the author intended to deliver. That's on me, though. The title is accurate. I was just hoping for some neat script/tool/repo/gadget that re-created the feel of ancient interfaces on modern hardware in a more general sense. The author went for something different, and I found it to be interesting, none-the-less.
Maybe it was that I just landed on the "Moonshot" article[0], but as I was looking at the screenshot on a 4K 15" ASUS OLED display, I know that the image is also being fscked with a little by the OSes attempts to imbiginate everything for this Hi-DPI display (it didn't enlarge the smaller screenshot and even the middle one is kind of small, but it's also a little uneven which smells like shoddy scaling happening somewhere other than "on the author's end").
In all seriousness, though, Windows 11 does a terrible job of upscaling things for Hi-DPI in really unforgivable ways[1]. I can see a future where some of that upscaling is done via AI and all of that work is undone by accident.
Gotta love all of the weird unintended side-effects as things move forward :)
[0] Samsung using AI to enhance images of the moon when discovered in scenes ... and it's more comical/unfortunate results/moral panic thereof.
[1] I know, "You can't put signal back in" and I'm ranting about AI trying to do that while complaining about how terrible it is, today. The unforgivable part is that the signal (WinForms fonts/layout/etc) was lost to the point that the way it's done for these apps is to use the "content-unware myopia upscaling algorithm" (named because it makes you feel like you need glasses while wearing your glasses).
It could have been summarized with one sentence: "Scaling up old interfaces for modern displays looks better with nearest-neighbor interpolation than other options." Which is also not necessarily true, as the retro gaming crowd would attest.
This is the weakest HN post I've ever seen.