Hacker Newsnew | past | comments | ask | show | jobs | submit | adornKey's commentslogin

Microsoft did a lot of great work on Fonts in the past. Recently it looked like they abandoned per monitor subpixel-rendering?! In which direction are they heading?

Pixel density continues to rise, but Microsoft might be engaged in… premature de-optimization?

It’s duals/mirrors all the way down. Or up.


But the angular resolution of the eye doesn't rise. For a desktop monitor 100 ppi practically already reached the limits. Anything beyond that is just additional burden for the GPU and a waste of bandwidth. Surely you can increase resolution just to make font rendering easier, but you also have to pay the price in energy consumption or speed - without any visible improvement.

At the traditional 96 dpi, you have to be 3 ft away to exceed the retinal density. Personally, I sit at half that distance. Something around 200 would be more ideal. Laptops you might sit even closer.

Mobile devices, unless you get really close to the screen, have matched the retinal density for a while. Most people hold the device at about 8 inches, so 450 dpi is the value to hit.

Edit These measurements assume 20:20 vision, which is the average. Many people exceed that. So you'd need slightly higher values if you're feeling pedantic.


Having the focal point up close for a long time isn't that good for the eyes, so sitting closer than an arms length to a desk monitor isn't an idea that lasts well.

100 dpi with subpixel rendering already maxes out angular resolution (horizontal). It doesn't max out everything (retinal), so you still see some artifacts, but practically this is not that relevant. The price in energy/bandwidth rises quadratic for very little gain.

To get the equivalent of 4K at 100 ppi - with 200 ppi you have to put the burden of 8K onto the GPU... For now this is absolutely not good - High ppi is ok for small monitors and handheld devices, but for a decent desk with several good monitors GPUs just aren't ready yet.


The difference between my 27" 4k and 1440p screens is still quite obvious and I don't consider myself particularly sensitive to these things.

For rendering of text/video even an underpowered integrated gpu can handle it fine, only issue is using a bunch more ram.

For reference my very underpowered desktop AMD igpu on 3 generations old gpu architecture (2CUs of RDNA 2) only has trouble with the occasional overly heavy browser animation


A few years ago, I replaced my 24" 1080p monitors (~96 ppi) with 27" 4k monitors (~157 ppi), and the increased pixel density was very noticeable, and I'd probably notice an increase over that. I sit about 3 feet away from them.

300 ppi matches printed books which looks nice. On notebook computers having a 3840x2160 panel might not be worth the reduced battery life.

I hate subpixel rendering. It's impossible to turn it off for displays that don't need it. It looks absolutely awful. I wish it was never invented.

Hate seems a bit strong for an increase in perceived horizontal resolution on low DPI displays, but to each their own. That said, I'm not sure what you mean by it being impossible to turn off. On Windows you can just disable ClearType per monitor, and on Linux it's configurable either through your DE, fontconfig, or sometimes at the application level.

MacOS went the other direction and removed subpixel rendering entirely, which is partly why low DPI external displays tend to look worse there.


> That said, I'm not sure what you mean by it being impossible to turn off.

You can try to configure it to be off, and while that almost works, many applications will still simply not respect the setting. This is particularly apparent (and infuriating) with apps that don't render in high-resolution mode, because their rendering then no longer has anything to do with actual subpixels.

I imagine this behavior came from ClearType having been a special case, and therefore non-native widget toolkits getting explicitly programmed to render with it on Windows, forgetting that the user should be able to turn it off!!

> MacOS went the other direction and removed subpixel rendering entirely, which is partly why low DPI external displays tend to look worse there.

Subpixel antialiasing is a compromise. Once every Mac shipped with a Retina display, there was no need to retain that compromise, because you already get high resolution so you may as well get color accuracy too.

I will note macOS still enables by default a feature called "stem darkening" (incorrectly called "font smoothing" in macOS Settings) that also looks fairly awful to my eye, and seems itself a legacy from the low-DPI days.


> I imagine this behavior came from ClearType having been a special case, and therefore non-native widget toolkits getting explicitly programmed to render with it on Windows, forgetting that the user should be able to turn it off!!

I see, that is indeed frustrating.

> Once every Mac shipped with a Retina display, there was no need to retain that compromise, because you already get high resolution so you may as well get color accuracy too.

I believe that is Apple's position, and it may be valid for their own high-DPI displays. However, it overlooks the fact that most external monitors, especially typical office displays, are still far from retina pixel densities. Even on a relatively good 27" 4K panel, text on macOS looks noticably worse than on Windows or Linux. Then again, that's likely compounded by the lack of fractional scaling. Unless you're using a 5-6K external display, you aren't hitting 250+ PPI to get crisp text at all.

> I will note macOS still enables by default a feature called "stem darkening" (incorrectly called "font smoothing" in macOS Settings) that also looks fairly awful to my eye, and seems itself a legacy from the low-DPI days.

Yea, I've seen quite the range of stem darkening implementations. Skipping proper gamma-correct blending in many doesn't help.

The really annoying thing nowadays is renderers attempting to apply subpixel rendering to panels that aren't even RGB/BGR in the first place.


Apple's laptop displays are 220 PPI, not anywhere near 250+. None of their other Mac displays are 250+ PPI.

> The really annoying thing nowadays is renderers attempting to apply subpixel rendering to panels that aren't even RGB/BGR in the first place.

Oh yes, I know a Bayer advocate. And things like WOLED are also a thing.


The explanation seems to be that it looked good in some old fonts. But I think it was always some kind of abuse. On old Typewriters the accents were usually used for accents (é è). They didn't move the cursor, so using them for apostrophes wasn't that comfortable and interrupted writing flow. Accent + space looks a bit like a quotation mark, but the right place of an accent is usually on top of a letter.

A more general approach are Encyclopedias of integer series. I think that works better than just focusing on single numbers. Hm. How many numbers are there, that are interesting, but not part of a series?


A few years ago I liked Ars Technica, but then somehow I think quality went down the drain. Did something happen to them a few years ago? If they get rid of the crazy reporters and go AI only - maybe the quality will improve again to a readable level.


"Der Ätsch-Browser".


Relevant cultural context from Germany:

In Germany, there exists a popular children's game named "Das verrückte Labyrinth" (in English it's simply named "Labyrinth": https://en.wikipedia.org/wiki/Labyrinth_(board_game) ).

When you get a corner-shaped card (in German: "Eckkarte")

> https://boardgamegeek.com/image/155268/labyrinth

from the turn of the previous player, your intended move will typically be more complicated to visualize (at least for children) - this is what this game is about - so children tended to name a "Eckkarte" an "Ätschkarte".


Maybe that game exists, but it's an old word and you can find references for it that are hundreds of years old. Its meaning fits to the browser. Your statement that it's just some kind of reference to some special game is not correct.


Once everybody has a decent amount of VRAM they can just run local AIs and the need to mess with Ad-laden search results will fizzle. So of course they are desperate to grab a new monopoly. People haven't realised yet, that local AIs are fast and produce good results - on pretty average hardware. If they don't manage to grab a new monopoly Google will be history.

But it doesn't really need a nefarious plot for the price spikes. There is a serious lack of VRAM deployed out there. Filling that gap will take quite some time. Add to that the nefarious plot and the situation will most likely get even worse....


LLM inference is mostly read only, so high-bandwidth flash looks like it could provide huge cost savings over VRAM. It's not yet in commercial products but there are working prototypes already. Previous HN discussion:

https://news.ycombinator.com/item?id=46700384


Are you saying that intel’s optane product was just ahead of its time? Is optane the answer to LLM’s ever increasing appetite?


Oh oh... Time to say goodbye to Greenland. Lets see what is going to happen to LEGO.. Freedom Bricks?


Why do you think there's a connection between the Danish government and LEGO?


Trump has already started talking about taking over Iceland. Where's next?

https://drive.google.com/file/d/1yZA7A1fy8yelNvDK2aVesx24jak...


Do the guys that buy out the market have real use for all the hardware - or is it just hype? A solution against investors trying to corner the market would be to sell virtual hardware. Let them buy as much options on virtual "to be delivered" hardware" as they want. We also need an option market for virtual LLM-tokens, where the investors can put all their money without affecting real people.


Sam Altman (just yesterday): Codex weekly users have more than tripled since the beginning of the year!

https://twitter.com/sama/status/2023233085509410833

Funny how they all meet at the "You'll own nothing" Swiss club.


One agreement could be that Eisenstein integers are more beautiful...


Currently Debian wants to deprecate GTK2. So even the guys that are interested in stability might start to see problems with Debian. The key problem of Linux is that it doesn't have a stable API to write long living GUI-software for. So far Debian was the way to go. Maybe recommending Debian will become even less popular soon.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: