The reason fringing exists is because they're rendered as multi-colored fonts. As in, instead of 1 pixel for each pixel, its rendered in a 3x1 (or 1x3 for vert) 'greyscale', and then just striped across the subpixels, using the subpixels for their spacial locations.
With DirectWrite and Freetype, subpixel rendering isn't done for colored rendering. On OSX, due to lack of subpixel with Core Text, it also isn't done.
I suspect what you're really asking is "why does high color contrast look weird at the edges?" Because some monitors are "exceptionally clear and sharp", and people have been selecting monitors for this trait for over a decade.
On LCDs, "good" polarizers make it hard to make out individual subpixels (which also makes subpixel rendering kinda moot on them; you'll see the fringing but the text won't look any sharper than greyscale, and noticeably less sharper than aliased); instead of "clear and sharp" they're more "natural".
OLEDs and MicroLEDs do not have polarizers, and they're the sharpest monitors I've ever seen. However, good news (at least for me): I can see subpixels on a 1080p 24" during high color contrast (ie, fringing fonts), I _cannot_ see them on a 4k 24".
Even if I integer-scaled 1080p to 4k, I would be using an array that looks like...
R G B R G B
R G B R G B
.. to represent pixels. I can't see subpixels during situations like this. So, the only way to avoid the problem is just use HiDPI monitors. 32" 4k seems to be a very common size for Mac users; it causes it to do Retina scaling correctly, while also being about 150% the DPI of a 24" 1080p, or about 125% of the DPI of a 27" 1440p (the two most common sizes).
My recommendation also is: never use below 4k on OSX. OSX's handling of sub-4k monitors is broken, and usually leads to in-compositor scaling instead of letting apps natively render at LoDPI.
...yet at 4K native on macOS (OS X) I could see fringing. And it was worse than using a slightly lower resolution, scaled up by the OS.
And it's particularly bad on solid color lines and high contrast borders (not fonts). So... that doesn't work for me. Which was the point of the post; I don't like how this particular subpixel pattern OLED monitor looks and it's not for me.
Numbers.app, Autodesk Fusion, Adobe Illustrator, and Terminal.app were the first places I noticed it. And in Fusion and Illustrator it's not text that's the issue but lines/graphics.
And high contrast edges in photos in Apple Photos looked wonky.
Oof, at least two of those apps should not do that. I wonder how Fusion and Illustrator do lines, because last time I touched Illustrator (CS6 era), its line drawing was pretty good.
I'd like to see screenshots of these showing off the weirdness, if you don't mind.
I wanted to see what it looked like on my 24" 1080p IPS monitors (two Dell U2414H IPS, and a rebranded LG FastIPS from Monoprice). I don't own a Mac, so I can't replicate it.
All of those share similar traits: the lines are excessively soft in many cases. They're rendered in linear space and then baked to the target gamma ramp, instead of being rendered in sigmoidal space (or some other psudeo-sharpening/pixel-aligning w/o sharpening methodology).
The font in Numbers is extremely misrendered, that's soft even for Core Text standards. Its as if it had absolutely no hinting applied, instead of the kind they use that approximates Freetype's "light" hinting.
The Terminal.app one is okay, but not amazing, as you can see slight misshapen stems, such as in the M for Makefile.
So, I'm not sure the monitor was at fault, but given the "clear and sharp" nature of OLEDs, it certainly magnified the effect.
Yeah, and I'm not terribly interested in getting into the details of how everything renders... I just want a display that works and doesn't make my eyes feel funny.
The PA27JCV (which I don't expect to have back from warranty repair for 3+ weeks) looked fine, and I'm now at day 5 of using the U3223QE and it's fine. So this is my solution to the problem I guess.
From what I can tell from photos of your new monitor's pixels, it has a polarizer that is of similar taste of my Dell U2414Hs, just much newer. It aims for natural reproduction, which means pixels aren't sharply defined from neighbors, and subpixels blend together.
I prefer monitors like these, so I can't really argue with your choice. Sadly, due to a lot of younger kids being raised on phones (which have exceptionally sharp screens), modern high end screens keep being pushed towards sharper and clear to a fault.
Apple refuses to adjust rendering, since Apple's own taste in screens prefers natural over sharp. Even their OLEDs clearly have a film on them to hide the subpixel misalignment; the side effect of this is their brightness and contrast is lower, but eye fatigue is also lower.
Unfortunately, this is why I won't take OSX seriously: I bought a MBPr many years ago, I really tried to like OSX, I tried to understand why people like it, but it ultimately is a death by a thousand cuts, and entirely Apple's fault.
That MBPr ran OSX for a year, then Windows 10 a bit, and then Linux until it died. The text rendering was only fatigue-free on Windows and Linux, OSX had always been too fuzzy, especially with dark themes.
If you're not willing to break up with Apple, yeah, you're stuck just buying Apple-friendly monitors. A lot of OLEDs are just too clear and sharp and I don't disagree with you on sending it back.
With the advent of new RGB (three column, like most LCD) OLEDs I wonder if Apple's next high-end display is going to use that. It'd be a whole bunch of things aligning for a good ecosystem.
And I know this is a whole lot of personal preference, but I like macOS. It works well for me. It's a good UNIX(-like?) with professional-level apps.
I support/maintain/use Windows systems for a living so I'm comfortable there as well, and I'd be mostly fine on a Linux but the lack of pro-level apps for some of my hobbies (namely, map making) and sufficiently-user-friendly equivalents for a few other apps (eg: rubiTrack, Hazel, Photos.app) is a problem.
(A bunch of years back I made a conscious choice to do less sysadmin-ing at home, even if I have to pay a bit more. It's freed up mental capacity for using computers as a means to an end vs. an end itself. And it means I don't have the flexibility of Linux or other OSS things at times, but I've been able to work within that. But I'm getting way off topic here...)
With DirectWrite and Freetype, subpixel rendering isn't done for colored rendering. On OSX, due to lack of subpixel with Core Text, it also isn't done.
I suspect what you're really asking is "why does high color contrast look weird at the edges?" Because some monitors are "exceptionally clear and sharp", and people have been selecting monitors for this trait for over a decade.
On LCDs, "good" polarizers make it hard to make out individual subpixels (which also makes subpixel rendering kinda moot on them; you'll see the fringing but the text won't look any sharper than greyscale, and noticeably less sharper than aliased); instead of "clear and sharp" they're more "natural".
OLEDs and MicroLEDs do not have polarizers, and they're the sharpest monitors I've ever seen. However, good news (at least for me): I can see subpixels on a 1080p 24" during high color contrast (ie, fringing fonts), I _cannot_ see them on a 4k 24".
Even if I integer-scaled 1080p to 4k, I would be using an array that looks like...
R G B R G B
R G B R G B
.. to represent pixels. I can't see subpixels during situations like this. So, the only way to avoid the problem is just use HiDPI monitors. 32" 4k seems to be a very common size for Mac users; it causes it to do Retina scaling correctly, while also being about 150% the DPI of a 24" 1080p, or about 125% of the DPI of a 27" 1440p (the two most common sizes).
My recommendation also is: never use below 4k on OSX. OSX's handling of sub-4k monitors is broken, and usually leads to in-compositor scaling instead of letting apps natively render at LoDPI.