Display text scaling absurdity with a 4K 27" display — and a solution

There are many rants like this, but this one is mine.

I recently started a new gig (I’m a professor now, which is fun, and challenging). Among numerous setup issues, I had to get the university to buy a new display. I didn’t want to burn through my professional development budget by getting a Studio Display (yes, I am jealous of yours). So, I did a bunch of research and got a 27" LG 4K display. This one. It was even on sale. Great!

It arrived, and I set it up, plugged in the single-USB-C cable, felt magical, etc.

Then I looked at some text. Woof!

I ended up spending an unspeakable number of minutes over the succeeding week changing display settings this way and that, reading about HiDPI and scaling and 1:1 ratios and the display sweet spot. I tried four different resolutions within the sweet spot. I entered defaults write -g CGFontRenderingFontSmoothingDisabled -bool NO into the terminal, because apparently “Apple has disabled Subpixel antialiasing for text in macOS Mojave.” Later, I deleted those same font smoothing settings, because apparently they can be at fault. A half-dozen times I considered returning the display and waiting to spend $2000 CAD on a Studio Display in the new fiscal year.

Finally, finally, I found an advanced setting hidden in the app Better Display:

I enabled those toggles, applied them, and voila. Everything looks … like a good display!

I am discombobulated by how confusing this is. I can’t imagine it’s a good thing for the average user to buy an average display to hook up to their better-than-average new Mac, only to see an extremely disappointing image quality.

Surely I am not the only person who thinks it is a bit odd that Apple’s solution is to buy a display that costs as much as your laptop … especially when a software solution exists!

I don’t know what the lesson learned is here, but I hope Better Display and those settings above will help someone else in the future. (Let’s be real: that person is probably me in a few years after I repress these memories and buy another 4K display.)


I’ve got a 27" iMac (5K display) alongside a 27" LG 4K display. The iMac display runs at the default 2560x1440 resolution, which is exactly half the display resolution so the scaling is perfect. The problem is, of course, the adjacent 4K display. But I just run it at the same 2560x1440 resolution so not to go crazy moving windows back and forth.

In the case of the 4K display, the scale factor is imperfect so that averaging needs to occur, basically what “font smoothing” did but for the entire screen.

To my eyes I can’t see the pixels on either, so they are both “Retina” and at that point the actual resolution doesn’t matter. There are performance arguments against the non-integer scaling, but computers are fast these days and I don’t game.


What resolution did you end up running it at? 2560 x 1440, with this smoothing turned on?

I have exactly the same display and just set my display settings to 2560x1440 and do not use any other tools. Text is sharp for my eyes.

I have the LG HDR 4K model. Comparable specs to what is posted here it seems. I am driving it from a 16in MBP (Intel processor). I have found that the quality of the USB-C cable matters, as does the Picture Mode setting. Not to mention using “computer distance” glasses. I’ve not had to struggle with reading text even at my standard scaled 3360 x 1890 resolution setting (versus the 3840 x 2160 that is the monitor’s default).


You’re showing an Ultrafine there, which costs about an iPhone more than the display I’m using. (You have a 5K resolution whereas this one is 4K.)

I’ve wondered that. I’m using the bundled cable but have thought about buying something “better” third-party.

No it’s not, it exactly the same monitor 4k, but it uses HiDPI by default, it seems your monitor is not recognised in the same way as mine is.

By default is uses 1920 x 1080 resolution (50% of 4k) but if I switch to 2560x1440 it automatically uses a HiDPI resolution. I also have a 27UL500 which does exactly the same.

See screenshot below when is switch to the default 1920x1080 resolution:

Fascinating. I was able to get a similar output in System Information using similar settings, but the text looked … scratchy, to me. Maybe I am just being picky — or imagining things.

For what it’s worth, I use an app called Display Menu (I’m sure there are others) to make sure that I’m using the Retina mode for my scaled resolutions. I have a pair of 4K displays that I scale to 2560x1440 and they look really very good. The better of the two looks very similar to a nanotexture Studio Display, but not as sharp as the non-nano version.

1 Like

I have read some articles in the past were the authors also had the issue that macOS did not recognise the display correctly and did some weird text rendering.

Or maybe you are indeed more picky than I am :slight_smile:

1 Like

I’ve had issues with my LG 27UN83a-W. It’s 4K, but here’s where it gets a little weird:

In the system report it tells me it’s 4k and 5K (it isn’t) and gives me a resolution not even available in display settings, 5120x2880.

It’s made my think that Apple don’t really care if it works if not an Apple display.

This is because of how macOS scales when it is in HiDPI mode. If you have a 4k display and use a 2560x1440 resolution it means that macOS will scale everything up to 5120x2880 and then it will render it as 2560x1440. This is also the reason why there is a (very small) performance penalty and why the quality is not exactly the same as when using a 5K display.