Replay. 4K on the desktop might sound like overkill to some computer users. It's not. Here's why.
For anyone who thinks that anything beyond HD resolution is a waste of time, I suggest that they go through the experience I've just had. Let me explain.
At home, I use a year-old Macbook Pro 13" computer. I also have a 32" Iiyama 4K monitor, which I plug into the Macbook, and it works very well indeed. It gives me the equivalent of four 16" 1080p monitors arranged in a rectangular grid. Adding the screen on the Macbook gives me effectively five monitors powered from one small laptop.
Is 4K on a desktop excessive? Absolutely not. It gives me the screen real-estate I need as a writer, with multiple documents open all the time and the ability to cut and paste easily between them. By that time you have Skype, Slack, Google docs, Evernote and four or five Chrome windows, each with at least 20 tags open, you'd struggle with a single laptop screen.
It is also fantastic for watching videos. Obviously, you sit pretty close to the screen, which means you see the difference very starkly between HD and 4K videos. Some of them are an absolute joy, including - perhaps surprisingly - simple YouTube channels where you might have someone talking about specialist subjects like Synthesisers or how to fix iMacs.
My Iiyama monitor is good enough for most "ordinary" video, but not for HDR. For that, I have a second 32" monitor, an ASUS PA32UCX. It's several times the price of the Iiyama and worth every penny. Obviously, it's not for general computer use at that price: it's a specialist monitor. It's meant for colour grading and other critical work and has much greater contrast, a locally dimmed backlight, 10-bit video and most current forms of HDR.
And, yes, I'm fortunate to view text and video with such clarity and precision. And as you'd expect, I'm absolutely conditioned to working (that is, writing) with such high-resolution monitors. And long may it stay that way.
Back to 1080p
Except that for the past two weeks, it hasn't. I've had to work away from home, and I couldn't take those monitors with me. Instead, I do have a couple of pretty average 24" 1080p monitors. I won't mention the manufacturer because it doesn't matter. They're OK, but they're certainly not capable of HDR, and you honestly wouldn't choose them as your main monitors if you had anything more than the most basic budget.
But again, for text, they're perfectly OK.
At least that's what I thought until I started using them. Which is when I realised that I've been spoiled by working in 4K.
So while I was away from home, I still needed an extra monitor - a 13" Macbook screen wasn't ever going to be enough for my work. I plugged the 24" monitor into the Macbook and thought something was wrong. The text was blurry and blocky. I tried every setting on the Mac and the monitor, but there was no improvement.
And then I realised. Full HD on a 24-inch monitor is nothing like 4K on a 32" monitor. With significantly fewer pixels per inch, the difference was obvious and not at all comfortable to look at. In fact, I've ended up preferring the screen on the Mac, which is smaller but has a much higher resolution than HD, although still not 4K. So it seems that Apple's "retina" description of their screens actually does mean something. The text looks small but crisp and clear, with no sign of pixels.
How it relates to video
You'd be perfectly entitled to say at this stage that this doesn't have anything do to with watching video, and you'd be right up to a point. Looking closely at a screen full of static text is an entirely different process to watching moving video. Video is subject to the same rules of resolution - the more pixels, the better - but there's also temporal resolution, which is the reason why a terrible video format like VHS was watchable at all.
If you've ever tried grabbing a frame from a VHS tape, it will - I can say this with some certainty - look terrible. But when you show these terrible frames one after another 25 or 30 times per second, they look, well, less terrible. That's because multiple frames contain more information than a single frame and can largely "average out" the shortcomings of a still frame.
So I'm definitely not saying that HD is no good for video, especially on a small monitor. But I am saying that a higher pixel count is noticeable, and significantly so.
The fact is that we don't watch TV on a small screen across the living room anymore. If we're serious about it, we either watch TV on a huge screen across the room or - increasingly so - we watch it on a computer screen, sitting between 18 inches and two feet away.
In either case, the resolution - and mostly this comes down to the difference between HD and 4K - is unmistakable. I have a 65" TV, and I can immediately tell whether I'm watching 4K or HD. So much so that I seek out 4K material, and where there's a choice between HD and 4K, I'll pay extra (within reason!) to watch the higher quality stream.
I watch on my 32" monitors the rest of the time, sitting close up, and, again, very aware of whether I'm seeing HD or 4K.
And as I've often said, it's not just a question of sharpness. That feeling of disappointment when I see text in HD looking blocky and indistinct on my monitor is going to affect video too. It's not that you want to see every individual blade of grass, but it's as much about smoothness and subtlety as it is pin-sharp detail. It brings us closer to nature.
As you'll see in a subsequent article, the whole point of digital video as it improves to previously unimaginable levels of quality is to look like analogue again, not like an analogue camera, but like nature itself, which is naturally analogue. One simple step is to buy a 4K computer monitor. You don't have to spend a lot, and you definitely won't regret it.
Tags: Technology Monitors
Comments