Wow, when I asked Is 4K BS? three days before Christmas, I didn’t expect much of a reaction, but is that little piece ever popular. A bunch of useful follow-ons appeared in the comments and on G+ and Twitter, so here they are.
Never mind 4K, lots of 1080p screens are already being wasted because overaggressive or poorly-implemented upstream compression by the broadcasters.
I really notice this on live sports. Some Sunday, when there are 3 or 4 different NFL games on, switch between them and if your sources are like mine, some will have way better pictures than others. And the NHL is particularly bad; sometimes hockey games look all creamy and impressionist like one of my pictures in Lightroom when I crank the noise-reduction a little too hard.
Whether or not 4K makes sense for TV, it probably makes excellent sense for computer monitors. The 30" Dell I’m typing this on has big fat ugly pixels and I’d like not to see them.
Also, if 4K screens become ubiquitous, the manufacturing cost may get driven down to the point where it doesn’t make sense not to buy 4K.
Also, 4K is well on its way to being the preferred format for professional video capture. Among other things, it lets you do huge video walls and so on.
Adrian Cockcroft pointed out that “The ultraHD 4k standard includes more pixels, increases color depth to 10 or 12 bits per pixel from 8, and doubles the frame rate to 60Hz. The thing that is most noticeable to the eye is the 60Hz refresh.” Also, he noted that it doesn’t come with a storage format like DVD or Blu-Ray; the assumption is, it’s all about streaming.
The high-def pixel-count race is amazingly like the camera-megapixel race, which is mostly over, thank goodness. For most practical purposes, we have plenty enough megapixels, and camera makers are turning their attention to more important things like speed, ergonomics, lenses, and sensitivity. Time for screen-builders to do that too.
Chris Swan linked to 4K Resolution Is Visible vs 1080p on 55″ TV from 9′ Viewing Distance in HDTVtest.co.uk; I was actually fairly unimpressed by the quality of their research, but there were fascinating notes about what really matters in video quality (tl;dr: Blacker blacks).
People would like to see an experiment where you draw a non-anti-aliased diagonal line and look for visible squiggles. My feeling is that the result should be what the math predicts, but it’d be fun to try.
And everyone agrees: The biggest problem with TV isn’t the pictures, it’s the shows. Which is a little weird, because we are also generally agreed to be in TV’s golden age. But still, lots of times when I feel wiped and want video wallpaper, nothing’s on.
Comment feed for ongoing:
From: J. King (Dec 23 2013, at 12:34)
Tim, I'd love to see sources on agreement that we live in TV's golden age. If one agrees that TV's twilight is of a particularly lovely shade of gold, then sure, but it would seem to me that television, as a medium, is about on equal footing with radio: ubiquitous and taken as a given, but increasingly irrelevant.
If we're talking about television as a format i.e. an hour (usually minus fifteen minutes or so) or half-hour of video of usually scripted programming irrespective of delivery timing or mechanism, that's another argument, but if we're all in agreement there's nothing worth watching (I certainly fall into that camp: I watch one show, and cry because it's only slightly better than bad) how can we be in a golden age? I'm baffled how anyone could make the argument.
Or am just being dense?
[link]
From: Dave Walker (Dec 23 2013, at 13:36)
Adrian makes a very interesting observation about there being no recording / playback medium for 4k.
This makes me wonder 2 things:
1. Will Joe Kane productions produce a 4k version of their long-running "Video Essentials" test and calibration content?
2. Will it be made available purely as a download (and, as a streaming test, might Adrian get it uploaded to Netflix)?
;-)
[link]
From: Bob Monsour (Dec 23 2013, at 14:15)
Here's one of the best ads for a 4K TV that I have seen so far.
http://www.youtube.com/watch?v=hCLYh4JSus8
I agree that 4K is coming, whether we want/need it or not. There are certainly great applications for it, but I agree that you likely need a pretty large TV at some distance to make it a great viewing experience.
[link]
From: Mike Kozlowski (Dec 23 2013, at 15:28)
To do that experiment on your phone/tablet: Go to Chrome's URL bar and type "www". I find it very easy to see the jaggies on the "w" at normal reading distance on a 300-ish DPI device (720p phone, 2013 Nexus 7, Nexus 10). I don't see them on 1080p phones (Nexus 5).
And this isn't at all surprising -- everyone knows that 300 dpi laser printers were full of jaggies (which is why 600 dpi laser printers were such a big deal), so 300 dpi unaliased pixels should ALSO be full of jaggies.
Any math that believes 300 dpi unaliased black-and-white text is jaggie-free is just really obviously at odds with reality.
[link]
From: Gordon Haff (Dec 23 2013, at 17:25)
@J King
I'm obviously not going to try to argue that you're in any way wrong about your TV preferences, but there are IMO a lot of good shows on, now and over the past decade. Many are on cable (or even Netflix) rather than network. And I don't watch any real-time partly for that reason. I probably only regularly watch 5 or so shows at a given time-which gives me a lot of latitude to ignore the not-so-good stuff, of which there has always been a lot.
Genres do come and go though. There's not much in the way of good SF at the moment, which is a pity.
[link]
From: Bob (Dec 26 2013, at 03:53)
Comfortable viewing distance is influenced by how screen images are composed. Movies used to assume large screen presentation. Much of the image was often occupied by sets and ancillary atmospheric content; primary actors or content only occupied a fraction of the image area. TV image composition assumed small screen presentation: eg closeups of faces took up the entire image area.
Today, many (all?) movies use shots that are composed for small screen viewing (either home TV or tiny shoebox-16 multiplex movie screens. Allowing primary action to occupy a huge amount of the frame translates into an uncomfortable viewing experience, if that frame subtends a large angle at the eye (eg on screen facial features loom over the viewer).
If consumers are going to be given 4k or 8k, then to "see" that resolution they need to have the frame subtend a large angle at the eye. Either they sit very close to the screen or they have very large screens. Viewers will resist this so long as images continue to be composed for small-screen viewing.
Digital bigots have long contended that 4k is an adequate replacement for 35mm film. If we are now able to deliver 4k into both theaters and residences, perhaps we can prevail on the directors to resume creating content with images composed for "large screen" viewing. Without this change viewers won't feel comfortable (either with big screens or sitting close enough to see 4k on a small sceen), and consumer 4k adoption will flounder.
[link]
From: Daniel Black (Dec 26 2013, at 13:11)
Curious that it's generally agreed we're in TV's golden age, unless we're including streaming (and specifically the likes of Neflix's and Amazon's productions). I kind of feel like the opposite, that people are tired of TV as a confluence of horrible, horrible reality shows, underwhelming or outright unpalatably vapid news programming, and a handful of decent scripted material amid the predictable dregs of bad plots and bad acting. I would've taken the 90s to be the golden age, given the David E. Kelley stuff, Friends and NBC's other Must See TV stuff of the time, and fairly credible news programming.
[link]
From: Nik (Jan 10 2014, at 07:20)
It seems like 4K will be the standard because manufacturing costs will drop to the point where it doesn't make sense to produce HD (remember 720p and 1080i?).
I'd seriously question the research that came up with the human eye resolution angle.
Except it doesn't really matter as really, this format will be streamed and will be limited by the bit rate and encoding much more than the number of pixels.
I am getting English Premier League soccer in HD and the bit rate and/or encoding are not up to it. When they show a player who is not moving too much, it's a brilliantly clear high-res picture.
If the same player is in full action on the ball, moving as fast as he can, his face turns into a skin colored blur. Most details get lost. Yet those are clearly the more interesting scenes....
I would imagine some of this is due to the encoders - encoding at high quality isn't easy and they have to do it in (near) real time. And some of it is due to an actually too-low bitrate.
[link]