I hear that Sony & friends are going to start telling us that our HDTVs aren’t good enough and we all need to upgrade to 4K (which is twice the dimensions and 4 times the pixels of 1080p). NBC news says the experts are unconvinced, and quotes one of them: retina scientist, photographer, and blogger Bryan Jones. I thought I’d do the numbers and yeah, I think it’s probably BS.
[Update: Wow, this piece touched a nerve and generated a ton of interesting follow-ons, which I wrote up in More Things About TV.]
In Jones’ widely-quoted piece Apple Retina Display, he argues that the literature shows the human eye has an angular resolution of about an arcminute (1/60 degree). So, sitting in front of an NFL game, it occurred to me to wonder how far apart, in arcminutes, the pixels in my TV are. Let’s call Arc-Minutes per Pixel AMpP for short, and if it’s less then 1, that should mean that our eyes can’t distinguish pixels, which is the goal.
I’ve got a 42" screen and sit 7½ feet from it. Per this TV Size Chart, its height is 20.6 inches, and the total angle subtended, assuming my eye is somewhere near the vertical middle of the screen (this is all in radians):
angle = 2 × arctan(height/2, distance)
Then of course the AMpP is:
(angle / resolution) / (π / (60 × 180))
Which I think the following chunk of Ruby does.
def tv_height(diagonal)
diagonal * 0.490261259680549
end
ARC_MINUTE = Math::PI / (60 * 180)
def pixel_arcminutes(diagonal, distance, resolution)
height = tv_height(diagonal)
tv_angle = 2 * Math.atan2(height / 2, distance * 12.0)
(tv_angle / resolution) / ARC_MINUTE
end
diagonal_in_inches = ARGV[0].to_f
distance_in_feet = ARGV[1].to_f
resolution_in_lines = ARGV[2] ? ARGV[2].to_f : 1080.0
p_in_a = pixel_arcminutes(diagonal_in_inches,
distance_in_feet,
resolution_in_lines)
puts "1 pixel in arcminutes: #{p_in_a}"
If I have the math right, then from where I sit, on my TV, my AMpP is 0.73. Yay!
But my 42" set, as big as we can fit in our video cave, is hardly state-of-the art. There’s a suggested-TV-size calculator at HDTVtest.co.uk, which suggests that a real video weenie ought to have a 55" screen at 7½ feet; which gives you around 0.95 AMpP. Hmm, I bet the math behind that site is designed to make the AMpP come out around 1.0.
At my distance, I’d need a 60" set before the AMpP hits 1.0 (at which point I’d be worrying about brain damage). So my take-away is that the 1080p TV standard is a pretty good match for our eyes.
The experts in the NBC article at the top suggest TV designers should worry more about color accuracy and dynamic range. I think a much bigger problem is that there’s often nothing on worth watching.
[Thanks to Bryan Jones for glancing at my math. He made a bunch of good points that deserve space of their own; I hope he writes them down.]
Comment feed for ongoing:
From: Andrew Ducker (Dec 22 2013, at 15:16)
The chart on this page is pretty useful:
http://carltonbale.com/1080p-does-matter/
Given the size of your TV and distance you are from it, it'll tell you the maximum resolution you can see.
At 7.5ft you need to have a 60" TV to even begin to see a difference. Which, I think, would give me a headache...
[link]
From: Johnny (Dec 22 2013, at 17:46)
It all depends on screen size and viewing distance (unsurprisingly). In the projector/screen world, screen sizes of 100"+ diagonal viewed from 15-20ft away aren't that uncommon, so the 4K is justified to exist at least for these.
[link]
From: Tim (Dec 22 2013, at 18:55)
Sorry, Johnny, a 100" screen at 15' produces an AMpP of 0.86.
[link]
From: Anthony (Dec 22 2013, at 19:47)
It's very true that for a home television 4k is a kin to 3D;
However, in video production for live events 4k _is_ the future. Now it means we can connect a 80' 6mm LED wall from a single 4k output.
Usually something that big is projected, but led wall configurations are getting more and more challenging these days.
[link]
From: Chris Swan (Dec 22 2013, at 22:20)
Some discussion of side by side field testing - http://www.hdtvtest.co.uk/news/4k-resolution-201312153517.htm (complete with comments that other factors beside resolution might influence perception of 'best').
PS I've been running this math for over 20 years since first encountering it on a 2D DSP course, which is why I'm looking forward to 4k on my desk more than in my living room.
[link]
From: Gavin B (Dec 23 2013, at 00:03)
Tim you need to take into account how small is the fovea and how much the eye move while viewing - so in the end absolute acuity is no longer much guide to perceptual quality - except if you fancy watching NBA with a bite-bar[1] in your mouth (i.e. no pop corn!)
[1] http://link.springer.com/static-content/lookinside/379/art%253A10.3758%252FBF03210613/000.png
[link]
From: Jay Carlson (Dec 23 2013, at 05:12)
Greater-than-retina resolution can still be useful: you can trade spatial resolution for bit depth. Many LCDs have traditionally traded temporal resolution for bit depth. (This was acceptable on a Dell laptop with an 18bpp panel, since the replacement shrunk the screen from 1920x1200 to 1080; death to 16:9!) You're sick of hearing about this genre, but a lot of anime has banding problems. 10-bit-per-component is the obvious solution, but existing BRs seem happy to use up a huge amount of bandwidth coding spatial&temporal noise to get dither.
There are a lot of trades which make sense for "desktop" monitors which are bad for motion pictures and vice versa.
[link]
From: Scott Laird (Dec 23 2013, at 06:02)
I suspect that 4K is a bigger deal for monitors than for normal TVs. I bought a Seiki 39" 4K TV for use as a monitor a couple weeks ago and *WOW*, it's been an awesome improvement over the dual 24" monitors I used to use. At $550, it was also cheaper than each of the 24s were when I bought them. So, at the moment, I'm all in favor of 4K TV--it'll make high-res monitors cheaper.
[link]
From: Adrian Cockcroft (Dec 23 2013, at 07:55)
The ultraHD 4k standard includes more pixels, increases color depth to 10 or 12 bits per pixel from 8, and doubles the frame rate to 60Hz. The thing that is most noticeable to the eye is the 60Hz refresh.
The most interesting thing about 4k is that it is the first video standard designed to be delivered over the Internet. There is no blu-ray like disc for it. So internet video streaming will be the highest quality available, rather than the low quality option it has been historically.
[link]
From: John Roth (Dec 23 2013, at 09:17)
The human visual system does an incredible amount of processing between the eye and what we "see." A lot of those image processing tricks you see in cameras or in image processing software got their inspiration from studies of our own visual system.
The only real test is whether you can see a difference at your normal viewing distance with programming specifically designed to take advantage of the screen resolution, color depth and refresh rate.
If the average person can, or can be persuaded that they can, see a difference, those things are going to sell. Otherwise the market will start punishing them.
[link]
From: Johnny (Dec 23 2013, at 10:50)
Hi Tim, I did say 100"+ ;) At 100" diagonal projection people may want to be closer to the screen, maybe at even at 10'. No universal consensus here, but consistent with the last two recommendations here (diagonal x1.2 - diagonal x1.6):
http://en.wikipedia.org/wiki/Optimum_HDTV_viewing_distance#Fixed_distance
So:
$ ruby pixel_angle.rb 100 10 1080
1 pixel in arcminutes: 1.28280947370954
For my setup I was pretty sure my 6 year old 720p projector is in acute need of an upgrade even without checking the numbers, but did it now just in case:
$ ruby pixel_angle.rb 160 15 720
1 pixel in arcminutes: 2.04871126809546
$ ruby pixel_angle.rb 160 15 1080
1 pixel in arcminutes: 1.36580751206364
So even 1080p would be sub-optimal (at least in theory) for my setup, but probably acceptable and a noticeable improvement over my current 720p.
[link]
From: Bob H (Dec 24 2013, at 07:11)
The test in the UK is interesting because it shows that in a blind test consumers could see the difference. My problem with this is that we don't entirely know what part the quality of the content played.
I work in the industry and I can tell you *most* consumers can't tell SD from HD, let alone '$k'. One of the major quality factors is the encoding, Netflix have announced a 4k service, but considering quite a few movies are actually 2k this is quite strange. Also they say they will transmit in 10-16Mbit, my experience of this is that because of bandwidth costs companies quickly need to reduce the bitrate. Look at broadcast television, the bitrates there are a race to zero.
People will be sold 4k using fancy in-store demos at super high bitrate and low movement, but the experience at home will be very different.
Finally, many experts that I am speaking to, especially BBC R&D are saying that the framerates aren't sufficient yet. Temporal resolution is a significant factor that no one seems to want to address because the plasma companies have already over-sold that promise. There are descenters in the CE and broadcast industry insisting we need genuine 150-300fps not just on 4k but also on HD. When we have this we can truly experience quality and at almost no extra overhead (higher framerate does not equal higher bandwidth in compressed systems above 50Hz).
Bob
[link]
From: Petri (Dec 26 2013, at 05:28)
I had the pleasure of testing Sony's 55-inch UHDTV for a few days. To see how people perceive the difference between 4K and FHD I set up a test session with 5 friends. The test was somewhat flawed in that I used the same TV for both FHD and 4K content. Even though I disabled all fancy post-processing for FHD content, I'm sure the results would have been different had I used native FHD and 4K displays for the comparison.
Test set-up: trailer for movie "Flight" in FHD from Blu-ray and in native 4K from a REDRAY player. Subjects waited in another room until set-up was completed, then asked to come in one at a time and seated at a precise distance. The trailer was then played and the subject was asked "Was this FHD or native 4K?"
At a distance of 1.5 meters, all subjects replied correctly. At 2 meters I noticed some hesitation and uncertainty but all replies were correct. At 2.5 meters there was much more hesitation, and even though 80% of replies were correct, some admitted afterwards they hadn't been certain and had made a guess. At 3 meters everyone agreed they simply can't say.
In hindsight I should have set the test up using two displays. I should also have set a rule of "no guessing" or introduced a metric for certainty. (Live and learn, eh?)
Personally I feel a 55" UHDTV is a silly proposition. Sony themselves recommend a viewing distance of 1.5x image height for 4K, which for a 55" TV is something like 3.5 feet. That's a set-up for single person viewing. Cram in two people side-by-side and both have to either sit slightly sideways or angle their head. Try three people, and two of them are staring straight at the speakers.
Higher resolution is great and all that, but I feel it goes to waste at small image sizes as regular consumers refuse to move much closer to their displays. I'd love a 39" 4K display for my PC but as far as my home theaters go, I'll skip 4K TVs and go directly to 4K video projectors.
[link]
From: Mark Alexander (Jan 06 2014, at 10:56)
I decided I would never need a huge TV when I realized that watching movies on a laptop (an old ThinkPad with a 1600x1200 display) in bed provided the same or even better quality. Using your handy Ruby code, I get a value of .64 for this setup, which is pretty good.
[link]
From: Anand Mani (Jan 06 2014, at 15:07)
I like the idea of a large (>55") 4k, much more than 3d for reasons I will note below. I realise that most people sit at a fixed distance and view TV passively but I find that I can walk up to the set and see greater detail. Much as in life—if I want to examine something, I bring it closer. This is a huge benefit when looking at artwork etc.
As for 3d, I believe that the film industry blew it when they licensed 3d for home viewing. By limiting the display to cinemas, watching the movie becomes an event, guaranteeing that people will fill seats at the theatre. It also pretty well eliminates cam piracy as the copy is at half the brightness (assuming the video camera has a polarising filter). I have seen only one show that truly benefitted from 3d—a documentary-type production on insects. They were stunningly brought to life by the medium.
[link]
From: Marcus (Jan 23 2014, at 13:47)
I think we should ignore Sony anyways. they are out to sell a product at any cost. Even if they have to start propaganda. Although, on the other hand, Sony's new TV are really cool. They make the movie experience way better...lol
[link]