Divi WordPress Theme

Bayer Interpolation — Does Your Camera Really Have The Quoted Resolution?

What if someone told you that your 10 megapixel camera only has the effective resolution of 1 megapixels? Sounds like a scam – only that this is technical in nature. And at the end only those with a keen eye for details realize the difference. I’m over-simplifying but the topic warrants it.

The resolution is measured by the number of pixels on the screen. This can be the screen of a TV or a computer monitor or the camera sensor. Every pixel contains three sub-pixel components meant to handle three primary colors red, blue and green. So if you use a magnifier to see the detail on your computer’s LCD monitor, you’ll notice the three sections red, green and blue. The resultant color at the pixel is determined by the combined intensity of these three components. So for example when the red and green components are illuminated the resultant color is yellow. When all three are illuminated you get white.

However camera manufacturers play it smart. They simply quote the total resolution taking into count these sub-component pixels. So whereas it is supposed to be one pixel (with the three sub-components red, green and blue) they quote three pixels. In effect this means that your actual camera resolution has been blown up by a multiple of three and quoted on the specs. However only 1/3rd of this quoted resolution is the actual effective resolution.

How then do you still get the images at the quoted resolution? Here the Bayer Interpolation comes into play. Using an algorithm the missing pixels are artificially recreated to result in the quoted resolution. This is one reason why images resized and shrunk by a factor of three look more pleasing and natural to the human eye.

Bayer Interpolation is something that digital cameras can’t do without till the manufacturers stop quoting the multiplied resolution and quote the actual resolution like the computer LCD manufacturers. So we are down to the moral of the story – shoot at the cameras maximum resolution. When you are done, reduce your image to a smaller size. This results in the loss of the artifacts and the synthetic pixels while raising the overall density of the picture. But this is all when you have it at the back of your mind that there’s something called Bayer Interpolation happening and that you are loosing on the quality of the image (more on the factors affecting the image quality here). That said the algorithm is still smart to trick the naked human eye of course unless you are viewing the picture at 100%.

WordPress themes by Elegant themes

7 comments… add one
  • Allan Angus Apr 3, 2012 @ 1:26

    Well, I think that Mama & O are neglecting the optical low pass filter (OLPF) that is normally used in conjunction with the Bayer filter in most DSLRs. (The Nikon D800E is a notable exception, as are some medium format cameras, but ignore those for the moment.) The function of the OLPF is to spread out incoming light over the span of the Bayer filter or “color filter array” (CFA). In this way, light that might otherwise be focused to a point the size of a sensor well is spread out over 4 sensor wells. So, in one way, it’s not the CFA itself, or its interpolation that reduces resolution; but the OLPF that usually comes along with it does. One natural way to think about this is in terms of the circle of confusion (CoC) of the sensor and its pre-filter stack overall. One very standard way to define the CoC for a 35mm SLR, whether digital or film, is based on normal human visual acuity and the enlargement of the captured image up to 8″x10″; and this model yields a CoC of 29µm on the sensor. But if you take a 12Mp D700, for example, the sensor well dimension is about 8.5 µm across; and the OLPF spreads out any highly focused point over a circle of diameter closer to 21 µm or more. Resolution is usually specified in terms of line pairs per millimeter (lp/mm). If one ignored the OLPF in the D700, you might expect a resolution of 1 lp per 17 µm, or about 60 lp/mm. Another way to get at this number is from the sensor size and resolution of the camera. The sensor is 36x24mm and its resolution is 4256×2832 (for 12Mp). So, you could just take 4256/36mm divided by 2 and get the same answer of about 60 lp/mm. But the OLPF cuts that down by a factor of 2 or so; and you’re left with more like 30 lp/mm. In practice, it’s common to make this more objective by using the idea of a modulation transfer function (MTF) to account for the fact that as the number of line pairs is increased, the optical system of a camera will gradually blend the bright and dark lines into a gray blur. You can look up MTF yourself, but the idea is that if the lines are perfectly intact, then the modulation is 100%. If they’re complete blurred out, the modulation is 0%. If they’re half-way smeared together the modulation is 50%, and so on. Although it doesn’t tell the whole story, it’s common to quote resolution for a lens or camera at the number of line pairs per mm for an MTF of 50%. For very high quality lenses, the D700 will typically score an MTF of over 90% at 30lp/mm quite handily all across the sensor surface. The new D800 and D800E have sensor well dimensions of about 5 µm on a 35mm format. Using pixel dimensions and computing a CoC from the size of the OLPF and CFA yields a diameter of just over 12 µm. This would suggest that the D800 could do around 100 lp/mm. However, the D800E lacks an OLPF and there is much speculation that it could do better as a consequence. While this is arguably true in some cases, pushing the D800E past the limits of its CFA could introduce false color patterns and moiré that will simply shift the burden for correction into post processing, thereby reducing the ultimate image resolution anyway. Anyway, suppose you had a 35mm camera (36x24mm sensor) that did 30lp/mm; in terms of pixels, you’d quote 2160×1440 or about 3Mp. That says that for the D700, the OLPF reduces resolution by about a factor of 4 from what the raw pixel count from the sensor gives you. Is this bad? Not really, obviously a D700 is a fine camera; and the D3 sensor is identical, another fine camera. Is the D800 better in terms of resolution? In some cases, however for any lens stopped down past about f/11 the practical limit to resolution for the D800 or D800E is diffraction; and many lenses on the market will be worse than their diffraction limit. At f/11, the full diameter of the Airy disk of a diffraction-limited lens is about 15 µm meaning that no lens can focus a spot to a narrower width than this anyway. That’s equivalent to a resolution of around 30 lp/mm anyway, so you might as well be using a D700 or D3. This is not to say that the diffraction-limited resolution at 30 lp/mm and f/11 is bad; in fact, the calculations give an MTF of nearly 80% at this point. I hope this helps. I’m planning to put a technical review of some of these matters on my own blog soon.

  • O Apr 10, 2012 @ 22:16

    Allan, thank you for your very scientific approach to resolution! Mama & I are not neglecting any part of the camera but merely addressing the mathematical error in the article that made it look like true camera resolution is only 10% of advertized.
    Now, if you put your long very detailed comment just to promote your blog – I can understand… but if you agree with the article that real resolution is 1/9th of the resulting .jpg than you are also incorrect.

Leave a Reply

Your email address will not be published. Required fields are marked *