With no colour filter array, the 18-million-pixel sensor in Leica’s new M Monochrom rangefinder captures highly detailed black & white images. Richard Sibley considers the advantages of using the Monochrom and finds out if it really is like shooting on film. Read the Leica M Monochrom review...

Product Overview

Leica M Monochrom

Product:

Leica M Monochrom review

Manufacturer:

Price as reviewed:

£6,200.00
TAGS:

Sensor

Professor Bob Newman explains… The Leica M Monochrom Sensor

Colour in a digital camera is inherently a compromise. Silicon image sensors start out as monochrome devices, and are only made to see colour by the addition of a colour filter array, which filters two thirds of the light from each pixel, leaving only a single coloured pass band for each. If the need for colour can be done away with, sensors can be made to perform much better, both in terms of noise and also with respect to resolution, for the same level of silicon technology.

Doing without colour
Let us reprise how a monochrome sensor is made to ‘see’ in colour. The standard pattern for colour filters is the Bayer array, which splits the sensor into four interleaved colour channels: one red, one blue and two green. The reason that the green filters are doubled up is that in a Bayer filter green doubles as the ‘luminance’ channel. The eye can see more detail in monochrome because there are more monochromatic detectors (rods) than there are colour ones (cones) – the rods also have a peak response in the green area of the spectrum, so by providing twice as many green receptors the Bayer arrangement provides extra resolution where it is needed.

Effectively, the two green channels act as a single sampling grid, sampling every other pixel rotation, but rotated by 45°, which means that in the vertical and horizontal directions the green or luminance channel is sampled every 1.414 (square root of two) pixel spaces. Meanwhile, the red and blue arrays sample every other pixel in both the vertical and horizontal directions, so those channels are sampled every two pixel spaces.

The reduced sampling rates will also affect the amount of light collected. Since only one in four pixels samples red and blue, three quarters of the red and blue light is thrown away, along with half the green light, since that is sampled twice as often. In fact, the light loss is worse than that, as the colour filters do not have perfect transmission within their pass band. The graph on the left shows a typical RGB filter response.


 

Sensor advantage
Now consider the case if we eliminate the colour filter array. Obviously, the ability to sense colour is lost, but in return the sensor gains the ability to sense all the visible light projected onto it at every position. Thus, not only is the amount of light increased, but so is the number of samples taken in the luminance channel (now the only channel), which is the one that provides the detail in the image. The gain is 1.414 times as many samples in each direction, totalling twice as many samples. This means that an 18-million-pixel monochrome sensor will produce the same amount of luminance information (otherwise known as detail) as a 36-million-pixel Bayer sensor.

It is a little more difficult to quantify the light gain, since the Bayer array blocks light selectively according to colour, meaning that the light loss depends on the colour of the scene. To a fair approximation, though, for monochrome work the green/luminance channel stands in quite well for the whole visible spectrum. Looking at the shape of the luminosity curve – the overall colour response of the eye – it can be seen that the red and blue parts of the spectrum contribute relatively little compared with the green part. So, a starting point for comparing the light-gathering ability of a monochrome and Bayer sensor would be to observe that the Bayer sensor samples the green/luminance channel over only half its area, compared with the full sensor for the monochrome – therefore, we might expect the monochrome sensor to have more than a stop advantage in light gathering over a similar sensor equipped with a Bayer filter.

In the case of the ‘old-fashioned’ CCD sensor in the Monochrom, this gives it a working advantage over even the very best CMOS sensors. For instance, the D800 sensor is nearly twice as efficient as that of the standard M9 – 86% more efficient to be precise. However, used in monochrome, it captures the luminance over only half the sensor, so compared with the Monochrom, without the Bayer array, it is actually capturing less light, 7% based on the simplified model discussed above, probably even less in practice.

As discussed earlier, the 18-million-pixel sensor on the M9 is sampling the luminance information at the same rate as the 36-million-pixel sensor in the D800, thus we might expect the Monochrom, with no anti-alias filter, to give a very similar level of detail to the inactive filter of the D800E, despite having half as many pixels.

There is one aspect of the M9’s sensor performance that the omission of the Bayer filter cannot completely make up for – the electronic or ‘read’ noise. The M9’s sensor produces about three times the read noise per pixel compared with the D800E. However, since the D800E has two pixels contributing noise for each of the Monochrom’s, this difference is not as pronounced as it might be, levelling out to about twice the noise per area, or about a stop worse in the deep shadows. Despite that, it is to be expected that the Monochrom will be a far superior low-light camera than the original M9.

Looking at the advantages of a camera dedicated to monochrome work, it is surprising that it has not been offered as an option by one of the mainstream manufacturers. If Canon has found it worthwhile to produce the EOS 60Da, for astrophotography, then surely a monochrome option is a possibility?

 

  1. 1. Leica M Monochrom review - Introduction
  2. 2. Features
  3. 3. In use
  4. 4. Image quality
  5. 5. Sensor
  6. 6. Conclusion
Page 5 of 6 - Show Full List