In the latest of a string of potential disruptions to the traditional digital-camera imaging model — the last was from Light — InVisage technology is gearing up to ship its first quantum-dot imaging sensor which the company hopes will displace the more common silicon-based CMOS sensors.
Unlike traditional silicon-based sensors, whose photosensitive layer consists of “buckets” that collect the electrons created when photos hit a silicon layer, QuantumFilm replaces the buckets with liquid nanoparticles (the quantum dots) suspended in a substrate, sort of the way silver halide grains in film are suspended in gelatin. When a photon hits a dot, it releases an electron and a positively charged hole. The positive and negative charges flow through the QuantumFilm toward the electrodes which sandwich it, streaming them out to an analog-to-digital converter just like a silicon sensor. The capture process and the characteristics of the quantum dots are what differentiate QuantumFilm’s characteristics from a typical CMOS sensor.
a sensor with 1.1-micron pixels that fits in an 8.5mm-square by 4mm-deep module. The company expects to be able to ship the Quantum13 to phone manufacturers by the end of this year.
An inside look at digital photography upstart…
This technology does enable a couple of important improvements. Because it dumps an entire frame of image data at a time (unlike other sensors which read it out a line at a time), it can potentially eradicate rollling shutter — wobble is one of the ugliest problems with phone video.
It can also potentially eradicate the color filter array (CFA) at the front of the sensor, which is how the sensor captures the color information. Dropping the CFA would allow more light through, which I think would unambiguously improve low-light photo quality. However, despite admitting to this possibility when the company started up five years ago, the Quantum13 sensor still has a CFA.
Based on these samples, the sensor seems to do a reasonable job with highlights that usually get blown out. (The center is Kodak film.)
I’m not overly impressed with the quality of the medium-light photos, and it looks like the images need more postprocessing — they don’t look as nearly as sharp as the cell phone photo and the white balance looks a little off.
InVisage previously released some photo samples and a video from October shot with a prototype, and based on those I have mixed feelings. QuantumFilm’s light-response characteristics are a cross between film and silicon: in the bright areas it acts like film, gradually losing details as brightness increases (nonlinear response, the way your eye sees), but in the dark areas and midtones, it responds like a silicon sensor, losing detail in a direct relationship to the decrease in brightness (linear response). Based on the samples, I’m impressed with the performance in the bright areas, but not so much in the overall photo quality.
That said, it looks like the lens they used for these samples was terrible, and each manufacturer will be able to tweak and optimize the imaging pipeline and other hardware, so assuming they don’t run into any implementation problems, I’m optimistic.