Editorial Feature

Digital Camera Sensor History and Developments

ImageForArticle_1883_15815060668326781.png

Image Credit: Alex Yeung/Shutterstock.com

The digital camera was imagined years before the hardware was available to make it a reality. In 1961 Eugene Lally at the Jet Propulsion Laboratory thought about using a mosaic photosensor to produce digital images of stars in space for navigation purposes, even in 1972 when Willis Adcock at Texas instruments filed a patent for a filmless camera the technology was still some way behind the concept.

The first accepted digital camera was a prototype developed by Eastman Kodak in 1975. One of their engineers, Steven Sasson, built a working camera using Motorola parts, some newly available Fairchild CCD (Charged Coupling Device) electronic sensors and a film camera lens. The resulting device weighed over 4 kilograms and recorded monochrome pictures onto a digital cassette tape and the images created needed a special projector to be viewed. Sadly, Kodak didn’t pursue this avenue seeing their core business as film and the ongoing development of the digital camera was left to others.

It was Fairchild, the developer of that CCD sensor used by Steven Sasson, who developed the first commercially available CCD camera in 1976 as a tool to inspect Proctor and Gamble products. Things started to move in 1981 with the introduction of the Sony Mavica, basically, an analog television camera that captured images onto floppy disk packs. These analog cameras never really developed technologically, they were expensive and their image quality was poor. The Canon Xapshot cost $499 in the US in 1988, but users had to buy a $999 battery, computer interface card with software, and floppy disks.

When it came to the development of a true digital camera, one that captured and stored digital images, it is Fairchild and their CCD sensors from 1971 that is at the heart of the All-Sky camera, developed in 1981 by the University of Calgary Canada ASI Science Team to photograph auroras.

Perhaps the most significant development in the history of digital cameras had little to do with either sensors, lenses or storage media. In 1990, just two years after the first JPEG and MPEG standards were set and the same year the first true digital camera became available in stores, Adobe Photoshop v.1 was launched.

From 1990 there were numerous developments in digital cameras, mainly centered around an ever-increasing resolution and further development of compact storage media. From the Apple QuickTake camera in 1994 with its 640x480 pixel images and 8 picture storage facility to the current 100+ megapixel cameras such as the medium format Hasselblad H6D-100c. The heart and soul of a digital camera is its image sensor. 

Until the mid-nineties, the only choice of sensor available was the CCD and it was a good solution, offering a great dynamic range and low noise. However, it was a team at NASA’s jet propulsion lab that took technology invented again at Fairchild in the 1960s and developed the CMOS (complementary metal-oxide-semiconductor active pixel image sensor.), a low energy consuming sensor that reduced the weight of cameras in spacecraft. Professor Eric Fossum who led the team saw the potential for the use of the cheaper sensor in a range of consumer cameras and he and his colleagues started a spin-off company, Photo bit, and licensed the technology from NASA. By 2005 Photobit (now known as Aptina) had sold over a billion sensors. 

Despite CMOS initially producing inferior images to CCD its cost-effectiveness led it to be used in consumer cameras and it’s the technology you’ll find your mobile phone camera. Today’s CMOS sensors are claimed to almost match CCD for image quality with the added advantage of power efficiency and their ability to offer a burst mode and other functionality.

There are currently two types of CMOS image sensors the Bayer (named after Bryce Bayer a researcher at Kodak) and the less popular, conceptually different Foveon. The Bayer has a grid of pixels, organized in groups of four pixels each sensitive to a primary RGB, 25% of the pixels are blue, 25% are red, 50% are green because the human eye is more sensitive to green. In the Foveon sensor, every single pixel can record any of the RGB colors from 0 to 255. There is no need to reorganize the pixels in groups of four or to apply anti-aliasing filters. There are more nuanced differences in images produced by each system, and each has its evangelists amongst users.

There are numerous opinions about the “megapixel myth” with many assuming that more pixels are better. However, the size of the sensor used to capture the image and the number of pixels in the sensor, together with the type of sensor all interplay to determine the best image. The number of megapixels claimed by the manufacturer rarely tells the whole story.

So where can we expect camera sensor technology to go in the future? Some believe they can’t get much better, there’s no Moores law for these sensors. Perhaps the future lies in software and the ability of a camera to deliver the image the user is seeing with their eyes. Will that be delivered by a software company like Apple through computational imaging and multiple lenses, Microsoft with curved sensors or by traditional camera manufacturers like Nikon or Leica with image stabilization higher ISO and curved sensors? We’ll have to wait and see… but not for long.

Sources and Further Reading

Tell Us What You Think

Do you have a review, update or anything you would like to add to this article?

Leave your feedback
Submit