The Miracle of Digital Imaging
Four decades ago, two physicists at Bell Laboratories stumbled upon a means of capturing light and turning it into data with a small piece of silicon, an invention that would revolutionize our daily lives
At about five o’clock one morning this past October, the retired physicist Willard S. Boyle received a scientist’s ultimate wake-up call. At first he couldn’t bestir himself—who could be calling at this ungodly hour?—but the phone was insistent, so his wife dragged herself out of bed.
A couple of minutes later, she was shaking him awake. “Stockholm is calling.”
Somebody’s kidding, Boyle thought. But he went to the phone. “There was this woman with a magnificent Swedish accent,” he recalled in a recent telephone interview with the Nobelprize.org Web site. “And I suddenly thought, well, nobody is going to have gone to all that trouble just to fool us at five o’clock in the morning!”
No, it wasn’t a joke. The dulcet voice from Stockholm announced that the 2009 Nobel Prize in Physics would be shared by Willard S. Boyle and his former partner from Bell Labs, George E. Smith, for their invention of the charge-coupled device (CCD), and by Charles K. Kao for fiber-optic communications technology.
The Nobel Prize in Physics tends to be an esoteric matter for most people. Not only does it usually go to individuals unknown to the general public, but it recognizes accomplishments that only physicists can begin to comprehend—things like the discovery of “giant magnetoresistance” (2007) or “broken symmetry” (2008). Sometimes the prize is awarded for something that everyone has heard of but few people understand or use (the laser, 1964), or to a scientist everyone has heard of, but not for the thing for which he is best known (Albert Einstein for the photoelectric effect, 1921).
In 2009, however, it went to three men for achievements that have, without the slightest exaggeration, touched and changed the lives of everyone living in the modern world. You’re benefiting from Kao’s work every time you use the Internet, make a phone call, or watch cable TV, and, more likely than not, you’re actually carrying Boyle and Smith’s work in your pocket—at least if you’ve got a digital camera, video camcorder, or camera-equipped cell phone.
The heart of all those devices is the CCD, a stamp-sized square of silicon that captures light and, by turning it into data, makes possible all the miracles of digital imaging. Even if you don’t own any CCD-equipped devices, you encounter them every day in barcode readers at the store, in photocopiers and fax machines, and even in the security cameras that are a ubiquitous part of 21st-century American existence. And if you’ve ever admired the profound beauty of an image from the Hubble Space Telescope, or for that matter almost any astronomical picture produced by a ground or space-based telescope in the past several decades, your awe springs from Boyle and Smith’s CCD.
This invention, which has so completely revolutionized photography, videography, and all forms of image creation, was created four decades ago for an entirely different purpose. Then as now, improving and enhancing the storage and manipulation of digital data in computers and other devices was a perennial quest, and Boyle and Smith were pursuing new approaches, at perhaps the best place in the world to do so.
From the day it was founded in 1925, Bell Laboratories has been at the vanguard of cutting-edge research and innovation in the fields of electronics and communications. It was here that Karl Jansky discovered radio waves from outer space in 1932; that John Bardeen, Walter Brattain, and William Shockley created the transistor in 1947; that the laser was first conceived in 1957; and that Arno Penzias and Robert Wilson serendipitously uncovered the microwave echoes of the big bang in 1965. “The atmosphere set up by the management was very conducive to people being creative,” remembers Boyle: a high-powered, competitive, yet freewheeling environment where no idea was too wild to be considered and no gadget was too bizarre to be built.
One autumn afternoon in 1969 at Bell’s headquarters in Murray Hill, New Jersey, physicist George E. Smith strolled into the office of his colleague, Boyle. These two experts in semiconductor physics had been working on a new technology involving metal-oxide semiconductors, but their boss, Jack Morton, head of advanced research at Bell, was pointedly challenging them to come up with something to compete with the other electronics research departments, where a concept called magnetic bubble memory was all the rage. Could the semiconductor boys somehow make that work? Or come up with some other even better idea? If not, Morton warned, he was going to have to cut the money for their projects and give it to the magnetic bubble guys.
So after Smith, who was a department head, had dealt with a few minor administrative matters with Boyle, executive director of the semiconductor device development division, they began brainstorming. “Let’s invent something,” Smith suggested. After spending some time scribbling equations and diagrams and schematics on Boyle’s blackboard, the two came up with one of those concepts that seemed so simple in retrospect that many later wondered why they hadn’t thought of it first: storing and moving electrical charges around on a silicon chip.
“In a discussion lasting not more than an hour, the basic structure of the CCD was sketched out on the blackboard, the principles of operation defined, and some preliminary ideas concerning applications were developed,” Smith recalled in a recent interview. It didn’t take long before they realized that the hypothetical CCD could do more than simply store data: it could create images, thanks to the phenomenon called the photoelectric effect.
Contrary to popular belief, Albert Einstein won his Nobel Prize in 1921 not for his work on relativity but for his explanation of this manifestation of quantum theory. When photons hit the atoms of certain materials, each of these atoms gives off an electron, in essence generating an electric flow. The well-known effect had already been used for years in phototubes, “electric eyes,” and solar cells. “In the old days it was done with some kind of a vacuum tube. You might have some kind of special coating on a piece of metal and then a high voltage, and then light would hit it, and it would release electrons and you would collect it,” explains Tim Madden, an electronics engineer at Argonne National Laboratory. But unlike those familiar devices, the CCD doesn’t use the effect to produce current.
Instead, the silicon surface of the CCD is divided into a grid of millions of separate cells or pixels, each sensitive to light. The cells are essentially capacitors, holders of electrical charge. When photons strike the cells, electrons are released in the silicon in numbers proportional to photon input—the more photons, the more electrons set free. When a voltage is applied to a row of the CCD array, each cell’s “bucket” of charge is passed along to the next cell in line, like pails of water in a bucket brigade, until it reaches the edge of the CCD, where it is read out and the amount of charge in each cell is registered. “So each pixel is kind of like a bucket of charge, and it just gets transferred from pixel to pixel over to the edge,” says Madden. It’s this passing through or “coupling” of electrical charge among the cells that gives the CCD its name.
With the charge of each pixel reduced to a string of digital data, the precise pattern of light captured by the entire CCD can be reconstructed almost perfectly. The efficiency of the charge transfer across the CCD is remarkably high, usually well over 99 percent, so few if any data are lost. The charge can be stored indefinitely, not to mention endlessly modified or enhanced by computer tweaking of the data. To register color images, filters in front of the CCD (and in line with the lens usually needed to focus light onto it) control the intensity of the different colors that it detects, information also recorded as part of the entire image data set.
Not long after that afternoon brainstorming session, Boyle and Smith had the technicians at Bell work up a prototype, which performed exactly as they had predicted. After they publicly announced their invention in 1970—demonstrating, among other things, the use of a CCD in a video camera—“all hell broke loose,” Boyle remembers. Soon CCDs became the hottest thing in the semiconductor world, with engineers and scientists at Bell and elsewhere rushing to develop and perfect the device for myriad applications. Fairchild Electronics became the first to enter the commercial market.
Astronomers immediately realized the enormous potential of digital imaging and enthusiastically embraced the CCD. With their science entirely dependent on collecting and analyzing light, the advent of a light-detecting device hundreds and eventually thousands of times more sensitive than the most advanced photographic film was nothing less than a revolution. Even the most sensitive chemical film emulsion might capture only a handful of the photons hurtling in from a distant galaxy, while a CCD could register 90 percent of them. “[CCDs] are so incredibly sensitive you can pick up individual photons,” says Madden. That precision gives astronomers the ability to obtain pictures of unprecedented clarity and quality without the hours-long exposures formerly necessary with photographic film.
The first astronomical images captured by CCDs, which were taken in 1974, hardly needed that kind of sensitivity: they were pictures of our familiar, nearby Moon. But CCD technology rapidly became the standard imaging tool in astronomy, not only because of its sensitivity but also because of its versatility. CCDs can “see” not only in visible light but in every other part of the electromagnetic spectrum. The power to digitally manipulate and analyze imagery has also proven invaluable. Although it’s true that some professional astronomers still operate telescopes inside darkened observatory domes on cold mountaintops, they’re no longer peering into eyepieces in metal cages high above the floor but sitting comfortably in warm offices, gazing at computer screens displaying images collected by the CCD arrays inside their telescopes.
Madden offers a telling example of the CCD’s exquisite sensitivity: “One of the CCD cameras that we built here we put in a dark room, all the lights out, all the doors shut, so if you went in there it’d be pitch black, you couldn’t see your hand in front of your face. We took a picture of this pitch-black room, and what we saw was sort of a grainy picture of the room. You can detect individual photons. If there were more light in there you would basically see a picture of the room like a regular photograph, but a CCD is so sensitive that single light photons are detectable. That’s what’s really remarkable about it.”
While astronomy might be the main beneficiary of such extreme light sensitivity, the CCD’s attributes make it indispensable to other applications as well. In medical imaging, it’s used in diagnostic radiology and to provide real-time pictures during surgical, endoscopic, and other procedures (many of which also use the optical fiber technology of 2009’s third physics Nobelist, Charles Kao). Biologists and chemists obtain data on individual cells and molecules with advanced microscopic techniques such as X-ray crystallography.
And, of course, there are the everyday commercial applications: our cameras, camcorders, webcams, and iPhones. (Some consumer electronic devices, such as cell phones, now use a newer technology called CMOS, for complementary metal-oxide semiconductor. While cheaper to manufacture and much less energy intensive than CCDs, CMOS chips are also much less sensitive and produce lower-quality images.) Although photography has been generally available for about a century, it’s hardly a stretch to assert that the unprecedented versatility, flexibility, and ubiquity of digital imaging technology is continuing to change our world in countless ways, from YouTube videos and cell phone pictures to the ability to touch up, change, or completely alter images through software such as Photoshop. Commercial and personal photography has become almost wholly digital, as the technology’s continuing and ever increasing sophistication has nullified practically all the lingering aesthetic and technical arguments in favor of film.
Because of the CCD, pictures have become commodified information to be stored, manipulated, and moved from place to place through the Internet or other communication technologies, created and distributed at the speed of light with the ease of a straying thought. The positive effects of this development (the ability of private citizens to capture newsworthy events on camera) and the negative (goofy YouTube videos and embarrassing cell-phone shots), utterly new in human history, are still being debated and determined. But digital images are not going to fade like Kodachrome slides—they’re here to stay.
The idea of magnetic memory bubbles burst not long after the invention of the CCD, supplanted by hard-disk and random-access technology and the other memory devices that have since been invented. Now 85, Boyle retired from Bell in 1979 and returned to his native Nova Scotia, while Smith, now 79, went on to sail the world for 17 years after his own retirement in 1986, finally returning to his home in New Jersey. Throughout the years they’ve received numerous awards and accolades for their work, but although their names had been mentioned more than once for an eventual Nobel (particularly given Bell Labs’ extraordinary score of seven such awards), that particular honor had always eluded them. As Boyle says, “Just because so many people around us had been winning Nobel Prizes, we sort of said, ‘Well, our stuff is reasonable. It’s possible, you know, we’ll get something here, I don’t know!’ We hoped for the best, but nothing had happened and so we’d pretty well dismissed it.”
That all changed forever on this past October morning. But even if the Swedish Academy had continued to overlook their achievement, the place of Willard Boyle and George Smith in history would be secure. The product of their skull session on that fall day in 1969 has fundamentally changed the way we see each other, our world, and the universe.