Half a universe away, nature screams. Even the death throes of countless suns, even the spiralling infernos of vast galaxies, pale in comparison. Nothing else in the universe comes close to the power of a quasar. Matter is being crushed by the cartload into the mouth of a vast black hole; radiation is flooding forth across the cosmos.
The torrent of light issuing forth from the quasar is bent into sinuous paths by the gravity of stars and galaxies. Its photons are scattered by tiny particles floating in the near-vacuum of space or absorbed by great clouds of interstellar gas. Five billion years after the quasar screams, a few of those photons finally arrive at Earth. Pulled and stretched, torn and pummelled, the light from the quasar is so weak that it is invisible to the naked eye. Only the most modern observatories can make out the dim spot of the quasar against the far more mundane stars, galaxies and nebulae of the night sky. Nature's scream has become a whisper.
No matter how strong a signal is, the universe eventually pulls it apart. Noise distorts it and time attenuates it. This is not just something that happens in the wilds of space where everything is prey to the untamed forces of nature. Despite the best efforts of the telecommunications crowd, signals get noisy even in the carefully moderated world of a transatlantic cable. The thermal motion of the molecules in the cable, the competing signals of other phone calls along the same line, the confusing electrical currents in the house which is receiving the phone call: all these forces make it difficult to understand an incoming signal.
Which is why you need people like David Thomson. He's a senior scientist at Bell Labs, the research institute that used to be owned by AT&T, and which has done more than any other commercial organisation to sort out signals and noises. Thomson's mathematical insights have helped AT&T do this, but more recently he has been turning to nature for assignments.
The same principles that govern phone calls apply to all sorts of weakened, attenuated signals: the slow groaning of continental plates; the rumblings of the sun's belly; the screams of quasars; the cycle of the seasons. They have allowed Thomson to upset the experts in field after field by looking at data that everyone had thought empty and saying, "You know, there's a signal here." Because he has a different way of looking, he has found things other people have not. And it is not just an academic matter of quasars and continental plates. Thomson has found signals in the global climate record that no one else had ever dreamed of - and that may change the way scientists think about global warming.
Not the information infrastructure
With his grey-and-white beard, round wire glasses, beige corduroy trousers, worn, brown button-down jumper and tartan flannel shirt, the 54-year-old scientist does not look like a revolutionary. He resembles a kindly Amish grandfather dressed for a night on the town. His office is small, cluttered, pale yellow and dominated by huge metal filing cabinets covered with papers. There is little room on his desk for much besides two telephones and two computer terminals. Dilbert and Far Side cartoons dot the walls. A penguin on a calendar keeps lonely watch below an ancient clock.In the corner is a little plastic frame with a slice of steel tubing in it. It is one of the last remnants of the first problem Thomson attacked at Bell Labs - a problem that paved the way for all his future work on analysing signals. In 1966 it was quickly becoming obvious that the capacity of the coaxial cables which crisscrossed the United States would soon be insufficient to meet the ever-growing demand, even if the cable-layers worked 24 hours a day. AT&T needed a system that could carry more data.
It was a big project, and hugely important. A number of teams were set to work on different ideas for replacing the coaxial cables; the winning scheme was to become the backbone of the transcontinental data network. Thomson's team worked on a linear waveguide (essentially a thick metal tube encased in plastic) down which microwave signals could bounce with very little attenuation.
For such a waveguide to work properly, the metal tubes had to be machined with incredible precision, with tolerances so fine that they were nearly miraculous. "We were looking at mechanical tolerances that are about the same as people are achieving now with integrated circuits, except we were doing it on things which weighed several hundred pounds," he says with a chuckle.
And there was no "clean room" to do it in. "One of the problems with doing precision measurements in a factory was that there was a lot of dust and dirt and pigeons flying around," says Thomson. His only tool was a "mouse", a little mechanical gauge that measured the curvature and diameter of the pipe. As the waveguide emerged from the steel mill, he ran the mouse along its length and took readings to see whether the waveguide could do its job. But the signal was contaminated by noise: dust and droppings. So he had to find a way to "clean up" the signal, to ignore the noise.
To analyse a signal, you first have to cut it up; it's like an autopsy. The cutting tool is called a taper, and it is more like a pastry cutter than a delicate scalpel. There are hundreds of different tapers, many different shapes into which the pastry can be cut, but they all do damage to the signal. To clean up his data, Thomson needed to find the right sort of tapers, and more or less by chance he knew where to look.
In college, Thomson had had a job filing journals at the physics department. It was not a terribly onerous job. "You could take as long putting them on the shelves as you wanted, which meant you could thumb through them." He came across a paper written by David Slepian, Henry Landau and Henry Pollak in a technical journal. "I couldn't understand the proofs at that point, but I certainly remembered the results," says Thomson. "These magic functions had the best frequency resolution for a finite sample - they hit the Heisenberg uncertainty principle." Which means they provided a near perfect set of "pastry cutters".
As the project shot ahead, Thomson's techniques for dealing with messy data improved further and further, and once the team was able to ensure that its waveguides were manufactured properly, it got results. "I'm kind of amazed that nowadays people are impressed when somebody or other gets a T1 rate. We started going for real in '68 at a T4 rate [274 megabits per second] on 60 parallel channels - and that was just the start-up speed," he boasts with a smile.
But results are never enough. "It was an interesting engineering education. We came in ahead of schedule, under cost, and had double the performance we originally expected, and we still didn't win." The winner had built very high performance repeaters for the existing coaxial systems; a cheaper, quicker fix. By the time the patched system began to show its age, the fibre people had overtaken the waveguide group. "At the time we started, a good piece of fibre could get you from here to the filing cabinet," explains Thomson, gesturing. "The attenuation was just ridiculous." But the attenuation was dealt with, and a technology that was less efficient than a tin-can-and-string phone eventually became the backbone of the global information network.
The little metal tubes went away; the Slepian tapers did not. In 1977 Bell was testing a mobile phone network. The initial results were not good. "We had an experimental system in Newark, which had half-a-dozen simple cell sites," says Thomson, but "the error rate was high enough that they were seriously wondering whether it was commercially viable."
The problem was noisy signals. "When you're looking at a cellular phone you're moving around and the signal randomly bounces off buildings, trees, traffic; the sequence you receive data in is not quite the same as it was transmitted." Thomson, however, was able to correct the errors.
Until then, choosing a taper was still more art than science. But as cellular phones were becoming a reality, Thomson finally cracked the taper problem. "It suddenly dawned on me where the missing functions were and why they should be there," says Thomson. The missing functions had been worrying him for a while. There seemed to be a whole lot of potential tapers he couldn't quite put his finger on - and then he realised he was thinking about them in the wrong way.
If you're walking across London Bridge and you see a person wearing a bowler hat and carrying an umbrella, you might think, "Aha ... a banker!" In the same way, when a mathematician sees an integral equation with a certain shape, he thinks, "Aha ... a convolution!" And mathematicians love convolutions because they have a large toolbox for tinkering with them. But Thomson decided not to treat the equations he was using as convolutions. His approach was more like seeing the bowler hat and saying, "Aha ... a man on his way to work!" It is an approach that tells you less to begin with, but if you follow it up it tells you more in the end. Thomson promptly followed his integral equation on its way to work and, in the process, he generated the missing functions. In 1983, he published his results. For the first time in his technological career, the scientists sat up and took notice.
The Scripps Institution of Oceanography invited Thomson to teach a course in spectrum estimation (one of the most important parts of signal analysis, and the area where the missing functions were most needed). Fortunately, it was at a time when the cell phone project was on bureaucratic hold, so Thomson could afford to take the trip from his New Jersey base to the sunny Californian shores of La Jolla.
Thomson's course at Scripps was a bit of a hit. Jeffrey Park, now a professor at Yale University, was one of the mathematically inclined attendees who found that it changed his life. He has since used the techniques he learned there to great effect in his seismological research and other work. Perhaps the most spectacular demonstration of their power came when he used them to discern the tides in the Amazon 1,000 miles from the ocean.
Thomson had an impact on Scripps and vice versa. The little redwood buildingwhere Thomson lectured sat on the edge of a cliff overlooking the Pacific Ocean. The institute faculty dressed in their "best formal beach wear" and after lunch would change into swimwear. Best of all, no one cared about cell phones and waveguides. They were interested in measuring the temperatures of ancient seas, in quantifying changing climates, and now and then in less academic problems such as detecting enemy submarines.
"It changed the course of how I do things," explains Thomson. "Before I got there, I was pretty much a conventional electrical engineer and statistician." Soon, though, he was studying ancient climates. Just like cell phone transmission, this depends on analysing garbled signals. But the signals are entirely different; ratios of oxygen isotopes in layers of sea floor mud. "Lots of fish shit," says Thomson.
By the time he got back to New Jersey he was no longer a straightforward engineer. And the change came just in the nick of time. Within a year, Thomson was kicked out of the cell phone business. "I became unpopular with management," says Thomson, grinning. "I told them that their billing algorithm for mobiles invited service fraud." Even though Bell was taking great pains to protect the serial numbers of cellular phones, at the beginning of a transmission the phones broadcast the serial number in plaintext. Thomson thought this was "kind of daft", as a clever eavesdropper would be able to steal the serial number and get free cell phone service. His prediction turned out to be entirely correct, but it didn't earn him the love of his superiors.
And he made a worse mistake. Some people in management were keen on an alternate version of cellular phones, using a different broadcast band. "I went to the library and xeroxed a page from a journal, from 1947 or thereabouts, which had the identical system," said Thomson, grinning. He sarcastically asked whether he should replace the vacuum tubes with solid-state transistors. He was forced to flee the cell phone group.
Luckily, there was still a home for him in Bell Labs: the research group. "The thing about research is that you pretty much choose your own projects ... you work on something that's probably useful to the company as a general rule, but not all the time."
The research group was always a very eclectic bunch. Thomson worked with statisticians, mathematicians, a quantum physicist, an expert on neutrinos and purifying materials and several space physicists. Members of the group were always dragging Thomson into their projects, but much of his work still dealt with communications. One afternoon he designed the system which detects tones for millions of touch-tone phones, while on another project he found that temperature sensitivity was wreaking havoc with the reliability of lasers in transatlantic cables. But he kept looking for new problems in the scientific world, too.
Soon he was an ad hoc astrophysicist. As such he was following in a fine tradition, as Bell Labs' foremost pure science achievement was the discovery, when trying to work out why a big antenna was so noisy, of the reverberating hum of microwaves left over by the big bang. One problem which had been bothering astronomers had to do with the light coming from a quasar: as Thomson puts it, "a black hole eating lunch." The quasar in question, QSO 0957+561, is gravitationally lensed - that is, the gravity from a large clump of matter between us and it distorts some of the quasar's light. The result is that two beams of quasar light reach the earth, twisted by the lens to a greater and lesser degree. The light coming down the shorter of these paths gets here faster.
Quasars twinkle. By matching up distinctive twinkles, it should be possible to tell how much further the light taking the scenic route through the lens takes to reach the earth, but after com-ing halfway from the big bang, each beam is full of noise. Their unanalysed twinklings look nothing like each other when they reach the astronomer's telescope, so no one can tell how delayed the twinkles are; astronomers have come up with numbers anywhere between eleven days and nearly five years. Thomson burned through the noise. His estimate was that the long route took about 420 extra days for the light to go around the unknown mass.
Astronomers soon divided into two camps. Some believed in the upstart engineer's time-delay estimate, others believed in a longer delay - about 540 days. Luckily, an unusual event is settling the debate. In early 1995, the quasar suddenly became more than 10% fainter. Edwin Turner, an astron-omer based at Princeton, detected the drop and waited and observed. After 420 days, the intensity of the second image dropped.
"Most people seem to think that the data is confirming the 420-day delay," says Turner. Though astronomers can't be sure until the 540-day deadline passes, it appears that Thomson's signal methods have won the day.
There are also projects closer to home. The sun oscillates, like a ringing bell. For years, astronomers have mapped the pressure waves within it by looking at the pulsing of its surface. Another set of oscillations, the g-modes, have been much harder to see. In analysing data from the Ulysses space probe, Thomson found a signal just like the one expected from g-waves, not in the light from the sun, but in the solar wind of particles that blows off the sun's surface. No one expected the signal there, and not everyone is sure that Thomson is right. If he is, there's a lot of new theory to cook up.
Change of climate
Controversies about the sun and the quasars are one thing, but Thomson has also been working on one of the hottest of scientific problems: global warming. In 1990, a statistical journal had given Thomson a paper on carbon dioxide and global temperature to review. "I took the data and looked at it, and there was a flaw in the guy's argument," said Thomson. "But I'm a horrible pack rat, so I keep that sort of data around."Out of curiosity, he started using his more advanced methods on this and other temperature data, including records from England that were centuries old. Much to his surprise, he noticed a "crazy drift" in the signal. The seasons were moving.
The reason for the drift, he thinks, lies with two different ways of defining a year. One way, the one the calendar is based upon, has to do with the tilt of the earth. Because the earth stays tilted in one direction as it orbits the sun, the northern hemisphere gets more light during one half of the year, while the southern hemisphere gets more light during the other half. On two days a year, the northern and southern hemispheres get the same amount of light. These are the equinoxes - one comes in the spring and one in the autumn. A "tropical year" is the time between spring equinoxes.
There is another sort of year, though. The earth's orbit is not quite a circle. It is an ellipse, so the earth gets closer to and further away from the sun as it moves in its orbit. The point where the earth is closest to the sun is called the perihelion of the orbit. An anomalistic year - the time between perihelia - is 25 minutes longer than a tropical year.
The relationship between the two years and the climate is complex, but in many places, Thomson found, the seasons follow the anomalistic year, and so they drift, getting earlier by about two days or so a century. The result came with a neat little control point for checking. In September 1752, there was a jump in temperature data from England; it was as if the seasons had shifted by three days in a single bound. In fact, they had. In 1752, the Julian calendar gave way to the Gregorian calendar and eleven days disappeared. The data is averaged weekly, so taking out eleven days had a net effect of stuffing three days into the averages. The fact that Thomson's analysis showed a single, sharp three day jump that year shows just how sensitive it is.
But it is not the long term drift that excites Thomson, it is the recent drastic changes. Since 1940 or thereabouts, says Thomson, "Every phase curve you look at does something peculiar. Almost all of them do something crazy in the last half of the century - stranger than anything they've done in whatever amount of records you've been keeping." The seasons have begun to shift more and more quickly.
Most researchers examining the greenhouse effects of carbon dioxide concentrate on looking for changes in global mean temperature - a change in amplitude. What they see is still a matter of hot debate. There is a signal, but it is a weak one, and there is a body of opinion that says it has little to do with carbon dioxide.
By looking at the cycle of the seasons, rather than the average temperature, Thomson ended up looking for a shift in phase, not amplitude - something electronic engineers know is often much easier to find. He thinks that the reason he has found one is that when the northern half of the earth is tilted toward the sun, the greenhouse gases enhance the effect of direct sunlight, adding to the effect that the tilt has on the seasons. The tropical year starts beating the anomalistic year in the battle and the seasons begin starting earlier and earlier rather than later and later. The more carbon dioxide, he thinks, the more the seasons will drift.
If Thomson is right, the early onset of winter may soon take a deadly toll on plants and animals. "There's a bird which flies from the Caribbean to Canada with the expectation that its favourite insect will be there to eat. If the insects hatch at the wrong time, then the birds will starve." Nobody had noticed this before Thomson started talking about it, publishing a paper on it in Science last year.
However, many climate watchers dispute Thomson's findings. Gordon Emslie, chairman of the physics department at the University of Alabama at Huntsville, is sceptical. "My first reaction was surprise. That seasonal variation depends on light from the sun less than distance seemed a little strange," he says. "Out of curiosity, I got the raw data and we agreed, to three decimal places, up to 1940. After 1940, it didn't agree. I had no idea why." And the abrupt change at 1940? "Data fluctuations ... they just happen," says Emslie. The one beginning at 1940 is no more significant than others in the data series, he believes.
Thomson disagrees. "It is larger than anything else in the record." Even the cataclysmic eruption at Krakatoa did not make as much of a fluctuation in temperatures. "As far as I can tell from the data, there is very little escaping the fact that we are making [extremely significant] changes in the climate," observes Thomson. "I'm not personally very happy with these conclusions myself. However, if we have to change the way we do things, there's no point in playing ostrich about it."
As yet, though, the climatologists, like the astrophysicists, are divided over Thomson's work. "Consensus? There is none," says Emslie. "If it proves what you wanted to show all the time, you're going to support it ... or if you're like me, you're objective."
It is not difficult to understand why astrophysicists and climate scientists are so sceptical about an engineer entering their field, solving their problems, and leaving. They don't think the same - which is why Thomson, whatever he works on, will never really be one of them. He is more interested in his tools and their applications than in the nature of the problems, or the body of work behind current attempts to understand them.
Thomson's work reflects his tools. It is by those tools that his work will stand or fall. Man gazed at the stars for millennia before Galileo's telescope ushered in modern astronomy. Many laughed at van Leeuwenhoek's wee beasties, but these spawned the icroscope. David Thomson seems to have created a tool which, in his hands, is as powerful as the strongest radio telescope or the finest electron microscope. It may prove just as revolutionary - if he can get others to use it, too.
Charles Seife is a science journalist and quondam mathematician who lives in New Jersey.