Skip to content
Chicago Tribune
PUBLISHED: | UPDATED:
Getting your Trinity Audio player ready...

THE MENTAL picture most people have of high-speed, ultramodern supercomputers probably comes from the movies: impressive machines with flashing lights and tape reels whizzing and whirring as questions are fed into one end of a box and answers flow out the other.

As it turns out, real supercomputers don`t look like that at all–and they don`t work that way either.

There are no blinking lights or reels of tape, not even a box. The Cray X-MP/24 supercomputer churning away on the University of Illinois campus in Urbana is a bright red affair covered with vinyl at its base. It bears a striking resemblance to a sofa in an airport lobby.

You can even sit on it.

And the researchers don`t use the supercomputer to produce actuarial tables, to figure out how much income tax they owe or to do other problems that have specific answers.

The supercomputer is a tool for glimpsing aspects of reality that cannot be seen otherwise, a window to worlds whose very existence may boggle the mind of the uninitiated.

There are insights and suggestions, but no pat answers to be had in these exercises.

For example, scientists feed reams of data into the computer to make graphic representations of turbulent activity around black holes, those collapsed stars where gravity is so strong that even light cannot escape.

Because black holes aren`t visible to telescopes, graphic representations are a useful tool for researchers studying how they behave.

Or the supercomputer can chart how electrons flow through proteins, the fundamental units of biology, or through semiconductors, the building blocks of computers. Such pictures may someday help scientists design new products ranging from weedkillers to computer chips.

Engineers can use the supercomputer to create three-dimensional models of groundwater aquifers and study how pollution is likely to move through sand and soil deep underground. From this, they can get insights into how to avoid fouling vital underground water supplies.

The supercomputer`s unusual shape isn`t a whimsical attempt to dispute the vision of fiction; rather, it is a way to minimize the amount of wire needed to connect the circuit boards.

By placing the power sources as close to computation and memory equipment as possible and arranging the wires in cone-like configurations, the amount of wire needed to transport electrons from one of the machine`s 2,160 circuit boards to another is minimized.

In conventional computers, a few extra inches of wire here or there may not make much difference, but in a supercomputer, every millisecond an electron spends in transit cuts down efficiency noticeably. Even so, the supercomputer contains more than 50 miles of wire.

The Cray supercomputer was installed at the Urbana campus last fall after the U. of I. was designated by the National Science Foundation as one of five academic centers in the country where supercomputers would be made available to researchers doing basic science.

Supercomputers are super expensive. The Illinois machine cost $11 million and it is a used one, coming from Los Alamos National Laboratory in New Mexico, where it was used in weapons research.

Until now, the super-expensive supercomputers had been the domain of industrial researchers and those in government working on specific science projects, most of them in weapons development.

An infusion of more than $50 million of federal and state funds planned over the next five years in Illinois has attracted millions more in private gifts to the university, which will install another more powerful Cray computer this year.

Larry Smarr, a physics professor and director of the university`s supercomputer applications center, said the Cray is a magnet attracting scientists to Illinois. While on campus, they may mingle with colleagues from other universities in an atmosphere that has some trappings of a computer shopping center.

Not only do they learn about the supercomputer, but they also can learn about the latest advances in personal computers. Major computer companies such as IBM and Apple have donated a full variety of personal models for sampling on the Urbana campus.

Most researchers use the supercomputer as a back-up to standard personal computers.

In general, said Smarr, scientists are realists who attack only those problems that they have hopes of solving. Any problem that a computer can address could be performed by a skilled person with a pencil and paper. The difference is time.

The supercomputer takes problems that might tie up an ordinary computer for days and does them in an hour or two. Having a machine that can calculate so many things so quickly is motivating scientists to work on problems that would have been unthinkable a few years ago.

If it takes two years of working with a pencil to solve a problem, few researchers are likely to tackle it. But when a computer cuts that time to a few days, many may become interested.

An Illinois electrical engineer, Karl Hess, has written a program to simulate the motion of 50,000 electrons within crystals of gallium arsenide and aluminum arsenide, two substances that will likely play a major role in future computer chip manufacture.

It is possible to run the Hess program on a standard high-speed computer, but it takes about 100 hours. Unfortunately, it is difficult to run a computer for 100 hours without a breakdown of some kind, Hess noted. Thus, it could take weeks to get results.

The Cray can run the program in an hour.

Another user of the supercomputer is Peter Wolynes, a U. of I. chemistry professor, who harnesses its immense calculating powers to chart the probable paths an electron may take in traveling through the strands of a protein.

When an electron begins its journey, it can take any of infinite combinations of routes, with no route appearing more inviting than the others. This task is akin to mapping the probable route taken by a team of competitors in a scavenger hunt who must begin in New York City and end in Los Angeles, collecting their prizes from small towns in between. There is no correct answer, but some routes may be more promising than others.

In Wolynes` work, an infinite number of routes are available, but to simplify things, he asked the computer to look at only about 800,000 of them. As the Cray crunched numbers in the formulas with lightning speed, it became apparent that some routes are more probable than others.

By simulating electron behavior and producing colorful graphics to illustrate that behavior, chemists hope to gain insight into fundamental processes such as photosynthesis. It is abstract at this point, and very basic work that promises fruits the nature of which researchers can only speculate. ”This could give us ideas about how proteins could be redesigned,”

Wolynes said. Such proteins would be tailor-made to serve the ends of humans rather than the plants or animals in which they evolved.

Computer models may suggest directions for laboratory research that could result in drugs that are more powerful and specific–or herbicides that kill only a particular weed, with no other impact on the environment.

Albert Valocchi, a civil engineering professor at the U. of I., hopes to use the supercomputer to make three-dimensional models of how pollutants flow through soil when they leak from landfills.

The present two-dimensional models used for designing liners and caps for landfills are limited by the power of conventional computers and woefully inadequate, said Valocchi, although they are the best tool available at present. So many factors must be simplified or ignored that the models are unlikely to be accurate in predicting pollutant flow.

In the real world, even a relatively uniform soil such as one composed mostly of sand may have vastly different qualities of porosity at different locations, Valocchi said. An ideal model would include information about the quality of soil taken from several places, providing a cascade of numbers that would overwhelm a conventional computer.

”We`ve been using these two-dimensional computer models since the 1970s,” said Valocchi, ”and before that we did the work by hand. Using the supercomputer for three-dimensional modeling should give us a much better handle on what is really going on. We also want to look at complicated biological reactions with these models.”

The Cray supercomputer achieves its speed and high power for calculation by using the most advanced hardware available–so that the time it takes for electrons to speed through its circuitry is at a minimum.

There is a new approach to attaining greater speed and computing power, an approach that promises to be less costly than pushing hardware technology to its limits, and Illinois researchers are in the forefront of that effort.

The new strategy is to modify the traditional way computers solve problems by first dividing up the work into several small components and then giving each component to a separate processor for calculation.

Computer processing may be best understood by considering the problem that faces a grocery shopper with 100 items on his list. If he picks every item off the shelf and races to the checkout line himself, there are some inherent limits in how fast he can shop. They involve distances between grocery aisles, the dexterity of the checkout clerk and the like.

One way to cut the shopping time would be to delegate the list of 100 items to 100 shoppers with each responsible for finding and buying only one item in a store with 100 checkout lanes ready to serve them. If the list of items can be delegated efficiently and the 100 items assembled quickly once purchased, this approach could sidestep many time delays associated with a single shopper doing the work.

David Kuck, director of the Illinois center for supercomputing research and development, is using this second approach to computation, working on ways to use several processes to divide up complex problems and work on different aspects of them simulataneously.

Kuck predicts that by next year his team will have put together equipment that can match the Cray`s performance at a fraction of the cost.

By October, Kuck expects to have 16 processors harnessed together, working on problems and by the end of this year, 32 processors will be working together. Within a few years, it is expected, Kuck may have 128 processors working together.

A goal of Kuck`s research is to produce a supercomputer that will provide high-speed processing on all kinds of computing tasks and do it at an affordable cost, he said.

As Kuck`s work proceeds, some insights into more efficient computer programming may be incorporated into use of the Cray, said Smarr. The Cray in use has two processors and the one to be installed later will have four.

By efficient programming, users can get maximum use of these processors, Smarr said, although there are no plans to take the Cray apart and tinker with its hardware, as Kuck is doing with conventional high-speed computers in his lab.