Skip to content
Chicago Tribune
PUBLISHED: | UPDATED:
Getting your Trinity Audio player ready...

A major overhaul of America`s multibillion-dollar university-based research system is under way. And almost inevitably, regardless of November`s election results, relationships forged at the end of World War II among research universities, the federal government and industry soon will change fundamentally.

The shift might harness U.S. brainpower to compete better in the global economy, as proponents intend. Or it might destroy the Nobel Prize-winning basic-research juggernaut that has dominated science worldwide for more than 40 years, as many scientists fear.

But while virtually all would-be architects of science`s new order praise basic research and the value of studying things of no apparent practicality, they have a clearly stated goal: to erect a new research establishment that produces more readily useful discoveries to justify public expenditures on science.

The federal government`s two major sources of research funding, the National Institutes of Health and the National Science Foundation, have launched drives focusing their missions on more result-oriented research. In Congress, House and Senate committees have issued reports suggesting research funding be tied more closely to goals that benefit taxpayers.

In the private sector, more voices join the chorus, the latest coming last week from the Carnegie Commission in a report titled ”Enabling the Future: Linking Science and Technology to Societal Goals.”

The Cold War`s end, coming in the midst of America`s realization that international competitors threaten its living standard, has focused critical attention on a research infrastructure designed by Carnegie Institute President Vannevar Bush in 1945 at the request of President Franklin Roosevelt.

In proposing the National Science Foundation, Bush stressed the importance of fundamental research, which strives to expand knowledge, and the need to nurture it separately from science applied to defined goals.

In an era when Americans credited pure science with yielding the atomic bomb, which helped end World War II, Bush`s suggestions were carried out.

The arrangement was successful, but things change. Frank Press, president of the National Academy of Sciences, in spring said America is entering a

”post-(Vannevar) Bush” era of science policy. Few in Washington disagree.

Many scientists, especially university-based researchers, fear they are being penalized for American industry`s failure to develop and market new technologies. To them, things seem to be going from bad to worse.

Two years ago Leon Lederman, a Nobel laureate and now a physics professor at Illinois Institute of Technology, surveyed university researchers and found widespread unhappiness tied mostly to funding difficulties. While government research spending had increased steadily, Lederman noted, it hadn`t kept pace with growth in the number of scientists and the rising sophistication and costs of their tools.

Rather than inspire policymakers to dig deeper into their pockets for research, Lederman`s report caused many in government to regard professors as whiners and ingrates and to question whether the country has too many university-based scientists.

”The universities as a whole create clones of themselves,” said D. Allan Bromley, a Yale physicist who is President Bush`s science adviser. ”The educational focus is on graduate students who are encouraged to emulate their faculty advisers and become university researchers. They de-emphasize industrial science.”

An underlying problem, Bromley said, is that many of today`s college professors were graduate students in the 1960s, a ”golden age” of unprecedented prosperity, awash in federal research dollars flowing in response to Soviet space successes, starting with Sputnik.

”They think of the early `60s as the norm,” Bromley said. ”That isn`t the norm. We`re closer now to the norm.”

University researchers will continue basic research, he said, ”but they`ll have to sensitize their students to real-world challenges, to the opportunities of working for industry.”

Rep. George E. Brown Jr. (D-Calif.), chairman of the House Committee on Science, Space and Technology, wrote a report suggesting that research should be tied to national goals and that projects that don`t produce should lose funding.

”I`m a strong supporter of research purely for expanding human knowledge,” Brown said, ”but the basic researchers who fail to recognize that support can only come from a healthy, growing society are doing themselves a disservice.”

Perceived changes in the nature of science and researchers` economic demands push policymakers toward their new attitude.

Walter Massey, the former University of Chicago vice president who heads the National Science Foundation, said that his agency`s mission should be broadened because science is changing, and that economics isn`t his sole consideration.

As technology gives scientists tools to take on more ambitious inquiries, fundamental distinctions are blurred between science and technology, between basic and applied research or even between physics and biology, he said.

”We find that science and technology are becoming more and more coupled,” Massey said. ”When we support a project in surface chemistry, it`s only possible to do because a new microscope has been built, and that microscope is only possible because of work that led to new chips in developing the microscope.”

Few oppose bettering American life through technology, but the likelihood of tight federal budgets leads many to expect that the new era will mean university researchers will get less money, or at least will lose some control over what they study.

”It`s scary the way it`s happening,” said Robert L. Park, executive secretary of the American Physical Society. ”It appears they are looking to the NSF and NIH to make up for the failures of American enterprise.”

Park noted that Vannevar Bush had warned that demands of applied research tend to crowd out support for basic research. Park said Washington`s current inclination seems to be to let that happen.

What politicians don`t seem to understand, say Park and other scientists, is that science doesn`t usually respond well to crash programs such as building the atomic bomb or sending humans to the moon. Those projects worked because the foundation of basic knowledge was in place. When basic knowledge is lacking, crash programs fail.

Dr. Tom Feldbush, associate dean for research at Northwestern University`s medical school, said recent developments remind him of the Nixon administration`s ballyhooed war on cancer. Despite spending billions, cancer researchers achieved little, he said, because they lacked sufficient fundamental knowledge upon which to build.

When the government sets a specific goal and hands out money to scientists to work on attaining it, Feldbush said, ”it forces a whole lot of people to work in areas in which they are ill-prepared and in which there isn`t a lot of fundamental information available so you can make significant progress. It can actually result in funding second-rate research efforts.

”It creates unrealistic expectations. If you throw money at something, the expectation is you`re going to solve it. But then 20 years later, you still haven`t cured cancer, and people wonder why.”

Despite their misgivings, many university-based scientists realize they no longer can be aloof from industrial needs.

They cannot continue viewing themselves as ”high priests of pure knowledge” while looking down on industrial researchers as ”the devil at the door,” said James P. Womack, a scientist at Massachusetts Institute of Technology who studies global competition.

”Research universities are going to depend more and more upon industrial funding,” Womack said. ”They are going to learn that those who are being paid don`t get to lead the parade.”

With federal funding stagnant, many university scientists have embraced commercial enterprises to continue their research funding.

A decade ago Len Batterson, a Chicago-based venture capitalist, spent several months working with University of Chicago scientists who had developed herpes treatments that produced promising results in animal tests. But nothing came of his efforts to start a company to commercialize the work.

”They got cold feet,” he said. ”It just wasn`t something that university people were comfortable with then. They were afraid of their colleagues` reactions.”

Attitudes have changed, Batterson said, and he has helped start four Chicago-area companies based on research at universities and Argonne National Laboratory.

”In the last three or four years, things have changed substantially,”

he said. ”Now we`re welcomed with open arms.”

By focusing on universities and their societal responsibilities, the current spate of reports and reappraisals risks overlooking a major culprit in America`s failure to commercialize technology in recent decades. That is private industry`s shortsightedness.

George Heilmeier, president and chief executive officer of Bellcore, the research consortium of the Bell operating companies, made that point last week when he accepted the 1992 Founders Award of the National Academy of Engineering.

Heilmeier was honored for inventing liquid crystal displays, or LCDs, which have become important parts of some laptop computers, small watches, calculators and other electronic products.

Although his staff produced prototype devices by 1968, his work hit a dead end when U.S. corporations couldn`t see a commercial use for it. Japanese engineers picked up the technology and turned it into a commercial bonanza.

Heilmeier said no government strategies will help an industry become competitive unless managers of American enterprises are innovative and tenacious enough to take new technology as far as necessary to achieve a business success.