The Creativity Conceit

article image
image by Adam Larson

This article is part of a package on creativity. For more, read “The Future of Creativity,” “Why Essays Are So Damn Boring,” “Bright Ideas from Baltimore’s Citizens,” “Art + Science= Inspiration,” and “Putting the Arts Back into the Arts.”

Almost everything the Apple computer company sells these days comes with the following statement of origin: “Designed by Apple in California, Assembled in China.”

The implication is obvious: A few brilliant, creative Americans did the real work, while low-skilled Chinese assembly workers, laboring in serflike conditions, did the rest. Citing Apple’s iPod at a Virginia trade conference last year, former U.S. Treasury secretary John Snow commented, “China gets to do what they do well: low-value manufacturing. America gets to do what we do well: return on intellectual capital. It’s good for both of us, but I would rather be on our end of that.”

Such talk panders to one of the most consequential illusions of contemporary American economic thought: that by dint of its unique creativity alone, the United States can count on remaining the world economy’s top dog. This assumption, shared by intellectuals on both sides of the U.S. political divide, goes a long way toward explaining the electorate’s relative apathy about the collapse of America’s manufacturing sector. As the Harvard-educated Japan historian Ivan P. Hall points out, it is just “smug ethnocentric American complacency–little more than whistling in the dark.”

Let’s first dispose of the misconception that America’s “culture of freedom” is a crucial advantage in innovation. Of course, absent a basic level of freedom, creativity does not flourish. But the bar is set quite low. None of the most inventive cultures of antiquity–China, Mesopotamia, or Egypt–counted as a civil liberties utopia. Nearer our own time, Nazi Germany, fascist-era Japan, and the old Soviet Union all displayed considerable inventiveness.

The lesson of history is that if America’s maximalist concept of individual freedom is a factor at all, it is hardly decisive. All the evidence shows that something else is much more important: money.

The wealthier a society is, the more inventive it tends to be. Just ask any of the thousands of brilliant Western European scientists and engineers who, in a phenomenon known as brain drain, began emigrating to the United States in the 1950s. They were not seeking freedom. They had that already. Rather, they wanted to work with the most advanced equipment and the largest research budgets. (Where relative economic laggards have sometimes punched above their weight–say, Japan in the 1930s or the Soviet Union in the 1950s–government leaders have gone out of their way to provide teams of handpicked scientists and engineers with massive support.)

Three centuries before Christ, the Chinese invented the magnetic compass. Contemporary Northern European hunter-gatherers could never have made such a breakthrough. They may have been equally brilliant, and they no doubt enjoyed greater liberty, but they lacked the advanced materials and knowledge already available to the much more affluent Chinese.

Similar factors explain the extraordinary inventiveness of the Muslim world during Europe’s Dark Ages. The Arabs were then one of the world’s richest peoples, and their craftsmen routinely worked with the rarest and most advanced materials. Their familiarity with glass-making techniques, for instance, helps explain why it was the Muslim polymath Abbas Ibn Firnas whom some credit with inventing eyeglasses in the ninth century.

It is hardly news that the United States has been in relative economic decline since the 1960s. What’s less obvious is that America has been losing relative position in inventiveness almost as fast. The correlation is not an accident. As other nations have prospered, they’ve spent more on educating scientists and engineers and put more of them to work on technol­ogy’s cutting edge.

For several years, Japan has dedicated more of its workforce and its gross domestic product to research and development than the United States. What’s more, while much of what passes for R&D in the United States now consists of lightweight activities such as website building and software customization, the Japanese focus their technological efforts on building a competitive advantage in export industries.

The Europeans have been leaping ahead in Big Science. The trend is expected to be highlighted this year with the opening of Europe’s $5 billion Large Hadron Collider. Located on the Swiss-French border, it will be by far the world’s largest energy particle accelerator. A proposed American response, the International Linear Collider, will be largely funded by Japan–so heavily that it may well be located on Japanese soil.

America’s era of greatest innovation was the 1930s through the 1960s. In the 1930s alone, American inventions included nylon, the helicopter, the electron microscope, and the automated teller machine. Then in the 1940s came the bazooka, the atomic bomb, the microwave oven, and the transistor. The 1950s brought the nuclear reactor, industrial diamonds, the computer hard drive, the integrated circuit, the videocassette recorder, and the communications satellite; in the 1960s the laser, the computer mouse, and light-emitting diodes followed.

Of course, the flow of significant American breakthroughs didn’t stop in 1970. American leadership has become increasingly attenuated, however. Although Americans played a key role in developing both personal computers and cell phones, for instance, these innovations were rather predictable refinements of earlier devices.

The story has been similar in liquid crystal displays. While scientists from the United States, Japan, Britain, and Switzerland have all made significant contributions, commercialization has been led by the Japanese. In a related development, the Japanese claim most of the credit for creating high-definition television, despite a much publicized but short-lived intervention by Zenith and General Instrument in the early 1990s.

If America’s declining technological prowess has been little publicized in the United States, the trade figures are indisputable. In a 2005 report to the U.S.-China Economic and Security Review Commission, technology-policy analysts Pat Choate and Edward Miller summed up the point in their definition of a China Sphere, a region encompassing not only mainland China but also the wider Confucian world from Vietnam to Japan. As of 2004, the China Sphere already enjoyed a $60 billion surplus in technological trade with the United States–a divide that grows with each passing year.

“The United States’ economy is so large and powerful, and its scientific and technological leadership has long been so overwhelming, that the nation could ignore potential technology-based flaws, traps, and dangers,” Miller and Choate commented. “But that era is quickly ending.”

Due to a legacy of isolationism that cut the region off from outside intellectual influences, East Asia was slow to enter the technology race. Now it’s becoming the world’s technological center of gravity. When the region began opening up, government leaders insisted that the first duty of leading scientists was not to win Nobel prizes but to build national economic muscle–and to do so mainly by overtaking the West in advanced manufacturing. Throughout the region, career incentives have been structured to ensure that the most brilliant scientists go into industry rather than universities or public research institutes.

This brings us back to the untold story behind Apple’s statement of origin. Although the company is correct in stating that its products are assembled in China, this sidesteps the real question: Where are the components made?

One key part of the iPod is a miniaturized hard drive that is made by Toshiba of Tokyo, and it constitutes a disproportionately large share of the total manufacturing cost. In terms of employment, the real winner has not been California, where Apple’s design department has created negligible employment opportunities. Nor has it been China, where assembly workers are paid a pittance. Rather, it has been in the highly capital-intensive manufacturing facilities of Japan, where factory workers enjoy some of the highest wages in world manufacturing–and their employers enjoy a healthy trade surplus.

Eamonn Fingleton is the author of In the Jaws of the Dragon: America’s Fate in the Coming Era of Chinese Hegemony (Thomas Dunne Books, 2008). This piece is excerpted from the American Conservative(Nov. 5, 2007), a journal of “old conservative” ideas, cofounded by Scott McConnell and Pat Buchanan in 2002. Subscriptions: $29.95/yr. (24 issues) from Box 9030, Maple Shade, NJ 08052;

In-depth coverage of eye-opening issues that affect your life.