We’re so proud of our dog-eat-dog world that we fail to notice that it’s not
Nice guys finish last. Survival of the fittest. Eat or be eaten. For Americans, such catchphrases strike familiar chords. Stemming from an unwieldy synthesis of social Darwinism and (until recently) trendy Chicago School economics, this ethos claims that mercilessly competitive conditions weed out the weak while preserving and enhancing the strongest members of an institution, a market, or a civilization. Roughness and ruthlessness render us more competitive, thicker-skinned, and simply better than the rest of the pack.
When this belief system bleeds over into the realm of political discourse, it transmogrifies into a paradoxical badge of honor, a disposition toward sink-or-swim hard-heartedness. “The public be damned!” William H. Vanderbilt famously told a reporter who asked the 19th-century tycoon about social responsibility. His sentiments can still be heard today, couched in PR-friendly euphemisms or offered as hearty retorts to the soft communitarianism of Scandinavia, Continental Europe, and Canada.
Of course, there have been dissenting voices. Progressive leaders in charge of the New Deal exemplified this spirit in the early 20th century. But dissenters have rarely questioned the premise of social Darwinism itself.
Even in the wake of the 2008 economic meltdown, opposition to the “cachet of the cutthroat” is generally confined to ethical qualms about the suffering and personal cost imposed on hard-pressed individuals and families—deploring the scale of the misery, rather than addressing its roots.
Across the ideological spectrum, prevailing wisdom holds that institutionalized harshness generates a more productive, adaptive, and wealthy society, with “liberalism” left to debate merely whether the resulting human collateral damage is an acceptable cost of doing business. Although moral objections are clearly relevant, the most devastating counterargument to the cachet of the cutthroat is that it is simply wrong.
If there is any lesson to be gleaned from the churn of society’s various “isms,” it is that both nature and human communities are too complex to reduce to a single, linear theory. So it is with the curious history of social Darwinism.
The work of the 19th-century naturalists Charles Darwin and Alfred Russel Wallace led to a coherent theory of evolution by natural selection, culminating in Darwin’s 1859 publication of On the Origin of Species. Neither cast his work as a recommendation for the complex, fast-changing domain of human societies, populated by intelligent members whose conduct was guided by ethical codes.
Others were not so circumspect. The term “survival of the fittest” was coined in 1864 by a contemporary of Darwin, British sociologist Herbert Spencer. Even before Darwin’s work, Spencer had ascribed a degree of moral rectitude to the callous workings of nature, which included, in his mind, the harsh inequalities of Victorian society.
In Darwin’s work, Spencer found a naturalistic scaffolding upon which to buttress his interpretations of social organization, and, in so doing, converted Darwin’s descriptive observations into prescriptive arguments about an ideal society.
Others followed, in no small part because social Darwinism would prove convenient for a number of vested interests. Many of the era’s industrial magnates, imperial officials, and landed aristocrats were more than willing to overlook its shortcomings. Vanderbilt was only the first; robber barons like Jay Gould and James Fisk followed his lead, while an army of subservient politicians, lawyers, and even preachers espoused social Darwinism across the country.
Substandard wages, child labor, monopoly capitalism—all were justified by an appeal to the “survival of the fittest.”
The push toward social Darwinism was momentarily blunted by late-19th- and early-20th-century reformers and human rights conventions. But as historical memory ebbed, so did the intellectual bulwarks against the predatory excesses that had brought on the Depression. Fewer and fewer in the postwar era questioned the traditional rendition of a cruel natural world underlying our own.
Following the collapse of the Soviet Union in 1991, the United States emerged as the world’s unquestioned economic and military hegemon. In this context, “Greed is good”—the narcissistic tagline uttered by corporate raider Gordon Gekko, played by Michael Douglas, in the 1987 film Wall Street—became a rallying cry.
By the mid-2000s, this philosophy spilled over into other arenas: the office, the media, courtrooms, and athletic pursuits. If only the strongest would survive, then all was indeed fair in this most “competitive” of marketplaces. In September 2008, however, the collapse of Lehman Brothers ignited a maelstrom of financial turmoil. The ensuing cascade of calamities stretched itself out over many more painful months.
Under stress, the social Darwinist economic model had delivered a worst-of-all-worlds debacle: a system lacking in compassion during an arduous crisis, yet unable to demonstrate the resilient wealth creation it supposedly promised in return. Adam Smith himself warned of the effects of such an economy: “No society can surely be flourishing and happy, of which the far greater part of the members are poor and miserable.”
Yet even in post-crisis 2010, the forces of reform are weak, and we seem to be moving toward Smith’s dystopia. Rather than try, try again, we’d do well to consider the alternative: that social-Darwinist systems fail, and fail on their own terms.
Fundamentally, the cachet of the cutthroat fosters all the wrong incentives, and without regulatory networks and transparency to manage such raw impulses—which Adam Smith himself underscored—five deadly incentives are reinforced. First, it rewards unscrupulous behavior. If quasi-legal bilking of customers can yield rapid profits, then more constructive paths are bypassed.
Second, it too easily equates profits to generated wealth, ignoring the fact that a company can rake in enormous profits without contributing real goods and services. Far too many health insurers, for example, make obscene profits by denying care rather than providing services. Social Darwinist systems also stifle constructive criticism and creative thinking. In a cutthroat workplace, even those who calmly report design flaws are penalized, because they disrupt the quick rollout of short-term-profit-maximizing products.
Fourth, it makes human beings into commodities, with ruinous effects on morale. And finally, it promotes short-termism, the most pernicious incentive. Social Darwinism compels an obsession with easily quantifiable, immediate metrics of success. Clear-cutting an ancient forest (for development) would be rewarded in an “efficient market” for yielding quick profits, while ignoring less-quantifiable damages (to the local ecology) that would far outweigh the initial gains.
The ravages of short-termism were illustrated with sobering clarity in a late 2009 Harvard Business Review forum titled “Is the U.S. Killing Its Innovation Machine?” Featuring nearly two dozen accomplished panelists, the forum noted that U.S. high-tech companies—driven by unflinching demands for cost cutting—have outsourced even high-end processes, so much so that entire sectors of engineering and computer science effectively lack home-grown expertise. Maintaining leadership in technology and manufacturing requires precisely the sort of long-term investments—in basic R&D and in trained professionals—that cannot be quantified on a balance sheet.
Yet compassion and competitiveness can go hand in hand. Scandinavian countries—mostly resource-poor nations with high social investment—illustrate this most cogently, ranking highly in per-capita GDP, business competitiveness, and high-tech production. Likewise, Germany—with similar policies and resource scarcity—has retained its manufacturing strength and rivals China as the top exporter, despite having barely 6 percent of the latter’s population.
In these countries, carefully regulated capital markets and tax credits—which promote professional development, ecological sustainability, and high-tech job creation—provide incentives for scientific innovation, R&D, and their societies’ general fund of knowledge.
Ultimately, social Darwinism fails in practice because it never succeeded as a theory. When the great naturalist outlined a mechanism of natural selection so cold and selfish, he never meant it to go further. Darwin professed a Socratic ignorance about the nature of human morality and sympathy, which he recognized as seemingly outside the system of natural selection and yet somehow born of it.
In recent years, however, work in both the biological and social sciences has indicated that traits like compassion and empathy are elemental to the wiring of animal nervous systems, and thus to their behavior and evolutionary change. Dutch primatologist Frans de Waal, author of popular books such as The Age of Empathy, has outlined how “soft” characteristics augment evolutionary fitness in a population—as demonstrated in robust ape communities whose members diligently tend to their sick and wounded.
Meanwhile, industry-oriented literature—such as Kristin Tillquist’s Capitalizing on Kindness and William F. Baker’s Leading with Kindness—has extended the theme to human systems, by countering the business-as-war metaphor.
Most of this literature is recent and the case is far from settled, but there is a common thread streaming throughout: As an association of individuals attains higher orders of complexity, so does the corresponding need to ensure trust and solidarity among its constituents. In such a state, the callous precepts of social Darwinism—which may be adaptive in simplistic competitive scenarios—become fundamentally maladaptive. Dog-eat-dog behavior increasingly poisons the more complex levels of organization.
These principles are of far more than mere academic interest; they are pivotal to guiding the real world of fiscal and public policy amid the tumultuous uncertainties of the 21st-century economy. Our system’s zero-sum adversarialism has reached a disastrous endpoint: suffocated by ideological polarization, fruitless partisan bickering, stifling parliamentary obstacles, and the iron grip of moneyed interests.
The result, whichever party grasps the reins, is a fractious, dysfunctional U.S. institutional paralysis, incapable of tackling the fine-grained nuances of 21st-century public policy without collapsing into ham-fisted ideological quarrels.
Subsidized university tuition and high-quality public schools, universal child and health care, job-creating public works projects and infrastructure spending, government-sponsored research and conservation efforts, carefully managed unemployment and social safety nets—all of these are misleadingly cast as crude liberal-conservative battlegrounds, with the outcome interpreted to be benefiting one coalition or another.
In reality, such policies foster the cohesion and resilience that enable a country to weather economic storms, free up its citizens’ talents for creative and entrepreneurial endeavors, and emerge as a more competitive and self-reliant entity. In the 20th century, it was a Republican president, Theodore Roosevelt, who first understood this paradoxical intricacy with his trust-busting, regulating, and conservation efforts, and his pragmatist successors—FDR, Truman, and Eisenhower—spanned both parties.
Today, it is our global peers that have best managed to reconcile the seeming contradiction of a competitive society steeped in compassion. Commentators across the U.S. political spectrum mistakenly regard Europe as a bastion of “leftism” or “socialism.” But there is a more fundamental dynamic to the Old World’s progressive social policies, poorly appreciated in the New World: They are motivated as much by a competitive drive as they are by a longing for social justice.
Japan also has recognized this juxtaposition, as has a resurging (and politically evolving) China. These ancient lands of West and East, buffeted by centuries of rebellions and revolutions, understand that a predatory aristocracy will inevitably devour itself amid the fury of a ravaged populace bereft of basic human dignities and the promise of social mobility. It is a lesson that we, as a dynamic nation that is still inexperienced in historical terms, would be wise to heed.
J. Wes Ulm is a physician-researcher from Harvard Medical School and author of the forthcoming novel The Leibniz Demon. Excerpted from Democracy (Spring 2010), a quarterly “journal of ideas” that helps build a vibrant progressivism for the 21st century. www.democracyjournal.org