What the modern world owes slavery
This article originally appeared at TomDispatch.
Many in the United States were outraged by the remarks of conservative evangelical preacher Pat Robertson, who blamed Haiti’s catastrophic 2010 earthquake on Haitians for selling their souls to Satan. Bodies were still being pulled from the rubble—as many as 300,000 died—when Robertson went on TV and gave his viewing audience a little history lesson: the Haitians had been "under the heel of the French" but they "got together and swore a pact to the devil. They said, 'We will serve you if you will get us free from the French.' True story. And so, the devil said, 'OK, it's a deal.'"
A supremely callous example of right-wing idiocy? Absolutely. Yet in his own kooky way, Robertson was also onto something. Haitians did, in fact, swear a pact with the devil for their freedom. Only Beelzebub arrived smelling not of sulfur, but of Parisian cologne.
Haitian slaves began to throw off the “heel of the French” in 1791, when they rose up and, after bitter years of fighting, eventually declared themselves free. Their French masters, however, refused to accept Haitian independence. The island, after all, had been an extremely profitable sugar producer, and so Paris offered Haiti a choice: compensate slave owners for lost property—their slaves (that is, themselves)—or face its imperial wrath. The fledgling nation was forced to finance this payout with usurious loans from French banks. As late as 1940, 80% of the government budget was still going to service this debt.
In the on-again, off-again debate that has taken place in the United States over the years about paying reparations for slavery, opponents of the idea insist that there is no precedent for such a proposal. But there is. It’s just that what was being paid was reparations-in-reverse, which has a venerable pedigree. After the War of 1812 between Great Britain and the U.S., London reimbursed southern planters more than a million dollars for having encouraged their slaves to run away in wartime. Within the United Kingdom, the British government also paid a small fortune to British slave owners, including the ancestors of Britain’s current Prime Minister, David Cameron, to compensate for abolition (which Adam Hochschild calculated in his 2005 book Bury the Chains to be “an amount equal to roughly 40% of the national budget then, and to about $2.2 billion today”).
Advocates of reparations—made to the descendants of enslaved peoples, not to their owners—tend to calculate the amount due based on the negative impact of slavery. They want to redress either unpaid wages during the slave period or injustices that took place after formal abolition (including debt servitude and exclusion from the benefits extended to the white working class by the New Deal). According to one estimate, for instance, 222,505,049 hours of forced labor were performed by slaves between 1619 and 1865, when slavery was ended.
Compounded at interest and calculated in today’s currency, this adds up to trillions of dollars.
But back pay is, in reality, the least of it. The modern world owes its very existence to slavery.
Voyage of the Blind
Consider, for example, the way the advancement of medical knowledge was paid for with the lives of slaves.
The death rate on the trans-Atlantic voyage to the New World was staggeringly high. Slave ships, however, were more than floating tombs. They were floating laboratories, offering researchers a chance to examine the course of diseases in fairly controlled, quarantined environments. Doctors and medical researchers could take advantage of high mortality rates to identify a bewildering number of symptoms, classify them into diseases, and hypothesize about their causes.
Corps of doctors tended to slave ports up and down the Atlantic seaboard. Some of them were committed to relieving suffering; others were simply looking for ways to make the slave system more profitable. In either case, they identified types of fevers, learned how to decrease mortality and increase fertility, experimented with how much water was needed for optimum numbers of slaves to survive on a diet of salted fish and beef jerky, and identified the best ratio of caloric intake to labor hours. Priceless epidemiological information on a range of diseases—malaria, smallpox, yellow fever, dysentery, typhoid, cholera, and so on—was gleaned from the bodies of the dying and the dead.
When slaves couldn’t be kept alive, their autopsied bodies still provided useful information. Of course, as the writer Harriet Washington has demonstrated in her stunning Medical Apartheid, such experimentation continued long after slavery ended: in the 1940s, one doctor said that the “future of the Negro lies more in the research laboratory than in the schools.” As late as the 1960s, another researcher, reminiscing in a speech given at Tulane Medical School, said that it was “cheaper to use Niggers than cats because they were everywhere and cheap experimental animals.”
Medical knowledge slowly filtered out of the slave industry into broader communities, since slavers made no proprietary claims on the techniques or data that came from treating their slaves. For instance, an epidemic of blindness that broke out in 1819 on the French slaver Rôdeur, which had sailed from Bonny Island in the Niger Delta with about 72 slaves on board, helped eye doctors identify the causes, patterns, and symptoms of what is today known as trachoma.
The disease first appeared on the Rôdeur not long after it set sail, initially in the hold among the slaves and then on deck. In the end, it blinded all the voyagers except one member of the crew. According to a passenger’s account, sightless sailors worked under the direction of that single man “like machines” tied to the captain with a thick rope. “We were blind—stone blind, drifting like a wreck upon the ocean,” he recalled. Some of the sailors went mad and tried to drink themselves to death. Others retired to their hammocks, immobilized. Each “lived in a little dark world of his own, peopled by shadows and phantasms. We did not see the ship, nor the heavens, nor the sea, nor the faces of our comrades.”
But they could still hear the cries of the blinded slaves in the hold.
This went on for 10 days, through storms and calms, until the voyagers heard the sound of another ship. The Spanish slaver San León had drifted alongside the Rôdeur. But the entire crew and all the slaves of that ship, too, had been blinded. When the sailors of each vessel realized this “horrible coincidence,” they fell into a silence “like that of death.” Eventually, the San León drifted away and was never heard from again.
The Rôdeur’s one seeing mate managed to pilot the ship to Guadeloupe, an island in the Caribbean. By now, a few of the crew, including the captain, had regained some of their vision. But 39 of the Africans hadn’t. So before entering the harbor the captain decided to drown them, tying weights to their legs and throwing them overboard. The ship was insured and their loss would be covered: the practice of insuring slaves and slave ships meant that slavers weighed the benefits of a dead slave versus living labor and acted accordingly.
Events on the Rôdeur caught the attention of Sébastien Guillié, chief of medicine at Paris’s Royal Institute for Blind Youth. He wrote up his findings—which included a discussion of the disease’s symptoms, the manner in which it spread, and best treatment options—and published them in Bibliothèque Ophtalmologique, which was then cited in other medical journals as well as in an 1846 U.S. textbook, A Manual of the Diseases of the Eye.
Slaves spurred forward medicine in other ways, too. Africans, for instance, were the primary victims of smallpox in the New World and were also indispensable to its eradication. In the early 1800s, Spain ordered that all its American subjects be vaccinated against the disease, but didn’t provide enough money to carry out such an ambitious campaign. So doctors turned to the one institution that already reached across the far-flung Spanish Empire: slavery. They transported the live smallpox vaccine in the arms of Africans being moved along slave routes as cargo from one city to another to be sold: doctors chose one slave from a consignment, made a small incision in his or her arm, and inserted the vaccine (a mixture of lymph and pus containing the cowpox virus). A few days after the slaves set out on their journey, pustules would appear in the arm where the incision had been made, providing the material to perform the procedure on yet another slave in the lot—and then another and another until the consignment reached its destination. Thus the smallpox vaccine was disseminated through Spanish America, saving countless lives.
Slavery’s Great Schism
In 1945, Allied troops marched into the first of the Nazi death camps. What they saw inside, many have remarked, forced a radical break in the West’s moral imagination. The Nazi genocide of Jews, one scholar has written, is history’s “black hole,” swallowing up all the theological, ethical, and philosophical certainties that had earlier existed.
Yet before there was the Holocaust, there was slavery, an institution that also transformed the West’s collective consciousness, as I’ve tried to show in my new book, The Empire of Necessity: Slavery, Freedom, and Deception in the New World.
Take, for example, the case of the Joaquín, a Portuguese frigate that left Mozambique in late 1803 with 301 enslaved East Africans. Nearly six months later, when a port surgeon opened the ship’s hatch in Montevideo, Uruguay, he was sickened by what he saw: only 31 bone-thin survivors in a foul, bare room, otherwise empty save for hundreds of unused shackles.
City officials convened a commission of inquiry to explain the deaths of the other 270 slaves, calling on the expertise of five surgeons—two British doctors, a Spaniard, a Swiss Italian, and one from the United States. The doctors testified that before boarding the Joaquín, the captives would have felt extreme anguish, having already been forced to survive on roots and bugs until arriving on the African coast emaciated and with their stomachs distended. Then, once on the ocean, crowded into a dark hold with no ventilation, they would have had nothing to do other than listen to the cries of their companions and the clanking of their chains. Many would have gone mad trying to make sense of their situation, trying to ponder “the imponderable.” The surgeons decided that the East Africans had died from dehydration and chronic diarrhea, aggravated by the physical and psychological hardships of slavery—from, that is, what they called “nostalgia,” “melancholia,” and “cisma,” a Spanish word that loosely means brooding or mourning.
The collective opinion of the five surgeons—who represented the state of medical knowledge in the U.S., Great Britain, and Spain—reveals the way slavery helped in what might be called the disenchanting of medicine. In it you can see how doctors dealing with the slave trade began taking concepts like melancholia out of the hands of priests, poets, and philosophers and giving them actual medical meaning.
Prior to the arrival of the Joaquín in Montevideo, for instance, the Royal Spanish Academy was still associating melancholia with actual nighttime demon possession. Cisma literally meant schism, a theological concept Spaniards used to refer to the spiritual split personality of fallen man. The doctors investigating the Joaquín, however, used these concepts in a decidedly secular, matter-of-fact manner and in ways that unmistakably affirmed the humanity of slaves. To diagnose enslaved Africans as suffering from nostalgia and melancholia was to acknowledge that they had selves that could be lost, inner lives that could suffer schism or alienation, and pasts over which they could mourn.
Two decades after the incident involving the Joaquín, the Spanish medical profession no longer thought melancholia to be caused by an incubus, but considered it a type of delirium, often related to seasickness. Medical dictionaries would later describe the condition in terms similar to those used by critics of the Middle Passage—as caused by rancid food, too close contact, extreme weather, and above all the “isolation” and “uniform and monotonous life” one experiences at sea. As to nostalgia, one Spanish dictionary came to define it as “a violent desire compelling those taken out of their country to return home.”
It was as if each time a doctor threw back a slave hatch to reveal the human-made horrors below, it became a little bit more difficult to blame mental illness on demons.
In the case of the Joaquín, however, the doctors didn’t extend the logic of their own reasoning to the slave trade and condemn it. Instead, they focused on the hardships of the Middle Passage as a technical concern. “It is in the interests of commerce and humanity,” said the Connecticut-born, Edinburgh-educated John Redhead, “to get slaves off their ships as soon as possible.”
Follow the Money
Slavery transformed other fields of knowledge as well. For instance, centuries of buying and selling human beings, of shipping them across oceans and continents, of defending, excoriating, or trying reform the practice, revolutionized both Christianity and secular law, giving rise to what we think of as modern human rights law.
In the realm of economics, the importance of slaves went well beyond the wealth generated from their uncompensated labor. Slavery was the flywheel on which America’s market revolution turned—not just in the United States, but in all of the Americas.
Starting in the 1770s, Spain began to deregulate the slave trade, hoping to establish what merchants, not mincing any words, called a “free trade in blacks.” Decades before slavery exploded in the United States (following the War of 1812 with Great Britain), the slave population increased dramatically in Spanish America. Enslaved Africans and African Americans slaughtered cattle and sheared wool on the pampas of Argentina, spun cotton and wove clothing in textile workshops in Mexico City, and planted coffee in the mountains outside Bogotá. They fermented grapes for wine at the foot of the Andes and boiled Peruvian sugar to make candy. In Guayaquil, Ecuador, enslaved shipwrights built cargo vessels that were used for carrying more slaves from Africa to Montevideo. Throughout the thriving cities of mainland Spanish America, slaves worked, often for wages, as laborers, bakers, brick makers, liverymen, cobblers, carpenters, tanners, smiths, rag pickers, cooks, and servants.
It wasn’t just their labor that spurred the commercialization of society. The driving of more and more slaves inland and across the continent, the opening up of new slave routes and the expansion of old ones, tied hinterland markets together and created local circuits of finance and trade. Enslaved peoples were investments (purchased and then rented out as laborers), credit (used to secure loans), property, commodities, and capital, making them an odd mix of abstract and concrete value. Collateral for loans and items for speculation, slaves were also objects of nostalgia, mementos of a fading aristocratic world even as they served as the coin for the creation of a new commercialized one.
Slaves literally made money: working in Lima’s mint, they trampled quicksilver into ore with their bare feet, pressing toxic mercury into their bloodstream in order to amalgamate the silver used for coins. And they were money -- at least in a way. It wasn’t that the value of individual slaves was standardized in relation to currency, but that slaves were quite literally the standard. When appraisers calculated the value of any given hacienda, or estate, slaves usually accounted for over half of its worth; they were, that is, much more valuable than inanimate capital goods like tools and millworks.
In the United States, scholars have demonstrated that profit wasn’t made just from southerners selling the cotton that slaves picked or the cane they cut. Slavery was central to the establishment of the industries that today dominate the U.S. economy: finance, insurance, and real estate. And historian Caitlan Rosenthal has shown how Caribbean slave plantations helped pioneer “accounting and management tools, including depreciation and standardized efficiency metrics, to manage their land and their slaves” —techniques that were then used in northern factories.
Slavery, as the historian Lorenzo Green argued half a century ago, “formed the very basis of the economic life of New England: about it revolved, and on it depended, most of her other industries.” Fathers grew wealthy building slave ships or selling fish, clothing, and shoes to slave islands in the Caribbean; when they died, they left their money to sons who “built factories, chartered banks, incorporated canal and railroad enterprises, invested in government securities, and speculated in new financial instruments.” In due course, they donated to build libraries, lecture halls, botanical gardens, and universities, as Craig Steven Wilder has revealed in his new book, Ebony and Ivy.
In Great Britain, historians have demonstrated how the “reparations” paid to slave-owning families “fuelled industry and the development of merchant banks and marine insurance, and how it was used to build country houses and to amass art collections.”
Follow the money, as the saying goes, and you don’t even have to move very far along the financial trail to begin to see the wealth and knowledge amassed through slavery. To this day, it remains all around us, in our museums, courts, places of learning and worship, and doctors’ offices. Even the tony clothier, Brooks Brothers (founded in New York in 1818), got its start selling coarse slave clothing to southern plantations. It now describes itself as an “institution that has shaped the American style of dress.”
Fever Dreams and the Bleached Bones of the Dead
In the United States, the reparations debate faded away with the 2008 election of Barack Obama—except as an idea that continues to haunt the fever dreams of the right-wing imagination. A significant part of the backlash against the president is driven by the fantasy that he is presiding over a radical redistribution of wealth—think of all those free cell phones that the Drudge Report says he’s handing out to African Americans!—part of a stealth plan to carry out reparations by any means possible.
“What they don't know,” said Rush Limbaugh shortly after Obama’s inauguration, “is that Obama's entire economic program is reparations.” The conservative National Legal Policy Center recently raised the specter of “slavery reparations courts” —Black Jacobin tribunals presided over by the likes of Jessie Jackson, Louis Farrakhan, Al Sharpton, and Russell Simmons and empowered to levy a $50,000 tax on every white “man, woman, and child in this country.” It’s time to rescue the discussion of reparations from the swamp of talk radio and the comment sections of the conservative blogosphere.
The idea that slavery made the modern world is not new, though it seems that every generation has to rediscover that truth anew. Almost a century ago, in 1915, W.E.B Du Bois wrote, “Raphael painted, Luther preached, Corneille wrote, and Milton sung; and through it all, for four hundred years, the dark captives wound to the sea amid the bleaching bones of the dead; for four hundred years the sharks followed the scurrying ships; for four hundred years America was strewn with the living and dying millions of a transplanted race; for four hundred years Ethiopia stretched forth her hands unto God.”
How would we calculate the value of what we today would call the intellectual property—in medicine and other fields—generated by slavery’s suffering? I’m not sure. But a revival of efforts to do so would be a step toward reckoning with slavery’s true legacy: our modern world.
TomDispatch regular Greg Grandin’s new book, The Empire of Necessity: Slavery, Freedom, and Deception in the New World, has just been published.
Copyright 2014 Greg Grandin
Image held in Public Domain.
Anatomy of a thug state.
This article originally appeared at TomDispatch.
Here, at least, is a place to start: intelligence officials have weighed in with an estimate of just how many secret files National Security Agency contractor Edward Snowden took with him when he headed for Hong Kong last June. Brace yourself: 1.7 million. At least they claim that as the number he or his web crawler accessed before he left town. Let’s assume for a moment that it’s accurate and add a caveat. Whatever he had with him on those thumb drives when he left the agency, Edward Snowden did not take all the NSA’s classified documents. Not by a long shot. He only downloaded a portion of them. We don’t have any idea what percentage, but assumedly millions of NSA secret documents did not get the Snowden treatment.
Such figures should stagger us and what he did take will undoubtedly occupy journalists for months or years more (and historians long after that). Keep this in mind, however: the NSA is only one of 17 intelligence outfits in what is called the U.S. Intelligence Community. Some of the others are as large and well funded, and all of them generate their own troves of secret documents, undoubtedly stretching into the many millions.
And keep something else in mind: that’s just intelligence agencies. If you’re thinking about the full sweep of our national security state (NSS), you also have to include places like the Department of Homeland Security, the Energy Department (responsible for the U.S. nuclear arsenal), and the Pentagon. In other words, we’re talking about the kind of secret documentation that an army of journalists, researchers, and historians wouldn’t have a hope of getting through, not in a century.
We do know that, in 2011, the whole government reportedly classified 92,064,862 documents. If accurate and reasonably typical, that means, in the twenty-first century, the NSS has already generated hundreds of millions of documents that could not be read by an American without a security clearance. Of those, thanks to one man (via various journalists), we have had access to a tiny percentage of perhaps 1.7 million of them. Or put another way, you, the voter, the taxpayer, the citizen—in what we still like to think of as a democracy—are automatically excluded from knowing or learning about most of what the national security state does in your name. That’s unless, of course, its officials decide to selectively cherry-pick information they feel you are capable of safely and securely absorbing, or an Edward Snowden releases documents to the world over the bitter protests, death threats, and teeth gnashing of Washington officialdom and retired versions of the same.
Summoned From the Id of the National Security State
So far, even among critics, the debate about what to make of Snowden’s act has generally focused on “balance”; that is, on what’s the right equilibrium between an obvious governmental need for secrecy, the security of the country, and an American urge for privacy, freedom, and transparency—for knowing, among other things, what your government is actually doing. Such a framework (“a meaningful balance between privacy and security”) has proven a relatively comfortable one for Washington, which doesn't mind focusing on the supposedly knotty question of how to define the “limits” of secrecy and whistle-blowing and what “reforms” are needed to bring the two into line. In the present context, however, such a debate seems laughable, if not absurd.
After all, it’s clear from the numbers alone that the urge to envelop the national security state in a blanket of secrecy, to shield its workings from the eyes of its citizens (as well as allies and enemies) has proven essentially boundless, as have the secret ambitions of those running that state. There is no way, at present, to limit the governmental urge for secrecy even in minimal ways, certainly not via secret courts or congressional committees implicated and entangled in the processes of a secret system.
In the face of such boundlessness, perhaps the words “whistleblower” and “leaker” —both traditionally referring to bounded and focused activities—are no longer useful. Though we may not yet have a word to describe what Chelsea (once Bradley) Manning, Julian Assange, and Edward Snowden have done, we should probably stop calling them whistleblowers. Perhaps they should instead be considered the creations of an overweening national security state, summoned by us from its id (so to speak) to act as a counterforce to its ambitions. Imagine them as representing the societal unconscious. Only in this way can we explain the boundlessness of their acts. After all, such massive document appropriations are inconceivable without a secret state endlessly in the process of documenting its own darkness.
One thing is for certain, though no one thinks to say it: despite their staggering releases of insider information, when it comes to the true nature and extent of the NSS, we surely remain in the dark. In the feeling that, thanks to Manning and Snowden, we now grasp the depths of that secret state, its secret acts, and the secret documentation that goes with it, we are undoubtedly deluded.
In a sense, valuable as they have been, Snowden’s revelations have helped promote this delusion. In a way that hasn’t happened since the Watergate era of the 1970s, they have given us the feeling that a curtain has finally, definitively been pulled back on the true nature of the Washington system. Behind that curtain, we have indeed glimpsed a global-surveillance-state-in-the-making of astounding scope, reach, and technological proficiency, whose ambitions (and successes), even when not always fully achieved, should take our breath away. And yet while this is accurate enough, it leads us to believe that we now know a great deal about the secret world of Washington. This is an illusion.
Even if we knew what was in all of those 1.7 million NSA documents, they are a drop in the bucket. As of now, we have the revelations of one (marginal) insider who stepped out of the shadows to tell us about part of what a single intelligence agency documented about its own activities. The resulting global debate, controversy, anger, and discussion, Snowden has said, represents “mission accomplished” for him. But it shouldn’t be considered mission accomplished for the rest of us.
In Praise of Darkness, the Dangers of Sunshine
To gain a reasonable picture of our national security state, five, 10, 20 Snowdens, each at a different agency or outfit, would have to step out of the shadows—and that would just be for starters. Then we would need a media that was ready to roll and a Congress not wrapped in “security” and “secrecy” but demanding answers, as the Church committee did in the Watergate era, with subpoenas in hand (and the threat of prison for no-shows and perjurers).
Yes, we may have access to basic information about what the NSA has been up to, but remind me: What exactly do you know about the doings of the Pentagon’s Defense Intelligence Agency, with its 16,500 employees, which has in recent years embarked on “an ambitious plan to assemble an espionage network that rivals the CIA in size”? How about the National Geospatial-Intelligence Agency, with its 16,000 employees, its post-9/11 headquarters (price tag: $1.8 billion) and its control over our system of spy satellites eternally prowling the planetary skies?
The answer is no more than you would have known about the NSA if Snowden hadn’t acted as he did. And by the way, what do you really know about the FBI, which now, among other things, issues thousands of national security letters a year (16,511 in 2011 alone), an unknown number of them for terror investigations? Since their recipients are muzzled from discussing them, we know next to nothing about them or what the Bureau is actually doing. And how’s your info on the CIA, which takes $4 billion more out of the intelligence “black budget” than the NSA, runs its own private wars, and has even organized its own privatized corps of spies as part of the general expansion of U.S. intelligence and espionage abroad? The answer on all of the above is—has to be—remarkably little.
Or take something basic like that old-fashioned, low-tech form of surveillance: government informers and agents provocateurs. They were commonplace in the 1960s and early 1970s within every oppositional movement. So many decades later, they are with us again. Thanks to the ACLU, which has mapped scattered reports on situations in which informers made it into at least the local news nationwide, we know that they became part of what anti-war movements existed, slipped into various aspects of the Occupy movement, and have run riot in local Muslim-American communities. We know as well that these informers come from a wide range of outfits, including the local police, the military, and the FBI. However, if we know a great deal about NSA snooping and surveillance, we have just about no inside information on the extent of old-style informing, surveilling, and provoking.
One thing couldn’t be clearer, though: the mania for secrecy has grown tremendously in the Obama years. On entering the Oval Office in 2009, Obama proclaimed a sunshine administration dedicated to “openness” and “transparency.” That announcement now drips with irony. If you want a measure of the kind of secrecy the NSS considers proper and the White House condones these days, check out a recent Los Angeles Times piece on the CIA’s drone assassination program (one of the more overt aspects of Washington’s covert world).
That paper recently reported that Chairman of the Senate Armed Services Committee Carl Levin held a “joint classified hearing” with the Senate Intelligence Committee on the CIA, the Pentagon, and their drone campaigns against terror suspects in the backlands of the planet. There was just one catch: CIA officials normally testify only before the House and Senate intelligence committees. In this case, the White House “refused to provide the necessary security clearances for members of the House and Senate armed services committees.” As a result, it would not let CIA witnesses appear before Levin. Officials, reported the Times, “had little appetite for briefing the 26 senators and 62 House members who sit on the armed services committees on the CIA's most sensitive operations.” Sunshine, in other words, is considered potentially dangerous, even in tiny doses, even in Congress.
A Cult of Government Secrecy
In evaluating what may lie behind the many curtains of Washington, history does offer us a small hand. Thanks to the revelations of the 1970s, including a Snowden-style break-in by antiwar activists at an FBI office in Media, Pennsylvania, in 1971, that opened a window into the Bureau’s acts of illegality, some now-famous reporting, and the thorough work of the Church committee in the Senate, we have a sense of the enormity of what the U.S. national security state was capable of once enveloped in a penumbra of secrecy (even if, in that era, the accompanying technology could do so much less). In the Johnson and Nixon years, as we now know, the FBI, the CIA, the NSA, and other acronymic outfits committed a staggering range of misdeeds, provocations, and crimes.
It’s easy to say that post-Watergate “reforms” made such acts a thing of the past. Unfortunately, there’s no reason to believe that. In fact, the nature of that era’s reforms should be reconsidered. After all, one particularly important Congressional response of that moment was to create the Foreign Intelligence Surveillance Court, essentially a judiciary for the secret world which would generate a significant body of law that no American outside the NSS could see.
The irony is again overwhelming. After the shocking headlines, the congressional inquiries, the impeachment proceedings, the ending of two presidencies—one by resignation—and everything else, including black bag jobs, break-ins, buggings, attempted beatings, blackmail, massive spying and surveillance, and provocations of every sort, the answer was a secret court. Its judges, appointed by the chief justice of the Supreme Court alone, are charged with ruling after hearing only one side of any case involving a governmental desire to snoop or pry or surveil. Unsurprisingly enough, over the three and a half decades of its existence, the court proved a willing rubber stamp for just about any urge of the national security state.
In retrospect, this remedy for widespread government illegality clearly was just another step in the institutionalization of a secret world that looks increasingly like an Orwellian nightmare. In creating the FISA court, Congress functionally took the seat-of-the-pants, extra-Constitutional, extra-legal acts of the Nixon era and put them under the rule of (secret) law.
Today, in the wake of, among other things, the rampant extra-legality of the Global War on Terror—including the setting up of a secret, extrajudicial global prison system of “black sites” where rampant torture and abuse were carried to the point of death, illegal kidnappings of terror suspects off global streets and their rendition to the prisons of torture regimes, and the assassination-by-drone of American citizens backed by Justice Department legalisms—it’s clear that NSS officials feel they have near total impunity when it comes to whatever they want to do. (Not that their secret acts often turn out as planned or particularly well in the real world.) They know that nothing they do, however egregious, will be brought before an open court of law and prosecuted. While the rest of us remain inside the legal system, they exist in “post-legal America.” Now, the president claims that he’s preparing a new set of “reforms” to bring this system under check and back in balance. Watch out!
If tomorrow a series of Edward Snowdens were to appear, each from a different intelligence agency or other outfit in the national security state, one thing would be guaranteed: the shock of the NSA revelations would be multiplied many times over. Protected from the law by a spreading cult of government secrecy, beyond the reach of the citizenry, Congress, or the aboveground judicial system, supported by the White House and a body of developing secret law, knowing that no act undertaken in the name of American “safety” and “security” will ever be prosecuted, the inhabitants of our secret state have been moving in dark and disturbing ways. What we know is already disturbing enough. What we don’t know would surely unnerve us far more.
Shadow government has conquered 21st century Washington. We have the makings of a thug state of the first order.
Tom Engelhardt, a co-founder of the American Empire Project and author of The United States of Fear as well as a history of the Cold War, The End of Victory Culture, runs the Nation Institute's TomDispatch.com. His latest book, co-authored with Nick Turse, is Terminator Planet: The First History of Drone Warfare, 2001-2050.
Copyright 2014 Tom Engelhardt
Image by Trevor Paglen, licensed under Creative Commons.
Why are we arming Israel to the teeth?
This article originally appeared at TomDispatch.
We Americans have funny notions about foreign aid. Recent polls show that, on average, we believe 28 percent of the federal budget is eaten up by it, and that, in a time of austerity, this gigantic bite of the budget should be cut back to 10 percent. In actual fact, barely 1 percent of the federal budget goes to foreign aid of any kind.
In this case, however, truth is at least as strange as fiction. Consider that the top recipient of U.S. foreign aid over the past three decades isn’t some impoverished land filled with starving kids, but a wealthy nation with a per-head gross domestic product on par with the European Union average, and higher than that of Italy, Spain, or South Korea.
Consider also that this top recipient of such aid—nearly all of it military since 2008—has been busily engaged in what looks like a 19th-style colonization project. In the late 1940s, our beneficiary expelled some 700,000 indigenous people from the land it was claiming. In 1967, our client seized some contiguous pieces of real estate and ever since has been colonizing these territories with nearly 650,000 of its own people. It has divided the conquered lands with myriad checkpoints and roads accessible only to the colonizers and is building a 440-mile wall around (and cutting into) the conquered territory, creating a geography of control that violates international law.
“Ethnic cleansing” is a harsh term, but apt for a situation in which people are driven out of their homes and lands because they are not of the right tribe. Though many will balk at leveling this charge against Israel—for that country is, of course, the top recipient of American aid and especially military largesse—who would hesitate to use the term if, in a mirror-image world, all of this were being inflicted on Israeli Jews?
Military Aid to Israel
Arming and bankrolling a wealthy nation acting in this way may, on its face, seem like terrible policy. Yet American aid has been flowing to Israel in ever greater quantities. Over the past 60 years, in fact, Israel has absorbed close to a quarter-trillion dollars in such aid. Last year alone, Washington sent some $3.1 billion in military aid, supplemented by allocations for collaborative military research and joint training exercises.
Overall, the United States covers nearly one quarter of Israel’s defense budget—from tear gas canisters to F-16 fighter jets. In their 2008-2009 assault on Gaza, the Israeli Defense Forces made use of M-92 and M-84 “dumb bombs,” Paveway II and JDAM guided “smart bombs,” AH-64 Apache attack helicopters equipped with AGM-114 Hellfire guided missiles, M141 “bunker defeat” munitions, and special weapons like M825A1 155mm white phosphorous munitions—all supplied as American foreign aid. (Uniquely among Washington’s aid recipients, Israel is also permitted to spend 25 percent of the military funding from Washington on weapons made by its own weapons industry.)
Why is Washington doing this? The most common answer is the simplest: Israel is Washington’s “ally.” But the United States has dozens of allies around the world, none of which are subsidized in anything like this fashion by American taxpayer dollars. As there is no formal treaty alliance between the two nations and given the lopsided nature of the costs and benefits of this relationship, a far more accurate term for Israel’s tie to Washington might be “client state.”
And not a particularly loyal client either. If massive military aid is supposed to give Washington leverage over Israel (as it normally does in client-state relationships), it is difficult to detect. In case you hadn’t noticed, rare is the American diplomatic visit to Israel that is not greeted with an in-your-face announcement of intensified colonization of Palestinian territory, euphemistically called “settlement expansion.”
Washington also provides aid to Palestine totaling, on average, $875 million annually in Obama’s first term (more than double what George W. Bush gave in his second term). That’s a little more than a quarter of what Israel gets. Much of it goes to projects of dubious net value like the development of irrigation networks at a moment when the Israelis are destroying Palestinian cisterns and wells elsewhere in the West Bank. Another significant part of that funding goes toward training the Palestinian security forces. Known as “Dayton forces” (after the American general, Keith Dayton, who led their training from 2005 to 2010), these troops have a grim human rights record that includes acts of torture, as Dayton himself has admitted. One former Dayton deputy, an American colonel, described these security forces to al-Jazeera as an outsourced "third Israeli security arm." According to Josh Ruebner, national advocacy director for the U.S. Campaign to End the Occupation and author of Shattered Hopes: Obama’s Failure to Broker Israeli-Palestinian Peace, American aid to Palestine serves mainly to entrench the Israeli occupation.
A Dishonest Broker
Nothing is equal when it comes to Israelis and Palestinians in the West Bank, East Jerusalem, and the Gaza Strip—and the numbers say it all. To offer just one example, the death toll from Operation Cast Lead, Israel’s 2008-2009 assault on the Gaza Strip, was 1,385 Palestinians (the majority of them civilians) and 13 Israelis, three of them civilians.
And yet mainstream opinion in the U.S. insists on seeing the two parties as essentially equal. Harold Koh, former dean of the Yale Law School and until recently the top lawyer at the State Department, has been typical in comparing Washington’s role to “adult supervision” of “a playground populated by warring switchblade gangs.” It was a particularly odd choice of metaphors, given that one side is equipped with small arms and rockets of varying sophistication, the other with nuclear weapons and a state-of-the-art modern military subsidized by the world’s only superpower.
Washington’s active role in all of this is not lost on anyone on the world stage—except Americans, who have declared themselves to be the even-handed arbiters of a conflict involving endless failed efforts at brokering a “peace process.” Globally, fewer and fewer observers believe in this fiction of Washington as a benevolent bystander rather than a participant heavily implicated in a humanitarian crisis. In 2012, the widely respected International Crisis Group described the “peace process” as “a collective addiction that serves all manner of needs, reaching an agreement no longer being the main one.”
The contradiction between military and diplomatic support for one party in the conflict and the pretense of neutrality cannot be explained away. “Looked at objectively, it can be argued that American diplomatic efforts in the Middle East have, if anything, made achieving peace between Palestinians and Israelis more difficult,” writes Rashid Khalidi, a historian at Columbia University, and author of Brokers of Deceit: How the U.S. Has Undermined Peace in the Middle East.
American policy elites are unable or unwilling to talk about Washington’s destructive role in this situation. There is plenty of discussion about a one-state versus a two-state solution, constant disapproval of Palestinian violence, occasional mild criticism (“not helpful”) of the Israeli settlements, and lately, a lively debate about the global boycott, divestment, and sanction movement (BDS) led by Palestinian civil society to pressure Israel into a “just and lasting” peace. But when it comes to what Americans are most responsible for—all that lavish military aid and diplomatic cover for one side only—what you get is either euphemism or an evasive silence.
In general, the American media tends to treat our arming of Israel as part of the natural order of the universe, as beyond question as the force of gravity. Even the “quality” media shies away from any discussion of Washington’s real role in fueling the Israel-Palestine conflict. Last month, for instance, the New York Times ran an article about a prospective “post-American” Middle East without any mention of Washington’s aid to Israel, or for that matter to Egypt, or the Fifth Fleet parked in Bahrain.
You might think that the progressive hosts of MSNBC’s news programs would be all over the story of what American taxpayers are subsidizing, but the topic barely flickers across the chat shows of Rachel Maddow, Chris Hayes, and others. Given this across-the-board selective reticence, American coverage of Israel and Palestine, and particularly of American military aid to Israel, resembles the Agatha Christie novel in which the first-person narrator, observing and commenting on the action in calm semi-detachment, turns out to be the murderer.
Strategic Self-Interest and Unconditional Military Aid
On the activist front, American military patronage of Israel is not much discussed either, in large part because the aid package is so deeply entrenched that no attempt to cut it back could succeed in the near future. Hence, the global BDS campaign has focused on smaller, more achievable targets, though as Yousef Munayyer, executive director of the Jerusalem Fund, an advocacy group, told me, the BDS movement does envision an end to Washington’s military transfers in the long term. This makes tactical sense, and both the Jerusalem Fund and the U.S. Campaign to End the Israeli Occupation are engaged in ongoing campaigns to inform the public about American military aid to Israel.
Less understandable are the lobbying groups that advertise themselves as “pro-peace,” champions of “dialogue” and “conversation,” but share the same bottom line on military aid for Israel as their overtly hawkish counterparts. For instance, J Street (“pro-Israel and pro-peace”), a Washington-based nonprofit which bills itself as a moderate alternative to the powerhouse lobbying outfit, the American Israel Public Affairs Committee (AIPAC), supports both “robust” military aid and any supplemental disbursements on offer from Washington to the Israeli Defense Forces. Americans for Peace Now similarly takes the position that Washington should provide “robust assistance” to ensure Israel’s “qualitative military edge.” At the risk of sounding literal-minded, any group plumping for enormous military aid packages to a country acting as Israel has is emphatically not “pro-peace.” It’s almost as if the Central America solidarity groups from the 1980s had demanded peace, while lobbying Washington to keep funding the Contras and the Salvadoran military.
Outside the various factions of the Israel lobby, the landscape is just as flat. The Center for American Progress, a Washington think tank close to the Democratic Party, regularly issues pious statements about new hopes for the “peace process”—with never a mention of how our unconditional flow of advanced weaponry might be a disincentive to any just resolution of the situation.
There is, by the way, a similar dynamic at work when it comes to Washington’s second biggest recipient of foreign aid, Egypt. Washington’s expenditure of more than $60 billion over the past 30 years ensured both peace with Israel and Cold War loyalty, while propping up an authoritarian government with a ghastly human rights record. As the post-Mubarak military restores its grip on Egypt, official Washington is currently at work finding ways to keep the military aid flowing despite a congressional ban on arming regimes that overthrow elected governments. There is, however, at least some mainstream public debate in the U.S. about ending aid to the Egyptian generals who have violently reclaimed power. Investigative journalism nonprofit ProPublica has even drafted a handy “explainer” about U.S. military aid to Egypt—though they have not tried to explain aid to Israel.
Silence about U.S.-Israel relations is, to a large degree, hardwired into Beltway culture. As George Perkovich, director of the nuclear policy program at the Carnegie Endowment for International Peace told the Washington Post, “It’s like all things having to do with Israel and the United States. If you want to get ahead, you don’t talk about it; you don’t criticize Israel, you protect Israel.”
This is regrettable, as Washington’s politically invisible military aid to Israel is not just an impediment to lasting peace, but also a strategic and security liability. As General David Petraeus, then head of U.S. Central Command, testified to the Senate Armed Services Committee in 2010, the failure to reach a lasting resolution to the conflict between the Israelis and Palestinians makes Washington’s other foreign policy objectives in the region more difficult to achieve. It also, he pointed out, foments anti-American hatred and fuels al-Qaeda and other violent groups. Petraeus’s successor at CENTCOM, General James Mattis, echoed this list of liabilities in a public dialogue with Wolf Blitzer last July:
“I paid a military security price every day as a commander of CENTCOM because the Americans were seen as biased in support of Israel, and that [alienates] all the moderate Arabs who want to be with us because they can’t come out publicly in support of people who don’t show respect for the Arab Palestinians.”
Don’t believe the generals? Ask a terrorist. Khalid Sheikh Mohammed, mastermind of the 9/11 attacks now imprisoned at Guantanamo, told interrogators that he was motivated to attack the United States in large part because of Washington’s leading role in assisting Israel’s repeated invasions of Lebanon and the ongoing dispossession of Palestinians.
The Israel lobby wheels out a battery of arguments in favor of arming and funding Israel, including the assertion that a step back from such aid for Israel would signify a “retreat” into “isolationism.” But would the United States, a global hegemon busily engaged in nearly every aspect world affairs, be "isolated" if it ceased giving lavish military aid to Israel? Was the United States "isolated" before 1967 when it expanded that aid in a major way? These questions answer themselves.
Sometimes the mere act of pointing out the degree of U.S. aid to Israel provokes accusations of having a special antipathy for Israel. This may work as emotional blackmail, but if someone proposed that Washington start shipping Armenia $3.1 billion worth of armaments annually so that it could begin the conquest of its ancestral province of Nagorno-Karabakh in neighboring Azerbaijan, the plan would be considered ludicrous—and not because of a visceral dislike for Armenians. Yet somehow the assumption that Washington is required to generously arm the Israeli military has become deeply institutionalized in this country.
Fake Peace Process, Real War Process
Today, Secretary of State John Kerry is leading a push for a renewed round of the interminable American-led peace process in the region that has been underway since the mid-1970s. It’s hardly a bold prediction to suggest that this round, too, will fail. The Israeli minister of defense, Moshe Ya’alon, has already publicly mocked Kerry in his quest for peace as “obsessive and messianic” and added that the newly proposed framework for this round of negotiations is “not worth the paper it’s printed on.” Other Israeli high officials blasted Kerry for his mere mention of the potential negative consequences to Israel of a global boycott if peace is not achieved.
But why shouldn’t Ya’alon and other Israeli officials tee off on the hapless Kerry? After all, the defense minister knows that Washington will wield no stick and that bushels of carrots are in the offing, whether Israel rolls back or redoubles its land seizures and colonization efforts. President Obama has boasted that the U.S. has never given so much military aid to Israel as under his presidency. On January 29th, the House Foreign Affairs Committee voted unanimously to upgrade Israel’s status to “major strategic partner.” With Congress and the president guaranteeing that unprecedented levels of military aid will continue to flow, Israel has no real incentive to change its behavior.
Usually such diplomatic impasses are blamed on the Palestinians, but given how little is left to squeeze out of them, doing so this time will test the creativity of official Washington. Whatever happens, in the post-mortems to come there will be no discussion in Washington about the role its own policies played in undermining a just and lasting agreement.
How much longer will this silence last? The arming and bankrolling of a wealthy nation committing ethnic cleansing has something to offend conservatives, progressives, and just about every other political grouping in America. After all, how often in foreign policy does strategic self-interest align so neatly with human rights and common decency?
Intelligent people can and do disagree about a one-state versus a two-state solution for Israel and Palestine. People of goodwill disagree about the global BDS campaign. But it is hard to imagine what kind of progress can ever be made toward a just and lasting settlement between Israel and Palestine until Washington quits arming one side to the teeth.
“If it weren’t for U.S. support for Israel, this conflict would have been resolved a long time ago,” says Josh Ruebner. Will we Americans ever acknowledge our government's active role in destroying the chances for a just and lasting peace between Palestine and Israel?
Chase Madar (@ChMadar) is a lawyer in New York, a TomDispatch regular, and the author of The Passion of [Chelsea] Manning: The Story behind the Wikileaks Whistleblower (Verso).
Copyright 2014 Chase Madar
Image by the White House (Public Domain).
Reading Melville in the Age of Terror.
This article originally appeared at TomDispatch.
A captain ready to drive himself and all around him to ruin in the hunt for a white whale. It’s a well-known story, and over the years, mad Ahab in Herman Melville’s most famous novel, Moby-Dick, has been used as an exemplar of unhinged American power, most recently of George W. Bush’s disastrous invasion of Iraq.
But what’s really frightening isn't our Ahabs, the hawks who periodically want to bomb some poor country, be it Vietnam or Afghanistan, back to the Stone Age. The respectable types are the true “terror of our age,” as Noam Chomsky called them collectively nearly 50 years ago. The really scary characters are our soberest politicians, scholars, journalists, professionals, and managers, men and women (though mostly men) who imagine themselves as morally serious, and then enable the wars, devastate the planet, and rationalize the atrocities. They are a type that has been with us for a long time. More than a century and a half ago, Melville, who had a captain for every face of empire, found their perfect expression—for his moment and ours.
For the last six years, I’ve been researching the life of an American seal killer, a ship captain named Amasa Delano who, in the 1790s, was among the earliest New Englanders to sail into the South Pacific. Money was flush, seals were many, and Delano and his fellow ship captains established the first unofficial U.S. colonies on islands off the coast of Chile. They operated under an informal council of captains, divvied up territory, enforced debt contracts, celebrated the Fourth of July, and set up ad hoc courts of law. When no bible was available, the collected works of William Shakespeare, found in the libraries of most ships, were used to swear oaths.
From his first expedition, Delano took hundreds of thousands of sealskins to China, where he traded them for spices, ceramics, and tea to bring back to Boston. During a second, failed voyage, however, an event took place that would make Amasa notorious—at least among the readers of the fiction of Herman Melville.
Here’s what happened: One day in February 1805 in the South Pacific, Amasa Delano spent nearly a full day on board a battered Spanish slave ship, conversing with its captain, helping with repairs, and distributing food and water to its thirsty and starving voyagers, a handful of Spaniards and about 70 West African men and women he thought were slaves. They weren’t.
Those West Africans had rebelled weeks earlier, killing most of the Spanish crew, along with the slaver taking them to Peru to be sold, and demanded to be returned to Senegal. When they spotted Delano’s ship, they came up with a plan: let him board and act as if they were still slaves, buying time to seize the sealer’s vessel and supplies. Remarkably, for nine hours, Delano, an experienced mariner and distant relative of future president Franklin Delano Roosevelt, was convinced that he was on a distressed but otherwise normally functioning slave ship.
Having barely survived the encounter, he wrote about the experience in his memoir, which Melville read and turned into what many consider his “other” masterpiece. Published in 1855, on the eve of the Civil War, Benito Cereno is one of the darkest stories in American literature. It’s told from the perspective of Amasa Delano as he wanders lost through a shadow world of his own racial prejudices.
One of the things that attracted Melville to the historical Amasa was undoubtedly the juxtaposition between his cheerful self-regard—he considers himself a modern man, a liberal opposed to slavery—and his complete obliviousness to the social world around him. The real Amasa was well meaning, judicious, temperate, and modest.
In other words, he was no Ahab, whose vengeful pursuit of a metaphysical whale has been used as an allegory for every American excess, every catastrophic war, every disastrous environmental policy, from Vietnam and Iraq to the explosion of the BP oil rig in the Gulf of Mexico in 2010.
Ahab, whose peg-legged pacing of the quarterdeck of his doomed ship enters the dreams of his men sleeping below like the “crunching teeth of sharks.” Ahab, whose monomania is an extension of the individualism born out of American expansion and whose rage is that of an ego that refuses to be limited by nature’s frontier. “Our Ahab,” as a soldier in Oliver Stone’s movie Platoon calls a ruthless sergeant who senselessly murders innocent Vietnamese.
Ahab is certainly one face of American power. In the course of writing a book on the history that inspired Benito Cereno, I’ve come to think of it as not the most frightening—or even the most destructive of American faces. Consider Amasa.
Since the end of the Cold War, extractive capitalism has spread over our post-industrialized world with a predatory force that would shock even Karl Marx. From the mineral-rich Congo to the open-pit gold mines of Guatemala, from Chile’s until recently pristine Patagonia to the fracking fields of Pennsylvania and the melting Arctic north, there is no crevice where some useful rock, liquid, or gas can hide, no jungle forbidden enough to keep out the oil rigs and elephant killers, no citadel-like glacier, no hard-baked shale that can’t be cracked open, no ocean that can’t be poisoned.
And Amasa was there at the beginning. Seal fur may not have been the world’s first valuable natural resource, but sealing represented one of young America’s first experiences of boom-and-bust resource extraction beyond its borders.
With increasing frequency starting in the early 1790s and then in a mad rush beginning in 1798, ships left New Haven, Norwich, Stonington, New London, and Boston, heading for the great half-moon archipelago of remote islands running from Argentina in the Atlantic to Chile in the Pacific. They were on the hunt for the fur seal, which wears a layer of velvety down like an undergarment just below an outer coat of stiff gray-black hair.
In Moby-Dick, Melville portrayed whaling as the American industry. Brutal and bloody but also humanizing, work on a whale ship required intense coordination and camaraderie. Out of the gruesomeness of the hunt, the peeling of the whale’s skin from its carcass, and the hellish boil of the blubber or fat, something sublime emerged: human solidarity among the workers. And like the whale oil that lit the lamps of the world, divinity itself glowed from the labor: “Thou shalt see it shining in the arm that wields a pick or drives a spike; that democratic dignity which, on all hands, radiates without end from God.”
Sealing was something else entirely. It called to mind not industrial democracy but the isolation and violence of conquest, settler colonialism, and warfare. Whaling took place in a watery commons open to all. Sealing took place on land. Sealers seized territory, fought one another to keep it, and pulled out what wealth they could as fast as they could before abandoning their empty and wasted island claims. The process pitted desperate sailors against equally desperate officers in as all-or-nothing a system of labor relations as can be imagined.
In other words, whaling may have represented the promethean power of proto-industrialism, with all the good (solidarity, interconnectedness, and democracy) and bad (the exploitation of men and nature) that went with it, but sealing better predicted today’s postindustrial extracted, hunted, drilled, fracked, hot, and strip-mined world.
Seals were killed by the millions and with a shocking casualness. A group of sealers would get between the water and the rookeries and simply start clubbing. A single seal makes a noise like a cow or a dog, but tens of thousands of them together, so witnesses testified, sound like a Pacific cyclone. Once we “began the work of death,” one sealer remembered, “the battle caused me considerable terror.”
South Pacific beaches came to look like Dante’s Inferno. As the clubbing proceeded, mountains of skinned, reeking carcasses piled up and the sands ran red with torrents of blood. The killing was unceasing, continuing into the night by the light of bonfires kindled with the corpses of seals and penguins.
And keep in mind that this massive kill-off took place not for something like whale oil, used by all for light and fire. Seal fur was harvested to warm the wealthy and meet a demand created by a new phase of capitalism: conspicuous consumption. Pelts were used for ladies’ capes, coats, muffs, and mittens, and gentlemen’s waistcoats. The fur of baby pups wasn’t much valued, so some beaches were simply turned into seal orphanages, with thousands of newborns left to starve to death. In a pinch though, their downy fur, too, could be used—to make wallets.
Occasionally, elephant seals would be taken for their oil in an even more horrific manner: when they opened their mouths to bellow, their hunters would toss rocks in and then begin to stab them with long lances. Pierced in multiple places like Saint Sebastian, the animals’ high-pressured circulatory system gushed “fountains of blood, spouting to a considerable distance.”
At first the frenetic pace of the killing didn’t matter: there were so many seals. On one island alone, Amasa Delano estimated, there were “two to three millions of them” when New Englanders first arrived to make “a business of killing seals.”
“If many of them were killed in a night,” wrote one observer, “they would not be missed in the morning.” It did indeed seem as if you could kill every one in sight one day, then start afresh the next. Within just a few years, though, Amasa and his fellow sealers had taken so many seal skins to China that Canton’s warehouses couldn’t hold them. They began to pile up on the docks, rotting in the rain, and their market price crashed.
To make up the margin, sealers further accelerated the pace of the killing—until there was nothing left to kill. In this way, oversupply and extinction went hand in hand. In the process, cooperation among sealers gave way to bloody battles over thinning rookeries. Previously, it only took a few weeks and a handful of men to fill a ship’s hold with skins. As those rookeries began to disappear, however, more and more men were needed to find and kill the required number of seals and they were often left on desolate islands for two- or three-year stretches, living alone in miserable huts in dreary weather, wondering if their ships were ever going to return for them.
“On island after island, coast after coast,” one historian wrote, “the seals had been destroyed to the last available pup, on the supposition that if sealer Tom did not kill every seal in sight, sealer Dick or sealer Harry would not be so squeamish.” By 1804, on the very island where Amasa estimated that there had been millions of seals, there were more sailors than prey. Two years later, there were no seals at all.
The Machinery of Civilization
There exists a near perfect inverse symmetry between the real Amasa and the fictional Ahab, with each representing a face of the American Empire. Amasa is virtuous, Ahab vengeful. Amasa seems trapped by the shallowness of his perception of the world. Ahab is profound; he peers into the depths. Amasa can’t see evil (especially his own). Ahab sees only nature’s “intangible malignity.”
Both are representatives of the most predatory industries of their day, their ships carrying what Delano once called the “machinery of civilization” to the Pacific, using steel, iron, and fire to kill animals and transform their corpses into value on the spot.
Yet Ahab is the exception, a rebel who hunts his white whale against all rational economic logic. He has hijacked the “machinery” that his ship represents and rioted against “civilization.” He pursues his quixotic chase in violation of the contract he has with his employers. When his first mate, Starbuck, insists that his obsession will hurt the profits of the ship’s owners, Ahab dismisses the concern: “Let the owners stand on Nantucket beach and outyell the Typhoons. What cares Ahab? Owners, Owners? Thou art always prating to me, Starbuck, about those miserly owners, as if the owners were my conscience.”
Insurgents like Ahab, however dangerous to the people around them, are not the primary drivers of destruction. They are not the ones who will hunt animals to near extinction—or who are today forcing the world to the brink. Those would be the men who never dissent, who either at the frontlines of extraction or in the corporate backrooms administer the destruction of the planet, day in, day out, inexorably, unsensationally without notice, their actions controlled by an ever greater series of financial abstractions and calculations made in the stock exchanges of New York, London, and Shanghai.
If Ahab is still the exception, Delano is still the rule. Throughout his long memoir, he reveals himself as ever faithful to the customs and institutions of maritime law, unwilling to take any action that would injure the interests of his investors and insurers. “All bad consequences,” he wrote, describing the importance of protecting property rights, “may be avoided by one who has a knowledge of his duty, and is disposed faithfully to obey its dictates.”
It is in Delano’s reaction to the West African rebels, once he finally realizes he has been the target of an elaborately staged con, that the distinction separating the sealer from the whaler becomes clear. The mesmeric Ahab—the “thunder-cloven old oak” —has been taken as a prototype of the twentieth-century totalitarian, a one-legged Hitler or Stalin who uses an emotional magnetism to convince his men to willingly follow him on his doomed hunt for Moby Dick.
Delano is not a demagogue. His authority is rooted in a much more common form of power: the control of labor and the conversion of diminishing natural resources into marketable items. As seals disappeared, however, so too did his authority. His men first began to grouse and then conspire. In turn, Delano had to rely ever more on physical punishment, on floggings even for the most minor of offences, to maintain control of his ship—until, that is, he came across the Spanish slaver. Delano might have been personally opposed to slavery, yet once he realized he had been played for a fool, he organized his men to retake the slave ship and violently pacify the rebels. In the process, they disemboweled some of the rebels and left them writhing in their viscera, using their sealing lances, which Delano described as “exceedingly sharp and as bright as a gentleman’s sword.”
Caught in the pincers of supply and demand, trapped in the vortex of ecological exhaustion, with no seals left to kill, no money to be made, and his own crew on the brink of mutiny, Delano rallied his men to the chase—not of a white whale but of black rebels. In the process, he reestablished his fraying authority. As for the surviving rebels, Delano re-enslaved them. Propriety, of course, meant returning them and the ship to its owners.
Our Amasas, Ourselves
With Ahab, Melville looked to the past, basing his obsessed captain on Lucifer, the fallen angel in revolt against the heavens, and associating him with America’s “manifest destiny,” with the nation’s restless drive beyond its borders. With Amasa, Melville glimpsed the future. Drawing on the memoirs of a real captain, he created a new literary archetype, a moral man sure of his righteousness yet unable to link cause to effect, oblivious to the consequences of his actions even as he careens toward catastrophe.
They are still with us, our Amasas. They have knowledge of their duty and are disposed faithfully to follow its dictates, even unto the ends of the Earth.
TomDispatch regular Greg Grandin’s new book, The Empire of Necessity: Slavery, Freedom, and Deception in the New World, has just been published.
Copyright 2014 Greg Grandin.
Image by an unknown photographer (Public Domain).
A glimpse inside the Zapatista Movement, 20 years after its birth.
This article originally appeared at TomDispatch.
Growing up in a well-heeled suburban community, I absorbed our society’s distaste for dissent long before I was old enough to grasp just what was being dismissed. My understanding of so many people and concepts was tainted by this environment and the education that went with it: Che Guevara and the Black Panthers and Oscar Wilde and Noam Chomsky and Venezuela and Malcolm X and the Service Employees International Union and so, so many more. All of this is why, until recently, I knew almost nothing about the Mexican Zapatista movement except that the excessive number of “a”s looked vaguely suspicious to me. It’s also why I felt compelled to travel thousands of miles to a Zapatista “organizing school” in the heart of the Lacandon jungle in southeastern Mexico to try to sort out just what I’d been missing all these years.
The fog is so thick that the revelers arrive like ghosts. Out of the mist they appear: men sporting wide-brimmed Zapata hats, women encased in the shaggy sheepskin skirts that are still common in the remote villages of Mexico. And then there are the outsiders like myself with our North Face jackets and camera bags, eyes wide with adventure. (“It’s like the Mexican Woodstock!” exclaims a student from the northern city of Tijuana.) The hill is lined with little restaurants selling tamales and arroz con leche and pozol, a ground-corn drink that can rip a foreigner’s stomach to shreds.
There is no alcohol in sight. Sipping coffee as sugary as Alabama sweet tea, I realize that tonight will be my first sober New Year’s Eve since December 31, 1999, when I climbed into bed with my parents to await the Y2K Millennium bug and mourned that the whole world was going to end before I had even kissed a boy.
Thousands are clustered in this muddy field to mark the 20-year anniversary of January 1, 1994, when an army of impoverished farmers surged out of the jungle and launched the first post-modern revolution. Those forces, known as the Zapatista Army of National Liberation, were the armed wing of a much larger movement of indigenous peoples in the southeastern Mexican state of Chiapas, who were demanding full autonomy from their government and global liberation for all people.
As the news swept across that emerging communication system known as the Internet, the world momentarily held its breath. A popular uprising against government-backed globalization led by an all but forgotten people: it was an event that seemed unthinkable. The Berlin Wall had fallen. The market had triumphed. The treaties had been signed. And yet surging out of the jungles came a movement of people with no market value and the audacity to refuse to disappear.
Now, 20 years later, villagers and sympathetic outsiders are pouring into one of the Zapatistas’ political centers, known as Oventic, to celebrate the fact that their rebellion has not been wiped out by the wind and exiled from the memory of men.
The plane tickets from New York City to southern Mexico were so expensive that we traveled by land. We E-ZPassed down the eastern seaboard, ate catfish sandwiches in Louisiana, barreled past the refineries of Texas, and then crossed the border. We pulled into Mexico City during the pre-Christmas festivities. The streets were clogged with parents eating tamales and children swinging at piñatas. By daybreak the next morning, we were heading south again. Speed bumps scraped the bottom of our Volvo the entire way from Mexico City to Chiapas, where the Zapatistas control wide swathes of territory. The road skinned the car alive. Later I realized that those speed bumps were, in a way, the consequences of dissent -- tiny traffic-controlling monuments to a culture far less resigned to following the rules. “Up north,” I’d later tell Mexican friends, “we don’t have as many speed bumps, but neither do we have as much social resistance.”
After five days of driving, we reached La Universidad de la Tierra, a free Zapatista-run school in the touristy town of San Cristóbal de Las Casas in Chiapas. Most of the year, people from surrounding rural communities arrive here to learn trades like electrical wiring, artisanal crafts, and farming practices. This week, thousands of foreigners had traveled to the town to learn about something much more basic: autonomy.
Our first “class” was in the back of a covered pickup truck careening through the Lacandon jungle with orange trees in full bloom. As we passed, men and women raised peace signs in salute. Spray-painted road signs read (in translation):
“You are now entering Zapatista territory. Here the people order and the government obeys.”
I grew nauseous from the exhaust and the dizzying mountain views, and after six hours in that pickup on this, my sixth day of travel, two things occurred to me: first, I realized that I had traveled “across” Chiapas in what was actually a giant circle; second, I began to suspect that there was no Zapatista organizing school at all, that the lesson I was supposed to absorb was simply that life is a matter of perpetual, cyclical motion. The movement’s main symbol, after all, is a snail’s shell.
Finally, though, we arrived in a village where the houses had thatched roofs and the children spoke only the pre-Hispanic language Ch’ol.
Over the centuries, the indigenous communities of Chiapas survived Spanish conquistadors, slavery, and plantation-style sugar cane fields; Mexican independence and mestizo landowners; racism, railroads, and neoliberal economic reforms. Each passing year seemed to bring more threats to its way of life. As the father of my host family explained to me, the community began to organize itself in the early 1990s because people felt that the government was slowly but surely exterminating them.
The government was chingando, he said, which translates roughly as deceiving, cheating, and otherwise screwing someone over. It was, he said, stealing their lands. It was extracting the region’s natural resources, forcing people from the countryside into the cities. It was disappearing the indigenous languages through its version of public education. It was signing free trade agreements that threatened to devastate the region’s corn market and the community’s main subsistence crop.
So on January 1, 1994, the day the North America Free Trade Agreement went into effect, some residents of this village -- along with those from hundreds of other villages -- seized control of major cities across the state and declared war on the Mexican government. Under the name of the Zapatista Army for National Liberation, they burned the army’s barracks and liberated the inmates in the prison at San Cristóbal de Las Casas.
In response, the Mexican army descended on Chiapas with such violence that the students of Mexico City rioted in the streets. In the end, the two sides sat down for peace talks that, to this day, have never been resolved.
The uprising itself lasted only 12 days; the response was a punishing decade of repression. First came the great betrayal. Mexican President Ernesto Zedillo, who, in the wake of the uprising, had promised to enact greater protections for indigenous peoples, instead sent thousands of troops into the Zapatistas’ territory in search of Subcomandante Marcos, the world-renowned spokesperson for the movement.
They didn’t find him. But the operation marked the beginning of a hush-hush war against the communities that supported the Zapatistas. The army, police, and hired thugs burned homes and fields and wrecked small, communally owned businesses. Some local leaders disappeared. Others were imprisoned. In one region of Chiapas, the entire population was displaced for so long that the Red Cross set up a refugee camp for them. (In the end, the community rejected the Red Cross aid, in the same way that it also rejects all government aid.)
Since 1994, the movement has largely worked without arms. Villagers resisted government attacks and encroachments with road blockades, silent marches, and even, in one famous case, an aerial attack comprised entirely of paper airplanes.
The Boy Who Is Free
Fifteen years after the uprising, a child named Diego was born in Zapatista territory. He was the youngest member of the household where I was staying, and during my week with the family, he was always up to something. He agitated the chickens, peeked his head through the window to surprise his father at the breakfast table, and amused the family by telling me long stories in Ch’ol that I couldn’t possibly understand.
He also, unknowingly, defied the government’s claim that he does not exist.
Diego is part of the first generation of Zapatista children whose births are registered by one of the organization’s own civil judges. In the eyes of his father, he is one of the first fully independent human beings. He was born in Zapatista territory, attends a Zapatista school, lives on unregistered land, and his body is free of pesticides and genetically modified organisms. Adding to his autonomy is the fact that nothing about him -- not his name, weight, eye color, or birth date -- is officially registered with the Mexican government. His family does not receive a peso of government aid, nor does it pay a peso worth of taxes. Not even the name of Diego’s town appears on any official map.
By first-world standards, this autonomy comes at a steep price: some serious poverty. Diego’s home has electricity but no running water or indoor plumbing. The outhouse is a hole in the ground concealed by waist-high tarp walls. The bathtub is the small stream in the backyard. Their chickens often free-range it right through their one-room, dirt-floor house. Eating them is considered a luxury.
The population of the town is split between Zapatistas and government loyalists, whom the Zapatistas call “priistas” in reference to Mexico’s ruling political party, the PRI. To discern who is who, all you have to do is check whether or not a family’s roof sports a satellite dish.
Then again, the Zapatistas aren’t focused on accumulating wealth, but on living with dignity. Most of the movement’s work over the last two decades has involved patiently building autonomous structures for Diego and his generation. Today, children like him grow up in a community with its own Zapatista schools; communal businesses; banks; hospitals; clinics; judicial processes; birth, death, and marriage certificates; annual censuses; transportation systems; sports teams; musical bands; art collectives; and a three-tiered system of government. There are no prisons. Students learn both Spanish and their own indigenous language in school. An operation in the autonomous hospital can cost one-tenth that in an official hospital. Members of the Zapatista government, elected through town assemblies, serve without receiving any monetary compensation.
Economic independence is considered the cornerstone of autonomy -- especially for a movement that opposes the dominant global model of neoliberal capitalism. In Diego’s town, the Zapatista families have organized a handful of small collectives: a pig-raising operation, a bakery, a shared field for farming, and a chicken coop. The 20-odd chickens had all been sold just before Christmas, so the coop was empty when we visited. The three women who ran the collective explained, somewhat bashfully, that they would soon purchase more chicks to raise.
As they spoke in the outdoor chicken coop, there were squealing noises beneath a nearby table. A tangled cluster of four newly born puppies, eyes still crusted shut against the light, were squirming to stay warm. Their mother was nowhere in sight, and the whole world was new and cold, and everything was unknown. I watched them for a moment and thought about how, although it seemed impossible, they would undoubtedly survive and grow.
Unlike Diego, the majority of young children on the planet today are born into densely packed cities without access to land, animals, crops, or almost any of the natural resources that are required to sustain human life. Instead, we city dwellers often need a ridiculous amount of money simply to meet our basic needs. My first apartment in New York City, a studio smaller than my host family’s thatched-roof house, cost more per month than the family has likely spent in Diego’s entire lifetime.
As a result, many wonder if the example of the Zapatistas has anything to offer an urbanized planet in search of change. Then again, this movement resisted defeat by the military of a modern state and built its own school, medical, and governmental systems for the next generation without even having the convenience of running water. So perhaps a more appropriate question is: What’s the rest of the world waiting for?
Around six o’clock, when night falls in Oventic, the music for the celebration begins. On stage, a band of guitar-strumming men wear hats that look like lampshades with brightly colored tassels. Younger boys perform Spanish rap. Women, probably from the nearby state of Veracruz, play son jarocho, a type of folk music featuring miniature guitar-like instruments.
It’s raining gently in the open field. The mist clings to shawls and skirts and pasamontañas, the face-covering ski masks that have become iconic imagery for the Zapatistas. “We cover our faces so that you can see us” is a famous Zapatista saying. And it’s true: For a group of people often erased by politicians and exploited by global economies, the ski-masks have the curious effect of making previously invisible faces visible.
Still, there are many strategies to make dissent disappear, of which the least effective may be violence. The most ingenious is undoubtedly to make the rest of the world -- and even the dissenter herself -- dismissive of what’s being accomplished. Since curtailing its military offensive, the government has waged a propaganda war focused on convincing the rest of Mexico, the world, and even Zapatista communities themselves that the movement and its vision no longer exists.
But there are just as many strategies for keeping dissent and dissenters going. One way is certainly to invite thousands of outsiders to visit your communities and see firsthand that they are real, that in every way that matters they are thriving, and that they have something to teach the rest of us. As Diego’s father said in an uncharacteristic moment of boastfulness, “I think by now that the whole world has heard of our organization.”
Writing is another way to prevent an idea and a movement from disappearing, especially when one is hurtling down the highway in Texas headed back to New York City, already surrounded by a reality so different as to instantly make the Zapatistas hard to remember.
The most joyous way to assert one’s existence, however, is through celebration.
The New Year arrived early in Oventic. One of the subcomandantes had just read a communique issued by the organization's leadership, first in Spanish, then in the indigenous languages Tzotzil and Tzeltal. The latter translations took her nearly twice as long to deliver, as if to remind us of all the knowledge that was lost with the imposition of a colonial language centuries ago. Then, a low hiss like a cracked soda can, and two fireworks exploded into the air.
“Long live the insurgents!” a masked man on stage cried.
“Viva!” we shouted. The band burst into song, and two more fireworks shot into the sky, their explosions well timed drumbeats of color and sound. The coordination was impeccable. As the chants continued, the air grew so smoky that we could barely see the fireworks exploding, but in that moment, I could still feel their brilliance and the illumination, 20 years old, of the movement releasing them.
TomDispatch regular Laura Gottesdiener is a journalist and the author of A Dream Foreclosed: Black America and the Fight for a Place to Call Home. She is an editor for Waging Nonviolence and has written for Playboy, Al Jazeera America, RollingStone.com, Ms., the Huffington Post and other publications.
Copyright 2014 Laura Gottesdiener
Image by Julia Stallabrass, licensed under Creative Commons.
In a small park next to the National Security Agency in Fort Meade, Md., signs explain rules about photographs one can take — and illustrate the kind of photo one isn’t allowed to take.
Welcome to the NSA’s non-location.
This article originally appeared at Waging Nonviolence.
Mass surveillance has an image problem. The visual references commonly used to portray intelligence agencies—screens, servers and sleek glass buildings—don’t suggest an ethics or a rationale to their operations. They don’t suggest that there are even humans involved in collecting information about millions of other humans. In order to understand a world in which mass surveillance is increasingly deemed an unexceptional fact, it seems useful to face the apparatuses performing that surveillance in the most literal, prosaic way possible. For me, this meant driving to suburban Maryland.
In a small park next to the National Security Agency in Fort Meade, Md., signs explain rules about photographs one can take—and illustrate the kind of photo one isn’t allowed to take. Looking at the diagram of a restricted image reminded me of the ubiquitous stock photograph of the NSA, the one reminiscent of the Kaaba and among the few used by news outlets. The photograph’s ubiquity, along with its subject’s resemblance to another opaque monument, serves as shorthand for an institution that seeks to be perceived as beyond human comprehension or accountability.
I made my pilgrimage not to the NSA, however, but to adjacent temples of lesser gods. The National Business Park is an office complex located less than a mile away from the NSA headquarters. Its tenants are mostly intelligence and defense contractors. According to the writer Tim Shorrock, contracting makes up about 70 percent of the defense intelligence budget. While the U.S. intelligence community has been subject to public scrutiny following Edward Snowden’s revelations about the surveillance state, the private contractors crucial to maintaining it—by designing software and hardware, providing analysis and building physical structures for government intelligence agencies—continue to do their work beneath the radar. As hope emerges that court rulings and panel recommendations might reform the NSA, it remains unlikely that these highly instrumental agents of intelligence gathering will be held accountable for their role in crafting the surveillance state. It also remains unclear where private intelligence contractors might shift their focus if they lose their top client (though their other clients currently include city police departments, international governments and private corporations).
The landscape of the National Business Park offers little insight into the vast, byzantine protocols of the surveillance state. But plausible deniability is itself an architectural choice, one made manifest not only in procurements and mergers, not only in networks and protocols, but also in drywall and concrete and stock photography.
The National Business Park is one of many properties belonging to Corporate Office Properties Trust, or COPT, a publicly traded real estate investment trust based in Columbia, Md. COPT traces its origins to a Minneapolis firm, Royale Investments, founded in 1988. In 1998 the company merged with Constellation Real Estate Group, a fully owned subsidiary of Constellation Energy. This merger led to the acquisition of 1.6 million square feet of mid-Atlantic office property, a new name and a shift in business priorities.
COPT describes itself as “serving the specialized requirements of U.S. Government agencies and defense contractors engaged in defense information technology and national security-related activities.” While there are, of course, other real estate companies working in this niche, COPT is seemingly the only one explicitly and almost exclusively dedicating itself to it. While COPT’s properties are sometimes a supporting character in stories about surveillance, it has generally evaded the spotlight. Press releases about new buildings will mention “a strategic tenant” and nothing else. At times, COPT even claims not to know who its tenants are.
The National Business Park is located in Annapolis Junction, an unincorporated community in Howard County. “Community” is perhaps a generous description. Annapolis Junction itself is a cipher of a place, named in 1840 for a rail junction on the B&O Railroad. Most of its land was repurposed by the federal government in 1917 to create Fort Meade. Industrial facilities, offices, the CSX rail line and Fort Meade make up the majority of its “community.” To call it “liminal” would imply a potentiality that it does not have. It is merely there.
While the NSA is a notable neighbor, the National Business Park is also about three miles away from two prisons and the site of a former prison. The Maryland Correctional Institute for Women still operates nearby, as does the minimum-security Brockbridge Center. Next to these two prisons are the remains of the Maryland House of Corrections, also known as “The Cut,” notorious for its poor conditions and violent history. The Cut closed abruptly in 2007 and was torn down in 2012. Within the National Business Park, the U.S. Federal Bureau of Prisons’ mid-Atlantic regional office works in the same space as small-to-midlevel contractors like Scitor, G2, Invertix (recently rebranded as Altamira) and Ventura Solutions.
That the landscape of the intelligence-industrial complex overlaps with that of the prison-industrial complex reveals something, but what that is remains inescapably inchoate — an uneasy resonance even if not actual collusion. Good cages, apparently, make good neighbors.
Arriving at the National Business Park induced intense déjà vu for me. Before actually visiting the National Business Park, I went there remotely via Google Street View. The landscapes I’d paused and panned now appeared through the screen of a car window, which renders every view cinematic. In the distracted euphoria of finally seeing the office of intelligence contractor giant Booz Allen Hamilton, which I had previously glimpsed only on Street View, for real, I couldn’t bring myself to stop and park. I drove to the end of the parkway instead.
In the satellite view on Google Maps, the buildings at the edge of the National Business Park don’t exist. A real estate broker’s portfolio site indicates that 410 National Business Parkway is home to Lockheed Martin offices, which are reportedly completed and in use by the military contractor. The interior of 420 is still under construction. Apparently it is the future home of SGI Federal.
Walking along the parkway, I began to imagine that the banality of the landscape hid ancient cults that in fact ruled the office park. At the base of a hill, I approached a fetid pond and an abstract sculpture. Neither the altar nor the ducks in the pond offered insight into the secrets of these temples.
The higher the level of restriction at a site, apparently, the more likely that the tenant is the U.S. government. I didn’t even try to photograph the barricades and ID booth at Hercules Road, the path to NBP-1, a building rented by the NSA’s Technology and Systems Organization. Similarly, while Sentinel Drive was accessible, Sentinel Way required proper ID and credentials. Old press releases suggest that they are mostly government offices, including the Department of the Navy’s Center for Information Dominance.
I ventured to Technology Drive, a name for a street that no one would ever have a childhood on. Office parks are fond of buzzwords as addresses: Innovation Road, New Allegiance Drive, Commerce Drive. There were some strangely quaint names well. The sculpture garden adjacent to the Lockheed Martin offices featured a plaque explaining that the land used to be the “Trusty Friend Farm”—farmland established by Amos Clark in 1829 and preserved until 1989.
At the 2701 Technology Drive compound, I entered a median courtyard with yet another sculpture, a circle formed by three curving stalks meandering toward the sky. I stood in its center and looked up.
The buildings were sleeping giants that should be approached with great caution. The buildings were bland temples to the many demigods of an infinitely complicated cult. The buildings were buildings, decked in surveillance cameras and filled with humans doing their jobs.
Outside one of Booz Allen’s offices at 304 Sentinel Drive, a man driving a pickup truck with the COPT logo told me I couldn’t take photos. I asked if I could at least photograph the sculptures. “As long as you can’t see the buildings,” he said. I was doubtful that I’d actually be stopped from photographing more, but I still went back to my rental car and, in a paranoid flourish, hid my camera’s memory card.
In part, it is this dynamic of uncertainty as to whether you are being observed, and whether you will be punished for your actions, that gives surveillance its power. Photographs, maps and descriptions of office buildings in Maryland aren’t especially valuable to those who oppose the rise of the surveillance state. The only reason to deny anyone the right to take photographs of the National Business Park or the NSA or any apparatus connected to the intelligence community is to maintain a blameless corporate mystique around it.
Maryland’s National Business Park is but one among many manicured landscapes of the intelligence industry. It presents a surface, smooth and menacing like a human face with no features. But someone had to build that surface, someone has to maintain it and someone has to tether the sad sapling trees to the medians. Through its banal, well-groomed foliage, its vacuous monuments and its insistence on image control, the National Business Park creates an environment that denies any rational past, a landscape that renders its corporeal corporate inhabitants beyond not only the politics of the day but also the accountability of history.
As I left the National Business Park, I thought of Robert Smithson, an artist uncannily attuned to unseen sites and “non-sites.” In his essay “Entropy and the New Monuments,” Smithson wrote, “It seems that beyond the barrier, there are only more barriers.” I’d gone to the outskirts of Crypto City and found only more ciphers.
All images by WNV/Ingrid Burrington, except the photo of NSA headquarters, which is by the NSA.
Hope, history, and unpredictability.
This article originally appeared at Tom Dispatch.
North American cicada nymphs live underground for 17 years before they emerge as adults. Many seeds stay dormant far longer than that before some disturbance makes them germinate. Some trees bear fruit long after the people who have planted them have died, and one Massachusetts pear tree, planted by a Puritan in 1630, is still bearing fruit far sweeter than most of what those fundamentalists brought to this continent. Sometimes cause and effect are centuries apart; sometimes Martin Luther King’s arc of the moral universe that bends toward justice is so long few see its curve; sometimes hope lies not in looking forward but backward to study the line of that arc.
Three years ago at this time, after a young Tunisian set himself on fire to protest injustice, the Arab Spring was on the cusp of erupting. An even younger man, a rapper who went by the name El Général, was on the verge of being arrested for “Rais Lebled” (a tweaked version of the phrase “head of state”), a song that would help launch the revolution in Tunisia.
Weeks before either the Tunisian or Egyptian revolutions erupted, no one imagined they were going to happen. No one foresaw them. No one was talking about the Arab world or northern Africa as places with a fierce appetite for justice and democracy. No one was saying much about unarmed popular power as a force in that corner of the world. No one knew that the seeds were germinating.
A small but striking aspect of the Arab Spring was the role of hip-hop in it. Though the U.S. government often exports repression—its billions in aid to the Egyptian military over the decades, for example—American culture can be something else altogether, and often has been.
Henry David Thoreau wrote books that not many people read when they were published. He famously said of his unsold copies, "I have now a library of nearly 900 volumes over 700 of which I wrote myself.” But a South African lawyer of Indian descent named Mohandas Gandhi read Thoreau on civil disobedience and found ideas that helped him fight discrimination in Africa and then liberate his own country from British rule. Martin Luther King studied Thoreau and Gandhi and put their ideas to work in the United States, while in 1952 the African National Congress and the young Nelson Mandela were collaborating with the South African Indian Congress on civil disobedience campaigns. You wish you could write Thoreau a letter about all this. He had no way of knowing that what he planted would still be bearing fruit 151 years after his death. But the past doesn’t need us. The past guides us; the future needs us.
An influential comic book on civil disobedience and Martin Luther King published by the Fellowship of Reconciliation in the U.S. in 1957 was translated into Arabic and distributed in Egypt in 2009, four decades after King’s death. What its impact was cannot be measured, but it seems to have had one in the Egyptian uprising which was a dizzying mix of social media, outside pressure, street fighting, and huge demonstrations.
The past explodes from time to time, and many events that once seemed to have achieved nothing turn out to do their work slowly. Much of what has been most beautifully transformative in recent years has also been branded a failure by people who want instant results guaranteed or your money back. The Arab Spring has just begun, and if some of the participant nations are going through their equivalent of the French Revolution, it’s worth remembering that France, despite the Terror and the Napoleonic era, never went back either to absolutist monarchy or the belief that such a condition could be legitimate. It was a mess, it was an improvement, it’s still not finished.
The same might be said of the South African upheaval Mandela catalyzed. It made things better; it has not made them good enough. It’s worth pointing out as well that what was liberated by the end of apartheid was not only the nonwhite population of one country, but a sense of power and possibility for so many globally who had participated in the boycotts and other campaigns to end apartheid in that miraculous era from 1989 to 1991 that also saw the collapse of the Soviet Union, successful revolutions across Eastern Europe, the student uprising in Beijing, and the beginning of the end of many authoritarian regimes in Latin America.
In the hopeful aftermath of that transformation, Mandela wrote, “The titanic effort that has brought liberation to South Africa and ensured the total liberation of Africa constitutes an act of redemption for the black people of the world. It is a gift of emancipation also to those who, because they were white, imposed on themselves the heavy burden of assuming the mantle of rulers of all humanity. It says to all who will listen and understand that, by ending the apartheid barbarity that was the offspring of European colonization, Africa has, once more, contributed to the advance of human civilization and further expanded the frontiers of liberty everywhere.”
The arc of justice is long. It travels through New Orleans, the city I’ve returned to again and again since Hurricane Katrina. It’s been my way of trying to understand not just disaster, but community, culture, and continuity, three things that city possesses as no place else in the nation. Hip-hop comes most directly from the South Bronx, but if you look at the 1970s founders of that genre of popular music, you see that some of the key figures were Caribbean, and if you look at their formative music, it included the ska and reggae that were infused with the influence of New Orleans. (In addition, that city’s native son and major jazz figure, Donald Harrison, Jr., was a mentor to seminal New York City rapper Notorious B.I.G.)
If you look at New Orleans, what you see is an astonishing example of the survival of culture—and of the culture of survival.
Maybe you’d have to do what I was doing in early 2011—poke around in the origins of American music in New Orleans—to be struck by the way so many essential parts of it came from Africa in the 18th and early 19th centuries, and some of it returned to that continent again in recent years. I was looking at maps, making maps, thinking about how to chart the unexpected ways immaterial things move through time and space.
The saddest map I have ever seen is the oft-published one of the triangle trade, a vicious circle that isn’t even a circle. It depicts the routes of the 18th and 19th century European traders who brought manufactured goods from their continent to West Africa to exchange for human beings who were then transported to the United States and the Caribbean to be exchanged for raw materials, especially sugar, rum, and tobacco. It’s a map that tells of people made into tools and commodities, but it tells us nothing of what the enslaved brought with them.
Stripped bare of all possessions and rights, they carried memory, culture, and resistance in their heads. New Orleans let those things flourish as nowhere else in the United States during the long, obscene era of slavery, while the biggest slave uprising in U.S. history took place nearby in 1811 (its participants including two young Asante warriors who had arrived in New Orleans on slave ships five years earlier). From the mid-18th century to the 1840s, the enslaved of New Orleans were permitted to gather on Sundays in the plaza on the edge of the old city known then and now as Congo Square.
"On sabbath evening," the visitor H.C. Knight famously wrote in 1819, "the African slaves meet on the green, by the swamp, and rock the city with their Congo dances." The great music historian Ned Sublette observes that this is the first use of rock as a verb about music, and in his marvelous book The World That Made New Orleans notes that what is arguably the first rock and roll record, Roy Brown’s 1947 “Good Rocking Tonight,” was recorded a block away.
In between, what Africans had brought with them continued its metamorphosis in the city: jazz famously arose from black culture near Congo Square, as did important rhythm and blues strains and influences, as well as performers, then funk, and eventually hip-hop. Funk arose in part from Afro-Cuban influences and from the African-American tradition of the Mardi Gras Indians—not Native Americans, but working-class African Americans. Their elaborate outfits and rites officially pay homage to the Native Americans who sheltered runaway slaves (and sometimes intermarried with them), but have a startling resemblance to African beaded costumes. The Mardi Gras Indians still parade on that day and other days, chanting and singing, challenging each other through song. One of the recurrent chants declares, “We won’t bow down.”
Though New Orleans is mainly famous for other things, it has also been a city of resistance—from the slave revolts of the late 18th and early 19th centuries to late 19th century segregation-breaker Homer Plessy to Ruby Bridges, the six-year-old who in 1960 was the first Black child to integrate a white school in the South. The span of time is not as long as you might think: Fats Domino, one of the founding fathers of rock and roll, is still alive and has a home in the Lower Ninth Ward. The midwife at his birth a few blocks away was his grandmother, who had been born into slavery.
New Orleanian Herreast Harrison, a woman in her 70s, mother of jazzman Donald Harrison Jr., widow of a Mardi Gras Indian chief, cultural preserver, and a dynamic force in the city, said to me of Mardi Gras Indian culture:
“But those groups remembered their cultural heritage and practiced it there, that memory, they had this overarching memory of their pasts. And when they were there, they were free. And their spirits soared to the high heavens. They were themselves. In spite of limitations in every aspect of their lives. Where they should have felt like, ‘we are nothing,’ because you get brainwashed constantly about the fact that you're a nobody... but they didn't, they brought back. And now it's part of the world, that music.”
And her son, Donald Harrison, Jr., added:
One other very important thing that Congo Square represented in the culture was that no matter what’s going on in life you transcend the culture and Congo Square helps you. It transcends and puts you into a transcendental state so that you are free at that moment. Even today, that’s the power of the music and that’s why it brings us together. You have a moment of freedom where you transcend everything that’s going on around you. Berthold Auerbach said it so eloquently: ‘Music washes away from the soul the dust of everyday life.’ At that moment you become free, which is why the music is part of the world now. Everybody wants a moment to transcend. It goes inside of you and you know where you can go to be free. No matter if you’re in Norway, South American, or Beijing, you know, ‘this music sets me free.’ So Congo Square set the world free, basically. It gives freedom to everyone around it.
In my latest project, Unfathomable City: A New Orleans Atlas, I tried to convey what New Orleans music gave the world in a map labeled “Repercussions: Rhythm and Resistance Across the Atlantic.” Those involuntary émigrés brought by slave ship were said to have nothing, but what they had still reaches and spreads and liberates.
What we call the Arab Spring was first of all the North African Spring—in Tunisia, Egypt, and Libya—and hip-hop was already there. It has, in fact, become a global means of dissent, from indigenous Oaxaca, Mexico, to Cairo, Egypt. Which does not mean that everything is fine (or that hip-hop can't also be used for consumerism or misogyny). It’s a reminder, however, that even in the most horrific of circumstances, something remarkable more than survived; it thrived and grew and eventually reached around the Earth.
Nearly three years after the first sparks of the Arab Spring began, it’s wiser to consider it, too, barely begun rather than ended in failure. More than two years after the first members of Occupy Wall Street began decamping in Zuccotti Park in lower Manhattan, that movement is not over either, though almost all the encampments have subsided and the engagement has new names: Occupy Sandy, Strike Debt, and more. That everything continues to metamorphose seems a better way to think of social upheavals than obituaries and epitaphs.
Maps of the Unpredictable
Whenever I look around me, I wonder what old things are about to bear fruit, what seemingly solid institutions might soon rupture, and what seeds we might now be planting whose harvest will come at some unpredictable moment in the future. The most magnificent person I met in 2013 quoted a line from Michel Foucault to me: “People know what they do; frequently they know why they do what they do; but what they don't know is what what they do does.” Someone saves a life or educates a person or tells her a story that upends everything she assumed. The transformation may be subtle or crucial or world changing, next year or in 100 years, or maybe in a millennium. You can’t always trace it but everything, everyone has a genealogy.
In her forthcoming book The Rise: Creativity, the Gift of Failure, and the Search for Mastery, Sarah Lewis tells how a white teenager in Austin, Texas, named Charles Black heard a black trumpet player in the 1930s who changed his thinking—and so our lives. He was riveted and transformed by the beauty of New Orleans jazzman Louis Armstrong’s music, so much so that he began to reconsider the segregated world he had grown up in. "It is impossible to overstate the significance of a 16-year-old Southern boy's seeing genius, for the first time, in a black," he recalled decades later. As a lawyer dedicated to racial equality and civil rights, he would in 1954 help overturn segregation nationwide, aiding the plaintiffs in Brown v. Board of Education, the landmark Supreme Court case ending segregation (and overturning Plessy v. Ferguson, the failed anti-segregation lawsuit launched in New Orleans 60 years earlier).
How do you explain what Louis Armstrong’s music does? Can you draw a map of the United States in which the sound of a trumpeter in 1930s Texas reaches back to moments of liberation created by slaves in Congo Square and forward to the Supreme Court of 1954?
Or how do you chart the way in which the capture of three young American hikers by Iranian border guards on the Iraq-Iran border in 2009 and their imprisonment—the men for 781 days—became the occasion for secret talks between the U.S. and Iran that led to the interim nuclear agreement signed last month? Can you draw a map of the world in which three idealistic young people out on a walk become prisoners and then catalysts?
Looking back, one of those three prisoners, Shane Bauer, wrote, "One of my fears in prison was that our detention was only going to fuel hostility between Iran and the U.S. It feels good to know that those two miserable years led to something, that could lead to something better than what was before."
Bauer later added:
The reason our tragedy led to an opening between the United States and Iran was that many people were actively working to end our suffering. To do so, our friends and families had to strive to build a bridge between the U.S. and Iran when the two governments were refusing to do it themselves. Sarah [Shourd, the third prisoner] is not a politician and she has no desire to be, but when she was released a year before Josh and me, she made herself into a skilled and unrelenting diplomat, strengthening connections between Oman and the U.S. that ultimately led to these talks.
A decade ago I began writing about hope, an orientation that has nothing to do with optimism. Optimism says that everything will be fine no matter what, just as pessimism says that it will be dismal no matter what. Hope is a sense of the grand mystery of it all, the knowledge that we don’t know how it will turn out, that anything is possible. It means recognizing that the sound of a trumpet at a school dance in Austin, Texas, may resound in the Supreme Court 20 years later; that an unfortunate hike in the borderlands might help turn two countries away from war; that Edward Snowden, a young NSA contractor and the biggest surprise of this year, might revolt against that agency’s sinister invasions of privacy and be surprised himself by the vehemence of the global reaction to his leaked data; that culture which left Africa more than 200 years ago might return to that continent as a tool for liberation—that we don’t know what we do does.
That Massachusetts pear tree is still bearing fruit almost 400 years after it was planted. The planter of that tree also helped instigate the war against the Pequots, who were massacred in 1637. “The survivors were sold into slavery or given over to neighboring tribes. The colonists even barred the use of the Pequot name, ‘in order to cut off the remembrance of them from the earth,’ as the leader of the raiding party later wrote,” according to the New York Times.
For centuries thereafter, that Native American nation was described as extinct, erased, gone. It was written about in the past tense when mentioned at all. In the 1970s, however, the Pequots achieved federal recognition, entitling them to the rights that Native American tribes have as “subject sovereign nations”; in the 1980s, they opened a bingo hall on their reservation in Connecticut; in the 1990s, it became the biggest casino in the western world. (Just for the record, I’m not a fan of the gambling industry, but I am of unpredictable narratives.)
With the enormous income from that project, the tribe funded a Native American history museum that opened in 1998, also the biggest of its kind. The new empire of the Pequots has been on rocky ground since the financial meltdown of 2008, but the fact that it arose at all is astonishing more than 150 years after Herman Melville stuck a ship called the Pequod in the middle of his novel Moby Dick and mentioned that it was named after a people "now extinct as the ancient Medes.” Are there are longer odds in New England than that a people long pronounced gone would end up profiting from the bad-math optimism of their neighbors?
Meanwhile, that pear tree continues to bear fruit; meanwhile, hip-hop continues to be a vehicle for political dissent from the Inuit far north to Latin America; meanwhile, diplomatic relations with Iran have had some surprising twists and turns, most recently away from war.
I see the fabric of my country’s rights and justices fraying and I see climate change advancing. There are terrible things about this moment and it’s clear that the consequences of climate change will get worse (though how much worse still depends on us). I also see that we never actually know how things will play out in the end, that the most unlikely events often occur, that we are a very innovative and resilient species, and that far more of us are idealists than is good for business and the status quo to acknowledge.
What I learned first in New Orleans after Hurricane Katrina was how calm, how resourceful, and how generous people could be in the worst times: the “Cajun Navy” that came in to rescue people by boat, the stranded themselves who formed communities of mutual aid, the hundreds of thousands of volunteers, from middle-aged Mennonites to young anarchists, who arrived afterward to help salvage a city that could have been left for dead.
I don’t know what’s coming. I do know that, whatever it is, some of it will be terrible, but some of it will be miraculous, that term we reserve for the utterly unanticipated, the seeds we didn’t know the soil held. And I know that we don’t know what we do does. As Shane Bauer points out, the doing is the crucial thing.
Rebecca Solnit co-directed Unfathomable City: A New Orleans Atlas, the sequel to her 2010 Infinite City: A San Francisco Atlas. A TomDispatch regular, she has written the final article of the year for that site for the last nine years.
To read an excerpt from Unfathomable City, click here.
Copyright 2013 Rebecca Solnit
Image by Derek Bridges, licensed under Creative Commons.