5/15/2013 11:44:33 AM
Originally intended to protect defendants from unnecessary indictments,
grand juries have recently been used to investigate and intimidate innocent activists.
New York City legal
activist Jerry Koch is only the latest victim.
It might seem ironic that the
only place you can’t practice your 5th amendment right would be a federal
courtroom, considering its just such a place the amendment was designed for. It
might seem ironic that a process designed to protect people accused of serious
crimes can be used to imprison people for up to 18 months who have committed no
crime without bringing charges against them. It might, unless you know about
Grand juries are an old feature
of the English common law, and were originally designed to make sure that
prosecutors couldn’t bring cases about serious crimes against people without
evidence. The grand jury determines, before the trial, whether the prosecuting
attorney has enough evidence to continue with the case. Since it is an
evidentiary hearing that could effect the outcome of the trial, the grand jury
is completely secret, usually just with the prosecutor, the jury and the
witness giving testimony in the room.
But throughout the 20th century,
grand juries were used to bully political “enemies” of the state. From union
organizers to Communist Party members to Black Panthers to enivronmental
activisits, federal grand juries have been used by the government as a tactic
of harrasment and information gathering. Witnesses subpeonaed to the grand jury
cannot have their lawyer with them, and cannot refuse to testify. Despite fifth
amendment rights, refusing to speak to the grand jury can result in contempt of
court charges and the resister spending the length of the grand jury in jail,
which can be as long as 18 months.
Thus, by acting on one of your
most basic and core rights, in a room with no judge and no council present, you
can be de-facto convicted of contempt (the prosecutor would need to bring you
in front of a judge to rubberstamp the contempt charge) and thrown in jail to
languish for the duration of the grand jury process.
Just such a prospect is facing a
New York City
legal activist and anarchist: Gerald Koch is being subpoenaed regarding a
bombing in 2008, a bombing that broke a window and hurt no one, and that he was
subpoenaed for once before, in 2009. Not because they suspect him of being
involved, but because they think he may have overheard information about it in
a bar. As Jerry has put it in a public statement:
Given that I
publically made clear that I had no knowledge of this alleged event in 2009,
the fact that I am being subpoenaed once again suggests that the FBI does not
actually believe that I possess any information about the 2008 bombing, but
rather that they are engaged in a ‘fishing expedition’ to gain information
concerning my personal beliefs and political associations.
Last year, four anarchists in
the Pacific Northwest faced a similar grand jury over vandalism on May Day 2011: two
spent five months in jail, a third spent seven, all of them spending much of
that time in solitary confinment, despite the fact that they committed no
crime. Jerry faces a similar possibility of jail time. By refusing to speak to
a grand jury, they stand up for the safety of their friends and for their
rights, and they face serious consequences for doing so. What does it say about
our “free society” when it jails citizens for asserting their rights in a
completely closed process absent a judge or a lawyer?
If you're in the New York area, Jerry's
subpoenae date is 10:00am on May 16th, and people are going to pack the court
room at 500 Pearl St.
You can learn more about Jerry’s case, and how you can support him, at Jerry Resists.
Image by Brian
Turner, licensed under Creative Commons.
5/2/2013 3:23:50 PM
As marketing to children intensifies, what can society do?
This article is adapted from Solutions Online and is licensed under Creative Commons 3.0.
A four-year-old arrives at school and starts crying when she realizes her lunch is packed in a generic plastic bag, not the usual Disney Princess lunchbox she so loves. A friend tells her she won’t be able to sit at the princess lunch table—it’s only for girls with princess lunchboxes.
A fourth grader arrives home from school all excited. He has a Book It certificate from Pizza Hut because his mother signed the form showing that he met the reading-at-home goal his teacher set for him. He pleads with his mother to take him to Pizza Hut for dinner that night.
Sixth graders are assigned the task of writing to their principal about something important that they would like to see happen at their school. They decide to ask for school vending machines that sell snack foods and drinks.
Marketing is a more powerful force in the lives of children growing up today than ever before, beginning from a very young age. The stories above provide but a few examples of how it can shape learning and behavior at home and in school. Marketing affects what children want to eat, wear, and play, and with whom they play. It also shapes what they learn, what they want to learn, and why they want to learn. And it primes them to be drawn into, exploited, and influenced by marketing efforts in schools.
What Can We Do?
Many feel that a complete ban on all marketing to children is an impossible dream. But that is exactly what many countries do. Advertising to children is restricted in Great Britain, Belgium, Denmark, and Greece, and totally banned in Sweden and Norway. Studies have recently shown that children in Sweden want fewer toys, as a result. A study proves what companies already know: advertising to children works. Why advertise if it’s not effective? Similar efforts to restrict advertising were attempted in the United States in the 1970s but, unfortunately, failed to pass.
More restrictions might be on the way in countries like the UK, where a recent investigation into the causes of the 2011 looting found that a culture of consumption, fueled by marketers, played a role in the civil unrest. Early in 2012, the Riots, Communities, and Victims Panel, set up by Prime Minister David Cameron and deputy prime minister Nick Clegg, called for action against “aggressive advertising aimed at young people,” citing evidence that “rampant materialism was an underlying cause of last year’s lawlessness.”
The fact that marketing in schools is such an omnipresent and pernicious force in children’s lives makes finding solutions of utmost importance. It is unrealistic to expect that in the current economic times we can make marketing and the influence of marketing in schools go away. But, there is much we can and must do to reduce its harmful impact on children. No one effort can solve the problem; a multifaceted approach is needed. Here is what a comprehensive and meaningful response, directed at children, families, schools, communities, and the wider society, might be:
1. Educate parents, teachers and policymakers about the harm that marketing to children, especially in schools, can cause to children’s development, learning, and behavior. It is only through a change in public understanding of the dangers that we will be able to turn the tide.
2. Protect children as much as possible from exposure to commercial culture. Parents can use strategies at home that reduce children’s exposure to and focus on commercial culture and products, including less dependence on media that has advertising and multiple products associated with it. They can promote their children’s involvement with meaningful real world activities that do not focus on consumption or advertising. One school sent home a letter to parents with ideas for birthday parties that didn’t involve commercial themes like Disney Princesses or fast food chains’ packaged events.
Teachers and school administrators can work to reduce marketing in schools. For instance, they can limit the number of products with logos in school. This might involve setting up rules about what commercial products and logos children bring to school, and coming up with alternative, low-cost strategies to meet the same needs the banned product met. For instance, one early childhood program banned lunch boxes with logos, and sent home suggestions to parents about alternative, inexpensive containers they could use to pack their children’s lunches. One school board created a middle-school dress code that severely limited the size of logos that could appear on students’ clothing because so much bullying and teasing occurred against the children who didn’t have the “right,” clearly visible logos on their clothing.
3. Counteract the harmful lessons children learn from marketing both in and out of school. Teach children about the nature and impact of marketing and commercial culture in age-appropriate ways. Children are unduly influenced by ads and marketing practices directed at them because of how they think and also because of the unrelenting ways marketers capture their attention and loyalty. One teacher designed an activity based on the book Arthur’s TV Troubles by Marc Brown. The teacher asked students whether they had ever been disappointed with something they bought based on an ad. Every child had a story to share. When they wrote these stories down for homework, they produced their best writing of the year! In these days of No Child Left Behind pressures, which have forced educators to focus on the demands of the test rather than on the broad-based learning needs of children, we must convince educational policy makers that children will be more successful learners if they aren’t constantly being lured away from their lessons by marketing.1
Children need to feel safe talking to a trusted adult about what they see marketed at school and beyond and what they think about it, without being embarrassed, ridiculed, or punished. Only by having such conversations can we learn what children think and, in turn, influence their thinking. This does not mean lecturing about what is right and wrong or good and bad, or criticizing children for what they say and think. It means having give-and-take conversations that show we care about what they think and say, and hope they will care about and listen to what we have to say too. This is the key starting point for influencing the lessons that children are learning from marketing in schools.
4. Enact government regulations and policies that limit marketing in schools. Government and policy makers must play a role in limiting marketing to children, even in these harsh economic times. The best way to make this happen will be by providing adequate funding for schools, so that schools do not need to be so dependent on corporations. Great Britain provides the United States with a powerful example for what we can do: in 2006 it established a ban on junk food in school meals.2
One Organization, Making A Difference
Ten years ago, a Harvard academic, a child advocate, and a puppeteer launched an organization named Campaign for a Commercial Free Childhood (CCFC), and it has lead a national effort to police advertising and marketing to children. Based out of Boston, a coalition of educators, health care providers, parents, academics, and advocacy groups take on major corporations that they find are marketing to children, from Pizza Hut and Sunny D to the coal industry and Disney. Started by Susan Linn, a professor, the organization has landed a number of recent victories, including persuading Disney to offer a refund to parents who bought Baby Einstein videos and pressuring Scholastic to stop taking money from the coal industry. Scholastic was forced to drop its curriculum for fourth graders after admitting it was paid for by the National Coal Foundation. The curriculum was, unsurprisingly, one-sided in its endorsement of coal, without any mention of the environmental repercussions or of alternative energies.
CCFC’s current campaign includes a move to pressure PBS to drop its partnership with the fast-food company Chik-fil-A, in which the channel is paid to present commercials for fast food at the beginning and end of its shows. It also wants advertising to be removed from school buses. In an age when it seems even the most well-respected advocacy groups—from Sierra Club to Save the Children—have begun accepting corporate money, CCFC stands alone in refusing to be bought off.
Although there is tremendous work to be done, and the advertising and marketing industry is a financial behemoth to tackle, we believe that children deserve to grow up free of invasive and unrelenting marketing messages that peddle products known to be harmful to the health and well-being of young people. Children deserve the opportunity to explore their creativity without the interference of do-it-for-you toys. They must be able to develop the capacity to make independent decisions, and to enjoy life free from the insecurities and pressures inherent in marketing campaigns. We hope that the United States will follow the lead of other countries and recognize that restricting corporations’ ability to market to children is a healthy and necessary step.
- Defending the Early Years [online]. www.defendingtheearlyyearsproject.org.
- BBC News. Junk food banned in school meals [online] (May 19, 2006). news.bbc.co.uk/2/hi/uk_news/education/4995268.stm.
Photo by Labpluto123, licensed under Creative Commons.
4/18/2013 10:57:24 AM
From right-wing think tanks to Homeland
Security to the “drone lobby,” a lot’s riding on the constant threat of global terrorism. Here’s how it all started.
This article originally
appeared at TomDispatch.
The communist enemy, with the “world’s
fourth largest military,” has been trundlingmissiles around and threatening the United States
with nuclear obliteration. Guam, Hawaii, Washington: all, it claims, are targetable. The coverage in
the media has been hair-raising. The U.S. is rushing an untested missile defense system to Guam,
deploying missile-interceptor ships off the South Korean coast, sending “nuclear capable” B-2 Stealth bombers thousands of
miles on mock bombing runs, pressuring China, and conducting large-scale war games with its South Korean ally.
Only one small problem: there is as yet little evidence that the enemy with a few nuclear weapons
facing off (rhetorically at least) against an American arsenal of 4,650 of them has the ability to miniaturize and mount even one on a missile, no
less deliver it accurately, nor does it have a missile capable of reaching
Hawaii or Washington, and I wouldn't count on Guam either.
It also happens to be a desperate country, one possibly without enough fuel to fly a modern air force, whose
people, on average, are inches shorter than their southern neighbors thanks to
decades of intermittent famine and malnutrition, and who are ruled by a bizarre
three-generational family cult. If that other communist, Karl Marx, hadn’t once
famously written that history repeats itself “first as tragedy, then as farce,”
we would have had to invent the phrase for this very moment.
In the previous century, there were two devastating
global wars, which left significant parts of the planet in ruins. There was
also a "cold war" between two superpowers locked in a system of mutual
assured destruction (aptly acronymed as MAD) whose nuclear arsenals were
capable of destroying the planet many times over. Had you woken up any morning
in the years between December
7, 1941, and December 26, 1991, and been told that the leading
international candidate for America's Public Enemy Number One was Kim Jong-un’s
ramshackle, comic-opera regime in North Korea, you might have gotten down on
your hands and knees and sent thanks to pagan gods.
The same would be true for the other candidates for that
number one position since September 11, 2001: the original al-Qaeda (largely
decimated), al-Qaeda in the Arabian Peninsula located in poverty-stricken areas
of poverty-stricken Yemen, the Taliban in poverty-stricken Afghanistan, unnamed
jihadis scattered across poverty-stricken areas of North Africa, or
Iran, another rickety regional power run by not particularly adept theocrats.
All these years, we’ve been launching wars and pursuing a
“global war on terror." We’ve poured money into national security as if
there were no tomorrow. From our police to our borders, we’ve up-armored everywhere. We constantly
hear about “threats” to us and to the “homeland.” And yet, when you knock on
the door marked “Enemy,” there’s seldom anyone home.
Few in this country have found this
striking. Few seem to notice any disjuncture between the enemy-ridden,
threatening, and deeply dangerous world we have been preparing ourselves for
(and fighting in) this last decade-plus and the world as it actually is, even
those who lived through significant parts of the last anxiety-producing, bloody
You know that feeling when you wake up and realize you’ve
had the same recurrent nightmare yet again? Sometimes, there’s an equivalent in
waking life, and here’s mine: every now and then, as I read about the next move
in the spreading war on terror, the next drone assassination, the next
ratcheting up of the surveillance game, the next expansion of the secrecy that
envelops our government, the next set of expensive actions taken to guard us --
all of this justified by the enormous threats and dangers that we face -- I
think to myself: Where’s the enemy? And then I wonder: Just what kind of a
dream is this that we’re dreaming?
A Door Marked “Enemy” and No One Home
Let’s admit it: enemies can have their uses. And let’s
admit as well that it’s in the interest of some in our country that we be seen
as surrounded by constant and imminent dangers on an enemy-filled planet. Let’s
also admit that the world is and always will be a dangerous place in all sorts
Still, in American terms, the bloodlettings, the
devastations of this new century and the last years of the previous one have
been remarkably minimal or distant; some of the worst, as in the multi-country
war over the Congo with its more than five million dead have passed us by entirely;
some, even when we launched them, have essentially been imperial frontier
conflicts, as in Iraq and Afghanistan, or interventions of little cost (to us)
as in Libya, or frontier patrolling operations as in Pakistan, Yemen, Somalia,
and Northern Africa. (It was no mistake that, when Washington
launched its special operations raid on Abbottabad,
get Osama bin Laden, it was given the code name “Geronimo” and the message from the SEAL team
recording his death was “Geronimo-E KIA” or “enemy killed in action.”)
And let’s admit as well that, in the wake of those wars
and operations, Americans now have more enemies, more angry, embittered people
who would like to do us harm than on September 10, 2001. Let’s accept that
somewhere out there are people who, as George W. Bush once liked to say, “hate us" and what we stand for. (I
leave just what we actually stand for to you, for the moment.)
So let’s consider those enemies briefly. Is there a major
state, for instance, that falls into this category, like any of the great
warring imperial European powers from the sixteenth century on, or Nazi Germany
and Imperial Japan in World War II, or the Soviet Union
of the Cold War era? Of course not.
There was admittedly a period when, in order to pump up
what we faced in the world, analogies to World War II and the Cold War were
rife. There was, for instance, George W. Bush’s famed rhetorical construct, the Axis of
Evil (Iraq, Iran, and North Korea), patterned by his
speechwriter on the German-Italian-Japanese “axis” of World War II. It was, of
course, a joke construct, if reality was your yardstick. Iraq and Iran were then enemies. (Only in
the wake of the U.S.
invasion and occupation of Iraq
have they become friends and allies.) And North Korea had nothing whatsoever
to do with either of them. Similarly, the American occupation of Iraq was once regularly compared to the
U.S. occupations of Germany and Japan, just as Saddam Hussein had long been presented as a modern Hitler.
In addition, al-Qaeda-style Islamists were regularly
referred to as Islamofascists, while certain military and neocon types
with a desire to turn the war on terror into a successor to the Cold War took
to calling it “the long war,” or even “World War IV.” But all of this was so wildly out of whack
that it simply faded away.
As for who’s behind that door marked “Enemy,” if you
opened it, what would you find? As a start, scattered hundreds or, as the years have
gone by, thousands of jihadis, mostly in the poorest backlands of the
planet and with little ability to do anything to the United States. Next, there were a
few minority insurgencies, including the Taliban and allied forces in Afghanistan and separate Sunni and Shia ones in Iraq. There
also have been tiny numbers of wannabe
Islamic terrorists in the U.S.
(once you take away the string of FBI
sting operations that have regularly turned hopeless slackers and lost
teenagers into the most dangerous of fantasy Muslim plotters). And then, of
course, there are those two relatively hapless regional powers, Iran and North Korea, whose bark far exceeds
their potential bite.
The Wizard of Oz on 9/11
in other words, is probably in less danger from external enemies than at any
moment in the last century. There is no other imperial power on the planet
capable of, or desirous of, taking on American power directly, including China. It’s
true that, on September 11, 2001, 19 hijackers with box cutters produced a
remarkable, apocalyptic, and devastating TV show in which almost 3,000 people died. When those giant towers in
downtown New York
collapsed, it certainly had the look of nuclear disaster (and in those first days, the
media was filled was nuclear-style references), but it wasn’t actually an
The enemy was still nearly nonexistent. The act cost
bin Laden only an estimated $400,000-$500,000, though it would lead to a series
of trillion-dollar wars. It was a nightmarish event that had a
malign Wizard of Oz quality to it: a tiny man producing giant
effects. It in no way endangered the state. In fact, it would actually
strengthen many of its powers. It put a hit on the economy, but a passing one.
It was a spectacular and spectacularly gruesome act of terror by a small,
murderous organization then capable of mounting a major operation somewhere on
Earth only once every couple of years. It was meant to spread fear, but nothing
When the towers came down and you could suddenly see to
the horizon, it was still, in historical terms, remarkably enemy-less. And yet
9/11 was experienced here as a Pearl Harbor
moment -- a sneak attack by a terrifying enemy meant to disable the country.
The next day, newspaper headlines were filled with variations on “A Pearl Harbor of the Twenty-First Century.”
If it was a repeat of December 7, 1941, however, it lacked an imperial Japan or
any other state to declare war on, although one of the weakest partial states
on the planet, the Taliban's Afghanistan, would end up filling the bill
adequately enough for Americans.
To put this in perspective, consider two obvious major
dangers in U.S.
life: suicide by gun and death by car. In 2010, more than 19,000 Americans killed themselves using guns. (In the same
year, there were “only” 11,000 homicides nationwide.) In 2011, 32,000 Americans
died in traffic accidents (the lowest figure in 60 years, though it was again on the rise in the first six months of 2012). In other
words, Americans accept without blinking the equivalent yearly of more than six
9/11s in suicides-by-gun and more than 10 when it comes to vehicular deaths.
Similarly, had the underwear bomber, to take one post-9/11 example of
terrorism, succeeded in downing Flight 253 and murdering its 290 passengers, it would have
been a horrific act of terror; but he and his compatriots would have had to
bring down 65 planes to reach the annual level of weaponized suicides and more
than 110 planes for vehicular deaths.
And yet no one has declared war on either the car or the
gun (or the companies that make them or the people who sell them). No one has
built a massive, nearly trillion-dollar car-and-gun-security-complex to deal
with them. In the case of guns, quite the opposite is true, as the post-Newtown
debate over gun control has made all too clear. On both scores, Americans have
decided to live with perfectly real dangers and the staggering carnage that
accompanies them, constraining them on occasion or sometimes not at all.
Despite the carnage of 9/11, terrorism has been a small-scale American danger in the years since, worse than
shark attacks, but not much else. Like a wizard, however, what Osama bin Laden
and his suicide bombers did that day was create an instant sense of an enemy so
big, so powerful, that Americans found “war” a reasonable response; big enough
for those who wanted an international police action against al-Qaeda to be
laughed out of the room; big enough to launch an invasion of revenge against
Iraq, a country unrelated to al-Qaeda; big enough, in fact, to essentially
declare war on the world. It took next to no time for top administration
officials to begin talking about targeting 60
countries, and as journalist Ron Suskind has reported, within six days of the attack, the CIA had
topped that figure, presenting President Bush with a “Worldwide Attack Matrix,”
a plan that targeted terrorists in 80 countries.
What’s remarkable is how little the disjuncture between
the scope and scale of the global war that was almost instantly launched and
the actual enemy at hand was ever noted here. You could certainly make a
reasonable argument that, in these years, Washington has largely fought no one -- and
lost. Everywhere it went, it created enemies who had, previously, hardly
existed and the process is ongoing. Had you been able to time-travel back to the Cold
War era to inform Americans that, in the future, our major enemies would be in
Afghanistan, Yemen, Somalia, Mali, Libya, and so on, they would surely have
thought you mad (or lucky indeed).
Creating an Enemy-Industrial Complex
Without an enemy of commensurate size and threat, so much
that was done in Washington
in these years might have been unattainable. The vast national security building and spending spree -- stretching from the Virginia
suburbs of Washington, where the National Geospatial-Intelligence Agency
erected its new $1.8 billion headquarters, to Bluffdale, Utah, where the
National Security Agency is still constructing a $2 billion, one-million-square-foot data center for storing
the world’s intercepted communications -- would have been unlikely.
Without the fear of an enemy capable of doing anything,
money at ever escalating levels would never have poured into homeland security,
or the Pentagon, or a growing complex of crony corporations associated with our
weaponized safety. The exponential growth of the national security complex, as
well as of the powers of the executive branch when it comes to
national security matters, would have far been less likely.
Without 9/11 and the perpetual “wartime” that followed,
along with the heavily promoted threat of terrorists ready to strike and
potentially capable of wielding biological, chemical, or even nuclear weapons,
we would have no Department of Homeland Security nor the lucrative mini-homeland-security complex that surrounds it; the
17-outfit U.S. Intelligence Community with its massive $75 billion official budget would have been far less impressive;
our endless drone wars and the “drone lobby” that goes with them might never have
developed; and the U.S. military would not have an ever growing secret military, the Joint Special Operations Command,
gestating inside it -- effectively the president’s private army, air force, and
navy -- and already conducting largely secret operations across much of the
For all of this to happen, there had to be an
enemy-industrial complex as well, a network of crucial figures and institutions
ready to pump up the threat we faced and convince Americans that we were in a
world so dangerous that rights, liberty, and privacy were small things to
sacrifice for American safety. In short, any number of interests from Bush
administration figures eager to “sweep it all up” and do whatever they wanted in the world
to weapons makers, lobbyists, surveillance outfits, think tanks, military intellectuals, assorted pundits... well, the whole
national and homeland security racket and its various hangers-on had an
interest in beefing up the enemy. For them, it was important in the post-9/11
era that threats would never again lack a capital “T” or a hefty dollar sign.
And don’t forget a media that was ready to pound the
drums of war and emphasize what dangerous enemies lurked in our world with
remarkably few second thoughts. Post-9/11, major media outlets were generally
prepared to take the enemy-industrial complex’s word for it and play every new
terrorist incident as if it were potentially the end of the world. Increasingly
as the years went on, jobs, livelihoods, an expanding world of “security”
depended on the continuance of all this, depended, in short, on the injection
of regular doses of fear into the body politic.
That was the “favor” Osama bin Laden did for Washington’s national
security apparatus and the Bush administration on that fateful September
morning. He engraved an argument in the American brain that would live on
indelibly for years, possibly decades, calling for eternal vigilance at any
cost and on a previously unknown scale. As the Project for the New American
Century (PNAC), that neocon think-tank-cum-shadow-government, so fatefully put
it in "Rebuilding America's Defenses" a year before the 9/11 attacks:
“Further, the process of transformation [of the military], even if it brings
revolutionary change, is likely to be a long one, absent some catastrophic and
catalyzing event -- like a new Pearl Harbor.”
So when the new Pearl Harbor
arrived out of the blue, with many PNAC members (from Vice President Dick
Cheney on down) already in office, they naturally saw their chance. They
created an al-Qaeda on steroids and launched their “global war” to establish a Pax
Americana, in the Middle East and then perhaps globally. They were aware
that they lacked opponents of the stature of those of the previous century and,
in their documents, they made it clear that they were planning to
ensure no future great-power-style enemy or bloc of enemy-like nations would
For this, they needed an American public anxious,
frightened, and ready to pay. It was, in other words, in their interest to
manipulate us. And if that were all there were to it, our world would be a
grim, but simple enough place. As it happens, it’s not. Ruling elites, no
matter what power they have, don’t work that way. Before they manipulate us,
they almost invariably manipulate themselves.
I was convinced of this years ago by a friend who had
spent a lot of time reading early Cold War documents from the National Security
Council -- from, that is, a small group of powerful governmental figures
writing to and for each other in the utmost secrecy. As he told me then and
wrote in Washington’s China, the smart book he did on the
early U.S. response to the establishment of the People’s Republic of China,
what struck him in the documents was the crudely anti-communist language those
men used in private with each other. It was the sort of anti-communism you
might otherwise have assumed Washington’s
ruling elite would only have wielded to manipulate ordinary Americans with
fears of Communist subversion, the “enemy within,” and Soviet plans to take
over the world. (In fact, they and others like them would use just such
language to inject fear into the body politic in those early Cold War years,
that era of McCarthyism.)
They were indeed manipulative men, but before they
influenced other Americans they assumedly underwent something like a process of
collective auto-hypnotism in which they convinced one another of the dangers
they needed the American people to believe in. There is evidence that a similar
process took place in the aftermath of 9/11. From the flustered look on George
W. Bush’s face as his plane took him not toward but away from Washington on September 11, 2001, to the image of
Dick Cheney, in those early months, being chauffeured around Washington in an armored motorcade with
a “gas mask and a biochemical survival suit" in the backseat, you could
sense that the enemy loomed large and omnipresent for them. They were, that is,
genuinely scared, even if they were also ready to make use of that fear for
their own ends.
Or consider the issue of Saddam Hussein’s supposed
weapons of mass destruction, that excuse for the invasion of Iraq. Critics
of the invasion are generally quick to point out how that bogus issue was used
by the top officials of the Bush administration to gain public support for a
course that they had already chosen. After all, Cheney and his men cherry-picked the evidence to make their case, even formed their own secret intel outfit to give them what they
needed, and ignored facts at hand that brought their version of events into question.
They publicly claimed in an orchestrated way that Saddam had active nuclear and WMD
programs. They spoke in the most open ways of potential mushroom clouds from (nonexistent) Iraqi nuclear weapons
rising over American cities, or of those same cities being sprayed
with (nonexistent) chemical or biological weapons from (nonexistent) Iraqi
drones. They certainly had to know that some of this information was useful but
bogus. Still, they had clearly also convinced themselves that, on taking Iraq, they
would indeed find some Iraqi WMD to justify their claims.
In his soon-to-be-published book, Dirty Wars, Jeremy Scahill cites the conservative
journalist Rowan Scarborough on Secretary of Defense Donald Rumsfeld’s growing
post-invasion irritation over the search for Iraqi WMD sites. “Each morning,”
wrote Scarborough, “the crisis action team had
to report that another location was a bust. Rumsfeld grew angrier and angrier.
One officer quoted him as saying, ‘They must be there!’ At one briefing, he
picked up the briefing slides and tossed them back at the briefers.”
In other words, those top officials hustling us into
their global war and their long-desired invasion of Iraq had also hustled themselves
into the same world with a similar set of fears. This may seem odd, but given
the workings of the human mind, its ability to comfortably hold potentially
contradictory thoughts most of the time without disturbing itself greatly, it’s
A similar phenomenon undoubtedly took place in the larger
national security establishment where self-interest combined easily enough with
fear. After all, in the post-9/11 era, they were promising us one thing:
something close to 100% “safety” when it came to one small danger in our world
-- terrorism. The fear that the next underwear bomber might get through surely
had the American public -- but also the American security state -- in its
grips. After all, who loses the most if another shoe bomber strikes, another
ambassador goes down, another 9/11 actually happens? Whose job, whose
world, will be at stake then?
They may indeed be a crew of Machiavellis, but they are
also acolytes in the cult of terror and global war. They live in the Cathedral
of the Enemy. They were the first believers and they will undoubtedly be the
last ones as well. They are invested in the importance of the enemy. It’s their
religion. They are, after all, the enemy-industrial complex and if we are in
their grip, so are they.
The comic strip character Pogo once famously declared: “We have met the enemy and he is us.”
How true. We just don’t know it yet.
Tom Engelhardt, co-founder of the
and author of
The United States of Fear
as well as a history of the
The End of Victory Culture
, runs the Nation Institute's
His latest book, co-authored with Nick Turse, is
Terminator Planet: The First History of Drone Warfare,
Follow TomDispatch on Twitter and join us on Facebook. Check
out the newest Dispatch book, Nick Turse’s The Changing Face of Empire: Special Ops, Drones, Proxy
Fighters, Secret Bases, and Cyberwarfare.
Copyright 2013 Tom Engelhardt
Image by ISAF Media,
licensed under Creative
4/15/2013 10:27:11 AM
How corporate power is ruining your life, explained in animated GIFs
when 20 million Americans hit the streets to celebrate the
first Earth Day:
And it wasn’t just a party.
People of all ages and political stripes were demanding regulations protecting earth,
air, water, and wildlife.
Not everyone was happy about it, though.
When Lewis Powell, a corporate lawyer from Richmond, Virginia, heard about Earth Day:
Why? Powell served on the board of directors of several international
corporations—corporations whose profitability would be hampered by all the new
When Powell thought of a way to stop them:
He schemed up
a memo—titled “Attack on American Free Enterprise System”—and presented it to
the U.S. Chamber of Commerce on August 23, 1971. In it, he laid out a broad
plan: corporate leaders would use an “activist-minded Supreme Court” to enact
“social, economic, and political change” in favor of corporate power.
What the rest of America
The memo was secret, so barely anyone knew about Powell’s plot until
Really, Powell’s idea wasn’t totally new. He had already sued the U.S.
government on behalf of
the cigarette industry, saying the
government’s assertion that cigarettes were dangerous was
controversial and that cigarette companies had a right under free speech to promote their product in whatever
way they liked. It worked. America’s
response was to keep ‘em lit:
And when President Nixon nominated Powell for the Supreme Court and the
Senate voted him in (less than six months after the Chamber read his memo):
With Powell on the Court, corporations got busy creating legal
foundations to fund lawsuits across the country. They introduced the idea that
corporations were “persons,” “speakers,” “voices,” and “protectors of our
freedoms.” They said that government regulations over pollution, wages, or
political spending made corporations feel like this:
Meanwhile, Americans were cleaning house. The Clean Water Act was
passed in 1972. After this came the Endangered Species Act (1973), the first fuel
economy standards for cars (1975), and the Toxic Substances Control Act (1976).
“Strength,” Powell had written in his memo, “lies in organization, in
careful long-range planning and implementation, in consistency of action over
an indefinite period of years.”
By 1978, Powell and his cronies were ready to take his plan to the next
level. A few corporations got together to challenge a Massachusetts law banning corporate spending
in referendum ballots. They wanted to use corporate funds to defeat a
progressive income tax vote later that year.
When they lost,
progressives were all:
But then they took their case to the Supreme Court, where Justice
Powell had been waiting for just such an opportunity:
Powell cast the deciding vote (5-4), declaring that
“corporations are persons” and corporate money is “speech” under the First
Amendment, ushering in the current era of corporate power.
Between 1978 and 1984, Justice Powell overrode laws citizens had agreed
on, in favor of legislation benefitting the pharmaceutical, energy, tobacco,
and banking industries. By the time he resigned in 1987, the corporate world
had made up its mind:
When the agribusiness industry spends $75-145 million a year lobbying to make sure America always has a good supply of junk food at its fingertips:
“The health of Americans is secondary to layers of taxpayer
subsidies and preferential treatment for corporate food giants and coal and
utility corporations, resulting in epidemic-level rates of obesity, asthma, and
type 2 diabetes,” writes
Jeffrey D. Clements for YES! Magazine. And this in spite of healthy profits for pharmaceutical and health care corporations (which spent over $2 billion lobbying the government between 1998 and 2010).
That’s not all. Between 1998 and 2010, military contractors spent over
$400 million and ExxonMobil spent $151 million lobbying. But “control of our energy
policy by global fossil fuel corporations and unregulated corporate lobbying,
even for weapons the Pentagon doesn’t want,” Clements writes, “leads to endless
war in the Middle East and uncontrolled
It also means we continue
to drive everywhere. When we build roads for cars that pollute the air and
suburbs that destroy wilderness:
So, we pay with our health, with endless war, and destruction of the
environment—but there's more. Corporate rule is also why we’re broke.
Yep. Between 1998 and 2010 the Chamber of Commerce spent $739 million
lobbying in favor of big business. The results? “Corporate-friendly trade and tax policies
have moved jobs overseas, destroyed our manufacturing capacity, produced vast
wage and income inequality, and gutted local economies and communities,” writes
What that means for the
What that means the rest of us:
Then, in 2010, the Supreme Court decision in Citizens United v. Federal Election Commission gave
corporations the go-ahead to spend as much as they wanted influencing
Politicians who failed to do what corporate lobbies asked were punished
with negative ads funded by the corporate elite. So now when our elected
officials look at us:
And while corporations love consumers, this is what they say when we
try to act like citizens:
Case in point: Monsanto, when Vermonters tried to enact labeling laws
for recombinant bovine growth hormone (rBGH):
And when people try to get environmental protections, fair wages, or an end to military offensives, one corporation or another is always:
Nearly 80 percent of the
public opposes the Citizens United
decision. That it hasn’t been reversed goes to show how skewed the current
balance of power is. Many
representatives and citizens’ groups are calling for a constitutional
amendment to reverse it and end money's use as "speech" altogether. When that day comes, we may finally be able to slow climate change, end
war, get healthy, and get paid.
This article is based on Jeffrey D.
Clements essay, “Rights
are for Real People,” from the Spring 2012 issue of Yes! Magazine. Clements is the author of the book Corporations Are Not
4/9/2013 12:41:36 PM
Filmmaker Annie Leonard finds people want to be liberated from overconsumption.
Annie Leonard is one of the most articulate, effective champions of the commons today. Her webfilm The Story of Stuff has been seen more than 15 million times by viewers. She also adapted it into a book.
Drawing on her experience investigating and organizing on environmental health and justice issues in more than 40 countries, Leonard says she’s “made it her life’s calling to blow the whistle on important issues plaguing our world.”
This article originally appeared in On the Commons.
On the Commons recently asked Leonard a few questions about the commons.
How did you first learn about the commons?
I first learned about the commons as a kid using parks and libraries. I didn’t assign the label “commons” to them, but I understood early on that some things belong to all of us and these shared assets enhance our lives and rely on our care.
Like many other college students, my first introduction to the word “commons” was sadly in conjunction with the word “sheep” and “tragedy.” That lousy resource management class tainted the word for me for years, until I heard Ralph Nader address a group of college students. He asked them to yell out a list of everything they own. This being the pre-i-gadget 1980’s, the list included “Sony Walkman…boombox… books…bicycle…clothes…bank account.” When the lists started to peter out, Ralph asked about National Parks and public airwaves. A light went off in each of our heads, and a whole new list was shouted out: rivers, libraries, the Smithsonian, monuments. That’s when I realized that the commons isn’t an overgrazed pasture; it really is all that we share.
What do you see as the biggest obstacle to creating a commons-based society right now?
There are so many interrelated aspects of our current economic and social systems which undermine the commons. Some obstacles are structural, like government spending priorities that elevate military spending and oil company subsidies over maintenance of parks and libraries. Others are social, including the erosion in social fabric and community-based lifestyles. Actually, even those have structural drivers; for example, land use planning which eliminates sidewalks and requires long commutes to work contribute to breakdown of social commons by impeding social interactions. It’s all so interconnected!
A huge obstacle is the shift toward greater privatization and commodification of physical and social assets. Many things that used to be shared—from open spaces for recreation to support systems to help a neighbor in need—have been privatized and commodified; they’ve been moved out of the community into the market place. This triggers a downward spiral. Once things become privatized, or un-commoned, we no longer have access to them without paying a fee. We then have to work longer hours to pay for all these things which used to be freely available—everything from safe afterschool recreation for kids to clean water to swim in to someone to talk to when you’re feeling blue. And since we’re working longer hours and spending more time alone, we have less time to contribute to the commons to rebuild these assets: less volunteer hours, less beach-clean-up days, less time for civic engagement to advocate for policies that protect the commons, less time to invite a neighbor over for tea. And on it goes.
What is the greatest opportunity to strengthen and expand the commons right now?
In spite of real obstacles, we have a lot on our side as we advance a commons-based agenda. First, we have no choice. There’s a very real ecological imperative weighing down on us. Even if we wanted to continue this overconsumptive, hyper individualistic and vastly unequal way of living, we simply can’t. We have to learn to share more and waste less, to find joy and meaning in shared assets and experiences rather than in private accumulation, to work together for a better world, rather than to build bigger walls around those who can. And the good news is that these changes not only will enable us to continue to live on this planet, but they will result in a happier, healthier society overall.
There’s another shift emerging which offers some real opportunities for building support for the commons. People in the overconsuming parts of the world are getting fed up with the burden of trying to own everything individually. We used to own our stuff and increasingly our stuff owns us. We work extra hours to buy more stuff, we spend our weekends sorting our stuff. We’re constantly needing to upgrade, repair, untangle, recharge, even pay to store our stuff. It’s exhausting.
The shift I see emerging is from an acquisition focused relationship to stuff, to an access- focused relationship. In the acquisition framework, the more stuff we had, the better, as captured in the 1990s bumpersticker “He Who Dies with the Most Toys Wins.” Having spent a couple decades being slaves to our stuff, we are rethinking. Now it is “He Who dies with the Most Toys Wasted His Life Working to Buy Them and Lived in a Cluttered House When He Could have been Investing in Community with which to Share Toys.
Increasingly people want access to stuff, not all the burden that comes with ownership. Instead of owning a car and dealing with all that comes with it, we get one just when we want through city car share programs. Instead of hiring a plumber, we swap music lessons with one through skillsharing networks. Why buy something to own alone, when we can share it with others? Why signup for an even more crushing mortgage for a house with a big back yard, when we can instead share public parks? From coast to coast, there’s a resurgence of sharing, so much that it even has a fancy new name: collaborative consumption. I’m really excited about this. A whole new generation of people is realizing that access to shared stuff is easier on one’s budget and on the planet, then individual ownership. Now, that’s liberating.
Image: Annie Leonard by annainaustin, licensed under Creative Commons.
"Story of Change", Annie Leonard's "Story of Stuff" follow-up video.
4/5/2013 11:18:31 AM
With an increasingly small fraction of wealthy Americans
buying and selling elections, power has never been more unequal in Washington, says Lawrence Lessig in a new TED Talk. And the problem goes way beyond the 1 percent.
Everybody seems to agree that there’s too much money in
politics. According to a Demos poll during the last election cycle, more
than 80 percent of Americans agree that “corporate political spending
drowns out voices of average Americans,” and more than half would support a ban
on all corporate donations. What’s more, opposition to laws like the Citizens United decision is equally
strong among those on the left and the right.
But knowing that the system is rigged is different than
understanding exactly what’s behind it. With Super PACs and “independent
expenditures” veiled from public knowledge by Citizens United and other laws, how do we know what’s really going
For activist and academic Lawrence Lessig, it all comes down
to the Lesters. That is, the 144,000 or so Americans that are rigging the game
for the rest of us—roughly the same small number of Americans who are named Lester. These
are the guys making big donations to Super PACs and hiring high-powered
lobbies. They’re also the guys members of Congress are trying very hard to impress—as
Lessig adds, federal politicians spend somewhere between 30 and 70 percent of
their time just trying to raise even more money from the Lesters.
But wait, it gets worse. The Lesters may have more political
influence than most of us can fathom, but they’re no match for the real movers
and shakers. The ones who are really in charge are the .000042 percent—that’s
exactly 132 Americans—who made 60 percent of Super PAC contributions in 2012. I’m
gonna let that sink in a little…
But don’t worry, there is hope. There are plenty of
proposals for fairer elections already on the table, and no shortage of public
support. The key, says Lessig, is to remember that the barriers to real change
are not insurmountable—just political.
To learn more, check out his TED Talk below:
Image by Kevin Dooley,
licensed under Creative
4/3/2013 12:09:08 PM
If we want more students to succeed in college, we have to turn
full attention to the craft of university-level teaching. What’s at stake is
not only increasing graduation rates but providing a quality education for
those who, a generation or two ago, might not have seen college as possible.
originally appeared at the
Right after I gave
my opening lecture on Oedipus the King to the 30 employees of Los Angeles’s
criminal justice system, I handed out a few pages of notes I would have taken
if I were sitting in their seats listening to the likes of me.
They were taking my course, Introduction to Humanities, as
part a special program leading to a college degree, and I knew from a survey I
gave them that many hadn’t been in a classroom in a long time – and some didn’t
get such great educations when they were. So we spent the last half hour of the
class comparing my notes with the ones they had just taken, talking about the
way I signaled that something was important, how they could separate out a big
idea from specific facts, how to ask a question without looking like a dummy.
I taught that humanities course more than 30 years
ago, but I was thinking about it as I read the new report from the National
Commission on Higher Education Attainment, “College Completion Must Be Our
Priority.” The report is a call to leaders in higher education to
increase graduation rates by scheduling courses and services to accommodate
working adults, developing more on-line learning, easing the ability of students
to transfer, and implementing a host of other sensible solutions to the many
barriers that are contributing to America’s
stagnating college graduation rates.
But if we want more students to succeed in
college, then colleges have to turn full attention to teaching.
To their credit, the authors of the college
completion report call for better professional development for college faculty;
however, most reports of this type have little to say about teaching, focusing
instead on structural and administrative reforms outside the classroom. It is a
Perhaps the authors of these reports believe that
teaching is such an individual activity that not much can be done to affect it.
Another reason has to do with the way college
teaching gets defined in practice. Faculty become experts in a field, and then
they pass on their knowledge to others through college courses. Some teachers
get very good at this delivery – compelling lectures, creative demonstrations,
engaging discussions, and useful assignments. But professors don’t usually
think beyond their subjects to the general intellectual development of the
undergraduates before them, to enhancing the way they learn and make sense of
Finally, I don’t see much evidence at the policy
level of a deep understanding of college-level teaching or a respect for its
The problem starts in the graduate programs where
college instructors are minted. Students learn a great deal about, let’s say,
astrophysics or political science, but not how to teach it. They might assist
in courses and pay attention to how their professors teach, but none of this is
systematic or a focus of study or mentoring.
And there is rarely a place in the curriculum to
consider the difficulties students might have as they learn how to think like
an astrophysicist or political scientist. And then there are the reading and
writing difficulties that can emerge when encountering a discipline for the first
The majority of new college faculty wants to teach
well – and many do. But they won’t find on most college campuses an
institutional culture that fosters teaching. To be sure, there are rewards for
good teaching – awards, the esteem of students – and most institutions, even
research universities, consider exemplary teaching as a factor in promotion.
And some campuses have programs that provide resources for instruction, but
they tend to be low-status and under-utilized operations.
Teaching has special meaning now, as the authors
of the report on student success point out, because close to half of American
undergraduates are a bit more like those students in my humanities class than
our image of the traditional college student fresh out of high school.
Particularly in the community colleges and state
colleges where the majority of Americans receive their higher education,
students are older, they work, and many have children. A significant percentage
are the first in their families to go to college; somewhere between 40 to 50
percent need to take one or more remedial courses in English or mathematics.
To do right by these students, we need to rethink
how to teach them. This does not mean rushing to electronic technology – a
common move these days. On-line instruction of any variety will only be as good
as the understanding of teaching and learning that underlies it.
We can begin by elevating the value of teaching
and creating more opportunities to get better at it. For those students who
need help with writing, mathematics, and study skills, there are tutoring
centers and other campus resources. Faculty should forge connections with these
resources but realize that they, too, can provide guidance and tricks of the
trade – like taking good notes – as well as an orientation to their field.
In my experience, students at flagship
universities and elite colleges could also benefit from this approach to
instruction. Just ask them.
Doing such things does not mean abandoning our
subject area but rather enhancing it and opening a door to it.
Working with those humanities students on their
notes helped them develop better note-taking techniques. But as we studied
technique, we also thought hard about how to determine what’s important – and
how to make someone else’s information your own. All this involved talking
further about Greek tragedy, about literary interpretation, and about what the
humanities can provide
What’s at stake is not only increasing graduation
rates but also providing a quality education for those who, a generation or two
ago, might not have seen college as possible.
a professor in the UCLA
of Education & Information Studies and author of Back to School: Why
Everyone Deserves a Second Chance at Education.
Image by Alan Levine, licensed under Creative Commons.
Want to gain a fresh perspective? Read stories that matter? Feel optimistic about the future? It's all here! Utne Reader offers provocative writing from diverse perspectives, insightful analysis of art and media, down-to-earth news and in-depth coverage of eye-opening issues that affect your life.
Save Even More Money By Paying NOW!
Pay now with a credit card and take advantage of our earth-friendly automatic renewal savings plan. You save an additional $6 and get 6 issues of Utne Reader for only $29.95 (USA only).
Or Bill Me Later and pay just $36 for 6 issues of Utne Reader!