Thursday, December 13, 2012 10:49 AM
This post originally appeared at On the Commons.
In the wake of superstorm Sandy and a presidential
election in which both candidates essentially ignored climate change, it’s time
that our schools began to play their part in creating climate literate
Hurricane Sandy, and the superstorms that will follow,
are not just acts of nature—they are products of a massive theft of the
atmospheric commons shared by all life on the planet. Every dollar of profit
made by fossil fuel companies relies on polluting our shared atmosphere with
harmful greenhouse gases, stealing what belongs to us all. But if we don’t
teach students the history of the commons, they’ll have a hard time recognizing
what—and who—is responsible for today’s climate crisis.
If the commons is taught at
all in history classes, it’s likely as a passing reference to English
enclosures—the process by which lands traditionally used in common by the poor
for growing food, grazing animals, collecting firewood, and hunting game were
fenced off and turned into private property. Some textbooks may mention the
peasant riots that were a frequent response to enclosures, or specific groups
like the Diggers that resisted enclosure by tearing down fences and
reestablishing common areas. But they are buried in chapters that champion
industrial capitalism’s “progress” and “innovation.”
Some texts, like McDougal
Littell’s widely used Modern World History, skip the peasants’ resistance
entirely, choosing instead to sing the praises of enterprising wealthy
landowners: “In 1700, small farms covered England’s landscape. Wealthy
landowners, however, began buying up much of the land that village farmers had
once worked. The large landowners dramatically improved farming methods. These
innovations amounted to an agricultural revolution.”
This is a disturbing
narrative, as much for what it leaves out as for what it gets wrong. Students
could fairly assume that enclosures involved a fair exchange between “wealthy
landowners” and “village farmers,” instead of the forced evictions that removed
peasants from land that their families had worked for generations. Take the
account of Betsy Mackay, 16, when the Duke of Sutherland evicted her family in
“Our family was very reluctant to leave and stayed for some time, but the
burning party came round and set fire to our house at both ends, reducing to
ashes whatever remained within the walls. The people had to escape for their
lives, some of them losing all their clothes except what they had on their
back. The people were told they could go where they liked, provided they did
not encumber the land that was by rights their own. The people were driven away
The McDougal Littell
version of history silences the voices of the poor, who struggled for centuries
to maintain their traditional rights to subsist from common lands—rights
enshrined in 1217 in the Charter of the Forest,
the often-overlooked sister document to the Magna Carta.
Of course, this history is
not limited to land enclosures during the British agricultural revolution.
Around the world, European colonizers spent centuries violently “enclosing”
indigenous peoples’ land throughout the Americas,
India, Asia, and Africa. The Indian scholar and activist Vandana Shiva
explains why this process was a necessary aspect of colonialism:
The destruction of commons
was essential for the industrial revolution, to provide a supply of natural
resources for raw material to industry. A life-support system can be shared, it
cannot be owned as private property or exploited for private profit. The
commons, therefore, had to be privatized, and people’s sustenance base in these
commons had to be appropriated, to feed the engine of industrial progress and
The enclosure of the
commons has been called the revolution of the rich against the poor.
In the same way that world
history curriculum passes over the social and ecological consequences of land
enclosure, the current U.S.
history curriculum contributes to a larger ecological illiteracy by glossing
over the historical role of nature. When we’re not taught to understand the
intimate and fundamental connections between people and the environment in our
nation’s history, it should come as no surprise that we struggle to make these
same connections today.
One of the few places where
nature shows up in the U.S. History curriculum is with discussions of how
Native American and European concepts of landownership differed. Textbooks could
provide a valuable opportunity for students to analyze these differences.
Instead, they usually dismiss Native American notions of property as quaint and
in the end—just like the struggle of the Diggers—somewhat tragic in the grand
scheme of things.
Every textbook I’ve seen
presents the buying and selling of land as a normal—even inevitable—part of
human history. What’s missing from all accounts is the naked truth that land
inhabited and used in common by English peasants and Native Americans had to first
be stolen, before it could ever become the private property that can be bought
and sold today.
Instead, we have this
section of Prentice Hall’s America,
titled “Conflict with Native Americans”: “Although the Native Americans did
help the English through the difficult times, tensions persisted. Incidents of
violence occurred side by side with regular trade. Exchanges begun on both
sides with good intentions could become angry confrontations in a matter of
minutes through simple misunderstandings. Indeed, the failure of each group to
understand the culture of the other prevented any permanent cooperation between
the English and Native Americans.”
This is history of the
worst kind, in which a misguided attempt at “balance” results in a morally
ambiguous explanation for the dispossession and murder of millions of Native
In fact, the growth of
industrial capitalism has been predicated on the private enclosure of the
natural world. And these enclosures have always met with resistance. Students
need to learn this alternative narrative for at least two reasons. First, it
encourages critical conversation about how “economic growth” has been used to
justify the private seizure of the earth’s resources for the profits of a
few—while closing off those same resources, and decisions about how they should
be used, to the rest of us. Even more importantly, this conversation about
history can help us to see today’s environmental crises—from the loss of global
biodiversity to superstorm Sandy—for
what they really are: the culmination of hundreds of years of privatizing and
commodifying the natural world.
The private enclosure of
nature continues today; it’s just hard to see. Like the proverbial fish
surrounded by the water of the “free market,” it’s easy to assume that fossil
fuel companies have some god-given right to profit from polluting our
atmospheric commons. How are young people to recognize this atmospheric grab
when the school curriculum has erased all memory of our collective right to the
Reclaiming these commons
means fueling students’ knowledge about a past that has conveniently
disappeared. Educators did not create the climate crisis, but they have a key
role to play in alerting students to its causes—and potential solutions.
Image by audio-luci-store.it,
licensed under Creative
Wednesday, June 13, 2012 3:50 PM
Going to college is
getting complicated. With higher ed becoming both more essential and less of a
guarantee of future success, many young people today face a kind of impossible
choice. At the same time, for many kids higher education has become an
expectation, a social obligation, less an accomplishment than a prerequisite to
something more difficult and less certain. It’s also becoming much less
It wasn’t always like this.
For generations, college offered a romanticized leap into the middle class—practically
a guarantee of future prosperity, or at least, a good chance of avoiding the
job market’s worst pitfalls. All through American history, it’s impossible to
separate education from the idea of social mobility, and by extension, the
possibility of equality. Access to higher education formed a big part of the
early women’s rights movement, and later, antislavery and civil rights. Federal
aid to education was at the center of Washington’s
desegregation effort under the Johnson and Nixon administrations. From a
student’s perspective, higher education can instill self-direction and lifelong
learning, and can teach us to discover new ideas in a collaborative,
But more recently, when
people talk about education in America,
they mostly talk about crisis. And not just the student debt crisis that
movements like Occupy recently propelled into the national consciousness. And
not just the crisis
in higher education that’s impoverishing instructors and grad students and underfunding
the liberal arts. The other crisis—the one that’s been at the center of our
education policy for two administrations—is about competitiveness. It’s this
idea that has spurred the Gates Foundation to pump hundreds of millions into
struggling schools. It’s also the subject of a popular and fascinating 2010
documentary (Waiting for “Superman”) on why schools
must address changes in the global economy.
We’ve all heard the
argument. Students in Europe and Asia are
outperforming Americans on standardized tests, especially in math and science.
We have lower graduation rates than other countries (college and high school), and those that do
graduate don’t study the right things. On average, class sizes have gone down
and education spending has gone up since the 1960s, and we’re still at the back
of the rich country pack. And this is an information age, with an
information-based global marketplace, and we’d better be ready to compete with
the tech schools in China
that are churning out millions of engineers a year. The U.S. job market is still hurting,
but the real growth areas will be in—you guessed it—math and science. We need more
chemical engineers now than novelists. More computer technicians than historians.
Beyond the anti-union
rhetoric and reductive view of social inequality that usually comes out of a
discussion like this, it’s a really interesting argument. After World War II,
higher education was available to more people than ever before through programs
like the GI bill. That availability reshaped the American economy, supporting a
large middle class and low inequality. But today, the prevailing wisdom is that
schools must “provide an education that is relevant
to the needs of business,” in the words of Bill Gates, whose Gates
Foundation has become a major player in the ongoing debate. So rather than take
an active role in creating the society of the future, schools should instead
react to whatever incentives and deterrents already exist in the global
marketplace. Moreover, they should tailor the student experience to the future requirements
of private enterprise. “Computer science employment is growing by nearly
100,000 jobs annually,” Gates wrote in 2007. “But at the same time studies show
that there is a dramatic decline in the number of students graduating with
computer science degrees.” The key to our America’s future prosperity, Gates
says, is to correct this imbalance.
And that’s just what the
Gates Foundation has been up to for the past decade, says the American Prospect. While most states
and districts have recently been cutting back on education spending,
organizations like Gates, the Eli and Edythe Broad Foundation and the Walton
Family Foundation have swooped in with fast
cash in reward for their vision of education reform. Mostly that means charters,
standardized tests, and lots of math and science. It also means quantifying
student and teacher performance, using cash to reward success and punish
failure, and battling with teachers unions.
It’s that vision that has
also been at the center
of federal reforms like No Child Left Behind and Race to the Top, which have
formed the core of our education policy for the past decade. Like Gates’
contributions, both programs rely on public-private partnerships, and use
financial aid as an incentive for cash-strapped states and districts to dramatically
change policy. Borrowing a line from the Gates playbook, Obama said in a
speech last year that,
the reality is too many students are not prepared across our country. Too many
leave school without the skills they need to get a job that pays. […] The
quality of our math and science education lags behind many other nations. And America
has fallen to ninth in the proportion of young people with a college degree. We
used to be number one, and we're now number nine. That's not acceptable.
administration’s reaction to these problems reflects the idea that public education
is inherently competitive, that its success is quantifiable, and that these
kinds of reforms should be a comprehensive, national project. Earlier this
year, Obama announced that he would soon make federal
money to public universities competitive, along the lines of Race to the
From a student’s
perspective, the underlying assumption is that the most important thing a
college education can give you is a job. At the same time, getting that job is
a social good in itself. Now, to a liberal arts major, this all sounds a little
funny—and not just because of the “getting a job” part. It’s because higher ed can
mean so much more than that. For a lot of people, it’s more about how to ask
questions, how to accept paradox, how to express complex ideas, where to go for
specialized information. It’s about the value of collaboration and discussion,
and pushing yourself to your intellectual and creative limits. It’s about
tackling questions that maybe you’ve never considered before. Talking about job
prospects almost confuses the matter.
I know. It’s a pretty
privileged viewpoint. Most people don’t have the luxury of enjoying education
for education’s sake. Like it or not, the job market is crazy-competitive, and
it’s starry-eyed lib arts majors that end up serving lattes. But you could say
the same thing about the Gates argument. The innovation of the future may be
based on math and science, but it’s hard to believe that that’s where most of
the jobs will be. 100,000 new jobs a year is a lot for one industry, but retail
salespersons and cashiers are still the biggest jobs in the
U.S. right now, and will be for some time. The service economy, growing for
decades, now represents more than three-fourths of our GDP. And while information
technology was one of the few industries to actually create jobs during the
economic downturn, its growth was outpaced by other
sectors like hospitality (engineers weren’t so lucky—that industry shed
jobs). On top of that, the unemployment rate for recent engineering grads is
relatively low, but
it’s also higher than kids who studied education, agriculture, and (amazingly)
communications. Among those who’ve found work, the median income for an experienced
engineering major is $81,000—much
higher than other college grads ($51,000) and the country as a whole ($30,000).
Like it or not, the kinds of high-paying information-economy jobs reformers
like Gates talk about will always be relatively elite—whether or not more kids major
in math and science.
If we wanted an education
system really based on the needs of business, then we should own up to the fact
that a 30 percent college graduation rate is much too high. Millions of college
grads end up not using their degrees in any direct way, and some even struggle
with being “overqualified” for the jobs that are available. But it’s easy to
see how that misses the point. Reducing something like learning or personal achievement
to an economic measure only makes sense if you’re talking about vocational
schools or job training, and even then it seems impossibly narrow. In 1947, a
young Martin Luther King wrote that while education can foster intelligence and
help prepare students for successful careers, it should also do
much more. “The complete education
gives one not only power of concentration, but worthy objectives upon which to
concentrate,” he argued. “The broad education will, therefore, transmit to one
not only the accumulated knowledge of the race but also the accumulated
experience of social living.” College may no longer guarantee a good job, but
it can still guarantee a wider social lens, and a better understanding of the
complex world around us.
In a lot of ways, this
idea of education seems obvious, but it’s also easy to ignore. The most serious
problem facing public education today is probably this deep identity crisis. Should
public schools and universities work to instill obedience, patriotism, and
routine in future low-wage workers, as they did during the industrial
revolution? Should they instead serve as social levelers, weeding out
segregation and inequality, as New Deal liberals like Johnson and, yes, Nixon believed?
Or should they appeal to the privileged few math and science students that will
later become entrepreneurs and engineers in the new information economy, as
current reformers argue? This question about the purpose of public education
underlies much of the current debate, even if it’s not explicitly hammered out.
The fact is that it’s easy
to say everyone should go to college, but the barriers to making that a reality
are not going away any time soon. Failing universal (free) college, we should
remember that, while valuable, higher ed doesn’t have a monopoly on higher learning.
There are plenty of people that are able to achieve the same kind of
self-direction, and can challenge themselves and their communities in the same
vigorous ways, without ever attending a lecture. There are quite a few
brilliant and successful people who either dropped out of college or didn’t
bother going. But for the rest of us, both as community members and
individuals, the college experience is invaluable. For those of us facing a
tough job market, here’s the good news: the social good isn’t getting the job,
it’s getting the education.
Prospect, FAIR, Boston
Post, Bureau of
Labor Statistics, New
York Times, U.S.
Luther King Jr. Research and Education Institute.
Image by The Cleveland Kid, licensed under Creative Commons.
Monday, May 14, 2012 2:35 PM
Excerpted from an article that originally appeared inThe Chronicle
of Higher Education.
Time: last year. Place: an
undergraduate classroom, in the airy, well-wired precincts of Silicon Valley University.
(Oops, I mean
Sun-Kissed-Google-Apps-University.) I am avoiding the pedagogical
business at hand—the class is my annual survey of 18th-century British
literature, and it's as rockin' and rollin' as you might imagine, given the
subject—in order to probe my students' reactions to a startling and (to me)
disturbing article I have just read in the Harvard alumni magazine. The piece,
by Craig Lambert, one of the magazine's editors, is entitled "Nonstop:
Today's Superhero Undergraduates Do '3000 Things at 150 Percent.'"
As the breaking-newsfeed title
suggests, the piece,
on the face of it, is anecdotal and seemingly light-hearted—a collegiate Ripley's Believe It or
Not! about the overscheduled lives of today's Harvard
undergraduates. More than ever before, it would appear, these poised,
high-achieving, fantastically disciplined students routinely juggle intense
academic studies with what can only seem (at least to an older generation) a
truly dizzy-making array of extracurricular activities: pre-professional
internships, world-class athletics, social and political advocacy, start-up
companies, volunteering for nonprofits, research assistantships, peer advising,
musical and dramatic performances, podcasts and video-making, and countless
other no doubt virtuous (and résumé-building) pursuits. The pace is so
relentless, students say, some plan their packed daily schedules down to the
minute—i.e., "shower: 7:15-7:20 a.m."; others confess to getting by
on two or three hours of sleep a night. Over the past decade, it seems, the
average Harvard undergraduate has morphed into a sort of lean, glossy,
turbocharged superhamster: Look in the cage and all you see, where the
treadmill should be, is a beautiful blur.
I am curious if my Stanford students'
lives are likewise chockablock. Heads nod yes; deep sighs are expelled; their
own lives are similarly crazy. They can barely keep up, they say—particularly
given all the texting and tweeting and cellphoning they have to do from hour to
hour too. Do they mind? Not hugely, it would seem. True, they are mildly
intrigued by Lambert's suggestion that the "explosion of busyness" is
a relatively recent historical phenomenon—and that, over the past 10 or 15
years, uncertain economic conditions, plus a new cultural emphasis on marketing
oneself to employers, have led to ever more extracurricular add-ons. Yes, they
allow: You do have to display your "well-roundedness" once you
graduate. Thus the supersize CV's. You'll need, after all, to advertise a
catalog of competencies: your diverse interests, original turn of mind, ability
to work alone or in a team, time-management skills, enthusiasm,
unflappability—not to mention your moral probity, generosity to those less
fortunate, lovable "meet cute" quirkiness, and pleasure in the simple
things of life, such as synchronized swimming, competitive dental flossing, and
Antarctic exploration. "Yes, it can often be frenetic and with an eye
toward résumés," one Harvard assistant dean of students observes,
"but learning outside the classroom through extracurricular opportunities
is a vital part of the undergraduate experience here."
Yet such references to the past—truly a
foreign country to my students—ultimately leave them unimpressed. They laugh
when I tell them that during my own somewhat damp Jurassic-era undergraduate
years—spent at a tiny, obscure, formerly Methodist school in the rainy Pacific
Northwest between 1971 and 1975—I never engaged in a single activity that might
be described as "extracurricular" in the contemporary sense, not,
that is, unless you count the little work-study job I had toiling away evenings
in the sleepy campus library. What was I doing all day? Studying and going to
class, to be sure. Reading books, listening to music, falling in love (or at
least imagining it). Eating ramen noodles with peanut butter. But also, I
confess, I did a lot of plain old sitting around—if not outright malingering.
I've got a box of musty journals to prove it. After all, nobody even exercised
in those days. Nor did polyester exist. Once you'd escaped high school and
obligatory PE classes—goodbye hirsute Miss Davis; goodbye, ugly cotton middy blouse and
gym shorts—you were done with that. We were all so countercultural back
then—especially in the Pacific Northwest,
where the early 1970s were still the late sixties. The 1860s.
The students now regard me with
curiosity and vague apprehension. What planet is she from.
But I have another question for them.
While Lambert, author of "Nonstop," admires the multitasking
undergraduates Harvard attracts, he also worries about the intellectual and
emotional costs of such all-consuming busyness. In a turn toward gravitas, he
quotes the French film director Jean Renoir's observation that "the
foundation of all civilization is loitering" and wonders aloud if
"unstructured chunks of time" aren't necessary for creative thinking.
And while careful to phrase his concerns ever so delicately—this is the Harvard
alumni magazine, after all—he seems afraid that one reason today's students are
so driven and compulsive is that they have been trained up to it since
babyhood: From preschool on, they are accustomed to their parents pushing them
ferociously to make use of every spare minute. Contemporary middle-class
parents—often themselves highly accomplished professionals—"groom their
children for high achievement," he suspects, "in ways that set in
motion the culture of scheduled lives and nonstop activity." He quotes a
former Harvard dean of student life:
This is the play-date generation. ...
There was a time when children came home from school and just played randomly
with their friends. Or hung around and got bored, and eventually that would
lead you on to something. Kids don't get to do that now. Busy parents book them
into things constantly—violin lessons, ballet lessons, swimming teams. The kids
get the idea that someone will always be structuring their time for them.
The current dean of freshmen concurs:
"Starting at an earlier age, students feel that their free time should be
taken up with purposeful activities. There is less stumbling on things you love
... and more being steered toward pursuits." Some of my students begin to
look downright uneasy; some are now listening hard.
Such parental involvement can be
distasteful, even queasy-making. "Now," writes Lambert, parents
"routinely 'help' with assignments, making teachers wonder whose work they
are really grading. ... Once, college applicants typically wrote their own
applications, including the essays; today, an army of high-paid consultants,
coaches, and editors is available to orchestrate and massage the admissions
effort." Nor do such parents give up their busybody ways, apparently, once
their offspring lands a prized berth at some desired institute of higher learning.
Parental engagement even in the lives
of college-age children has expanded in ways that would have seemed bizarre in
the recent past. (Some colleges have actually created a "dean of
parents" position—whether identified as such or not—to deal with them.)
The "helicopter parents" who hover over nearly every choice or action
of their offspring have given way to "snowplow parents" who
determinedly clear a path for their child and shove aside any obstacle they
perceive in the way.
Read the rest of this story atThe Chronicle
of Higher Education.
Image by Bizzleboy, licensed under Creative Commons.
Tuesday, March 16, 2010 3:53 PM
Professors on film were once the bumbling experts or snobs, but now a certain malaise has set in and we see more of the gloomy, hapless types. In a recent article for The Chronicle of Higher Education, Jeffrey J. Williams traces the evolution of professors, as portrayed in films:
It seems as if professors have become depressed and downtrodden. For example, two well-regarded 2008 films, The Visitor and Smart People, center on aging, later-career professors who are disengaged from their work and exhibit obvious signs of depression. The Visitor depicts an economics professor, played by Richard Jenkins, who is going through the motions, teaching syllabi from years before and avoiding research.
The celebrity professor might seem to counter the image of the downtrodden professor, but he is merely the flip side of the coin. He represents the “winner take all” model that governs businesses and, progressively more so, professions. Like the CEO who receives 300 times what the person on the shop floor is paid, these professors reap the spoils….The celebrity professor exemplifies the steep new tiers of academic life, in a pyramid rather than a horizontal community of scholars.
Source: The Chronicle of Higher Education (subscription required)
Friday, May 02, 2008 2:38 PM
Subsidize high school dropouts, advises Designer/builder in its March/April issue (article not available online). Our education system is hopelessly defunct, so we might as well reward those who realize it and strike out to learn on their own. A provocative argument, though I’m not convinced that every dropout with the Internet might be a George Washington or John D. Rockefeller, as Designer/builder suggests. Yet it is urgently necessary that we transform the American approach to education from a system in which “schools teach as if what is now thought true will always be true” into something that inspires “comprehensive self-initiation, management, and judgment of learning.”
Monday, April 28, 2008 12:50 PM
If you spend much time in office meetings or college classrooms, you’ve likely run into Gender Guy. He’s an alpha male and a liberal, and he likes to talk about gender issues—in the workplace, in society, in the book you’re reading, wherever. He pontificates and patronizes; he interrupts and shouts down. He makes the rest of the room endure his pissing matches with men less enlightened, or with those who share his general opinions but oblige his desire to quibble over details, loudly and at length.
Gender Guy’s assumed expertise might come from overly simplified connections he makes between gender and race, or class, or sexual identity, or religion. It might be based on the fact that, as an intelligent and well-spoken man, he’s by definition an expert on everything. Or perhaps he thinks he understands gender because the word—unlike, say, “women”—suggests a subject that deals not with one gender’s concrete realities so much as, more abstractly, with the relationship between two.
This last point in particular interests historian Alice Kessler-Harris. Writing in the Chronicle of Higher Education, Kessler-Harris considers the consequences for her own discipline when, starting in the early 1990s, gender history began to take over the ground previously held by women’s history (subscription required). She allows that “gender is a tempting and powerful framework”:
Far more inclusive than the category of women, [gender] raises questions not so much about what women did or did not do, but about how the organization or relationships between men and women established priorities and motivates social and political action. While the history of women can be accused of lacking objectivity—of having a feminist purpose—that of gender suggests a more distanced stance… The idea of “gender” frees young scholars (male and female) to seek out the ways that historical change is related to the shape and deployment of male/female relations.
And yet, something is lost:
Gender obscures as much as it reveals… [I suspect] that in seeing the experiences of men and women as relational, we overlook the particular ways in which women—immigrants, African-Americans, Asians, Chicanas—engaged their worlds… We lose the power of the individual to shed a different light—sometimes a liminal light—on historical processes.
In short, Kessler-Harris worries that abstracting “women” into “gender” can have the effect of silencing the voices of actual women—a danger not limited to the rarefied world of historians. The tension between analyzing gender relations and highlighting female voices is an old one, and it’s as broadly relevant as ever. While Gender Guy’s opinions may be impeccably feminist, how helpful is this if the abstraction “gender” gives him cover to go on and on, preventing the women in the room from getting a word in?
Monday, February 11, 2008 9:12 AM
A common criticism levied at the US educational system is that there isn’t enough time devoted to arts and crafts. “Our society devalues such handiwork,” Rabbi Danny Nevins writes for the Jewish website jspot.org, “but the Torah finds sanctity in sweat.” Students would do well to learn that “there are different types of wisdom,” according to Rabbi Nevins, and book learning is only one of them.
A similar point was made by Matthew B. Crawford in the New Atlantis, and written about in 2006 on Utne.com. Crawford writes that American society must reconsider its connection to manual labor. Learning and mastering a craft fosters self reliance and challenges consumer dependency, but too many people still value “knowledge work” over shop class.
Want to gain a fresh perspective? Read stories that matter? Feel optimistic about the future? It's all here! Utne Reader offers provocative writing from diverse perspectives, insightful analysis of art and media, down-to-earth news and in-depth coverage of eye-opening issues that affect your life.
Save Even More Money By Paying NOW!
Pay now with a credit card and take advantage of our earth-friendly automatic renewal savings plan. You save an additional $6 and get 6 issues of Utne Reader for only $29.95 (USA only).
Or Bill Me Later and pay just $36 for 6 issues of Utne Reader!