Friday, December 02, 2011 4:29 PM
Consider these page-turners for your next beach vacation: a transcription of all the weather reports from a radio news station recorded over one year, a re-typed issue of The New York Times, a chronicle of the utterances made by one person for an entire week, a similar account of every bodily gesture the same man made over the course of 13 hours, or 600 pages worth of words with rhyming r-sounds. Understandably, you’re probably not tacking these texts onto the bottom of your holiday wish list, much less considering them as notable for anything other than how boring they sound. But Kenneth Goldsmith—an avant-garde poet, experimental radio personality, and professor—considers these litanies to be poetry. In fact, the list is a small sampling of his published works.
“My books are better thought about than read,” Goldsmith said in an interview with The Believer. “They’re insanely dull and unreadable; I mean, do you really want to sit down and read a year’s worth of weather reports or a transcription of the 1010 WINS traffic reports ‘on the ones’ (every ten minutes) over the course of a twenty-four-hour period? I don’t. But they’re wonderful to talk about and think about.” The thing to think about, specifically, is whether this type of writing should be considered literature.
Although he’s more apt to call one of his volumes a reference book or a thought experiment, Goldsmith argues that there’s something experientially interesting that comes from reading an Almanac-like text. “The moment we shake our addiction to narrative,” he says, again in The Believer,
and give up our strong-headed intent that language must say something ‘meaningful,’ we open ourselves up to different types of linguistic experience, which, as you say, could include sorting and structuring words in unconventional ways: by constraint, by sound, by the way words look, and so forth, rather than always feeling the need to coerce them toward meaning.
We live in an information-sodden age, one in which processing is sometimes mistaken for reflection.
“With an unprecedented amount of available text,” Goldsmith writes in The Chronicle Review, “our problem is not needing to write more of it; instead, we must learn to negotiate the vast quantity that exists.” When it comes to artistic content, we’ve made and eaten a Thanksgiving dinner’s worth of verbiage. To continue the metaphor, the challenge, then, is how to burn off all of the calories and look good at the office the following workweek. “How I make my way through this thicket of information,” he argues, “how I manage it, parse it, organize and distribute it—is what distinguishes my writing from yours.”
In addition to “writing” mind-numbing-slash-mind-exploding poetry, Goldsmith was a DJ at New Jersey’s WFMU radio station, teaches a class on “uncreative writing” at University of Pennsylvania, and spear-heads an avant-garde arts website, UbuWeb.
UbuWeb is a radical depository of donated and stolen art of various media, a “completely independent resource dedicated to all strains of the avant-garde, ethnopoetics, and outsider arts.” Much of the website’s content will appeal only to the few high-conceptually inclined art geeks. In an interview with BOMB (which, in a very Goldsmithian way, is reproduced verbatim on the magazine’s website), he explains UbuWeb as “a way of flaunting all the rules, somewhat safely.” Avant-garde, it seems, has been waiting its whole life for the Internet. He continues:
I’ve actually found a major loophole in copyright culture, literary culture, in distributive culture which happens to be, for lack of a better word, the avant-garde—which nobody can understand. It’s so hard for people to understand this stuff. And number two, it’s really got no commercial value whatsoever. It has great historical and intellectual value, but people lose money when they try to release this stuff so most of it goes unreleased. So it’s been this, kind of, really beautiful grey area where it’s all out in the open and it’s all in front but you get a pass on it in a way that legitimate economies don’t give you that latitude.
Goldsmith’s work has garnered him some acclaim of late: He was invited to recite some of his work at a White House poetry event in May. If you ask him if the work is making a difference in literature, though, he’d probably respond cryptically with a handwritten list of all the nutritional information in a supermarket aisle and call it “Serving Size.”
Sources: BOMB, The Believer, The Chronicle Review
Image by Meredith Waterswaters, licensed under Creative Commons.
Thursday, January 06, 2011 12:53 PM
Is it possible to understand how an entire society thinks, to objectively examine the sum of a culture’s obsessions and anxieties, its fetishes and fascinations? And if so, could we extrapolate some deeper historical truth from the exercise, or just a mass of superficial conclusions? Cultural anthropologists write ethnographies, urban planners crunch demographic statistics, and media watchdogs sniff out trends and biases in mainstream media with the hope of gleaning some understanding the zeitgeist, be it past or present. But the various fields of study, due to their inherent specificity, can’t help missing the bigger picture. Even the shrewd, data-driven analysis of the urban planner is imperfect; it misses the nebulous, unquantifiable nuances of human experience. How do you statistically account for a heightened fear of foreigners, or infatuation with celebrities, or changes in artistic aesthetics? Assuming that we even want to know the contours of our national culture from an outside perspective, we’ll need to form an uncommon alliance: between scholars in the humanities and the arbiters-of-all-knowledge Google.
One of Google’s latest gifts to the Ivory Tower is Ngrams, an easy-to-use interface that pulls word-frequency data from the company’s massive database of books and plots them against a timeline. By agglomerating the text of as many books as possible from every conceivable field of writing, the theory behind Ngrams goes, one can begin to form a more comprehensive idea of what our culture is (and has been) all about.
This type of broad, numbers-based study of texts (called corporal studies in academia) isn’t entirely new, but computer-accelerated applications like Ngrams lend the practice an unprecedented computational power. A recent article in The Chronicle Review guardedly appraises this new scholarly field of “culturomics.” (Culturomics is meant to rhyme with genomics and carries the same assumptions: that culture can be quantified and then decoded, just like the human genome.) The article’s author, linguist Geoffrey Nunberg, frets that anyone with an internet connection can become an armchair-statistician-cum-cultural-critic. “I think that [Yale comparative literature scholar Katie] Trumpener is quite right to predict that second-rate scholars will use the Google Books corpus to churn out gigabytes of uninformative graphs and insignificant conclusions,” writes Nunberg. “But it isn’t as if those scholars would be doing more valuable work if they were approaching literature from some other point of view.”
People poking around on Ngrams will ultimately be beneficial to scholarship. “Whatever misgivings scholars may have about the larger enterprise, the data will be a lot of fun to play around with,” writes Nunberg. “And for some—especially students, I imagine—it will be a kind of gateway drug that leads to more-serious involvement in quantitative research.”
So here’s a bit of armchair scholarship. I plotted the use of two phrases (above) that mean a lot to us at Utne Reader—“alternative press” and “mainstream media”—from year 1900 to 2000. Both phrases don’t come into use until about 1970. Although “alternative press” enjoys more of a presence in written discourse for the following 15 years, “mainstream media” begins to skyrocket into our consciousness in 1985. What inferences can we draw? Perhaps the accelerated use of “mainstream media” is a symptom of an expanding cable news network or growing academic interest in the subject. Might the stagnation of “alternative press” be indicative of suppression of fringe opinions? And should this inflate our underdog ego? Admittedly, it’s hard to conclude anything from these graphs. After all, I was just playing around on Google.
Source: The Chronicle Review
Image by Carlos Luna, licensed under Creative Commons.
Thursday, September 02, 2010 12:09 PM
Whether you call it “post-postmodern,” “altermodern,” or “nonaesthetic,” contemporary art is less tenuous than ever. Disparate threads of post-colonialism, globalism, commercialism, and (insert preferred –ism, ad infinitum) work to intertwine the international community of artists and, at the same time, chip away at our notions of artistic discipline, medium, and purpose.
Writing on the future of art for The Chronicle Review’s “Defining Idea of the Next Decade” issue, James Elkins predicts that the next 10 years will bring dramatic change to art studies, splitting the study of art history from visual studies. Elkins writes:
In academe this will be played out in a collision of fields, as newer disciplines like postcolonial studies and visual studies collide with older disciplines like art history and art theory. Visual studies looks at popular culture, mass media, television, and advertising. Postcolonial studies considers art as an effect of class, ethnicity, socioeconomic conditions, and power relations. Art history has always cared about value—it matters that Michelangelo really is a good artist, and not just a symbol of Florentine or Roman identity—and so art history has difficulty with ways of understanding art that are based on economics, politics, and social functions. The two approaches, visual studies and art history, create a kind of unstable oil-and-water mixture in academic writing.
Perhaps the need for thoughtful art criticism and academic research is more pressing than ever. “As in all historical changes, much will be lost,” concludes Elkins. “My hope is that the celebratory mood of the new art and scholarship will not obscure the fact that the new art, which seems too much fun to resist, is deeply problematic. No one knows what contemporary international art expresses, or how best to interpret it.”
Source: The Chronicle Review
Image by See-ming Lee, licensed under Creative Commons.
Thursday, August 12, 2010 4:04 PM
Let’s put our ideological and spiritual differences aside for just a moment and, through reasoned argumentation, decide what happens to human beings after they die. Easier said than done. Should we approach the mystery from a high philosophical horse, or whittle it down with the empirical edge of the scientific method? And don’t forget: the cozy theologian will have something to add to the discussion as well. Even if we strip passion from our assumptions about the afterlife, we come no closer to understanding its feasibility.
After reading four recently published books regarding life after death, Jacques Berlinerblau is as clueless as he ever was. But what appears at first to be a run-of-the-mill book roundup in The Chronicle Review becomes a careful examination of the difficulties of talking about the afterlife in a useful, scholarly manner.
Berlinerblau first tackles books that try to prove the existence of an afterlife through modern science. One such book, Life After Death: the Evidence by Dinesh D'Souza, is a spirited read, Berlinerblau writes, but the alleged scientific accuracy of D’Souza’s claims is questionable, and far outside the realm of a lay-person’s ability to second-guess. “[D’Souza] devotes great energy and imagination to popularizing complex scientific ideas for his readers,” says Berlinerblau. “Whether his distillation of those ideas is accurate is something that only physicists, neuroscientists, astronomers, and biochemists, among others, can answer.” Looking to the humanities is just as unsatisfying.
Theological and philosophical writing is infamous for its convoluted complexity. Berlinerblau tried, with marginal success, to unpack the metaphysical arguments for an afterlife in Princeton professor Mark Johnston’s Surviving Death. Things don’t start well: “From the outset, let me confess that Professor Johnston's argument went so far above my head that it jettisoned booster rockets into the poppling ocean of my incomprehension.” After numerous dense, jargon-y chapters, Berlinerblau concludes that “It would be pointless to try to summarize [Johnston’s] hypotheses.”
Berlinerblau speculates that rational conversation about the afterlife may be impossible and offers his own modest solution: “There is, of course, a counterpossibility: If we do in fact perdure, perhaps we transit into a realm beyond good and evil—a realm so radically other that science, theology, and philosophy cannot fathom its contours. That does not mean we should stop asking questions. But insofar as there are no answers, a recommended course of action might consist of living according to some minimal standard of decency and cherishing our bright moments.”
Source: The Chronicle Review
Image by Qole Pejorian, licensed under Creative Commons.
Thursday, March 18, 2010 4:19 PM
Depending on your degree of web-savvy, dealing with grief online can feel a bit awkward at times. That’s OK. We’re dealing with a “new, uncharted form of grieving,” Elizabeth Stone writes for The Chronicle Review. In a tender essay about the accidental death of one of her students, Casey, Stone shares what she’s learned about grief and digital culture. It’s a story of choices; a friend of the young woman who died, for example, spent all night on the phone to ensure that no one heard the news on Facebook. It’s also a story with a positive take-away, especially for anyone unsure about grieving in the public space of a social network. As Stone writes:
Traditional mourning is governed by conventions. But in the age of Facebook, with selfhood publicly represented via comments and uploaded photos, was it OK for her friends to display joy or exuberance online? Some weren’t sure.
Six weeks after Casey’s death, one student who had posted a shot of herself with Casey wondered aloud when it was all right to post a different photo. Was there a right time? There were no conventions to help her. And would she be judged if she removed her mourning photo before most others did?
As it turns out, Facebook has a “memorializing” policy in regard to the pages of those who have died. . . . As [employee Matt] Kelly wrote in a Facebook blog post last October, “When someone leaves us, they don’t leave our memories or our social network. To reflect that reality, we created the idea of ‘memorialized’ profiles as a place where people can save and share their memories of those who’ve passed.”
Casey’s Facebook page is now memorialized. Her own postings and lists of interests have been removed, and the page is visible only to her Facebook friends. . . . Eight months after her death, her friends are still posting on her wall, not to “share their memories” but to write to her, acknowledging her absence but maintaining their ties to her—exactly the stance that contemporary grief theorists recommend.
Source: The Chronicle Review
Tuesday, March 16, 2010 3:53 PM
Professors on film were once the bumbling experts or snobs, but now a certain malaise has set in and we see more of the gloomy, hapless types. In a recent article for The Chronicle of Higher Education, Jeffrey J. Williams traces the evolution of professors, as portrayed in films:
It seems as if professors have become depressed and downtrodden. For example, two well-regarded 2008 films, The Visitor and Smart People, center on aging, later-career professors who are disengaged from their work and exhibit obvious signs of depression. The Visitor depicts an economics professor, played by Richard Jenkins, who is going through the motions, teaching syllabi from years before and avoiding research.
The celebrity professor might seem to counter the image of the downtrodden professor, but he is merely the flip side of the coin. He represents the “winner take all” model that governs businesses and, progressively more so, professions. Like the CEO who receives 300 times what the person on the shop floor is paid, these professors reap the spoils….The celebrity professor exemplifies the steep new tiers of academic life, in a pyramid rather than a horizontal community of scholars.
Source: The Chronicle of Higher Education (subscription required)
Tuesday, May 19, 2009 3:24 PM
If you love an old fashioned grammarian throw down, we've got a good one for you. This year marks the 50th anniversary of the publication of William Strunk and E.B. White's The Elements of Style and Geoffrey K. Pullum isnt celebrating. In a delightfully vitriolic essay for The Chronicle Review, Pullum complains that the tiny guide "does not deserve the enormous esteem in which it is held by American college graduates. Its advice ranges from limp platitudes to inconsistent nonsense. Its enormous influence has not improved American students' grasp of English grammar; it has significantly degraded it."
Brutal. Then, in the very next breath, this: "The authors won't be hurt by these critical remarks. They are long dead." You get the distinct sense that Pullum would have been glad to see the book buried with the men who made it. "Both authors were grammatical incompetents," he writes. "Strunk had very little analytical understanding of syntax, White even less. Certainly White was a fine writer, but he was not qualified as a grammarian."
Pullum isn't pissed about the style advice, which he calls "mostly harmless"—all of his punches are aimed squarely at the grammar rules and the grammar itself:
"Write with nouns and verbs, not with adjectives and adverbs," they insist. (The motivation of this mysterious decree remains unclear to me.)
And then, in the very next sentence, comes a negative passive clause containing three adjectives: "The adjective hasn't been built that can pull a weak or inaccurate noun out of a tight place."
That's actually not just three strikes, it's four, because in addition to contravening "positive form" and "active voice" and "nouns and verbs," it has a relative clause ("that can pull") removed from what it belongs with (the adjective), which violates another edict: "Keep related words together."
"Keep related words together" is further explained in these terms: "The subject of a sentence and the principal verb should not, as a rule, be separated by a phrase or clause that can be transferred to the beginning." That is a negative passive, containing an adjective, with the subject separated from the principal verb by a phrase ("as a rule") that could easily have been transferred to the beginning. Another quadruple violation.
A 50th anniversary is a big deal, and Geoffrey K. Pullum knows how to party.
Source: The Chronicle Review
Friday, April 10, 2009 11:03 AM
Are young people in the digital age perpetually plugged-in drones, or tolerant, politically and socially shrewd citizens with untapped potential? There has always existed a culture gap between educators and their students, but technology seems to have widened it into a chasm. Given the alienation that many educators feel from their students today, the debate over the fate of so-called “Digital Natives” and how to teach them continues.
William Deresiewicz over at The Chronicle Review laments the loss of solitude for today’s youth. He worries for his students and the apparent nonstop nature of their connectedness, from Facebook to Twitter to text messaging.
“Technology is taking away our privacy and our concentration,” he writes, “but it is also taking away our ability to be alone.”
Deresiewicz then wonders what this loss portends: “And losing solitude, what have they lost? First, the propensity for introspection, that examination of the self that the Puritans, and the Romantics, and the modernists (and Socrates, for that matter) placed at the center of spiritual life – of wisdom, of conduct. Thoreau called it fishing ‘in the Walden Pond of [our] own natures’, ‘bait[ing our] hooks with darkness.”
Barry Duncan and Carol Arcus take a less pessimistic stance at the Education Forum of Ontario Secondary School Teachers’ Federation. While acknowledging the concern for Digital Natives’ ability to think critically about the media they consume, Duncan and Arcus instead see an opportunity to “link this multi-sensory, multi-modal, multi-literate experience to new notions of literacy and identity.”
They suggest that “Net Geners” might be “smarter, quicker and more tolerant of diversity than their predecessors. They are more politically savvy, socially engaged and family-centered than society gives them credit for.”
And, they see in the conversation around teaching Digital Natives the possibility “to figure out and invent ways to include reflection and critical thinking in the learning...but still do it in the Digital Native language.”
Sources: The Chronicle Review, Ontario Secondary School Teachers’ Federation
Image by Bombardier, licensed under Creative Commons
Thursday, December 18, 2008 12:33 PM
The Encyclopedia of Life, a website that came out earlier this year and crashed almost immediately from a flood of visitors, is a “project to organize and make available via the Internet virtually all information about life present on earth.” With approximately 1.8 million known species on earth, this is a great tool for scientists and students, and it grants open access to anyone who wants to brush up on their knowledge of earth’s creatures, from seals to viruses and everything else in between.
Sounds great, right? Well, Randy Malamud, writing in The Chronicle Review (subscription or pass required), sees more going on here than an eight-eyed jumping spider, and asks if the digital nature of EOL will “encourage us to appreciate plants and animals more, or spend more time surfing online?” He suggests that the more we cite and arrange plants and animals the less we care about them in their environment—that taxonomy parallels destruction.
He also thinks the animals are out of their element, if you will, as a pin-up for each individual entry. All animals in life are affected by and play a role in their ecosystems. We should consider the role in history they play with human interaction, and their importance of place. An example he gives is Australian Aborigines' use of the imperial blue butterfly. Their host plant, the acacia, provides seeds to the natives as food and its gum as an adhesive for tool building. The butterfly then, is an integral part of the Aborigines' culture, but it's not referenced on the EOL website.
Who is classifying the animals, Malamud argues, tells you more about the human environment than the exotic outside world. "Structuring the natural world meshes with the structure of imperial power," he writes, and he quotes MIT historian Harriet Ritvo: "The classification of animals, is apt to tell us as much about the classifiers as the classified."
Any ecologist will tell you that life on earth is about ecosystems. Malamud thinks one step in the right direction for the website's success may be that “the EOL might take a cue from Facebook or Myspace for an enhanced sense of connectivity.”
Want to gain a fresh perspective? Read stories that matter? Feel optimistic about the future? It's all here! Utne Reader offers provocative writing from diverse perspectives, insightful analysis of art and media, down-to-earth news and in-depth coverage of eye-opening issues that affect your life.
Save Even More Money By Paying NOW!
Pay now with a credit card and take advantage of our earth-friendly automatic renewal savings plan. You save an additional $6 and get 6 issues of Utne Reader for only $29.95 (USA only).
Or Bill Me Later and pay just $36 for 6 issues of Utne Reader!