Wednesday, May 25, 2016

Alive in the Age of Lovecraft





















BY CARL H. SEDERHOLM
Professor of interdisciplinary humanities at Brigham Young University

Under the right circumstances, certain texts suggest a “weird realism,” circumstances (as described by Graham Harman) when language either struggles to describe the impossibly real or when it overflows with multiple possibilities. One of H. P. Lovecraft’s strengths as a writer lies in his constant insistence that there was always something just outside of human ken, something that might be understood by analogy or at the risk of one’s sanity. Human beings, as the opening of “The Call of Cthulhu” suggests, are simply not equipped to handle certain forms of knowledge.

For years, I introduced students to Lovecraft cautiously, as though we were all anxious about this evocative power. But things have changed dramatically over the last two decades. If we once approached the author reluctantly, we now embrace him as a cultural phenomenon. Students may not know his work any better, but they’re more likely to know something about his life, his work, and his monsters. Whereas I used to regale students with connections between Lovecraft and Edgar Allan Poe, Stephen King, or Robert Bloch, I now find ways of introducing him through references to films, internet memes, music, YouTube videos, board games, and the like. One semester, a student taught us all how to play the overly complex game Arkham Horror during finals week. Another student gave me a copy of Cthulhu Fluxx, a fun card game that I happily played with family and friends.

It’s rare for authors, especially those in the pulp tradition, to achieve such strange 21st century heights so quickly. Lovecraft, no longer an obscure pulp writer, is now a public figure, the creator not only of Cthulhu but also of a body of imaginative and fascinating tales that are enjoying a new generation of readers. Even though the notion of a unified “Cthulhu Mythos” is controversial, Lovecraft’s universe nevertheless remains one of the coolest and most expansive imaginary sandboxes out there. How did Lovecraft become so popular?

There’s no simple answer to the question, but much can be gained from exploring the range of connections and intersections that draw on Lovecaft in some way. I’ve already mentioned some of the ways popular culture appropriates Lovecraft. Another productive development comes from Lovecraft’s increasing prominence in philosophical discussions, especially those stemming from Harman’s aforementioned weird realism or the way a story like “Through the Gates of the Silver Key” factors into Gilles Deleuze and Felix Guattari’s work in A Thousand Plateaus. Moreover, Lovecraft’s cosmicism, especially the way he shifts attention decidedly away from the human, overlaps with recent work concerning posthumanism, animal studies, and deep time.

Lovecraft isn’t exactly a critical darling, but he’s certainly no longer the embarrassment to the literary establishment that he once was. In the last few years, new editions of his work have appeared, including those from Oxford University Press and the Library of America. He is also an acknowledged influence on graphic novels, films, songs, illustrations, sculptures, and more. In the early months of 2016, Matt Ruff (Lovecraft Country) and Victor LaValle (The Ballad of Black Tom) released novels featuring explicitly Lovecraftian themes and ideas. Jacqueline Baker’s 2015 novel (The Broken Hours) examines the last weeks of Lovecraft’s life.

Lovecraft’s afterlife is certainly impressive, but his fame also brings renewed controversy concerning his blatantly racist attitudes. Though some prefer to dismiss his racism as a thing of the past, or to somehow separate the man from his fiction, others want to understand how racism shapes Lovecraft’s writing, particularly his approach to core themes such as impurity, abjection, and hybridity.

In 2015, the public side of this controversy came to a head when the World Fantasy Convention determined to no longer give the “Howie” (a small bust of Lovecraft) to award winners. Critics of this change decried it as bowing to political correctness; others applauded the change as a sign of progress. W. Scott Poole’s forthcoming biography of Lovecraft argues for greater frankness not only concerning Lovecraft’s racism but also calls for a wider awareness of the ways racism structured political power in the United States, especially in the tumultuous aftermath of Reconstruction—and how these problems continue to shape American public life in 2016.

Jeffrey Weinstock and I developed The Age of Lovecraft around two key questions:

Why Lovecraft?

Why now?

The answers are complex and paradoxical. Lovecraft is a controversial writer, but his newfound fame should lead to better research, criticism, and understanding. For me, much of the current appeal lies in the excitement and power of sharing an imaginative space, the kind Michael Saler describes in his insightful book As If. If that’s the positive reason, there’s also a negative one: a pervasive fear of death. As Lovecraft writes in “Supernatural Horror in Literature, “the one test of the really weird is simply this—whether or not there be excited in the reader a profound sense of dread, and of contact with unknown spheres and powers; a subtle attitude of awed listening, as if for the beating of black wings or the scratching of outside shapes and entities on the known universe’s utmost rim.”


In Lovecraft, anything is possible but death remains inevitable—and is always lurking just outside the door.

-------

Carl H. Sederholm is professor of interdisciplinary humanities at Brigham Young University. Sederholm is co-editor, with Jeffrey Andrew Weinstock, of The Age of Lovecraft

Thursday, May 19, 2016

Looking back: Breast cancer activist Barbara Brenner on cancer wristbands



Barbara Brenner, a key figure in North American breast cancer history, wrote the following piece in 2005 as a Perspective for the San Francisco public radio station KQED. Brenner died in 2013. So Much to Be Done, an anthology of her political and personal writings, has been published by University of Minnesota Press.

-------


Anyone who knows that I’m a breast cancer activist knows that you won’t find me wearing pink paraphernalia, let alone one of the Livestrong wristbands from Lance Armstrong. While I don’t wear one, it did strike me when the yellow wristbands first appeared that they are visual evidence of the number of people who are living with cancer in the United States.

But that’s not the visual effect we’re getting. Instead, we’re seeing a whole rainbow of wristbands, including the pink ones signifying—you guessed it—breast cancer. Dr. Susan Love is raising funds with one; Target is selling “Share Beauty, Spread Hope” bands; and the Komen Foundation offers its very own “Sharing the Promise” version.

As a breast cancer activist, I’m concerned that, once again, the breast cancer movement is separating itself from the rest of the cancer world. This might sound strange, coming from someone who works for a breast cancer organization and who’s been living with this disease for 12 years. But I hear from increasing numbers of people that breast cancer gets a disproportionate amount of attention, especially when the incidence of many other cancers is also on the rise. And I’m worried that things like these pink wristbands will only add to a growing sense and frustration that breast cancer advocates don’t see themselves as part of a larger cancer community. For once, can’t breast cancer advocates be trees living in the forest of folks living with cancer?

I think we do everyone a great disservice when we separate ourselves in unnecessary ways. Would there be some harm in people who care about breast cancer wearing a yellow wristband? Is there anything gained by separating ourselves with pink ones? Can’t we sometimes work with a bigger vision that sees what we have in common instead of what separates us?

Like many people, I’m inspired by Lance Armstrong. I’m also inspired by the many women who continue to live their lives despite breast cancer and other cancers. Instead of a colored wristband, I wear a button that says, “Cancer Sucks,” which speaks to everyone’s experience with the disease. The language isn’t pretty, but neither is cancer.

-------

Barbara Brenner was executive director of the nonprofit organization Breast Cancer Action, based in San Francisco. She died in 2013 at the age of sixty-one.

Several events are planned in the San Francisco area and elsewhere around the launch this month of So Much to Be Done, an anthology of Brenner's writings. Click here for a full list of events.

Thursday, May 12, 2016

Remembering the fierce thinker and jazz historian Albert Murray, who would have turned 100 today.



Albert Murray (1916–2013), renowned jazz historian, critic, writer, social and cultural theorist, and cofounder (with Wynton Marsalis) of Jazz at Lincoln Center, would have turned 100 years old today. We remember him with an edited excerpt from Murray Talks Music: Albert Murray on Jazz and Blues (May 2016).

-------

"In order to know what the statement is, you have to know what is involved in the processing"
Edited excerpt from an interview with Greg Thomas


Greg Thomas: Could you go a little deeper into the concepts of folk art, popular art, and fine art?

Albert Murray: The three levels of sophistication or technical mastery involved in the processing of raw experience into aesthetic statement. That's a whole encyclopedia right there. Art is a means by which raw experience is stylized—goes through a process by which we mean stylized—into aesthetic statement. The style is the statement. In order to know what the statement is, you have to know what is involved in the processing. Involved in that would be degrees of the control of the medium that you're working in. Some guy comes up with a poem—but they don't know grammar, they can't pronounce the words, they don't know syntax—that's going to be folk level, man. A good example would be, somebody says [sings in blues cadence]: You be my baby, and I'll be your man. Not "If you will be my baby." That's folk level, we can tell. It's pronounced on a folk level. It can be very moving, very authentic—but it's limited. It's an acquired taste for a more sophisticated person, like a cruder recipe. Now, you get a guy saying [singing]: Is you is or is you ain't my baby? [1944 Louis Jordan song] That's bad grammar, but it's pop. You  know that's not folk. The guy's kidding. "Are you or aren't you my baby?" That won't work. He wants to be very close to the earth. [singing] Is you is or is you ain't my baby? The way you acting lately makes me doubt you is still my baby, baby. The way you say "baby," that's some country shit. But you could do that in a fifteen dollar or twenty-five dollar cover charge place. These other guys out there strumming, that's another thing, they got a tin cup in the town square on Friday afternoon. Now, the ultimate extension, elaboration, and refinement would be: [hums Ellington's "Rocks in My Bed"] That's the blues on another level. Technically more refined. More complex, more difficult to play. More complete control over the means of expression.


GT: Some of what [Constance] Rourke was counterstating was some of [T. S.] Eliot's elitist conceptions or I guess maybe the stereotype of Matthew Arnold's conception of culture. They also had a conception of, say, "fine art." But it seems to me that Constance Rourke was trying to privilege and focus on the folk form and the popular form.

AM: It's a dynamic that you want to get that adds up to Constance Rourke. What she discovered, as I understand it, was a principle for the definition of culture that was derived from the German philosopher Herder. It gave her insight into the fact that cultures develop. They come from the ground up, not from on-high down. Most people were lamenting that there was no high culture. You forget, these were barbarians—Europe in the Dark Ages. When you come out of that, they've got an art form. They've got the gothic cathedrals, they've got these goddamn vitraux, the stained-glass windows. They've got scholarship, although it's on sacred texts and so forth. Then, when they get to the Renaissance period, they rediscover Rome and Greece. Then they have a broader context of what they're doing. These guys had been all the way from savagery all the way up to Praxiteles to the Parthenon to Sophocles and Euripides and Aeschylus, Aristophanes, Socrates, Plato, Aristotle—all these refinements. Then you had all these extensions of that because the Romans could reach over there and get it. The Greeks were still around, for them. Any great Roman family had a Greek master. And they went around acting like Greeks. Just like classy Americans acted British and would speak with a slightly British accent, like that Boston thing. Well, that's the way that I understand it—that educated Romans spoke like Greeks. Which makes all the sense in the world, doesn't it? One is able to look at it this way because of the dynamic that Constance Rourke revealed. Extension, elaboration, and refinement—it's not just bootlegging something in.


GT: Process, continuum.

AM: You can see it in Mark Twain! He's a half-assed newspaperman, he writes about what he knows about, he's writing a fairly simple report, but the storytelling thing takes over at a certain point—and he's into art! He made the steps. You can see it. Whitman!—you've gotta make it out of this and it's gotta be like this. So when you've got Moby-Dick—there ain't nothing over there like that. It's a novel, it's not The Iliad and The Odyssey. It's something else. It's a big, thick American book about process. When I was in high school there was nothing like football movies, nothing like college movies. This sweatshirt comes from the 1930s, man! You find that very pragmatic level of how things are done at a given point. Life on the Mississippi—how it is to be a riverboat captain. The romance of it. It's a very practical thing. What's a riverboat captain? But it's transmuted into poetry. What the hell do you get in the first 150 pages of The Seven League Boots? Life on the Mississippi! What you'd call the Life on the Mississippi dimension. Nothing can be more American than "How do they do what they do?"


-------

Albert Murray (1916–2013), author of thirteen books including Stomping the Blues, was a renowned jazz historian, novelist, and social and cultural theorist. He cofounded Jazz at Lincoln Center in 1987. His finest interviews and essays on music have been compiled into a volume, published this month: Murray Talks Music: Albert Murray on Jazz and Blues.

Friday, May 6, 2016

The politics behind the metabolic health crisis in the United States















BY ANTHONY RYAN HATCH
Assistant professor in the Science in Society Program at Wesleyan University



Our metabolic health crisis—as defined by the conjoined endemics of heart disease, diabetes, high cholesterol, and obesity—continues to surprise biomedical researchers, frustrate health experts, and disable and harm millions of people. This week, three news stories illuminate yet again how the most important challenge that the metabolic health crisis presents is not biomedical or scientific, it is political. A political framing of metabolism matters because how we frame and interpret unjust and harmful situations shapes our options for insurrection against those situations.

The first story comes from the intersection of biomedical science and reality television. The New York Times scooped a new study, to be published in the journal Obesity, reporting that contestants on NBC’s reality TV show The Biggest Loser regained the weight they lost after the show’s end because of metabolic changes induced by extreme dieting and exercise. As reported by Gina Kolata:

It has to do with resting metabolism, which determines how many calories a person burns when at rest. When the show began, the contestants, though hugely overweight, had normal metabolisms for their size, meaning they were burning a normal number of calories for people of their weight. When it ended, their metabolisms had slowed radically and their bodies were not burning enough calories to maintain their thinner sizes.

The extraordinary plasticity of a body’s metabolism was a surprise for these obesity researchers and it certainly raises questions about how overweight people should go about achieving what are thought to be “healthy weights.” Yet, the biomedical claim that an overweight body is always an unhealthy body is more contentious than productions like The Biggest Loser would suggest. Scholars have interpreted the obesity problem as a socially induced moral panic that pathologizes overweight bodies and targets them for constant biomedical intervention and ethical judgment. Treating metabolism as an individual biomedical problem makes it more difficult to diagnose the metabolic health crisis as a social and political problem that impacts entire populations of organisms in patterned ways.

This biomedical finding may also prove disheartening for millions of overweight people who struggle to lose weight and keep it off. As Kolata says, “Despite spending billions of dollars on weight-loss drugs and dieting programs, even the most motivated are working against their own biology.” In this conceptualization, the body’s metabolism acts as an agent, conspiring against us to produce embodiments that we don’t want. Equally disheartening, in my view, is the narrow way in which metabolism is constructed as a biomedical process found only in the body and its biochemicals (for example, Kolata features the hormone leptin). In contrast to interpreting metabolism at the level of an individual body, obesity researchers would be wise to incorporate a concept of social metabolism into their theoretical world. The bodies featured on The Biggest Loser don’t exist in a sociological vacuum—they exist within a corporate food regime that makes it next to impossible to eat well in order to be healthy and a corporate pharmaceutical regime that crowds out alternative modes of healing bodies. All too often, the companies that produce and regulate food and drugs in our society are seen as bit actors in the grand play of social metabolism rather than as occupying the leading roles.

This leads me to a second story, which digs into the ways that government regulatory practices shape population and ecological health. The Guardian reports that lobbyists for the United Egg Producers, the National Cattleman’s Beef Association, and the National Pork Producers Council are urging Congress to soften Freedom of Information Act laws that force the government to report when the food industry lobbies the United States Department of Agriculture in the form of financing for “checkoff” programs—the sort that are responsible for marketing campaigns like “Beef. It’s What’s for Dinner.” The bill containing this regulatory change is merely proposed at this point, but it signals an ongoing dynamic in which food industries seek to obscure their involvements in shaping government policies that structure our food environment. These companies are able to produce low-cost, industrially manufactured animal products because of vast government subsidies given to other agribusiness that produce corn and soy—two key ingredients in animal feed. These covert actions work to conceal the political problem posed by metabolism and make it hard to resist government regulatory practices that shape population and ecological health. As long as individual bodies remain the target of metabolic intervention (like the highly visible bodies on The Biggest Loser) and not corporations, we are in deep, deep trouble.

But, there is room for hope. A third story cuts against the historical grain created by the biopolitics of metabolism, this one involving the colonial manufacture of sugar in Hawaii. As reported first in January and again in April, the Hawaiian Corporate and Sugar Company has planted its last sugar crop on the island of Maui. This 144-year old vestige of the old colonial sugar production system has determined that it is no longer profitable to use its stolen land for sugar monoculture and instead intends to diversify farming operations into new commodities like sorghum, fruits, and bio-mass. Native Hawaiian activist Tiare Lawrence has called for the land to be returned to the people of Maui so that more sustainable agricultural practices can be implemented like family-scale organic farming and agroforestry. Given the role that sugar plays in the ongoing metabolic crises, both at the level of individual biology and social ecology, this development in Maui signals possibilities for what can take place when institutions that produce inequality are (at least potentially) dismantled and replaced by locally organized social systems that aim to benefit the common people.

These seemingly disparate stories are connected through profound political transformations that link biomedicine and agriculture together. These developments illuminate the biopolitics of metabolism, a term that encompasses the ideas, social practices, and institutional relationships that govern the metabolic health of individuals and groups. In the biopolitics of metabolism, we essentially have a political problem that gets dressed up as a scientific problem, but we have to recognize that the scientific is always a political problem. The convergence of the individualization of metabolism and the concealment of the social dimensions of metabolism have created a context in which solutions to metabolic crises are increasingly understood as a problem of either more technocientific medicine or more transparent government regulations. But, these two pathways have always worked together as mechanisms of biosocial control. Perhaps we should start crafting new political stories about metabolism that help to break this pattern of understanding.

-------

Anthony Ryan Hatch is author of Blood Sugar: Racial Pharmacology and Food Justice in Black America. He is assistant professor in the Science and Society Program at Wesleyan University.

"Bearing personal witness from the frontiers of the quantified self, Anthony Ryan Hatch offers a reimagining of metabolism as a form of social knowledge. Blood Sugar makes a key contribution to our understanding of the evolution of racial health disparities."
—Alondra Nelson, author of
The Social Life of DNA and Body and Soul

Wednesday, April 27, 2016

Racial justice, American exceptionalism, and speculative fiction



BY ANDRÉ CARRINGTON
Assistant professor of English at Drexel University


In the 21st century, society has grown to rely on the axiom that “race” is a lie. For some people, out of paranoia or a desire to avoid conflict, touting the knowledge that race is socially constructed is a way of declaring that ignorance about what it means is willful. For the rest of us, knowing that the disparities causing us to live and die in painfully different ways stem from irrational pseudo-science is just an insult piled on top of injuries.

We all deal with the fictions on which white supremacy is founded and the fantasies that aim to rationalize the subordination of everyone else in different ways. Doctors and nurses convince themselves that Black people feel less pain or tolerate it more. White grade-school teachers commend the talents of Black girls and boys at lower rates than their white cohorts. Borrowing language from the role-playing game Dungeons & Dragons, Ta-Nehisi Coates describes the mass incarceration that consigns so many Black people to unfreedom in the post-Civil Rights era as The Gray Wastes.

And now things get queer: the critique of state-sanctioned racial violence meets the repertoire of fantasy and gaming. How can you write about life and death in the obscure rhetoric of a teenage diversion? How can a trivial hobby provide the words we need to shake the serious-minded know-it-alls wringing our hands about crime and the Black family out of our conventional wisdom? The Gray Wastes is a compelling topos within Dungeons & Dragons, according to a review that Coates cites, because its terror “erodes the sense of purpose that is the hallmark of an alignment-based philosophy.” The place where strongly focused evil resides is so thoroughly suffused with the meanings ascribed to the category of the unjust that this intangible moral quality becomes a spatial and temporal reality, precipitating down from the realm of abstraction to soak everything in a cold, aching despair. Coates finds this highly evocative metaphor powerful enough to describe what prison does to African American families. Our carceral society discolors your life even when you get back to living it. Legal discrimination against ex-offenders cuts off your access to a fulfilling livelihood and civic participation, and state-sanctioned exploitation strains every relationship you’d hope to maintain with lovers, family, and friends—none of whom will ever look at you the same way again. It’s a fate not unlike like what Orlando Patterson termed Social Death—a state of “natal alienation” or displacement from the bonds of community, time, and space—which eerily intersects with the lack of a sense of futurity that animates (or paralyzes) some branches of queer theory.

Each of these conceits—the Gray Wastes, Social Death, antisocial politics—lends credence to a hypothesis that I call “the speculative fiction of Blackness”: the notion that Black people might populate discourses of impossibility, haunting, death-defying, and the otherworldly as a matter of course. The notion that the supernatural should come naturally to descendants of enslaved Africans in the Americas is a corollary to operations of white supremacy in culture that positions Black people as freaks of nature, not quite up to full participation in the Age of Reason. Alain Locke called it: “For generations in the mind of America, the Negro has been more of a formula than a human being.” Richard Wright called it: “The Negro is America’s metaphor.” Toni Morrison called it: in American literature, race has become “metaphorical—a way of referring to and disguising forces, events, classes, and expressions of social decay and economic division far more threatening to the body politic than biological ‘race’ ever was.” Hortense Spillers called it when she said that black women are “the beached whales of the sexual universe, unvoiced, misseen, not doing, awaiting their verb.” Speculation from the hollows where Black genius resides produces poignant reconstructions of the past like Julie Dash’s Daughters of the Dust as well as prophetic polemics like Public Enemy’s Fear of a Black Planet. By eschewing the codes of modern social scientific realism, imaginative cultural production allows Black thinkers and dreamers to lay claim to the speculative fiction of Blackness on their own terms.

In some crucial respects, the speculative fiction of Blackness takes exception to the richly allegorical gestures of fantasy, science fiction, roleplaying games, and horror. With the lines between good and evil drawn in such stark metaphorical terms, you might expect that the millions of white Americans who came of age playing games like D&D—the people for whom an allusion to “the Gray Wastes” is most intelligible—would become the staunchest allies in the fight against police brutality, prison-based gerrymandering, and other forms of institutional racism. But you know that did not happen. You might think that a society realizing the wildest dreams of our forebears, knowing that race is no biological reality but a social fact, would harness the power of the imagination to confront the most intractable problems we have ever faced in novel ways.

As a humanities scholar, I am concerned about ostensibly conscientious contributions to social and political thought in popular culture. When fictions of social transformation don’t defer to the vast body of antiracist knowledge in the modern world—or worse, when they diminish it, draw it in caricature, or reduce it to its image—extrapolations on the nature of racial conflict fail utterly at their social task. A similar pattern lays the groundwork for struggles over the meaning of gendered and sexual difference in the genre: compared with their feminist counterparts who devote their entire lives to understanding the complexities of patriarchy, gender, and sexuality, anti-feminist writers who don’t believe in or don’t understand the critique of patriarchy do a terrible job articulating what the far-fetched possibilities of their fictions mean for the respective roles of women and men. The problem is the same: when the metaphor eclipses its subtext, it mystifies rather than demystifying.

With few exceptions, the story SF tells about itself recapitulates conventional tendencies when it comes to race thinking, because it is coextensive with the structures and traditions of cultural production that characterize the society in which we live. White supremacy is among the most enduring of those traditions. My term for the default setting of the relationship between race and genre is “the whiteness of science fiction.” Struggle though we might to comprehend the alterity of the genre in countercultural terms, without adopting a critique of racism that actually attends to the priorities of antiracist intellectuals and the social formations we come from, SF writers don’t enjoy any special purchase on the repertoire of cultural practices that will lead us out of our present when it comes to the racial status quo. Where a transformative vision of racial justice or a resonant meditation on being brown, postcolonial, or Diasporic shows itself in literature and other media, you can trust that vision is indebted to the deep roots of speculation in communities of color.

-------

andré carrington is author of Speculative Blackness: The Future of Race in Science Fiction. He is assistant professor of English at Drexel University.

Monday, March 28, 2016

On freegans, pre-peeled oranges, and ethical consumer ‘Whack-A-Mole’

Photo courtesy of the author.





















BY ALEX V. BARNARD
Food justice activist and doctoral candidate in sociology at the University of California, Berkeley


Whole Foods felt the wrath of the Twitter-sphere this month. The episode started with consumers questioning the company’s ethical bona-fides but, in the end, cast into doubt the effectiveness of “ethical consumerism” itself. It’s another example of the unlikely lessons, recounted in my new book Freegans: Diving Into the Wealth of Food Waste in America, we can learn from looking at the cast-offs of our food system—in this case, orange peels.

“OrangeGate” started when a shopper in a London Whole Foods tweeted a picture of a new product: pre-peeled, individually packaged oranges. She acerbically remarked, “If only nature would find a way to cover these oranges so we didn't need to waste so much plastic on them.” Online outrage ensued and the photo was retweeted 110,000 times. Whole Foods eventually declared that the innovation was, in fact, “a mistake” and promised that they had been “pulled” from the shelves.

It was a modest win, but one worth celebrating. The problem is not so much the packaging the oranges were being sold in—plastic—as the one they were not—the peel. Stores like Whole Foods pre-cut, pre-peel, and pre-cook food in order to increase selling prices in a competitive market where the raw materials—that is, the food itself—are unprecedentedly cheap. These practices “add value” for the company, but they subtract from foods’ shelf life—contributing to the U.S.’s 50% increase in food waste per capita since the 1970s. The environmental impact of the water, fertilizer, and land area that goes into a piece of produce we don’t eat is often even worse the more visible waste of excess packaging.

But environmental activists’ celebrations were cut short. Disability-rights advocates pointed out that, although certainly not the intention behind their introduction, Whole Foods’ pre-peeled oranges (and pre-cut foods in general) are a boon for people with limited hand dexterity or arthritis. That this had not been taken into consideration showed that “protesters prioritized the environment over the experiences of disabled people.” Some environmentalists conceded that these counter-critiques had a point.

But this leaves those of us still following the story at an impasse. What do we as consumers do if we care both about the environment and accessibility? The problem, I argue, is the assumption behind the question itself: that we should confront social problems as individual consumers.

As OrangeGate showed, ethical consumerism is a bit like Whack-A-Mole. As the one-time consumer activists I interviewed for my book realized, we often solve one problem in our lifestyle only to uncover three new ones. It’s the frustration that confronts vegans who discover that the production of vegetable crops on industrialized farms kills billions of small animals; conscientious shoppers who start buying from “natural” foods stores like Trader Joe’s or Whole Foods only to find out their promises of perfect food make them often the most wasteful; people who patronize their local farmers’ market religiously until they learn that small producers, too, can have abusive labor practices.

These realizations can lead to disenchantment—or to a creative rethinking of what effective activism is all about. My book centers on one group of people who embarked as individuals on a search for ways to live both ethically and efficaciously and wound up adopting a new and poorly understood collective approach: “freeganism.” Freeganism, as one of my respondents told me, is a kind of “practicing anti-capitalism” in which participants refuse, as much as possible, to buy anything. Freeganism is founded on the idea that every product in an increasingly un-regulated market economy—even those labeled “organic,” “fair trade,” or “humane slaughter”—can be traced back to abuses of one kind or another.

Instead, freegans recover the unused or wasted resources left behind by our capitalist economy and repurpose them toward putting in place alternative values of mutual aid and sustainability. Freegans are best known for “dumpster diving”: taking food discarded by supermarkets and redistributing it for free. Nonetheless, freeganism isn’t just about recovering wasted food: the group I studied in New York also ran a free bike workshop, organized “skill-shares” to train people to repair discarded textiles or forage for wild food in city parks, and helped set up “really really free markets” to circulate unneeded goods that, for one person, might seem like “waste” but for others was really useful “wealth.”

The practices of freeganism are, unsurprisingly, not exactly appealing to everyone (although resistance tends to soften as soon as one sees a dumpster full of one-day-expired bags of premium coffee or an entire trash bag filled with still-warm donuts). Nor is freeganism an accessible practice to a large part of the population. I outline in my book how the movement struggled to overcome barriers to participation related to class, race, and ability, as well as a pushback from stores concerned about the negative publicity (so much so that some started bleaching their garbage and guarding their dumpsters!).

But the point of freeganism was never to convince everyone to jump into a dumpster. Instead, freegans use the provocation of uncovering waste—“Did a store really throw that out?”—to question the assumptions behind ethical consumption. As freegans constantly reminded me, money in a capitalist economy is fungible; even if you buy the oranges in the peel at Whole Foods, you’re still supporting a business model that produces pre-peeled oranges and sells them at double the price. Moreover, “free markets” are anything but efficient in translating consumer demands into concrete changes. The products we boycott are often still produced, but then thrown out—and the proof is in the dumpsters themselves (where those Whole Foods oranges “pulled” from the shelves—alongside the approximately 90% of new food products introduced each year which “fail”—probably wound up).

But perhaps, most importantly, freegans remind us that the question “How can I make the world a better place?” should never be reduced to “What should I buy?” Freegans avoid wasting food not by buying fancier packaging, but by sharing with one another when they have too much; when they can’t find the right ingredients, they don’t rush to the store, but they cook together. As soon as we look at our food system as something we are constantly producing as a group—whether through gardening, foraging, or gleaning—some of the problems confronting us when we see ourselves only as consumers, left to manage as best we can with whatever the market makes available to us, disappear. Not being able to peel oranges, after all, is only a problem when we expect people to shop, prepare, and eat their meals alone—rather than, as freegans did, treating feeding ourselves as a collective effort in which everyone has something (literally and figuratively) to bring to the table.

-------

Alex V. Barnard is author of Freegans: Diving into the Wealth of Food Waste in America. He is a doctoral candidate in sociology at the University of California, Berkeley, and a food justice activist.

Thursday, March 17, 2016

The 1939–40 New York World's Fair publicly launched the first idea of the television and what it can do.

This publicity photograph from RCA emphasizes the wealth
and prestige of the first television viewers posed in front of the
TRK-12 RCA receiver.
Courtesy of the Hagley Museum and Library. 


BY DANIELLE SHAPIRO


Today, we take television for granted. It is everywhere, in different sizes and shapes, in our pockets and our living room walls. It is ubiquitous.

The idea of the television we know today was introduced to the American public in the 1920s and then more as a reality in the 1930s. John Vassos (1898-1985), an American artist and the Radio Coporation of America’s lead consultant industrial designer, played a critical role at the start of the television age, creating a shape for the first mass-produced televisions in America. RCA, a dominant force in radio production and broadcasting through its affiliate NBC, was a leading innovator in television technology and manufacturing. Vassos’s televisions were introduced in a big splashy presentation at the 1939-40 New York World’s Fair and at the Golden Gate International Exposition in San Francisco, also in 1939 and 1940. RCA’s broadcast of the opening of the World’s Fair marked the commencement of regular television broadcasts in North America and was the first opportunity for a large public to see television in operation. Vassos’s earliest television receivers used the then-futuristic idiom of streamlining to create a receiver design that became outdated before it hit the market.

The large TRK-12 receiver was named for its 12-inch screen. Vassos had been challenged to find an appropriate cabinet for RCA’s newest and most ambiguous technology. Indeed, the terminology “television” was still so new that the machine had not received its name and “receiver” more accurately described what the machine did: capture the transmission of television. RCA was unsure how to promote the new medium. Was it radio with pictures or something else?

The challenge of creating a shape for television forced Vassos to consider issues affecting design for the home for a truly new technology. The freestanding unit’s large mechanical parts posed a design difficulty. He chose to integrate some elements of radio in a design that is now considered a classic. The TRK-12’s importance to RCA cannot be underestimated. It suggests an instance of a visionary design worthy of study despite its failure in the marketplace due to a disruption in production during the war and advances in television technology that soon replaced the cumbersome receiver.

RCA, a leader in television manufacturing, introduced its premier technology at the 1939-40 New York World’s Fair. Its opening speech by Franklin Delano Roosevelt on April 30, 1939, marks the advent of regularly scheduled broadcast programming in the United States. It aired on W2XBS, the predecessor of NBC. The ceremony was viewed by several hundred viewers on TV receivers inside the RCA Pavilion at the fairgrounds as well as on receivers installed on the 62nd floor of Radio City. Ten hours of programming, including shows from the NBC studio in Radio City, were played on the multiple receivers housed in the RCA Pavilion.


RCA - Harvey Gibson, "Miss Television," and James E. Robert
stand with an earlier iteration of the television, c. 1935-45.
Image from Manuscripts and Archives Division, New York Public Library.



This striking receiver was the first thing that visitors saw upon entering the RCA pavilion along with Stuart Davis’s wall mural as a backdrop and illuminated by natural light from a spectacular glass curtain. The Phantom TRK-12, the one-of-a-kind Lucite version of one of four mass marketed televisions available in department stores, was mounted on a circular stage and surrounded by smooth curving metal bars. The object was like an exotic, beautiful, and perhaps dangerous animal held back by a cage. The effect on visitors is clear in the photo where they gaze at the receiver, not quite knowing what to make of it. Journalist Orrin E. Dunlap remarked on the rarity and cost of the object and the difficulty posed in mass producing it. He wrote:

by inspecting a special television set built in a glass cabinet visitors at Flushing have the opportunity to observe the complexity of the radio-sight chassis and why the machines are priced from $200 to $1,000. To see this gleaming glass encased instrument is to realize what a trick it is ahead to swing such an intricate outfit into mass production. It is evidence that the manufacturer as well as the showman has been tossed a challenge by the research experts who now anxiously watch to see what artistry can do with the new giant long-range eyes. (New York Times, May 7, 1939.)


An article shows curious spectators surrounding
TRK-12 televisions at various locations in the
New York City region.
Broadcast News, July 1939.
Courtesy of the Hagley Museum and Library.


RCA released the TRK-12 for sale soon after the start of the fair. Stores were mobbed with bystanders—an image of a televised crowd watching television that would be replicated again and again. In many ways, the World’s Fair was the ideal place to unveil a new technology. The exhibit, like the fair as a whole, was meant to express the values of freedom from scarcity and hope for a future saved by technology and administered by big business. The timing was right for a look at the future since the country had endured a decade of depression and was on the verge of joining a war that had already erupted in Europe. These fears and hopes were expressed succinctly by President Roosevelt's opening address at the fair: “The eyes of the United States are fixed on the future.” This was literally the case as thousands of people at the Fair and at department stores around New York City watched and listened to his address on the new medium of television and dreamed of a new tomorrow.

-------

Danielle Shapiro is author of John Vassos: Industrial Design for Modern Life. She is an independent scholar who has served as senior program officer in the Division of Public Programs at the National Endowment for the Humanities. She earned her PhD in art history and communications studies from McGill University.


"John Vassos is a complex portrait of an artist and designer whose early illustration work criticized the tempo and commercialism of modern life but whose later design work took for granted those same qualities and attempted to accommodate people to them."
—Jeffrey L. Meikle, University of Texas at Austin

Thursday, March 10, 2016

Disagreement abounds about the best way to serve deaf children.




LAURA MAULDIN
Assistant professor of human development/family studies and women's, gender and sexuality studies at the University of Connecticut



A common argument for using sign language with hearing babies is that it would have benefits that are practical (less fussing), emotional (creates a closer parent bond), and cognitive (boosts brain development). “Fewer tantrums and more fun!” claims the website Baby Sign Language. Using sign with babies has become popular in recent years, and many claim it facilitates communication—which is key for healthy development. There is a dearth of research to support these claims.

In perhaps the greatest irony for deaf children, signing is often discouraged in the United States—which also happens to be the world’s largest medical device market. As I outline in my book, Made to Hear, the cochlear implant (CI) has become more frequently used in deaf children, and clinics often give parents the opposite advice that they give parents of hearing children: signing is risky and will impede your child’s speech development. But there is a dearth of research to support these claims, too, and even research to suggest that the opposite is true and that sign may actually aid in the spoken language development of children with CIs. While my book outlines these patterns of advice to mothers, particularly when it comes to newborns and infants, I have since turned my focus to examining what becomes of deaf children who are denied sign language early in life. What about children who may in fact have benefited from access to sign? How do these clinical practices translate into educational practices and what problems arise because of it?

On March 2, 2016, a coalition of supporters for a recent bill introduced in the House of Representatives, H.R. 3535, also known as the Macy-Cogswell Act, converged on Washington, DC. According to attendee Jeff Bravin, more than 100 people showed up to advocate for new laws that they believe would better support deaf, as well as blind, students. While some students are indeed deaf-blind (that is, have both visual or hearing impairments), the partnership between educators who teach deaf children and educators who teach blind children came out of necessity. On their own, a coalition of deaf education-related organizations was unable to garner the support needed for the bill. But in joining forces with stakeholders in education for blind or visually impaired children, the Macy-Cogswell Act was eventually introduced. The meetings that took place on March 2 resulted in a new co-sponsor of the bill, Representative Larson, D-Connecticut.

The Macy-Cogswell Act is described by the Conference of Educational Administrators of Schools and Programs for the Deaf (CEASD) as necessary for ensuring that deaf students get the specialized teaching they need in a timely manner. This organization is made up of those who educate children in deaf schools; that is, they work with deaf children in a sign language-friendly environment and tend to promote bilingualism in both English and American Sign Language (ASL). The push for “appropriate” assessment of children’s language or instructional needs partly comes from patterns like this: The average age of enrollment in the American School for the Deaf is between 12 and 14 years of age. A portion of these students (including some with CIs) had been denied meaningful access to sign in their prior placement in an effort to focus on spoken language. Deaf schools are then enrolling children who are far behind, but could have been given access to ASL earlier. We know that language deprivation has devastating effects on development, but without comprehensive, aggregate empirical data on how those children ended up being placed at ASD, where they were before, and what spoken language skills they brought with them, it is hard to make evidence-based claims about which practices are best. Nevertheless, the arguments and analyses of those who are on the ground in deaf schools on a daily basis are clearly that more sign language access is needed and it is needed sooner. This group does not articulate an argument to exclude speech, only an argument to include sign. Would it be possible to add ASL as a viable tool in the clinical context or earliest educational placement, especially since there is no evidence to support the claims that sign impedes speech?

The Alexander Graham Bell Association (AGB) characterizes the legislative efforts regarding Macy-Cogswell as “well intentioned,” but critiques these laws that “frequently have language that focuses on access and rights to sign language which is not relevant to the majority of students who are deaf and hard of hearing in the public schools.” They go on to cite that 52% of deaf children are pursuing a spoken language approach. AGB is the largest organization in the US that promotes a “Listening and Spoken Language” approach and even has certificate programs that train teachers in such approaches. It suggests a number of aspects that should be included in the laws instead, including setting their certificate in listening and spoken language as the standard for providers who work with deaf children. In their view, ASL is irrelevant.


These two groups could not be more different. This pattern of viewpoints is nothing new when it comes to the question of what we should do about deaf children. Do they need to overcome their deafness through medical and educational intervention that could give them a better ability to hear and speak? Or should their deafness be accepted, dealt with through using a visual language like American Sign Language (ASL), and all other developmental and educational milestones achieved through this method? There never seems to be agreement over which is the ‘right’ answer and there is no formula for knowing which is the best route for all deaf children. But what if we stopped assuming it was an either/or question? What if the CI and ASL were standard and utilized in tandem? It seems the biggest barrier to combining efforts to serve deaf children is professionals’ refusal to hear each other.

-------

Laura Mauldin is assistant professor of human development/family studies and women's, gender, and sexuality studies at the University of Connecticut. She is author of Made to Hear: Cochlear Implants and Raising Deaf Children.