Wednesday, April 27, 2016

Racial justice, American exceptionalism, and speculative fiction

Assistant professor of English at Drexel University

In the 21st century, society has grown to rely on the axiom that “race” is a lie. For some people, out of paranoia or a desire to avoid conflict, touting the knowledge that race is socially constructed is a way of declaring that ignorance about what it means is willful. For the rest of us, knowing that the disparities causing us to live and die in painfully different ways stem from irrational pseudo-science is just an insult piled on top of injuries.

We all deal with the fictions on which white supremacy is founded and the fantasies that aim to rationalize the subordination of everyone else in different ways. Doctors and nurses convince themselves that Black people feel less pain or tolerate it more. White grade-school teachers commend the talents of Black girls and boys at lower rates than their white cohorts. Borrowing language from the role-playing game Dungeons & Dragons, Ta-Nehisi Coates describes the mass incarceration that consigns so many Black people to unfreedom in the post-Civil Rights era as The Gray Wastes.

And now things get queer: the critique of state-sanctioned racial violence meets the repertoire of fantasy and gaming. How can you write about life and death in the obscure rhetoric of a teenage diversion? How can a trivial hobby provide the words we need to shake the serious-minded know-it-alls wringing our hands about crime and the Black family out of our conventional wisdom? The Gray Wastes is a compelling topos within Dungeons & Dragons, according to a review that Coates cites, because its terror “erodes the sense of purpose that is the hallmark of an alignment-based philosophy.” The place where strongly focused evil resides is so thoroughly suffused with the meanings ascribed to the category of the unjust that this intangible moral quality becomes a spatial and temporal reality, precipitating down from the realm of abstraction to soak everything in a cold, aching despair. Coates finds this highly evocative metaphor powerful enough to describe what prison does to African American families. Our carceral society discolors your life even when you get back to living it. Legal discrimination against ex-offenders cuts off your access to a fulfilling livelihood and civic participation, and state-sanctioned exploitation strains every relationship you’d hope to maintain with lovers, family, and friends—none of whom will ever look at you the same way again. It’s a fate not unlike like what Orlando Patterson termed Social Death—a state of “natal alienation” or displacement from the bonds of community, time, and space—which eerily intersects with the lack of a sense of futurity that animates (or paralyzes) some branches of queer theory.

Each of these conceits—the Gray Wastes, Social Death, antisocial politics—lends credence to a hypothesis that I call “the speculative fiction of Blackness”: the notion that Black people might populate discourses of impossibility, haunting, death-defying, and the otherworldly as a matter of course. The notion that the supernatural should come naturally to descendants of enslaved Africans in the Americas is a corollary to operations of white supremacy in culture that positions Black people as freaks of nature, not quite up to full participation in the Age of Reason. Alain Locke called it: “For generations in the mind of America, the Negro has been more of a formula than a human being.” Richard Wright called it: “The Negro is America’s metaphor.” Toni Morrison called it: in American literature, race has become “metaphorical—a way of referring to and disguising forces, events, classes, and expressions of social decay and economic division far more threatening to the body politic than biological ‘race’ ever was.” Hortense Spillers called it when she said that black women are “the beached whales of the sexual universe, unvoiced, misseen, not doing, awaiting their verb.” Speculation from the hollows where Black genius resides produces poignant reconstructions of the past like Julie Dash’s Daughters of the Dust as well as prophetic polemics like Public Enemy’s Fear of a Black Planet. By eschewing the codes of modern social scientific realism, imaginative cultural production allows Black thinkers and dreamers to lay claim to the speculative fiction of Blackness on their own terms.

In some crucial respects, the speculative fiction of Blackness takes exception to the richly allegorical gestures of fantasy, science fiction, roleplaying games, and horror. With the lines between good and evil drawn in such stark metaphorical terms, you might expect that the millions of white Americans who came of age playing games like D&D—the people for whom an allusion to “the Gray Wastes” is most intelligible—would become the staunchest allies in the fight against police brutality, prison-based gerrymandering, and other forms of institutional racism. But you know that did not happen. You might think that a society realizing the wildest dreams of our forebears, knowing that race is no biological reality but a social fact, would harness the power of the imagination to confront the most intractable problems we have ever faced in novel ways.

As a humanities scholar, I am concerned about ostensibly conscientious contributions to social and political thought in popular culture. When fictions of social transformation don’t defer to the vast body of antiracist knowledge in the modern world—or worse, when they diminish it, draw it in caricature, or reduce it to its image—extrapolations on the nature of racial conflict fail utterly at their social task. A similar pattern lays the groundwork for struggles over the meaning of gendered and sexual difference in the genre: compared with their feminist counterparts who devote their entire lives to understanding the complexities of patriarchy, gender, and sexuality, anti-feminist writers who don’t believe in or don’t understand the critique of patriarchy do a terrible job articulating what the far-fetched possibilities of their fictions mean for the respective roles of women and men. The problem is the same: when the metaphor eclipses its subtext, it mystifies rather than demystifying.

With few exceptions, the story SF tells about itself recapitulates conventional tendencies when it comes to race thinking, because it is coextensive with the structures and traditions of cultural production that characterize the society in which we live. White supremacy is among the most enduring of those traditions. My term for the default setting of the relationship between race and genre is “the whiteness of science fiction.” Struggle though we might to comprehend the alterity of the genre in countercultural terms, without adopting a critique of racism that actually attends to the priorities of antiracist intellectuals and the social formations we come from, SF writers don’t enjoy any special purchase on the repertoire of cultural practices that will lead us out of our present when it comes to the racial status quo. Where a transformative vision of racial justice or a resonant meditation on being brown, postcolonial, or Diasporic shows itself in literature and other media, you can trust that vision is indebted to the deep roots of speculation in communities of color.


andré carrington is author of Speculative Blackness: The Future of Race in Science Fiction. He is assistant professor of English at Drexel University.

Monday, March 28, 2016

On freegans, pre-peeled oranges, and ethical consumer ‘Whack-A-Mole’

Photo courtesy of the author.

Food justice activist and doctoral candidate in sociology at the University of California, Berkeley

Whole Foods felt the wrath of the Twitter-sphere this month. The episode started with consumers questioning the company’s ethical bona-fides but, in the end, cast into doubt the effectiveness of “ethical consumerism” itself. It’s another example of the unlikely lessons, recounted in my new book Freegans: Diving Into the Wealth of Food Waste in America, we can learn from looking at the cast-offs of our food system—in this case, orange peels.

“OrangeGate” started when a shopper in a London Whole Foods tweeted a picture of a new product: pre-peeled, individually packaged oranges. She acerbically remarked, “If only nature would find a way to cover these oranges so we didn't need to waste so much plastic on them.” Online outrage ensued and the photo was retweeted 110,000 times. Whole Foods eventually declared that the innovation was, in fact, “a mistake” and promised that they had been “pulled” from the shelves.

It was a modest win, but one worth celebrating. The problem is not so much the packaging the oranges were being sold in—plastic—as the one they were not—the peel. Stores like Whole Foods pre-cut, pre-peel, and pre-cook food in order to increase selling prices in a competitive market where the raw materials—that is, the food itself—are unprecedentedly cheap. These practices “add value” for the company, but they subtract from foods’ shelf life—contributing to the U.S.’s 50% increase in food waste per capita since the 1970s. The environmental impact of the water, fertilizer, and land area that goes into a piece of produce we don’t eat is often even worse the more visible waste of excess packaging.

But environmental activists’ celebrations were cut short. Disability-rights advocates pointed out that, although certainly not the intention behind their introduction, Whole Foods’ pre-peeled oranges (and pre-cut foods in general) are a boon for people with limited hand dexterity or arthritis. That this had not been taken into consideration showed that “protesters prioritized the environment over the experiences of disabled people.” Some environmentalists conceded that these counter-critiques had a point.

But this leaves those of us still following the story at an impasse. What do we as consumers do if we care both about the environment and accessibility? The problem, I argue, is the assumption behind the question itself: that we should confront social problems as individual consumers.

As OrangeGate showed, ethical consumerism is a bit like Whack-A-Mole. As the one-time consumer activists I interviewed for my book realized, we often solve one problem in our lifestyle only to uncover three new ones. It’s the frustration that confronts vegans who discover that the production of vegetable crops on industrialized farms kills billions of small animals; conscientious shoppers who start buying from “natural” foods stores like Trader Joe’s or Whole Foods only to find out their promises of perfect food make them often the most wasteful; people who patronize their local farmers’ market religiously until they learn that small producers, too, can have abusive labor practices.

These realizations can lead to disenchantment—or to a creative rethinking of what effective activism is all about. My book centers on one group of people who embarked as individuals on a search for ways to live both ethically and efficaciously and wound up adopting a new and poorly understood collective approach: “freeganism.” Freeganism, as one of my respondents told me, is a kind of “practicing anti-capitalism” in which participants refuse, as much as possible, to buy anything. Freeganism is founded on the idea that every product in an increasingly un-regulated market economy—even those labeled “organic,” “fair trade,” or “humane slaughter”—can be traced back to abuses of one kind or another.

Instead, freegans recover the unused or wasted resources left behind by our capitalist economy and repurpose them toward putting in place alternative values of mutual aid and sustainability. Freegans are best known for “dumpster diving”: taking food discarded by supermarkets and redistributing it for free. Nonetheless, freeganism isn’t just about recovering wasted food: the group I studied in New York also ran a free bike workshop, organized “skill-shares” to train people to repair discarded textiles or forage for wild food in city parks, and helped set up “really really free markets” to circulate unneeded goods that, for one person, might seem like “waste” but for others was really useful “wealth.”

The practices of freeganism are, unsurprisingly, not exactly appealing to everyone (although resistance tends to soften as soon as one sees a dumpster full of one-day-expired bags of premium coffee or an entire trash bag filled with still-warm donuts). Nor is freeganism an accessible practice to a large part of the population. I outline in my book how the movement struggled to overcome barriers to participation related to class, race, and ability, as well as a pushback from stores concerned about the negative publicity (so much so that some started bleaching their garbage and guarding their dumpsters!).

But the point of freeganism was never to convince everyone to jump into a dumpster. Instead, freegans use the provocation of uncovering waste—“Did a store really throw that out?”—to question the assumptions behind ethical consumption. As freegans constantly reminded me, money in a capitalist economy is fungible; even if you buy the oranges in the peel at Whole Foods, you’re still supporting a business model that produces pre-peeled oranges and sells them at double the price. Moreover, “free markets” are anything but efficient in translating consumer demands into concrete changes. The products we boycott are often still produced, but then thrown out—and the proof is in the dumpsters themselves (where those Whole Foods oranges “pulled” from the shelves—alongside the approximately 90% of new food products introduced each year which “fail”—probably wound up).

But perhaps, most importantly, freegans remind us that the question “How can I make the world a better place?” should never be reduced to “What should I buy?” Freegans avoid wasting food not by buying fancier packaging, but by sharing with one another when they have too much; when they can’t find the right ingredients, they don’t rush to the store, but they cook together. As soon as we look at our food system as something we are constantly producing as a group—whether through gardening, foraging, or gleaning—some of the problems confronting us when we see ourselves only as consumers, left to manage as best we can with whatever the market makes available to us, disappear. Not being able to peel oranges, after all, is only a problem when we expect people to shop, prepare, and eat their meals alone—rather than, as freegans did, treating feeding ourselves as a collective effort in which everyone has something (literally and figuratively) to bring to the table.


Alex V. Barnard is author of Freegans: Diving into the Wealth of Food Waste in America. He is a doctoral candidate in sociology at the University of California, Berkeley, and a food justice activist.

Thursday, March 17, 2016

The 1939–40 New York World's Fair publicly launched the first idea of the television and what it can do.

This publicity photograph from RCA emphasizes the wealth
and prestige of the first television viewers posed in front of the
TRK-12 RCA receiver.
Courtesy of the Hagley Museum and Library. 


Today, we take television for granted. It is everywhere, in different sizes and shapes, in our pockets and our living room walls. It is ubiquitous.

The idea of the television we know today was introduced to the American public in the 1920s and then more as a reality in the 1930s. John Vassos (1898-1985), an American artist and the Radio Coporation of America’s lead consultant industrial designer, played a critical role at the start of the television age, creating a shape for the first mass-produced televisions in America. RCA, a dominant force in radio production and broadcasting through its affiliate NBC, was a leading innovator in television technology and manufacturing. Vassos’s televisions were introduced in a big splashy presentation at the 1939-40 New York World’s Fair and at the Golden Gate International Exposition in San Francisco, also in 1939 and 1940. RCA’s broadcast of the opening of the World’s Fair marked the commencement of regular television broadcasts in North America and was the first opportunity for a large public to see television in operation. Vassos’s earliest television receivers used the then-futuristic idiom of streamlining to create a receiver design that became outdated before it hit the market.

The large TRK-12 receiver was named for its 12-inch screen. Vassos had been challenged to find an appropriate cabinet for RCA’s newest and most ambiguous technology. Indeed, the terminology “television” was still so new that the machine had not received its name and “receiver” more accurately described what the machine did: capture the transmission of television. RCA was unsure how to promote the new medium. Was it radio with pictures or something else?

The challenge of creating a shape for television forced Vassos to consider issues affecting design for the home for a truly new technology. The freestanding unit’s large mechanical parts posed a design difficulty. He chose to integrate some elements of radio in a design that is now considered a classic. The TRK-12’s importance to RCA cannot be underestimated. It suggests an instance of a visionary design worthy of study despite its failure in the marketplace due to a disruption in production during the war and advances in television technology that soon replaced the cumbersome receiver.

RCA, a leader in television manufacturing, introduced its premier technology at the 1939-40 New York World’s Fair. Its opening speech by Franklin Delano Roosevelt on April 30, 1939, marks the advent of regularly scheduled broadcast programming in the United States. It aired on W2XBS, the predecessor of NBC. The ceremony was viewed by several hundred viewers on TV receivers inside the RCA Pavilion at the fairgrounds as well as on receivers installed on the 62nd floor of Radio City. Ten hours of programming, including shows from the NBC studio in Radio City, were played on the multiple receivers housed in the RCA Pavilion.

RCA - Harvey Gibson, "Miss Television," and James E. Robert
stand with an earlier iteration of the television, c. 1935-45.
Image from Manuscripts and Archives Division, New York Public Library.

This striking receiver was the first thing that visitors saw upon entering the RCA pavilion along with Stuart Davis’s wall mural as a backdrop and illuminated by natural light from a spectacular glass curtain. The Phantom TRK-12, the one-of-a-kind Lucite version of one of four mass marketed televisions available in department stores, was mounted on a circular stage and surrounded by smooth curving metal bars. The object was like an exotic, beautiful, and perhaps dangerous animal held back by a cage. The effect on visitors is clear in the photo where they gaze at the receiver, not quite knowing what to make of it. Journalist Orrin E. Dunlap remarked on the rarity and cost of the object and the difficulty posed in mass producing it. He wrote:

by inspecting a special television set built in a glass cabinet visitors at Flushing have the opportunity to observe the complexity of the radio-sight chassis and why the machines are priced from $200 to $1,000. To see this gleaming glass encased instrument is to realize what a trick it is ahead to swing such an intricate outfit into mass production. It is evidence that the manufacturer as well as the showman has been tossed a challenge by the research experts who now anxiously watch to see what artistry can do with the new giant long-range eyes. (New York Times, May 7, 1939.)

An article shows curious spectators surrounding
TRK-12 televisions at various locations in the
New York City region.
Broadcast News, July 1939.
Courtesy of the Hagley Museum and Library.

RCA released the TRK-12 for sale soon after the start of the fair. Stores were mobbed with bystanders—an image of a televised crowd watching television that would be replicated again and again. In many ways, the World’s Fair was the ideal place to unveil a new technology. The exhibit, like the fair as a whole, was meant to express the values of freedom from scarcity and hope for a future saved by technology and administered by big business. The timing was right for a look at the future since the country had endured a decade of depression and was on the verge of joining a war that had already erupted in Europe. These fears and hopes were expressed succinctly by President Roosevelt's opening address at the fair: “The eyes of the United States are fixed on the future.” This was literally the case as thousands of people at the Fair and at department stores around New York City watched and listened to his address on the new medium of television and dreamed of a new tomorrow.


Danielle Shapiro is author of John Vassos: Industrial Design for Modern Life. She is an independent scholar who has served as senior program officer in the Division of Public Programs at the National Endowment for the Humanities. She earned her PhD in art history and communications studies from McGill University.

"John Vassos is a complex portrait of an artist and designer whose early illustration work criticized the tempo and commercialism of modern life but whose later design work took for granted those same qualities and attempted to accommodate people to them."
—Jeffrey L. Meikle, University of Texas at Austin

Thursday, March 10, 2016

Disagreement abounds about the best way to serve deaf children.

Assistant professor of human development/family studies and women's, gender and sexuality studies at the University of Connecticut

A common argument for using sign language with hearing babies is that it would have benefits that are practical (less fussing), emotional (creates a closer parent bond), and cognitive (boosts brain development). “Fewer tantrums and more fun!” claims the website Baby Sign Language. Using sign with babies has become popular in recent years, and many claim it facilitates communication—which is key for healthy development. There is a dearth of research to support these claims.

In perhaps the greatest irony for deaf children, signing is often discouraged in the United States—which also happens to be the world’s largest medical device market. As I outline in my book, Made to Hear, the cochlear implant (CI) has become more frequently used in deaf children, and clinics often give parents the opposite advice that they give parents of hearing children: signing is risky and will impede your child’s speech development. But there is a dearth of research to support these claims, too, and even research to suggest that the opposite is true and that sign may actually aid in the spoken language development of children with CIs. While my book outlines these patterns of advice to mothers, particularly when it comes to newborns and infants, I have since turned my focus to examining what becomes of deaf children who are denied sign language early in life. What about children who may in fact have benefited from access to sign? How do these clinical practices translate into educational practices and what problems arise because of it?

On March 2, 2016, a coalition of supporters for a recent bill introduced in the House of Representatives, H.R. 3535, also known as the Macy-Cogswell Act, converged on Washington, DC. According to attendee Jeff Bravin, more than 100 people showed up to advocate for new laws that they believe would better support deaf, as well as blind, students. While some students are indeed deaf-blind (that is, have both visual or hearing impairments), the partnership between educators who teach deaf children and educators who teach blind children came out of necessity. On their own, a coalition of deaf education-related organizations was unable to garner the support needed for the bill. But in joining forces with stakeholders in education for blind or visually impaired children, the Macy-Cogswell Act was eventually introduced. The meetings that took place on March 2 resulted in a new co-sponsor of the bill, Representative Larson, D-Connecticut.

The Macy-Cogswell Act is described by the Conference of Educational Administrators of Schools and Programs for the Deaf (CEASD) as necessary for ensuring that deaf students get the specialized teaching they need in a timely manner. This organization is made up of those who educate children in deaf schools; that is, they work with deaf children in a sign language-friendly environment and tend to promote bilingualism in both English and American Sign Language (ASL). The push for “appropriate” assessment of children’s language or instructional needs partly comes from patterns like this: The average age of enrollment in the American School for the Deaf is between 12 and 14 years of age. A portion of these students (including some with CIs) had been denied meaningful access to sign in their prior placement in an effort to focus on spoken language. Deaf schools are then enrolling children who are far behind, but could have been given access to ASL earlier. We know that language deprivation has devastating effects on development, but without comprehensive, aggregate empirical data on how those children ended up being placed at ASD, where they were before, and what spoken language skills they brought with them, it is hard to make evidence-based claims about which practices are best. Nevertheless, the arguments and analyses of those who are on the ground in deaf schools on a daily basis are clearly that more sign language access is needed and it is needed sooner. This group does not articulate an argument to exclude speech, only an argument to include sign. Would it be possible to add ASL as a viable tool in the clinical context or earliest educational placement, especially since there is no evidence to support the claims that sign impedes speech?

The Alexander Graham Bell Association (AGB) characterizes the legislative efforts regarding Macy-Cogswell as “well intentioned,” but critiques these laws that “frequently have language that focuses on access and rights to sign language which is not relevant to the majority of students who are deaf and hard of hearing in the public schools.” They go on to cite that 52% of deaf children are pursuing a spoken language approach. AGB is the largest organization in the US that promotes a “Listening and Spoken Language” approach and even has certificate programs that train teachers in such approaches. It suggests a number of aspects that should be included in the laws instead, including setting their certificate in listening and spoken language as the standard for providers who work with deaf children. In their view, ASL is irrelevant.

These two groups could not be more different. This pattern of viewpoints is nothing new when it comes to the question of what we should do about deaf children. Do they need to overcome their deafness through medical and educational intervention that could give them a better ability to hear and speak? Or should their deafness be accepted, dealt with through using a visual language like American Sign Language (ASL), and all other developmental and educational milestones achieved through this method? There never seems to be agreement over which is the ‘right’ answer and there is no formula for knowing which is the best route for all deaf children. But what if we stopped assuming it was an either/or question? What if the CI and ASL were standard and utilized in tandem? It seems the biggest barrier to combining efforts to serve deaf children is professionals’ refusal to hear each other.


Laura Mauldin is assistant professor of human development/family studies and women's, gender, and sexuality studies at the University of Connecticut. She is author of Made to Hear: Cochlear Implants and Raising Deaf Children.

Wednesday, March 2, 2016

The Internet of Things and the rise of planetary computerization: How environmental sensing technologies multiply rather than consolidate versions of the planet.

Reader in sociology at Goldsmiths, University of London

Planetary computerization—and the making of a computational planet—are terms and concepts that now occupy considerable attention in media studies and environmental theory and practice. Yet these developments have been underway since at least the post-war context, since renderings of the planet as expressed through communication technologies can be found in works as far flung as the writings of Arthur Clarke, to Marshall McLuhan’s observations about the birth of ecology with the launch of Sputnik, to Barbara Ward’s discussions of Spaceship Earth emerging through telecommunication technologies—as well as Félix Guattari’s mapping of the possibilities of “planetary computerization.” More contemporary works continue to revisit these themes, including the Haus der Kulturen der Welt’s (HKW) exhibition, The Whole Earth, which considers how particular cultural practices and environmentalisms emerge by revisiting the often communication-based imaginings of the Earth from the Apollo missions onward. In these diverse approaches, the earth appears as a highly interconnected techno-political artefact that is nevertheless under stress.

Why do I begin with this discussion of multiple versions of computational planets? Because while the planetary is often a focus exactly because the Earth is seen to be under considerable environmental stress, one recurring response to the planet in crisis is to propose that more monitoring and more data, particularly through environmental sensor networks, will help to address environmental problems and make the planet more sustainable. In Program Earth, I take up considerations about the planetary and its environments by addressing the rise of ubiquitous environmental computing. In current imaginings of ubiquitous environmental sensing, technology companies often put forward a vision of the Earth as brimming with sensors, where every environmental process and activity will be monitored for ideal performance and responsiveness. Tens of billions, if not hundreds of billions, of sensors are proposed to be deployed in order to ensure earthly systems are optimized. At multiple levels, sensors are presented as a solution to the problem of the planet in crisis, from monitoring global systems to enabling citizens to become more effective sensors and participating nodes in these systems.

At multiple levels, sensors are presented as a solution to the problem
of the planet in crisis.
Images from IBM's Internet of Things videos, Part 1 and Part 2.

The Internet of Things,” one of many promotional videos developed by IBM to convey the technological revolution that ubiquitous environmental computing is meant to usher in, presents a version of interconnectivity where the planet has effectively “grown a central nervous system” through the rise of environmental sensors, to the point where there are more things than people connected to the Internet. Here is an intelligent planet that can coordinate the flow of traffic, facilitate the timing of commutes, report blockages in water mains, and balance energy grids. As the video narrator notes, in this coordinated vision of the Internet of Things, “you could look at the planet as an information creation and transmission system. The universe was hearing its information, but we weren’t, but increasingly now we can.” Sensors are meant to allow us to tap into planetary intelligence, and to augment and intervene within these systems in order to realize new efficiencies and insights.

Perhaps in contrast to the usual visual representations of the planet as a fragile object viewed from the distance of outer space, these computational articulations of the Earth are instead focused on connecting up processes and events in order to maximize earthly operations. Here is a planet—a programmed earth—where data and networks that have always already existed in a seemingly natural way can be better understood and harnessed through the unique insights provided by environmental sensors and actuators.

Often these imaginings of environmental sensors present a unified planet operating as one intelligent uber-organism. But rather than argue that new “whole” or unified earths are emerging through these computational technologies, I instead demonstrate how there are a proliferation of sensing technologies, datasets, networks, and practices, which might attempt to realize new types of interoperability but which multiply rather than consolidate versions of the planet. The “program” of Program Earth is then not a singular script or code executing a command-and-control logic on environmental systems. Instead, Program Earth asks how specific sensor occasions demonstrate the splintering and multiple ways in which these environmental computation technologies take hold.

How do sensing technologies connect humans and nonhumans,
along with their environments?

Within this focus on particular modes of sensing, the question also emerges as to how sensing technologies connect humans and more-than-humans, along with their environments. Here, citizen sensing is a key way in which this question is taken up. How do “citizens” and citizen-sensing practices become configured along with sensor technologies and processes? There is no shortage of examples of citizen-sensing projects underway that take up low-cost environmental sensing technologies to create evidence about air and water pollution, as well as document the activity of organisms and intervene within urban ecologies. Along with the proliferation of sensor technologies that are remaking versions of the “planetary,” here is a new set of practices for monitoring environments and generating evidence, and for engaging with environmental and political matters as data-based problems.

This is an area of research that is ongoing, since I am conducting a research project, “Citizen Sense,” which focuses more centrally on the question of citizen-sensing practices that emerge with the rise of low-cost computational sensing technologies. The project asks: What new political practices do these technologies enable? And how do they potentially limit environmental engagement to data-focused modalities?

Environmental sensing technologies, and the intensification of planetary computerization, generate particular ways of encountering and relating to the Earth as under stress and in crisis. It is the specific environments and entities that materialize in the process of this planetary computerization to which Program Earth attends, while also asking how these new technological arrangements might be reworked and rerouted toward less deterministic and more open-ended engagements.


Jennifer Gabrys is a reader in sociology at Goldsmiths, University of London. She is author of Program Earth: Environmental Sensing Technology and the Making of a Computational Planet (out this month) and Digital Rubbish: A Natural History of Electronics.

"Program Earth is a tantalizing account of digital, citizen-sensing worlds in the making."
—Kevin McHugh, Arizona State University

"Impressive and original, Program Earth is not just concerned with the collection and dissemination of data, but also—and more crucially—with the transformation of these data and with their effects."
—Steven Shaviro, author of The Universe of Things

Wednesday, February 24, 2016

The boombox on the bus: Erik Satie's furniture music in 2016

Postdoctoral fellow in global media and film studies at Brown University

2016 marks the 150th birth anniversary of the French composer Erik Satie (1866–1925). As far as musical ideas go, Satie is best known for his notion of “furniture music” (musique d’ameublement), first introduced nearly 100 years ago in 1917 and later popularized by American composer John Cage. Furniture music aspires to “make a contribution to life in the same way as a private conversation, a painting in a gallery, or the chair in which you may or may not be seated.” The music would “fill up those heavy silences that sometimes fall between friends dining together. It would spare them the trouble of paying attention to their own banal remarks. And at the same time it would neutralize the street noises which so indiscreetly enter into the play of conversation. To make such a noise,” Satie says, “would respond to a need.” (John Cage, Silence (1961)).

Nearly a century later, background music to support and augment everyday activities is commonplace. The Muzak tradition of sonic productivity enhancement carries on, though the stimulus progressions are now ever more individually tailored for specific times and tastes. On my phone, Apple Music offers streaming playlists based on time of day, profile data, and past clicks: “Experimental Music for Studying,” for example, or genre-appropriate ways to “Wake Up Gently” in the morning and “Tune Out Your Boss” in the afternoon.

To consider background music only as a practical physical support for the isolated individual, however, risks losing the inherently interpersonal aspect of Satie’s musical furnishings. This is music not just to keep the body in tune, but to soften the spaces between individuals, helping to make the awkward pauses that interrupt even friendly conversation a little less awkward.

Looking back over what we know of Satie’s life—he was a brilliant but idiosyncratic loner par excellence—it isn’t hard to imagine that the “need” he sought to fill wasn’t for greater productivity and efficiency, but rather a way to use music to feel more at home in what otherwise might have been a world of fraught social relations.

Flash forward to 1980. Satie’s gentle piano music is growing popular among a new generation of listeners. The recently released Sony Walkman and Brian Eno’s Ambient 1: Music for Airports are pushing furniture music in more personal and autonomous directions, combining the Taylorist push for utilitarian efficiency with more flexible and open-ended musical formats.

Meanwhile back in France, Gilles Deleuze and Félix Guattari publish their reading of the musical “refrain” (ritournelle) in A Thousand Plateaus. Unlike the more atomized individual sounds of personal audio technologies, this approach to music similarly locates it as a territorial device, helping a person (or other animal) establish their place in the surrounding world. Walking home at night, they write, a whistling child “ventures home on the thread of a tune.”

Jump again to the present. If we think of background music as just a tool for the privatized individual, we risk missing the central role of musical furnishings in negotiating interpersonal and environmental space. Rather than algorithmic soundtracks ever more finely tuned to a person’s biodata, a more fundamental role of furnishing music in 2016 might be much the same as it was a century before: a tool to fill up those heavier silences, particularly for those who might not otherwise feel at home.

Consider the mildly defiant act of boarding public transit with music leaking out of your backpack on small speakers (perhaps from a phone or boombox). Eschewing the privacy of headphones, the music immediately establishes a relational space, defining the vehicle not just for the would-be DJ but for everyone else, too.

Background music chosen by a co-passenger on public transit
brings to mind the central role of music in negotiating interpersonal
Image: San Francisco's F Line. Photographer: Momoko Shimizu.

A few months ago I was riding the F Line down Market Street in San Francisco when a woman boarded, sat in the middle of the back seat of the crowded train, and after about five minutes switched on M.I.A.’s “Paper Planes” at a volume just loud enough for everyone to hear. Suddenly we were in it together, with the lyrics about “sitting on trains” and the lolling rhythm of the beat perfectly matched to the rocking of the light rail. This cozy atmosphere suddenly turned confrontational during each chorus, however, when three gunshot sound effects ring out and M.I.A. tells us all she wants to do is “take your money.” After a few minutes the song ended and we went back to our private thoughts. The woman alighted at the next stop, having spoken to no one but having marked out a territory all her own.

Other silences: an Uber ride late one recent weekday night. The driver, after talking with me about his recent experience immigrating from Eastern Europe to the US, suddenly shifted to proclaim his love for the free Pandora music streaming service—how essential it was for him to get through his long hours at the wheel. Driving all night and with plenty of time to think, he had refined it down to a simple equation: “gasoline is what fuels the car, Pandora is what fuels me.” It would be fair to call this stimulus progression, perhaps, but this doesn’t capture the whole situation. Well aware of his vulnerability as an independent contractor at the mercy of Uber’s latest app update, the low-volume electronic music streaming endlessly from the dash provided some solace and some energy, a way to be at home while driving forward.

Rather than think of background music as merely a matter of consumer choice or utilitarian design, Satie’s legacy pushes us to understand and recognize music as a way of negotiating shared space. Musical furnishings can powerfully realign the social division of comfort and discomfort, the emotional economies of awkwardness and groove. Sometimes background music becomes a way for individuals who might not otherwise have a way to fit—for any multitude of reasons—to push back and carve out a little territory of their own.

To make such a noise would respond to a need.


Paul Roquet is author of Ambient Media: Japanese Atmospheres of Self. He is a postdoctoral fellow in global media and film studies at Brown University.

“Through a series of probing interventions, Paul Roquet generates a new environment for Japan studies—one that takes into account the faint, ambient, receding, and ubiquitous immaterialities that fill Japan's ether. This is a work worth noticing.” —Akira Mizuta Lippit, University of Southern California

“Paul Roquet smartly cuts through multiple strata from music to experimental performance to design, offering a fresh and novel perspective on the atmospheres of ambient media.” —Marc Steinberg, Concordia University

Thursday, February 18, 2016

Reparative therapies remain alive and well in some US states—Texas and Oklahoma included.

This billboard appeared in Dallas, Texas, in 2015. Despite widespread condemnation,
reparative (also known as "ex-gay" or "reorientation") therapies still exist in some states.

Assistant professor of sociology at Temple University

I recently traveled to Texas to talk about my new book The Straight Line: How the Fringe Science of Ex-Gay Therapy Reoriented Sexuality. While there, I learned a bit more about skirmishes in that region over the past year—many of which continue—regarding conversion therapies for homosexuality.

Responding to recent bans for reorientation therapy for minors in some states, the current Texas Republican Party platform maintains, “We recognize the legitimacy and efficacy of counseling, which offers reparative therapy and treatment for those patients seeking healing and wholeness from their homosexual lifestyle.” Oklahoma State Rep. Sally Kern filed a bill preventing bans on reorientation, protecting the rights of counselors to offer conversion therapy in all circumstances in that state, though this bill has not yet passed. (This was one among many anti-gay bills she filed, some of which she has withdrawn, including a right for businesses to discriminate, and banning gay affirmative therapy for minors.)

Last year in Dallas, a billboard promoting a reparative therapy clinic was put up and taken down after protests. This clinic has recently opened the “Children’s Center for Healthy Gender & Sexuality” to explicitly conduct reparative therapy with minors. While there was an attempt to ban these practices for minors in Texas, and despite condemnation from all major professional mental health associations in the US, conversion therapies remain alive and well in these areas—though likely practiced by a small minority of mental health practitioners, given general opposition to these practices in these professions.

San Francisco Pride, 2013.
Image via Creative Commons.

The gay community has also responded to the threat of conversion therapy. I dropped into a gay bar in Austin on my trip where a large flag hung over a pool table. Rainbow letters on a black background shouted "BORN THIS WAY" in all caps. This flag has been popular at pride parades this past year, including at the one in my home town of Philadelphia, taking the Lady Gaga anthem and transforming the title into a full force political slogan.

This pattern of viewpoints is quite common in debates over homosexuality: anti-gay folks often believe homosexuality is a choice that can be unchosen, while supporters of gays believe in innate and immutable sexual orientations from birth. Considering these two views simultaneously, they clearly cannot be reconciled. Each is rooted in a version of “essentialism”—an idea about the underlying nature of human sexuality, including the idea that sexual desire exists independent of culture. These ideas exist in a kind of symbiosis—the more one group promotes one of these views, the more opponents assert the opposite. But we know from decades of research on sexuality around the world that culture plays a crucial role in what forms that sexualities can take—biology does not establish sexuality any more than it establishes food preferences.

While these essentialist viewpoints seem irreconcilable, it is important to note that both have the capacity to reinforce the idea that same-sex sexuality is something shameful. For reorientation proponents, homosexual desire is to be eradicated and is a symptom of excessive shame. For promoters of “born this way,” same-sex sexual desire must be tolerated because it cannot be helped, and this position does nothing to suggest that a gay and lesbian life might be a desirable outcome. Making pro-gay arguments from the position of biology alone misses the cultural work that needs to be done if same-sex sexual desire and behavior are to be fully acceptable. The missing but perhaps necessary argument is that there is nothing inherently shameful about consensual sex and relationships between members of any sex, including sex between men, and these relationships may actually enhance human lives. With this position, the questions of the cause of homosexuality and concerns over malleability become moot points. In efforts to create tolerance, gay rights has ceded the idea of sexual fluidity, fighting for the recognition of clearly delineated types of persons, which science seeks to reinforce. Yet, establishing general sexual freedom for all was much more a position of gay liberationists who participated in the Stonewall Rebellion.

Source for these axes of comparison: Dawne Moon, 2005.
"Discourse, Interaction, and Testimony: The Making of Selves
in the US Protestant Dispute over Homosexuality."
Theory & Society 551-577.

New research is beginning to show ways that the “born this way” message may even be inappropriate for advancing gay rights. Consider how new generations are changing views on these issues, a recent poll in the UK found that 50% of people age 18-24 reported some same-sex sexual attractions, and most think sexual orientation is on a continuum. These trends may be similar in the United States, as researchers continue to uncover evidence of expressed sexual fluidity for men and women. With taboos on homosexuality and gender fluidity lessening, the old category systems of gay/straight and male/female are no longer exhaustive. Even more problematic for the “born this way” position, new research by University of Tennessee, Knoxville, psychology professor Patrick Grzanka and his research team shows that a substantial group of anti-gay people believe that people are born gay.

So what to do about the continued presence of conversion therapies in Texas, Oklahoma, and elsewhere? Recognizing the kinds of harm that these therapies can cause is certainly central, but so is the cultural message that there is nothing inherently wrong with same-sex sexual expression. Meanwhile, some scholars have argued that a better model for gay rights that sidesteps the issues of mutability and cause would be analogous to rights to freedom of religion. In that case, a state cannot force a person to change their religion in order to get rights because religion is crucially important to a person’s sense of self. Perhaps this kind of argument rooted in American ideals of freedom could be a better basis for arguing against therapies that treat same-sex attraction as something rooted in shame, and could be a more open-ended basis for protecting freedoms of a broader range of sexual and gender expressions.


Tom Waidzunas is author of The Straight Line: How the Fringe Science of Ex-Gay Therapy Reoriented Sexuality. He is assistant professor of sociology at Temple University.

"Finally we have a book that takes a deep, inside look at sexual reorientation therapies and their far-reaching cultural effects. In a provocative turn, The Straight Line not only interrogates the fringe science of sexual reorientation, but it shows us how these efforts to reorient gays and lesbians have shaped—and been shaped by—more liberal ideas about sexuality."
—Jane Ward, author of
Not Gay: Sex Between Straight White Men

Thursday, February 11, 2016

On Climate Change War Games and "environmentality."

The military's seizure of climate change and other environmental issues
is not as radically new as one might suppose.
Image via Creative Commons.

Associate professor of environmental and postcolonial studies, Purdue University

In 1947, George F. Kennan, writing under the pseudonym “Mr. X,” published “The Sources of Soviet Conduct” in Foreign Affairs. The article had considerable impact. Advocating a global strategy of Communist “containment,” it influenced the Truman administration’s shift to an anti-Soviet policy and served as a road map for what the journalist Walter Lippman would soon criticize (and popularize) as the Cold War. Kennan’s “X” article, and his earlier 1946 “Long Telegram” from Moscow prompted President Truman in March of 1947 to unveil his Truman Doctrine, which led to the creation of the National Security Act of 1947, The Central Intelligence Agency, the National Security Council, and NATO. For the next forty years, the US would harden its defense-oriented position, freighting the western world with the mutually assured destruction of nuclear arms buildup, the witch hunts of McCarthyism, and a domino-effect rhetoric that would bolster conflicts in Greece, Turkey, Korea, Vietnam, and elsewhere. Global geopolitics would never be the same.

That admix of madness and hubris, which makes up any grand narrative of power, may have become antiquated after the collapse of communism. But the attempt to contrive a narrative of pandemic proportions is in the works again. The new enemy to the free world? Climate change.

In 2012, US Navy Captain Wayne Porter and US Marine Corps Colonial Mark Mykleby, writing as “Mr. Y,” published “A National Strategic Narrative.” Backed by former Defense Secretary Robert Gates and hailed as a “grand strategy for the 21st century,” the narrative commandeers the immediate and future safety of the earth’s ecosystems—mainlining climate change with the venom of national security.

The war machine’s seizure of climate change and other environmental issues is not as radically new as one might suppose. The Pentagon has been taking climate change seriously, and growing more public about it, since the publication of the Intergovernmental Panel on Climate Change’s (IPCC) 2007 “Fourth Assessment Report.” But long before the turn of our century, Cold War military officials had their eyes on the environment. The CIA and other agencies headhunted scientists to learn just how much they could about manipulating natural environments. As far back as 1949, NATO investigated and even tested various potentials for transforming enemy environments on a planetary scale, going so far as to suggest the use of nuclear bombs to reconfigure the sea floor and change the course of ocean currents.

Global warming, however, is a game changer. Since the 1980s we’ve been hearing and ignoring the warnings from climate scientists: Species extinction (as much as a 50 percent die-off in the next century). Mass migration due to water and food scarcity (as many as 250 million on the move by 2050). Drought. Storms. Heat waves. Temperature increase. Sea level rise. An interminable roster of calamitous transformations continue to unfold almost daily. So, one might ask, in the wake of any serious green political movement in the US, having the military on board can only be a godsend, right?

Yes and no. Militarized engagements with the environment might give the gravity of the situation more attention, and soldiers indeed might be on the ground to help climate refugees (though one wonders, given the reaction to Syrian refugees, how much help will ultimately be given). But this “environmentality,” as I term it, brings with it an aggressive approach that can emphasize a nation’s insecurities at the expense of discovering alternatives to humanity’s destructive relation to nature. In July 2008, for instance, the Center for a New American Security, with the help of scientists, the Department of Defense, various private funding organizations, and ABC News, held a “revolutionary” military exercise: the Climate Change War Game. During the initial stages of the game players focused on sustainability, but they were steered away from this approach and came to reject it as a worrisome distraction from the central issue of global security: “a focus on cutting greenhouse gas emissions runs the risk of crowding out full consideration of adaptation challenges.” In the final analysis, confronting the complexities of climate change—questioning the gamut of human environmental abuses, changing patterns of overconsumption and waste, preserving biodiversity—involved too much risk.

We’ll be hearing more arguments, I suspect, in favor of viewpoints like Mr. Y’s, who redefines Cold War scenarios of containment with a “strategic ecology” of US-directed “sustainment.” We should pay heed, however, to these maneuvers to strategically capture climate change for defense purposes. George Kennan certainly did. As he came to see his policies turn more aggressive and militaristic, he began to abhor the impact of his X-article. He yearned for a more positive dialogue with the Soviet Union. One can only hope, in the wake of December’s Climate Summit in Paris, that the grand strategies for confronting global warming will be more solicitous, regenerative, and temperate.


Robert P. Marzec is author of Militarizing the Environment: Climate Change and the Security State. He is associate professor of ecocriticism and postcolonialism in the department of English at Purdue University and associate editor of MFS: Modern Fiction Studies. He is also author of An Ecological and Postcolonial Study of Literature and the editor of Postcolonial Literary Studies: The First 30 Years.

"Militarizing the Environment: Climate Change and the Security State offers an illuminating, perturbing account of the greening of military discourse and strategy amidst an era of advancing climate change. Robert P. Marzec brilliantly details the neoliberal assault—at once militaristic, economic and discursive—on the commons and its most vulnerable inhabitants. His book is essential reading for anyone committed to understanding the new imperialism and its cynical, sinister appropriation of critical environmental ideas like resilience, adaptation, and sustainability."
—Rob Nixon, author of Slow Violence and the Environmentalism of the Poor

Thursday, February 4, 2016

Shipwreck narratives are central to the Age of Discovery.

Shipwreck narratives, writes Steve Mentz, portray humanity caught
between divine fiat and the insufficient promise of human agency.
The Storm on the Sea of Galilee, Rembrandt, 1633.
Public domain image via Wikimedia Commons.

St. John's University

Humans love to tell stories that put humans at the center of things. In these fantasies, the Renaissance celebrates the rebirth of human knowledge, the Enlightenment shines its light on human realizations, and the postmodern era fractures human ideals. More recently, the Anthropocene shoulders its way into view with the power of Old Man Anthropos, the all-powerful Man who ruins everything.

These anthropocentric visions paper over the disturbing truth that human history overflows with unexpected turns. We seldom end up where we thought we were going. Stories about transformation and tragedy err when they claim more certainty about their destinations than they really have.

To put it more directly: the Age of Discovery was an Age of Shipwreck.

Modernity remains a contested term in literary and cultural scholarship, and controversies about the meaning of “early modernity” capture the unsettled nature of thinking about historical change and continuity. Ideas about transformation have long dominated scholarship of the literature and culture of the sixteenth and seventeenth centuries, whether that transformation appears as a “Renaissance” of classical forms, a “Reformation” of Christian cultures, or through more particular discourses such as skepticism, political republicanism, or the rise of empirical science. My book Shipwreck Modernity supplements these human-centered visions with disaster. Adding the modifier “shipwreck” to the modernity of European culture’s first age of globalization minimizes human control and relocates unplanned errancy at the center of world history.

Shipwreck modernity describes an understanding of historical change that is impersonal rather than humanized, material as well as ideological, and driven by random catastrophes more than singular acts of vice or virtue. Turning to shipwreck follows the offshore trajectory of recent scholarship in the oceanic or “blue” humanities that treats the sea as a corrective to pastoral dreams of harmony. This saltwater approach to human cultural history places the encounter with what oceanographers call the World Ocean at the center of the global movement of European culture. Ocean currents and prevailing winds drove European ships around the globe, carrying and encountering viruses, plants, animals, languages, cultures, and catastrophes. The disasters and narratives of hybridity that emerged comprise a global shipwreck. The rapid integration of the ecologies of Afro-Eurasia with the Americas created disruption and change on a massive scale that continue to resonate today. Shipwreck modernity brought smallpox to the Americas and the potato to Ireland, while disrupting local ecologies around the globe.

No trope in the oceanic archive resonates more than shipwreck, an ancient story of disorientation and disruption that punctuates Western literary culture from Odysseus and Jonah to Prospero and Robinson Crusoe. Especially during periods of maritime expansion, shipwreck narratives portray humanity caught between divine fiat and the insufficient promise of human agency. The technical labors of mariners in crisis, as portrayed by canonical authors such as Shakespeare and Defoe as well as common sailors and others, create allegories of humans struggling to endure nonhuman environments.

Representations of shipwreck in and beyond the early modern period suggest three subcategories or interpretive clusters for human-ocean encounters: wet globalization, blue ecocriticism, and shipwreck modernity. Each of these phrases identifies a trajectory for blue humanities scholarship.

Wet Globalization: Twenty-first century responses to globalization sometimes fly above the earth in passenger planes. The blue humanities recall that historically and still today, the global economy floats on ocean currents.

Blue Ecocriticism: The sea’s overwhelming physical presence in the natural environment emphasizes that this element, long marginalized by green eco-thinking, can revolutionize ecological thought in a post-sustainability context.

Shipwreck Modernity: From an oceanic perspective, the story of emerging modernity resembles a catastrophe-ridden epic of ocean-fueled expansion and its attendant disasters.

Responding to the alienating pressure of the ocean on human bodies and institutions makes the blue humanities a form of post-human investigation. With cognates in post-sustainability ecocriticism, cyborg studies, catastrophe studies, and other discourses that separate humans from the spaces that comfort them, the oceanic turn in humanities scholarship combines ancient narratives that remain vibrant in contemporary culture with a new emphasis on dynamism in the relationship between humans and their environments.

Shipwreck Modernity refuses sentimental consolations such as green sustainability or political utopianism. But it does not sink into the depths without hope. The shock of immersion has positive lessons as well as critical ones. The book ends in the “bright light of shipwreck,” alongside the hybrid vision it names the Bookfish, with “Seven Shipwrecked Ecological Truths.” Seeing catastrophes as opportunities means seeking an ecological future with wet swimmers rather than dry sailors, in an oceanic world in which survival, while only temporary, gives pleasure. This wet and disorienting vision shines a light on early modern ecological globalization that resonates with our post-climate change present.


Steve Mentz is author of Shipwreck Modernity: Ecologies of Globalization, 1550–1719. He is professor of English at St. John's University in New York City.

"A compelling, provocative, even lyrical piece of scholarship that will undoubtedly inaugurate new critical discussions in the fields of maritime humanities, eco-criticism, early modern English literature, and shipwreck studies."
—Josiah Blackmore, Harvard University