Friday, February 1, 2019

On humanity, videogames, and resisting operationalized logic.



BY TIMOTHY WELSH
Loyola University, New Orleans



In October 2018, just prior to the November midterm elections, Twitter banned close to 1,500 accounts that all featured the same gray, expressionless cartoon avatar. “NPC,” a version of the Wojack or Feels Guy reaction image, is a meme generated by the right-wing internet as a representation of liberals. Accounts featuring NPC as an avatar, Twitter alleged, had violated Twitter’s terms of service when, while mockingly impersonating liberals, they spread “intentionally misleading election-related content,” such as the incorrect election date.

The name of this icon—“NPC”—is a reference from videogame culture. Shorthand for “non-playable character” or “non-player character,” an “NPC” is any character a human user does not control. Without a user to direct their movements, dialogue, and interactions, the NPCs are directed by the game’s programming. Depending on the specific character and the type of game, an NPC can be as simple as a still image next to text bubble dialogue or as complex as a photo-realistic, voice-acted, three-dimensional model guided by adaptive artificial intelligence that reads, reacts to, and learns from the player’s interactions with the gamespace. Some NPCs have well-drawn personalities and players can get to know them over the course of multiple hours of gameplay. Most, though, are the equivalent of movie extras. They mill around town in a simulation of daily life to make the virtual world seem vibrant. They fall in with the attacking enemy horde only to be dispatched unceremoniously by the heroic player character.

When NPCs aren’t important to the story, they usually aren’t designed for sustained player interaction and attention. Their animations are on short loops, their movements follow set, predetermined paths, their dialogue is scripted and repeats frequently. If the player lingers with them for a moment, the veil of immersion breaks down and their computer-driven behavior reveals itself.

It’s this quality of the NPCs—their pre-programming—that became the seed of the right-wing meme. The insult, summarized by Kevin Roose of the New York Times, is that liberals are “programmed” to repeat the same limited dialogue and action scripts, like NPCs. They are thus unresponsive to the assumed sound, logical arguments of the right and therefore incapable of independent thought.




Beyond characterizing liberals as poor interlocutors, however, as many commentators like Cecilia D'Anastasio of the videogame culture blog Kotaku observe, the NPC insult is dehumanizing and potentially threatening. Given all manner of virtual suffering and death some videogames allow players to dispassionately inflict on these characters with no human behind them, coupled with the right-wing’s tendency to dehumanize and suggest violence against political rivals and minorities, the potential harm of the meme goes beyond spreading misinformation on social media.

At the same time, there are a few fundamental ironies at play in the NPC meme. I want to set aside for now the most obvious irony of using a meme to criticize opponents for repeating received arguments, but that’s part of this phenomenon as well. Instead, I’m interested in how the meme misunderstands our relationship with digital characters and environments. In brief, the NPC meme describes computer-driven, non-playable characters as incapable of responding to logical arguments, when in reality the computer code comprising an NPC is itself nothing but a logical argument parsed by a computer.

At a material level, NPCs are a bundle of scripts called to perform when the game state meets certain circumstances. The scripts may not be very complicated, depending on how important it might be to the game. Nonetheless, they are basically just a string of “if this, then that” functions. For example, if the player selects dialogue option “A,” then, the NPCs plays dialogue line “X,” which had been designated as the response to option “A:” if input A, play line X. That means, literally the only time NPCs change their behavior is when presented with new logical argument, such as when a programmer or modder injects new bits of code.

It is not just the NPC who is limited by programming. The playable character similarly has only a particular set of available animations, dialog, and abilities. It receives input from the player and so can engage the virtual environment as the player chooses, but only through the range of affordances defined by the game’s if-then statements. For example, no matter how much a player of Grand Theft Auto: San Andreas might want the protagonist CJ to go around hugging pedestrians, it simply isn’t an available in-game action. There is no changing the player character’s mind, perspective, or responses either. He—and it is almost always “he”—may tend to have more compelling reactions to the player’s input than the NPCs, but, without rewriting the game’s code, it too can only express what it was programmed to express.

The NPC and PC both exist within a protocological environment and can only ever enact the environment’s programmed logics. Down to the level of computer programming, protocol allows for creative operation of the functions, but not breaking protocol. As Alexander Galloway explains in Gaming: Essays on Algorithmic Culture, it is this flexibility that enables the digital system to maintain stability. In this sense, both PC and NPC function within the same operational logic, neither capable of differentiated response to player inputs regardless of who pushes the buttons or their political and philosophical leanings. 


Of course, these right-wing memesters are not considering the how videogames participate in the functioning of power in a control society. The NPC insult is simply supposed to delegitimatize and dehumanizes people who are seen to be uncritically repeating arguments and buzzwords by, ironically, reproducing and spreading a meme. But this irony is precisely the trick of protocol that makes it so effective. It offers an agency that appears like self-determination, but within constrained, consistent, repeatable perimeters, much like memes themselves.

Logical argumentation is presumed to be an alternative to NPC-hood, when the logic of protocol is what makes possible the NPC in the first place. Videogames invite players to take on an objectivized logic—everything is permitted, all is expendable, as long as the mission is completed—which in its very design devalues not only the NPCs but also the player’s own interiority in order to function universally. We have to ask, then, is this kind of cold rationality the avenue to human dignity and independence as the meme supposes or a means to its suppression?

The same week these stories about NPCs came out, the President of the United States equivocated over the killing of a journalist allegedly by Saudi Arabia. He argued the life of one non-US citizen isn’t as valuable as maintaining a lucrative arms deal. The logic of this argument relies on the dehumanizing assumption that human life can be weighed against a trade agreement, that it has a calculable, comparable exchange value.




This gross utilitarianism in international politics is not new, of course. The so-called “virtuous war” military strategy defines success as reducing the number of casualties suffered by one's own side. While reducing casualties is an admirable goal, it also treats human life as a benchmark statistic, sets parameters for “acceptable” losses and “high value” targets, and removes the human cost from combat. It can even justify more warfare and more casualties. After all, if the lives of one’s own soldiers aren’t at risk, there may seem to be less to lose from a military response.

The virtuous war is exemplified by the expansion of war-at-a-distance technologies like drones, which allow the US to strike targets without exposing soldiers to the battlefield. A 2010 United Nations report on targeted killings, however, warned against pilots potentially assuming a “playstation mentality.” The suggestion characterizes piloting a drone as akin to playing a videogame. The two technologies—drones and videogames—share a development lineage (some drones are even piloted using controllers similar to those sold with the Xbox360). The concern raised by the UN here is that the drone pilot might assume a similar attitude toward the silhouettes on their targeting display as they might take toward NPCs in a typical videogame.

In Mixed Realism: Videogames and the Violence of Fiction, I explain that NPCs are so expendable that games often have to inject them with human value simply so that the plot doesn’t fall apart and fictional events happening to virtual characters have a sense of meaning, gravity, and consequence. The Call of Duty series, for instance, which just saw the release of its fifteenth installment, often does this by having players look an NPC in the eyes. Call of Duty: Modern Warfare 2 uses this technique numerous times, even having previously unrecognizable NPCs gaze directly into the first-person camera just before they are dispatched. Despite the fact that the player will shoot thousands of identical NPCs over the course of the game, looking one in the eyes as they die lends it a degree of humanity.

I optimistically suggest that our capacity to see human value in NPCs, to feel anything when we dispatch an arrangement of pixels, is an opportunity to resist operationalized logic inherent in digital games. Doing so requires a different mentality toward play, though, one that sets aside the strategic efficiency implied by game designs and incentives in order to pursue other values such as sympathy and community, ambiguity and complexity, or revelation and awe.

The NPC meme may stick around for a while or it may go the way of so much other internet detritus. Regardless, the dehumanizing calculus at its core endures. The contemporary manifestation has found verdant soil in digitally mediated cultures, spreading on social media sites and in videogame communities. Responding to it will likely require not more logical arguments, but the ability to recognize the human in mere digital representations.


-------

Timothy J. Welsh is author of Mixed Realism: Videogames and the Violence of Fiction. Welsh is assistant professor of English at Loyola University, New Orleans.

"In Mixed Realism, Timothy J. Welsh proposes a fresh approach to understanding digital games and contemporary literature that is essential, relevant, and engaging."
—Zach Whalen, University of Mary Washington


-------

REFERENCES:

Crogan, Patrick. Gameplay Mode. Minneapolis: University of Minnesota Press, 2011. http://www.upress.umn.edu/book-division/books/gameplay-mode.

D’Anastasio, Cecilia. “How The ‘NPC’ Meme Tries To Dehumanize ‘SJWs.’” Kotaku (blog), October 5, 2018. https://kotaku.com/how-the-npc-meme-tries-to-dehumanize-sjws-1829552261.

Der Derian, James. Virtuous War: Mapping the Military-Industrial-Media-Entertainment Network. Boulder, CO: Westview Press, 2001.

Galloway, Alexander R. Gaming: Essays on Algorithmic Culture. Minneapolis: University of Minnesota Press, 2006.

Infinity Ward. Call of Duty: Modern Warfare 2. Activision, 2009.

Rockstar North. Grand Theft Auto: San Andreas. Rockstar Games, 2006.

Roose, Kevin. “What Is NPC, the Pro-Trump Internet’s New Favorite Insult?” The New York Times, October 19, 2018, sec. U.S. https://www.nytimes.com/2018/10/16/us/politics/npc-twitter-ban.html.

Triple Zed and Y.F. “Wojak / Feels Guy.” Know Your Meme. Accessed January 21, 2019. https://knowyourmeme.com/memes/wojak-feels-guy.

Wagner, John. “Trump Says Curbing Arms Sales to Saudi Arabia in Response to Missing Journalist Is ‘Not Acceptable.’” Washington Post, October 11, 2018, sec. Politics. https://www.washingtonpost.com/politics/trump-reluctant-to-curb-arm-sales-to-saudi-arabia-in-response-to-missing-journalist/2018/10/11/85c71212-cd4a-11e8-a360-85875bac0b1f_story.html.

Welsh, Timothy J. Mixed Realism: Videogames and the Violence of Fiction. Minneapolis: Univ Of Minnesota Press, 2016.

Friday, January 25, 2019

Fashioning Feminism: On Bodies of Information.





















BY ELIZABETH LOSH
William & Mary


What does a bulletproof dress prototype have to do with the digital humanities?

A lot actually, according to artist micha cárdenas. Such a garment, which was crafted from Kevlar airbags scavenged from a junkyard, could be capable of stopping a 9mm bullet. It’s one of the objects featured in the latest addition to the Debates in the Digital Humanities series, Bodies of Information: Intersectional Feminism and Digital Humanities.

As a piece of apparel, the dress dramatizes the higher risk of mortality that people of color face in confrontations with law enforcement. Of course, communities allied with #BlackLivesMatter are also deploying statistics, metadata (like hashtags), and even information visualizations to quantify how the inequities of state power do violence to black and brown bodies, as well as how activists can mobilize in response. Nonetheless, the metallic clothing created by cárdenas represents a critical kind of “embodied gesture” that she argues is as essential as big data number crunching, if not more so.

Others in this new collection, such as Marcia Chatelain – creator of the #Ferguson Syllabus – and Beth Coleman of the City as Platform Lab, similarly make the argument that #BLM should present central rather than peripheral concerns for digital humanities practitioners in the academy.

Furthermore, digital humanities scholars “can extend their work to be more accessible to low-income people,” cárdenas writes, “and to considerations of nondigital technologies, by abstracting the concept of algorithms to include recipes and rituals.”

Bringing DIY craftivism to the digital humanities is a commitment for Kim Brillante Knight as well. This scholar of “viral media” uses an unusual form of data visualization to depict the frequency of the use of the #prolife hashtag on Twitter. Rather than show a word cloud or network graph, Knight uses five pink LEDs as a meter to measure the occurrences of the relevant tweets.

“The medium for the visualization is a black T-shirt,” Knight explains, “onto which I have hand embroidered reproductive organs: a uterus, fallopian tubes, cervix, and part of a vagina.” The project also uses microcontroller technology and conductive thread.

Other evocative objects – such as yearbook photos – become artifacts of critical reflection in the new volume. Texas A&M professor Amy Earhart describes the unintended consequences of digital humanities projects that reveal sites of institutional shame. For example, she includes images from a project digitizing college memorabilia that reveal photos of student organizations with members proudly “wearing their Klan robes, with typical cross insignia, hoods, and brandishing swords.”

Rather than merely digitizing archives without reflecting on their design – who is included, what is excluded, and why some histories are deemed not worth preserving – this collection encourages digital humanities researchers to question what gets privileged in a library of rare materials and how digital archives can foster different perceptions of the historical record.

Brandeis medievalist Dorothy Kim, who has been a lightning rod for alt-right abuse online, invites us to consider what gets lost when we only experience the digital copy of a text. Kim notes that solely its visual elements are captured, and its other sensory features become lost. “Medieval reading practices were not linear,” Kim asserts, “often required vocality to read out loud or sing out loud, ideally required slow and repetitive rereading, were emotive, and involved sound, smell, touch, taste, visual, and even bodily calisthenics.”

The epistemological rethinking that digital technologies make possible is highlighted in many of the groundbreaking essays in the volume, including “Toward a Queer Digital Humanities” by Bonnie Ruberg, Jason Boyd, and James Howe.

Ruberg, Boyd, and Howe articulate basic principles: “If queer knowledge always resists completion, it becomes clear that queering metadata means more than adding new vocabulary to existing taxonomical systems. Queerness also points toward a shift in the very methodologies of metadata collection. To queer metadata, queer thinking must be brought to bear on the conceptual models and tools of object description as well as its content.”

The collection even includes Deb Verhoeven’s “Be More Than Binary” challenge to the international digital humanities community, as well as a number of essays that question what it means to speak of “community” in the digital humanities at all.

In emphasizing the importance of feminist digital humanities, this collection does much more than merely highlight digital archives that commemorate the previously hidden accomplishments of women. In addition to acknowledging transgender and nonbinary forms of digital humanities, these essays consider what is feminized as well as what is female. For example, Sharon Leon acknowledges the many professional roles that disprove the “Great Man” myth. And Julia Flanders encourages her audience to interrogate assumptions about all technical systems of knowledge production as they think about both print and digital publication processes. Flanders reflects upon how her own Women Writers Project “mirrored a shift in feminist theory from a second-wave attention to the visibility and rights of women . . . to a third-wave focus on how the structure of discourse enacts and reinforces cultural power dynamics of gender, race, class, coloniality, and other differentials.” There is also a wonderful essay by Susan Brown, who celebrated the 20th anniversary of Orlando recently, that deconstructs aversions to tropes of delivery and service associated with the “handmaiden” position in the digital humanities with an incisive reading of Atwood’s The Handmaid’s Tale.

As we performed our own informational labor as the editors/handmaids of this book – collating comments from the peer-to-peer review process or indexing the key terms in the volume – we found ourselves marveling at the sophistication of the feminist thinking modeled in this collection and the fundamental questions that it explored. Sadly a single blog post can’t do justice to the dazzling array of ideas in a table of contents that concludes with two essays about why videogame design and analysis of its player community practices might rightly belong with the growing corpus of digital humanities scholarship.

Readers are likely to appreciate how this book challenges existing attitudes and stereotypes about a rapidly expanding field. As an added benefit, with its affordable cover price and open access launch in a few months, Bodies of Information also offers a rich set of resources for students who are interested in exploring how digital technologies can promote activist scholarship, community alliances, and public engagement in the academy.


-------

Elizabeth Losh is associate professor of English and American studies at The College of William & Mary with a specialization in new media ecologies. She is coeditor, with Jacqueline Wernimont, of Bodies of Information: Intersectional Feminism and Digital Humanities; author of Virtualpolitik and The War on Learning: Gaining Ground in the Digital University; and coauthor of Understanding Rhetoric: A Graphic Guide to Writing.

Friday, January 18, 2019

Frankenstein and anonymous authorship in eighteenth-century Britain.




BY MARK VARESCHI
University of Wisconsin–Madison



Having celebrated its 200th anniversary in 2018, Mary Shelley’s Frankenstein is perhaps one of the most well-known novels of the early nineteenth century. While many are familiar with Shelley’s classic novel and can immediately picture some version of the work’s iconic monster, few are aware that when Frankenstein was first published in 1818 it was an anonymous novel. Nowhere on the title page does the name “Mary Shelley” appear. The novel that began its life as an exercise in writing a ghost story during the particularly cold and wet summer of 1816 at Lord Byron’s Villa Diodati in Geneva was conveyed to the British reading public with no indication of its author’s name.

Indeed, when in 1817 Mary Shelley’s husband Percy tried to help her sell the completed novel to his publisher, Charles Ollier, he did so without disclosing the author’s name. Percy Shelley wrote to Ollier: “I send you with this letter a manuscript which has been consigned to my care by a friend in whom I feel considerable interest.” The manuscript was rejected. Acting again as Mary Shelley’s agent, Percy Shelley would eventually find a publisher for the novel in Lackington and Co., but Mary Shelley’s name was withheld. Percy Shelley referred to the novel in a letter to the publisher as “not [his] own production, but that of a friend…”




Shortly after the novel was published in January 1818, with a print run of 500 copies, reviews of the novel began appearing in periodicals. Some reviewers, noting novel’s anonymity, hazarded an attribution. Walter Scott in Blackwood’s Edinburgh Magazine wrote: “It is said to be written by Mr Percy Bysshe Shelley, who, if we are rightly informed, is son-in-law to Mr Godwin; and it is inscribed to that ingenious author.” (Mary Shelley would later write to Scott to correct this error). An anonymous reviewer in The Literary Panorama, and National Register reported in its review of Frankenstein: “We have heard that this work is written by Mr. Shelley; but should be disposed to attribute it to even a less experienced writer than he is. In fact we have some idea that it is the production of a daughter of a celebrated living novelist.” The British Critic was even crueler in its dismissal:

The writer of it is, we understand, a female; this is an aggravation of that which is the prevailing fault of the novel; but if our authoress can forget the gentleness of her sex, it is no reason why we should; and we shall therefore dismiss the novel without further comment.

Given the harsh reviews Frankenstein endured from many, though not all, critics and the obvious antipathy to women writers held by some critics we might not be surprised that the novel was published anonymously. Indeed, we may wish to attribute some causal relationship between the expected reception of Frankenstein and its woman author and the decision to publish the novel anonymously. As Susan Eilenberg notes, however, “there was nothing peculiarly feminine about anonymity, nor anything very uncommon about it, either.”

Indeed, the novel in English emerges over the course of the long eighteenth century as a largely anonymous form. As James Raven asserts, “it is clear that the overwhelming majority of the English novels of the eighteenth and early nineteenth centuries were published without attribution of authorship on the title page or within the preface or elsewhere in the text.” In 1818, the year Frankenstein appeared, 62 new novels were published in Britain and Ireland. 41 were published anonymously (66%). Of the 21 novels that appeared with their authors’ names attached, five were attributed to male authors and 16 to female authors. These figures upend assumptions we might make about gendered authorship as well has how atypical authorial anonymity was for the novel.




In Everywhere and Nowhere: Anonymity and Mediation in Eighteenth-Century Britain, I argue that because anonymity was typical of texts published (and performed) in the long eighteenth-century, we must rethink both how we approach anonymous texts and how we attribute motives to authors to account for that anonymity. I suggest we move from approaching anonymity as a product of an individual author’s choice to understanding it as an aspect of textual production. We tend to assume that anonymity is a choice made by an author and that named authorship is the default state. Publication history, however, suggests otherwise – anonymity was the default state, particularly for new novels like Frankenstein. We might understand the anonymity of a novel like Frankenstein along the lines that we understand Lackington and Co. issuing the novel in three volumes – the typical physical form of novels in the late eighteenth and early nineteenth centuries. That is, the anonymity of Frankenstein (and those other 40 novels published anonymously in 1818), while informed by the author’s choice and individual motives, is as much a product of the forces of generic expectation, publication practices, and the collective actions that bring a literary text to be in the world.


-------

Mark Vareschi is author of Everywhere and Nowhere: Anonymity and Mediation in Eighteenth-Century Britain. Vareschi is assistant professor of English at the University of Wisconsin–Madison.

"This is fresh, compelling, detail-rich scholarship and essential reading."
—Brad Pasanek, author of Metaphors of Mind: An Eighteenth-Century Dictionary

"Everywhere and Nowhere is that rare thing: a genuinely interdisciplinary study, capacious and illuminating, of how anonymous authorship impacts meaning across genres and media. In Mark Vareschi’s hands, anonymity is transformed into a lens for reexamining the most fundamental literary concepts (authorship and intention, medium, textuality) and renovating them—not just in the domain of print but across the rich media ecologies of the eighteenth century."
—Michael Gamer, University of Pennsylvania


-------

References:
-Percy Bysshe Shelley, The Letters of Percy Bysshe Shelley, ed. Frederick L. Jones, 2 vols. (Oxford: Clarendon Press, 1964), Vol. I, p. 549; p. 553.
-Blackwood's Edinburgh Magazine 2 (March 1818): 613-620
-The British Critic, N.S., 9 (April 1818): 432-38
-Susan Eilenberg, “Nothing’s Nameless: Mary Shelley’s Frankenstein,” in The Faces of Anonymity, ed. Robert J. Griffin (New York: Palgrave, 2003), 171.
Raven, 143.
-James Raven, “The Anonymous Novel in Britain and Ireland, 1750-1830,” in The Faces of Anonymity, ed. Robert J. Griffin (New York: Palgrave, 2003), 143; 164.

Friday, December 7, 2018

The Most Dangerous Book in the World




BY SETH PERLOW
Georgetown University


One book, written by a computer, could have killed us all.

What do you do when you’re the only country in the world with atomic bombs? You make them much, much bigger. That was the US strategy right after World War II. The Cold War was beginning, and by 1952 the US would have a weapon 690 times as powerful as the one dropped on Hiroshima. To make such a gigantic explosion, the scientists at Los Alamos first needed to create a very strange book, one that proved an important component in the history of computing.

The book is called A Million Random Digits with 100,000 Normal Deviates (1947), and it contains precisely that: a huge table of random digits. It’s about the size of a phonebook. Los Alamos scientists used it to do the calculations necessary for designing thermonuclear weapons. As you might have learned in high school, the motion of subatomic particles is chaotic, so the bomb’s designers had to account for randomness in their calculations. They soon discovered that their random numbers were—well, not random enough. Apparently, picking numbers out of a hat just wasn’t scientific enough. So in 1947 they asked the RAND Corporation, a military think-tank, to produce a very large table of very random digits.





The so-called RAND book illustrates a familiar theme: many of today’s electronics emerged from military research. For example, Norbert Weiner formalized cybernetics based on his efforts to automate antiaircraft guns. Alan Turing developed key principles for computer science, including the Turing test of artificial intelligence, after breaking Nazi codes for the British intelligence service. One scientist involved with A Million Random Digits, John von Neumann, also invented the Von Neumann Architecture, an influential blueprint for computer hardware. The Internet itself began as ARPANET, a project by the Department of Defense group now known as DARPA. (In 1977 the entire Internet looked like this; note the names of universities, corporations, and military bases.) This group also claims to have developed the computer mouse—only partly true—and they’re responsible for those creepy walking robots you might have seen online.

By 1945, academic and corporate research had become integrated with the war effort. The University of California managed Los Alamos, but until the end of the war only one UC official knew its purpose or even which state it was in. The RAND Corporation, short for Research ANd Development, started as a collaboration between the Douglas Aircraft company and the US Air Force; its first publication was the prescient Preliminary Design of an Experimental World-Circling Spaceship (1946). Similar groups included the Institute for Advanced Study, home of Albert Einstein, and AT&T’s Bell Labs, which developed the solid-state transistor during the same year work on the RAND book began. Decades before Google became known for its casual work environments, these groups recognized that a little disorder in the workplace fosters innovation. Researchers were rarely held to specific performance standards and were encouraged to collaborate across disciplinary boundaries. If your company has a foosball table or free beer on Fridays, you might have the military-industrial complex to thank.

A big supply of random digits made it possible to do a new kind of math, called the Monte Carlo method, which simulates the movements of particles in a nuclear reaction. Monte Carlo math uses random sampling to make calculations. Here’s an example to illustrate how it works: if I draw two squiggly shapes on the ground, I can compare their areas by sprinkling grains of rice over them and counting how many grains fall inside each shape. Sprinkling more rice yields more accurate results but requires more tedious calculations, more counting. The scientist Stanislaw Ulam supposedly came up with the Monte Carlo method while sick in bed, playing solitaire. He realized he could figure out the probability of winning a solitaire game by dealing lots of sample games and checking how many were winnable. He purportedly named the method after his uncle, who liked to gamble. Ulam shared his ideas with his Los Alamos colleagues, including John von Neumann and Nicholas Metropolis. Together they formalized the technique. Like von Neumann, Metropolis made other contributions to computer science too, designing and naming the MANIAC computers in the 1950s. The task of computing randomness helped bring them together.

To generate random digits is surprisingly difficult. Computers cannot produce randomness on their own because their design is based on strict logic. Asking a deterministic machine to pick a random number is like asking your microwave to have a favorite color: it just doesn’t compute. Meanwhile any manual process, like flipping a coin or drawing numbers from a hat, would take too long and might not produce truly random results. To solve this problem, scientists designed an electronic “roulette wheel,” which was basically a virtual model of a wheel with a slot for each digit from 0 to 9. They set this wheel to “spin” at 3,000 times per second and then connected it to a random-frequency pulse. With each pulse, the machine would record the position of the wheel at that moment, and they’d have one randomly selected digit.

But where did this random-frequency pulse come from? No one is certain. Given that people at Los Alamos were fiddling with radioactive elements, some have speculated the random pulse came from a Geiger counter pointed at a piece of uranium. Such elements have a steady rate of decay (the half-life), but they emit particles at random intervals, hence the Geiger counter’s weird clicking. It would be quite elegant if a Geiger counter were used for the random pulse. This would mean that the unpredictable subatomic motion the scientists needed random digits to simulate was the very same unpredictable motion scientists used to generate the random digits.

Unfortunately, it probably isn’t true. Because of the bomb, radioactive elements had become precious and would not just be laying around for odd jobs. More likely a kind of vacuum tube provided the random pulses that told the machine when to stop the wheel. The whole apparatus was hooked up to an IBM punch card device and left running. Ironically, the machine had to be reset at least once because it was breaking down. But in this case, “breaking down” means it was becoming too systematic, not random enough. Likewise, the tables were printed directly from the computer printouts because it was feared a human transcriber would introduce errors into this untainted sea of randomness. A newsletter at Los Alamos joked that librarians would shelve the book under “abnormal psychology.” Today the book’s Amazon page offers other hilarious reviews, one of which calls the randomization “sloppy” because “at the lower left and lower right of alternate pages, the number is found to increment directly.”

These days when a computer needs a random number, there are two common possible sources. It can select from a limited table of random digits stored in its memory—a table sometimes copied from A Million Random Digits, which is available gratis online. Or else the computer uses a formula to generate a “pseudorandom” number, one that’s close enough to random for most reasonable purposes but not random enough for advanced applications like designing thermonuclear weapons. New techniques for generating random numbers continue to emerge, some of which look to the natural world for a source of randomness, as the RAND scientists seem to have done. One recent project, called Lavarand, trained digital cameras on a bank of lava lamps and derived random numbers according to the random shapes they make. The tech firm Cloudflare apparently still uses this technique to encrypt a significant portion of internet traffic.

The RAND book represents one big step in a long history of doing math with randomness. The book of digits and the Monte Carlo method have found uses in a range of fields, from thermodynamics and environmental engineering to statistics and finance. A related method, known as the “random walk,” lent its name to a popular book about investing, Burton Malkiel’s 1973 bestseller A Random Walk Down Wall Street. Randomness remains important in a variety of computer applications too. Weather models use randomness to simulate turbulence in the atmosphere. Video games use random numbers to make computerized enemies behave more naturally, less predictably. In fact, that’s how I first heard of A Million Random Digits, in a footnote about randomization in Ian Bogost and Nick Montfort’s excellent Racing the Beam: The Atari Video Computer System.

The RAND book has also interested at least one experimental poet, Jackson Mac Low, who used it to randomize his writing process. Like many other experimental writers, Mac Low employed procedures known as “chance operations,” strategies to make writing a bit more chaotic. Through chance operations, writers hope to minimize the role of personal choice in their work. When the words in a poem appear by chance, not by choice, then perhaps the poem reflects something other than the author’s personal biases. Writers doing chance operations typically use household equipment like dice, a deck of cards, or even words pulled out of a hat. So it’s strange that Mac Low often drew numbers from A Million Random Digits to perform his chance operations. If the point is to disrupt the influence of social and historical contexts, then why choose a piece of equipment with such a grim origin story? Mac Low used the RAND book for a variety of projects during his long career, but it was especially important for his rewritings of texts by the modernist writer Gertrude Stein. As I argue in the second chapter of The Poem Electric, there are surprising resonances between the RAND book and Stein’s work. By using A Million Random Digits to make poetry, however, Mac Low also hoped to redeem the creative energies of the talented scientists who first made this book for such dark purposes.


-------

Seth Perlow is assistant teaching professor of English at Georgetown University. He is author of The Poem Electric: Technology and the American Lyric and edited Gertrude Stein’s Tender Buttons: The Corrected Centennial Edition, which earned a Seal of Approval from the MLA Committee on Scholarly Editions.

"The Poem Electric is a highly original investigation of how ‘electronics enable poets and their readers to animate and rework, rather than reject and surpass, familiar lyric norms.’"
—Marjorie Perloff, author of Radical Artifice and Unoriginal Genius

"Seth Perlow presents a magnificent challenge to the current fashion of ‘big data’ and mathematized literary analysis. The Poem Electric shows how qualitative, lyric intensities embody dispositions that are of indispensable value to us, and which are in productive tension with the world of screens and memes that we inhabit. It represents a wonderful challenge to so many of our assumptions about the value of technology to the humanities and the place of the lyric in our technologized lifeworlds."
—Joel Nickels, author of World Literature and the Geographies of Resistance

Wednesday, November 28, 2018

Can apps care for healthcare?




BY ALISON KENNER, PhD
Drexel University



When Cheryl Lansing discovered her asthma care app had disappeared from her smartphone, she was unsettled to say the least. Recommended by her health insurance company, Cheryl had used the care app several times a week for about three months before it faded into the sea of apps that had accumulated on her phone. Cheryl had not missed the app, not until I asked her for a follow-up interview to talk about its features. Part of my research for Breathtaking focused on the rise of mobile health apps, and Cheryl was the one and only person I had interviewed who mentioned using a health app. When she realized it was gone, however, a whole new set of questions emerged: Who had access to Cheryl’s data? What was the data being used for? And how could she recover her records?

Together, our quick and cursory search – coordinated over the course of thirty minutes via email – revealed that Cheryl’s asthma care app was not (as she believed) owned by her insurance company, but rather by a third party company. In the time since Cheryl began using the app, the insurance and health technology companies had parted ways. As a result, the app was no longer available online and there was no further information about the relationship between the two organizations. Cheryl was especially concerned because she had used her insurance ID to create an account in the app, and she had provided information about her pharmaceutical prescriptions and medication regimes. The feature that Cheryl used the most, however, was the daily “How are you breathing?” logbook, which allowed Cheryl to index her asthma based on her sense of breathing. There were other features that Cheryl could use, too – she could enter her peak flow reading and track when she used her rescue inhaler, for example. But Cheryl did not measure peak flow readings at home and she avoided using her inhaler unless she was having a full-blown asthma attack. In fact, Cheryl was pretty healthy during the period when she used the asthma care app.

The daily breathing prompt was most useful for Cheryl because it helped her to gauge, from day to day, what her breathing felt like. Otherwise she only noticed if her breathing was restricted on a particular day, or for a sequence of days; this was quite common among asthma sufferers I interviewed. The asthma care app, however, got Cheryl to check in with her breath intentionally, and track how her airways felt from morning to morning. The ritual of daily tracking, of course, is exactly what app designers want to encourage in users. Yet study after study has shown that health apps don’t stick long-term: After initial, enthusiastic adoption – often fueled by a desire to change behaviors – most health and wellness apps go unused. This was the case for Cheryl, who after using the asthma care app for a few months, went back to only noticing her breathing when something was wrong.

Like many people living with asthma, Cheryl’s symptoms were intermittent and seasonal. Her asthma always spiked in the springtime because of her pollen allergies, and in the summer on bad air quality days. She also had to be careful in the winter months, when cold air triggered her asthma, too. Because Cheryl had lived with asthma for most of her life – since elementary school – she felt that she knew how to care for her disease; she could sense symptoms emerging, and she knew when environmental conditions would make it difficult for her to breathe. This meant that, much like her asthma, Cheryl’s care practices had become normalized so that she rarely even thought of herself as having a disease. This is precisely what many asthma care apps want to change about asthma care: They want to remind app users of their asthma (which may be intermittent and seasonal) in order to regiment care and keep medical costs down. The problem is, as my study showed, many asthma care apps are just as fleeting and uncertain as the disease itself.

In recent years, health organizations including insurance companies, hospitals, general practitioners, university research centers, and pharmaceutical companies have rushed to the mobile app marketplace, anxious to launch platforms that will help patients, research subjects, and customers maximize preventative healthcare. It is a fine move. Many asthma care apps, for example, ask users to track peak flow readings (which gauge airflow restriction, and can be taken as an indicator of uncontrolled asthma), daily medication use, symptoms, and exposure to asthma triggers. Tracking, it is believed, can increase medical adherence and reduce the costs of emergency care. And having a record of symptom events and care practices that spans years may help app users see trends that they otherwise might only intuit. There is great potential for asthma care apps to document a chronic, often lifelong condition that may be fleeting and varied from year to year.

What I puzzle through in chapter four of Breathtaking is the relationship that emerging apps have to existing healthcare infrastructure. More specifically, I ask how these apps are situated in a system that does not always give patients enough information, where medication is prohibitively expensive, and care needs to be a continued conversation beyond a fifteen-minute clinical appointment. Does existing healthcare infrastructure have the ability to care for emerging asthma care apps? If not, what would it take to make it so? To ensure that apps with our health data do not just disappear on us; that we understand relationships between healthcare organizations and how they use our data; that apps are offered to us with robust explanations of how they can support and enhance our existing care practices?

Healthcare apps have great potential to fill existing gaps in infrastructure, but they cannot be expected to fix a broken system.


-------

Alison Kenner is author of Breathtaking: Asthma Care in a Time of Climate Change. Kenner is assistant professor in the department of politics and the Center for Science, Technology, and Society at Drexel University.

"Breathtaking is social science at its best: experiential, explanatory, critical, and providing ways forward. Alison Kenner herself is an active participant as community social-scientist and as partner to someone who suffers disordered breathing. She guides us vividly across scales and registers."
—Michael M.J. Fischer, author of Anthropology in the Meantime

"Breathtaking is a sweeping ethnographic account of asthma and its treatments that expertly traverses questions of lived experience, medical technology, and critical ecology as they bear on the epidemic of disordered breathing. Beautifully written and poignant, this book makes a robust contribution to our understanding of the health effects of environmental degradation and climate change, deepens the critiques of biomedicalization, and heralds the promise of complementary and alternative medicine."
—Anthony Ryan Hatch, author of Blood Sugar

Monday, November 19, 2018

Migration and global justice: North American economic migrants in Latin America





















BY MATTHEW HAYES
St. Thomas University, New Brunswick



What happens when North American retirement ideals of adventure and personal growth collide with the material realities of a Latin American city, going through a process of rapid urban growth spurred by rural-to-urban migration?

This is a question I tried to answer in Gringolandia: Lifestyle Migration under Late Capitalism. North American retirees are not only developing new ways of aging and experiencing retirement, they are doing so in conditions of economic uncertainty and financial precarity unlike those of recent generations. The financial crisis of 2008 left many without the savings for the retirement they hoped for.

But thanks to a sharply unequal global economy, thousands of self-proclaimed ‘economic refugees’ have managed to rescue the retirement they dreamed of by offshoring it to Cuenca, Ecuador. Cuenca is a city of about 400,000 people located at 2,500m in the Andean Sierra, designated a World Heritage Site by UNESCO in 1999. Since the 2008 crisis, perhaps as many as 10,000 US Americans and Canadians have relocated there, drawn by rosy-tinted depictions of its colonial-style charms in online marketing. American-run companies speculating in transnational real estate have been using internet algorithms to promote offshore retirement destinations to later-life workers unable to afford a middle-class North American retirement.

Not all those who retire to Cuenca are poor—though certainly a few would fall below the low-income cut-off in the US or Canada. They simply could not afford to age in place without a working income, especially in desirable but rapidly gentrifying urban spaces in cities like Houston, San Francisco, Portland and Toronto. In Cuenca, they can live easily in a city where the average income is about $700 per month, but where many live on much less.

North American migrants’ higher incomes are a major source of economic demand in a city now undergoing a makeover. The Ecuadorian government is promoting a One for One tourism campaign, designed to draw one tourist for each resident in a bid to make the sector the most important source of foreign revenue outside the oil sector. In 2018, tourism receipts are expected to rise to almost $1.8 billion US. Many of these tourism arrivals are lifestyle migrants settling permanently in a country whose constitution recognizes universal citizenship rights for resident foreigners. Many others visit on tours operated by international lifestyle marketers, trying on different destinations to see if they feel they can live in a Latin American setting.


Latin American Heritage Urbanism


Since the 2000s, foreign development agencies, the Inter-American Development Bank, and UNESCO have sought to promote Cuenca’s “Heritage Urban Landscape” and have facilitated access to loans for urban upgrading, in particular through the IADB’s “Emerging Sustainable Cities Program.” Their interventions position the city within the tourist gaze and appeal to North American settlers, and aim to increase land values and ground rents. While there are jobs for some, lower-income uses of El Centro are marginalized, and low-skilled workers and informal vendors are increasingly removed or prevented from accessing El Centro.

In the leisure space that is being built in El Centro, later-life North American migrants take advantage of opportunities to get outside their comfort zone and experience a new culture, broadening horizons that many say they felt collapsing around them as they aged into poverty at home.

They participate in new activities in their new home, where they can afford to go out frequently and have social lives that remind them sometimes of their college days. Their higher incomes enable them to occupy positions as patrons and benefactors of neighbours and young people that they could not afford at home. They set up charities aimed at deserving poor—especially women and orphans—and position themselves as helpers in ways shape their experience of aging successfully in Ecuador.

The colonial-style built environment that houses their new lifestyle experiments reflects unequal and unjust colonial social relations. Cuenca was built by a landowning elite, whose wealth came from the exploitation of a landless and often racialized peasantry—a significant part of which worked in conditions of indenture until the late 1960s. Cuenca, the seat of urban, European power, presided for centuries over a mestizo and indigenous countryside of small tenant farmers and indentured servants, who would come into the city to sell small agricultural or manufacturing surpluses to commercial and landowning elites. These latter recouped their money by selling manufactured items to their rural workers.

El Centro was abandoned in the 1950s and 1960s, as wealthier Cuencanos relocated to American-style suburbs. Its population dropped, but was sustained by an influx of rural-to-urban migrants from a countryside going through a rapid process of social transformation in the late 1960s and early 1970s. As Ecuador initiated a series of land reforms designed to redistribute land to indentured labourers, elites in the region around Cuenca sought to avoid redistribution by forcing previously indentured workers to accept title of small properties while retaining the most productive agricultural lands for themselves.


To the Victors, the Spoils


The preservation of El Centro now reproduces the injustices of the past. El Centro’s UNESCO “Heritage Urban Landscape” enshrines the tastes and property of European-oriented landowning elites, but marginalizes popular and rural traditions, particularly the informal vending practices and uses of public space in El Centro.

Restoration projects prioritize the tourism uses of public space, and rising rents push lower-income households toward self-built suburbs on the city’s edges. A new tram project is set to open later this year or early next, but its cost will be recouped in part through fare increases, and the lack of bus transportation across El Centro from East to West leave many Cuencanos wondering if the hundreds of millions of dollars were well spent.

The successes of Cuenca as a lifestyle migrant destination are not shared equally. The Ecuadorians who benefit most from the building boom that has accompanied the North American migration are the landowning elites, self-described as ‘nobles’ and Spanish-descendant. One wealthy family that has diversified into construction owns large tracts of the neighbourhood now called Gringolandia—the North American ethnic-ghetto which serves as a gateway for many prospective lifestyle migrants to the rest of the city and region.

Some middle-class Cuencanos with experience studying or working in the United States benefit from new business opportunities and a wider range of restaurants. But a third of the Cuencano workforce is dependent on informal labour, such as street vending. While some no doubt benefit, urban interventions create higher-income spaces and marginalize lower-income people. While informal vendors and lower-income workers struggle to remain in place, the city around them is changing under forces that are not within their control, and that draw in financial interests quite far removed from the Andean Sierra.


Migration and Global Inequality

The ease with which North Americans can relocate their lives to Cuenca and displace lower-income workers smacks of a sort of colonialism most lifestyle migrants eschew and seek to mitigate, escape, or resist. Despite attempts to make amends for their whiteness (almost all are white) and their privilege, they identify with all the advantages of having higher incomes and higher status in a lower-cost and lower-income community.

Their experiences as migrants differ completely from those of lower-income workers trying to find work in the United States or Canada. As these latter are met with an increasingly restrictive and militarized border regime, lifestyle migrants fill out paper work, and are welcomed with full citizenship rights denied to Latin American migrants moving north. Among the benefits they enjoy are access to Ecuador’s public medical system for operations and check-ups, a service Ecuadorians spend a lifetime paying into, but that foreign residents access for only a small fee (about $70 per month at time of writing).

Cuenca and similar heritage cities in Latin America offer a picture of the world we are entering, one where inherited inequalities are multiplied in perverse forms, and the benefits of transnational mobility and facility of telecommunications are shared unequally and unjustly. For some, Cuenca has become a safe haven from precarity. For others, it is a home that is being taken away and transformed.


-------

Matthew Hayes is author of Gringolandia: Lifestyle Migration under Late Capitalism. Hayes is the Canada Research Chair in Global and International Studies at St. Thomas University in Fredericton, New Brunswick.

"Matthew Hayes provides a vivid sociological portrayal of North Americans living in Ecuador alongside a theoretically sophisticated analysis of the global inequalities that shape growing north-south migration. Gringolandia is a must-read for students and scholars interested in a complex understanding of transnational migration in the context of 21st century globalization."
—Sheila Croucher, author of The Other Side of the Fence: American Migrants in Mexico

"Gringolandia offers a refreshing and powerful new perspective on lifestyle migration that demonstrates how it is caught up in the production of global inequalities informed by colonial legacies, the structures and practice of planetary gentrification, and the local class struggles this portends. Through his up-close ethnographic observations of the lives and motivations of North Americans living in Ecuador, Matthew Hayes presents a timely and sorely needed intervention that straddles the sociology of migration and urban studies, woven together through a deep concern with decoloniality."
—Michaela Benson, Goldsmiths, University of London

Monday, November 12, 2018

#UPWeek | #ReadUP | University Press Week: Adrienne Kennedy inducted into the 2018 Theater Hall of Fame for Lifetime Achievement





People will be reading
Adrienne Kennedy's works
for centuries to come.

—Henry Louis Gates, Jr.

***

Adrienne Kennedy has been a force in American theatre since the early 1960s, influencing generations of playwrights with her hauntingly fragmentary lyrical dramas. Kennedy is a three-time Obie-award winning American playwright whose works have been widely anthologized and performed around the world. Among her many honors are the Guggenheim fellowship and the American Academy of Arts and Letters award. In 2018, The New York Times called her "one of the American theater’s greatest and least compromising experimentalists." In 1995, critic Michael Feingold of the Village Voice wrote, "with [Samuel] Beckett gone, Adrienne Kennedy is probably the boldest artist now writing for the theater." On this day, Adrienne Kennedy will be inducted into the 2018 Theater Hall of Fame for Lifetime Achievement at the Gershwin Theatre in New York City.

To mark this tremendous honor, we are posting here an excerpt from The Adrienne Kennedy Reader (2001), the first comprehensive collection of her most important works, including the Obie-winning Funnyhouse of a Negro (1964).


***

On the Writing of Funnyhouse of a Negro

Funnyhouse of a Negro was completed in Rome, Italy, the week before our second son Adam was born in Salvator Mundi hospital. I was twenty-nine. And I believed if I didn't complete this play before my child's birth and before my thirtieth birthday I would never finish it.

My son Joe Jr. and I lived in a beautiful tranquil apartment about fifteen minutes from Piazza di Spagna. Hall steps led to a miniature living room that opened onto a terrace that overlooked Rome. I sat at the dark desk in the cool miniature room with pages I had started in Ghana on the campus of Legon (Achimota Guest House). They seemed a disjointed raging mass of paragraphs typed on thin transparent typing paper I had bought at the campus of Legon's bookstore. The entire month of July each morning when my son Joe went to Fregene with a play group of children run by an American couple, I tried to put the pages in order. 

Ten months earlier at the end of September 1960 my husband Joe and I left New York on the Queen Elizabeth. It was my first sight of Europe and Africa. We stopped in London, Paris, Madrid, Casablanca and lived in Monrovia, Liberia before we settled in Accra, Ghana.

The imagery in Funnyhouse of a Negro was born by seeing those places: Queen Victoria, the statue in front of Buckingham Palace, Patrice Lumumba on posters and small cards all over Ghana, murdered just after we arrived in Ghana, fall 1960; the savannahs in Ghana, the white frankopenny trees; the birth of Ghana newly freed from England, scenes of Nkrumah on cloth murals and posters. And this was the first time in my life that it was impossible to keep my hair straightened. In Ghana and for the rest of the thirteen-month trip I stopped straightening my hair.

After Ghana in February 1961 I had chosen Rome to wait for my husband to finish his work in Nigeria. Rome was the land my high school Latin teacher had sung of: the Forum, the Tiber, the Palatine, Caesar. When my son Joe was at the Parioli Day School I walked in the Forum for hours that spring of 1961. I rode the bus on the Appian Way, the rhythms of my teacher speaking out loud in my mind. Wandering through Rome while Joe was at school I was more alone than I had ever been. At noon I returned to the Pensioni Sabrina for lunch, often a pasta soup made of star-shaped pasta, then went into our room while waiting for my son to return on the bus at the American Embassy and stared at the pages. There were paragraphs about Patrice Lumumba and Queen Victoria. I had always liked the Duchess of Hapsburg since I'd seen the Chapultapec Palace in Mexico. There were lines about her. But the main character talked in monologues about her hair and savannahs in Africa. At that moment Funnyhouse of a Negro and The Owl Answers were all a part of one work. It wasn't until late July and the impetus of my son's impending birth tha tthe two works split apart and my character Sarah (with her selves Queen Victoria, Patrice Lumumba, Duchess of Hapsburg and Jesus) was born. 

In May, two months earlier, my mother had written me that my father had left Cleveland and returned to Georgia to live after thirty-five years. I cried when I read the letter, walking from American Express up the Piazza di Spagna steps. So Jesus (who I had always mixed with my social worker father) and the landscape and memories of Georgia and my grandparents became intertwined with the paragraphs on the Ghanian savannahs and Lumumba and his murder.

So trying (for the first time in my life) to comb my unstraightened hair, trying to out race the birth of my child, rereading the divorce news letters from my mother . . . in the July Italian summer mornings, alone in the miniature room, near the Roman Forum, I finished Funnyhouse of a Negro the last week of July 1961. Our son Adam was born August 1.


***


Also published by University of Minnesota Press:
In One Act by Adrienne Kennedy
Deadly Triplets by Adrienne Kennedy


***


The University Press Week blog tour begins today and continues throughout the week. Today, Duke University Press writes about its partnerships with museums. Athabasca University Press offers a playlist by author Mark A. McCutcheon. Rutgers University Press dedicates a post to Junctures in Women's Leadership: The Arts by Judith Brodsky and Ferris Olin. Over at Yale University Press, check out a post by author Dominic Bradbury about how immigrants enrich a country's art and architecture. Please enjoy all of these great #TurnItUP posts!

Happy #UPWeek and remember to #ReadUP.

Thursday, November 8, 2018

Sonic Science Fiction: Programming the Thought Synthesizer




BY TRACE REDDELL
University of Denver


One of the challenges I faced while researching and writing The Sound of Things to Come: An Audible History of the Science Fiction Film concerned the terminology of the “new” and the role of “futurity.” Early drafts of the project emphasized thematic clusters that brought together films from very different eras in order to emphasize several tonal centers. I have been working now with these in more performative contexts to explore the ways in which individual films might constitute the components of a larger modular thought synthesizer. Could the disruptive cuts of Godard’s Alphaville (1965), for instance, function as a step-sequencing module to control the theremin-drenched soundscapes of Kurt Neumann’s Rocketship X-M (1950) in order to produce an acoustic ecology in which cosmic situations resonate with Cold War dread by offering a scalar attunement to an atomized post-linguistic? Or, can the cosmic engine of Sun Ra’s Moog outbursts in John Coney’s Space Is the Place (1974) introduce the blackness of the AfroStrange as a frequency modulator to attenuate the Wagnerian whiteness of Lucas’s Star Wars (1977) or Spielberg’s Close Encounters of the Third Kind (1977)? These are still open questions and experiments-in-progress as I regard my book less as the documentation of concluded research than a composition handbook, a score or schemata for new directions and, yes, sounds of things to come.

Conceiving of science fiction (SF) film history not as a timeline of works by composers, musicians, and technicians who build on each other’s work but rather as a proliferation of strategies for “making different,” this project has led me to reject the terminology of innovation and instead promote estrangements at once technical, material, narrative, cognitive, and speculative. The “audible history” that I have ended up with presents a chronology, with each chapter covering about a decade of SF film history from the early 1920s to the end of the 1980s. But I hope this chronology modulates itself over time by activating three compositional modes—the ambient glide, the shimmering fringe, and the xenomorphic—which repeatedly push time out of joint and liquefy historical reference points into a flux state. Not components of the book as modular thought synthesizer but rather techniques for assembling and methods of playing it, these three modes share a propensity toward sonic destabilization. That is, they both work against time and attenuate space while never disavowing the apparent inescapability, if not absolute necessity, of time and space as constituents of what we call sound. I will briefly consider each and how readers might expect them to resound with their experience of The Sound of Things to Come.


Ambient glide

SF sounds are ontologically unstable, neither here nor there but always shifting and drifting across categories of place. The ambient glide of sonic science fiction is initiated by the push-pull of the theremin’s siren call in Rocketship X-M. As the sound of Martian psychogeography, the destabilized tonalities of the theremin call the American expedition to Mars. The instrument is barely audible during liftoff but becomes increasingly loud in the score as the rocket is knocked off its original course to the Moon and tugged with increasing volume and volatility of wavering sound toward Mars. The theremin is recorded in an orchestral context, as part of the film’s non-diegetic score, but its unfixed and wobbling wolf tone not only unsettles the sounds of the strings with which it mixes, it contaminates them with its radiant waves. It also suggests diegetic sound. The theremin sonifies the Martian landscape in the same way that the film stock switches from black-and-white to sepia tints during the Mars sequences.

Gliding sonorous events like those of the theremin, Louis and Bebe Barron’s electronic tonalities in Forbidden Planet (1956), or years later the long descending tones of Vangelis’s synthesizers and siren wails heard in Ridley Scott’s Blade Runner (1982), won’t stay in their place and open up strange new domains of diegetic experience. Much of my film sound analysis maps out a sub-diegetic dimension that plays out along an alien psychological substrata of cinematic phenomena that is also at the same time a techno-diegetic realm. Here, the technological apparatus of film sound carries on an almost independent transaction among machinic, electric and otherwise material speculations. These come together in the form of sonic psychotechnologies through which the SF film imbricates and entangles psychic and cosmic indices. In its gliding mode, this sonic psytech emphasizes mobility that makes thought travel but never arrives fully formed and is perpetually seeking its place.


Shimmering fringe

The shimmering fringe is first heard in Leith Steven’s score to George Pal’s Destination Moon (1950). A series of sustained overtones and polytonal harmonics orchestrally suspend time to lend depth to a brilliant star field. These sounds recede from audition, implying depth through a physiognomic imperceptibility. Likewise, in the same film, the use of an early effects processor known as the Sonovox technologically attenuates orchestral sounds as we observe a lunar panorama, a matte painting by the so-called father of modern space art, Chesley Bonestell. We can never hear the moon, but we can hear our devices hearing the moon, as it were. Sounds that blur or play around the edges of other sounds make peripheral spaces key to our experience of the SF film and are the basis for any understanding of sonic pyschotechnologies. Sonic psytech filters the sonorous event, objectifies it within discrete modular devices, but also gives the audible a withdrawn materiality that eludes comprehension and creates tonal apprehension (in both senses of the word).

In Blade Runner, the pitched shimmer of the ventilation units in Deckard (Harrison Ford)’s apartment or the steady buzz of the hovering police vehicles, spinners, above crowded street scenes, attain a fractal density that seeps away from the ear if we try to concentrate on it, like a star that is seen more brightly at the edges of perception but fades if we turn to view it directly. At the same time, such sounds reveal themselves as artificial sonic props for a manufactured reality and are meant to reinforce the programming of implanted memories. As an auditory fringe beyond the flat affective encounter with the SF landscape, the warbling destabilization of the Sonovox or synthesizer suggests that our encounters with the alien diegetic ambience are experiences with and at the very limits of our perceptual apparatuses and the technologies of sense. The fuzzy edges of synthetic tonalities, then, provide access points for an ambient attunement to an affective nonplace.


The xenomorphic

The xenomorphic mode is first encountered in electronic tonalities in Forbidden Planet. The Barrons would program sonic patch boards, burn them out by overdriving them as they recorded the sounds on magnetic tape, and then reanimate through a form of tape music that resembles nothing so much as an alien autopsy. This is not hyperbole. Consistently, the Barrons characterize their work as the torture of living sound circuits, a form of biomedia. In the film, these sound beings morph across diegetic layers to express the film’s narrative concern with alien psychotechnological events, an invisible but audible creature manifested from the Id of Dr. Morbius (Walter Pidgeon). The xenomorph invariably points to an extracinematic location, a zone of machinic materiality that is also transformed in the service of the speculative imagination in which an ethicoaesthetic dilemma transpires. In Forbidden Planet, this is initiated by the Barron’s abdication of any responsibility they might have to communicate with and nurture the alien biocomputers engineered in their little kitchen laboratory in Greenwich Village.

In Ridley Scott’s Alien (1979) and John Carpenter’s The Thing (1982), the sonic xenomorph thrives on an expanded auditory terrain made possible by Dolby Surround Sound, manipulating the vast sonic field to amplify tensions around the unpredictability of emerging alien threats to the listening body. In these films, the unseen becomes emblematic of the sonic xenomorph and stages alien encounter as a form of sensory deficit paradoxically dependent on existential high fidelity. The Dolby System in fact always thrived on aggression toward the listener, originating in theaters with the debut of Stanley Kubrick’s A Clockwork Orange (1971). As Michael Geselowitz, senior director of the IEEE History Center has pointed out, most innovations in sound technology happen “while our backs are turned” (2016). As embodied experience of the non-local, these films map the primary body sounds of pumping blood, breathing, the high pitched whine of the nervous system, and even tinnitus. The xenomorphic sonorous hyperobject cannot be perceived as more than the traces of a thing, not the thing itself, that manifest as a byproduct of high fidelity auditory hallucination and uncanny precognitive paranoia.

The title of this project is not meant ironically even as it works around and against notions of newness and futurism to embrace instead estrangement and alterity. As I write in my introduction, I hope that readers will accept that by the book’s conclusion they know less than they did when starting out. This is not to empty out the book of meaning nor to make ineffectual the strategies, techniques and modalities that it encourages readers to adopt as ways of listening to the science fiction film as a sonic art form in its own right. Rather, this is because the work aims for an incommensurable “next thing,” an unavoidable other estrangement. This is the strangeness, for instance, of the widespread digitization of SF film sound in the 1990s, and the pursuit of broader frequency ranges and greater volumes of sound in the 21st century cinema. It also resonates toward different forms as sonic science fiction escapes film and ends up in the music videos of Björk, Grace Jones, and Janelle Monáe, for example, or in the live cinema projects of Evelina Domnitch and Dmitry Gelfand, Android Jones, and NoiseFold. Whenever it may come from, the future of sonic science fiction is elsewhere, making the new strange again.


-------


Trace Reddell is author of The Sound of Things to Come: An Audible History of the Science Fiction Film and associate professor of emergent digital practices at the University of Denver.

"A lively, endlessly inventive exploration of the sonic worlds of science fiction cinema (beginning even before the advent of synchronized sound). The breadth and subtlety of Trace Reddell’s interdisciplinary scholarship is impressive, and his book is an ongoing homage to the valuable conceptual and cognitive challenges upon which effective science fiction depends."
—Scott Bukatman, Stanford University

"Building on the highly original concept of the sonic novum, Trace Reddell has written the first comprehensive theoretical approach to musical science fiction. The Sound of Things to Come is an alternative history of science fiction cinema, a handbook of sophisticated close analyses of many important films, and a re-envisioning of the role of sound technology in modernist aesthetics."
—Istvan Csicsery-Ronay, author of The Seven Beauties of Science Fiction Studies

Friday, November 2, 2018

On Jeff VanderMeer and material monsters: Did we ever know anything about the world at all?




















BY BENJAMIN J. ROBERTSON
University of Colorado Boulder


In None of This is Normal: The Fiction of Jeff VanderMeer, I focus on the fantastic materialities VanderMeer creates in his major fiction: the Veniss milieu, in which a good portion of his early fiction takes place; the city of Ambergris, which takes shape in City of Saints and Madmen (2001 – 04), Shriek: An Afterword (2006), and Finch (2009); Area X, the motivating force behind the Southern Reach trilogy (2014); and the Earth of Borne (2017) and The Strange Bird (2017). These materialities are impossible according to the norms we take for granted in our own world. In other words, they are fantastic, created, fictional. VanderMeer’s materialities can, of course, help us understand our own. They are products of a writer working in a specific place (the United States) at a specific moment (the early twenty-first century). Close attention to the historical situation in which these materialities emerge no doubt reveals something about that historical situation and the manner in which it determines what we think and how we act. For example, Area X can be productively read in the context of climate change and the Anthropocene. In such a reading, this alien place suggests a return of the repressed, the revenge of nature upon a humanity that has ignored and exploited it for far too long.

However, I find that these materialities can do more than represent our world. They can intervene in it when we understand that they have a force of their own, a force particular to fiction. In my reading, Area X cannot stand in for climate change or the Anthropocene because these human terms suggest an attempt to draw a boundary around an object that cannot be delimited by human knowledge practices. Such practices seek to create an other opposed to the self, each bound to its opposite by way of a universalizing liberalism that guarantees that the unknown can be known, that the different can become the same. However, Area X escapes every attempt to draw it into human knowledge practices because it exists at scales that cannot be indexed to such practices. It is, in my terms, abdifferent—not a thing whose difference could give way to sameness, but a thing that flees from all difference and the knowledge practices that produce it. Area X suggests a reading practice appropriate for VanderMeer’s fantastic materialities. This reading practice does not require that every fiction reference our own world. It allows fictions to be fictions, the fantastic to be fantastic. To engage with such a practice, we do not need to stop caring about our own world. Rather we must understand how fictions participate in our world, that they can do more than simply reflect it back to us. As I write, this reading practice “involves imagining conditions that afford new ways of thinking and that do not assume a stable, grounding reality. To fantasize, or fictionalize, materiality does not mean to abandon oneself to fantasy but to abandon the fantasy that we always already are able to know and are able to question such knowing.”



VanderMeer's short story “This World is Full of Monsters” exhibits many of the concerns that VanderMeer’s readers will recognize from his previous fiction: the necessity of transformation, the relationship of writer to world, the end of human civilization, the failure of human knowledge techniques, and so on. However, more so than any of VanderMeer’s fiction to date, “This World is Full of Monsters” offers a materiality in which stories are more than stories, more than representations: they are living things, they are forces of transformation, they are monsters. Horror reveals to us how our knowledge of the world and the stories we tell ourselves about our places in the world will always fail because the world is not a story, because materiality is not amenable to our knowledge or narratives. To this end, horror deploys monsters that demonstrate (these two terms are etymologically related to one another). The monsters of traditional horror reveal to us what we don’t know despite all of our science, what we cannot know precisely because our science has limited the scope of knowing itself. Thus when the werewolf returns from our animal past, or the vampire appears as a reminder of a dead aristocracy that continues to threaten the bourgeois order, or zombies manifest out of the remnants of a failing consumer society no story about what they are or what they mean will save us. If our knowledge could not account for them before they (re)appeared, what chance does it have now?

Here we discover the limitations of such monstrosity. These monsters, despite their impossibility, each represent some aspect of the world as we know it. We know there are no werewolves, but we accept the presence of the werewolf in horror insofar as it might represent something about our own world to us, insofar as it suggests our relationship to a pre-modern past we might otherwise wish to forget. Is such a fiction the best vehicle for such a suggestion? Is such a fiction an adequate representation of this relationship? Is this relationship even real, or is it a function of the fiction itself? When we ask horror fictions, or any fictions, to refer to the world in a meaningful way, or when we ask monsters to show us how our world works, we quickly and invariably run into questions about whether we ever knew anything about the world at all, whether we ever knew it in and of itself or whether what we know of it only comes to us through our representations of it. This issue becomes all the more urgent in a moment when the greatest crisis facing humanity’s continued existence on this planet, the forces unleashed by the Anthropocene, escape our every effort to represent them to a human-scaled subject that takes itself as the measure of all things.

In contrast to the traditional monsters of horror, the story-creature at the center of “This World is Full of Monsters” does not represent anything. It is not “about” anything. Rather, it is an active force that drives the transformation of the narrator-writer and creates for him a position in a world where he no longer fits. “Monsters” begins when the story-creature appears on the doorstep of the narrator-writer: “The story that meant the end arrived late one night. A tiny story, covered in green fur or lichen, shaky on its legs. It fit in the palm of my hand. I stared at the story for a long time, trying to understand. The story had large eyes that could see in the dark, and sharp teeth. It purred, and the purr grew louder and louder: a beautiful flower bud opening and opening until I was filled up. I heard the thrush and pull of the darkness, grown so mighty inside my head.” If we understand that the story-creature does not represent anything, we can immediately grasp the strangeness of the first sentence. The rest of this passage makes clear that “story,” in this context, does not refer to a fictional representation of the real or even to the creation of a narrative. However, the first sentence is even more revealing when we understand that “meant” does not involve any latent content, any hidden message that must be interpreted to be revealed. “Meant” does not refer to the possibility of knowing something outside of what has been written here. Instead, it refers to what the story will cause, what the story will do.




The story invades the body and mind of the narrator-writer, eventually causing him to sleep for one hundred years. When he wakes up, he does so to a transformed world in which he no longer has a place. Without a place, without a meaning, he seeks to end his existence. “This World is Full of Monsters” becomes a meditation on the problem of memory, but not in any conventional sense. The problem of memory here has little to do with the adequacy of memory to actual events. Rather, it has to do with how memory prevents us from adjusting to new situations, how memory creates meanings at odds with material facts. Late in “Monsters,” the narrator-writer confronts a strange being in this transformed world: “He communicated to me that the world had been remade against my image and that my form, even much reduced, was the rebellion of the old world against the new, and that this made no sense because the new world embraced the old; that my very presence made the old world manifest, no matter the form, so why was the form important? Why did I hold onto the form?” In one sense, the narrator-writer clings to his embodied form and thus refuses a physical transformation that would better afford his continued existence in the new world, a world no longer amenable to human being or meaning. In another sense, however, the narrator-writer clings to the form known as story, the form through which human beings make meaning out of materiality by representing it this or that way—sometimes in ways that obscure the very materiality they seek to understand. If it appears that VanderMeer himself still clings to this form, to the story, such is only the case because we insist on reading “This World is Full of Monsters,” or any of his fictions, as attempts to adequately capture some aspect of our own materiality. Such is only the case because we fail to understand how these stories might instead have some material effect on the world itself.

-------

Benjamin J. Robertson is assistant professor of English at the University of Colorado Boulder. Robertson is author of None of This Is Normal: The Fiction of Jeff VanderMeer and coeditor of The Johns Hopkins Guide to Digital Media.

"None of This Is Normal is the first book-length study of the weird fiction of Jeff VanderMeer. Benjamin J. Robertson not only highlights the beauty and power of VanderMeer's fiction, but also shows how this writing is central to any attempt to think through the plight of humanity in what has come to be called the Anthropocene."
—Steven Shaviro, author of The Universe of Things: On Speculative Realism

"This spirited book disturbs the new normal of the Anthropocene by way of the ‘New Weird’ in Jeff VanderMeer's fiction. At once a meditation on fantastic materiality and a step toward life after aftermath, this first dedicated study of VanderMeer tells a new story about humans and nonhumans both."
—Wai Chee Dimock, Yale University