Thursday, July 23, 2015

Examining America's rhetoric of postracial progress.

Recent events in America including the #BlackLivesMatter movement are
forcing white Americans to look at race in a way that's uncomfortable—
but also much more realistic.
Image taken in November 2014 of a demonstration in New York City. Credit: Flickr.

Assistant professor of English at University of Nevada, Las Vegas

According to a recent poll, nearly 60 percent of Americans believe race relations are “generally bad.” It’s not hard to see why. From Ferguson to Baltimore to Charleston, racial unrest and violence seem to be getting worse, not better. Pollsters noted that the last time black Americans felt this negatively about race relations was in 1992, when four white police officers were acquitted in the beating of Rodney King, setting off riots in Los Angeles and across the country. The massacre of nine church members in Charleston in June reminded many of the massacre of four black girls at a Birmingham church in 1963. And the 2014 shooting death of twelve-year-old Tamir Rice bore troubling similarities to the lynching of Emmett Till in 1955—the boys even bore a striking physical resemblance.

But perhaps we should go back even further, to the early twentieth century, to what historian Rayford Logan called “the nadir of race relations in the United States.” By then, African Americans in the South had been systematically disenfranchised through poll taxes and literacy clauses, terrorized by vigilante groups like the Ku Klux Klan, and straitened by legalized segregation. The influx of Jewish and Catholic immigrants, along with the Great Migration of African Americans from the rural South to the urban North and Midwest, stirred great fears. The nation was becoming less white, less Protestant, more urban. In his 1903 masterwork, The Souls of Black Folk, W. E. B. Du Bois declared, “The Problem of the Twentieth century is the problem of the color line.”

Against this inauspicious backdrop, the metaphor of the melting pot entered general usage as a way to describe this new America. Here was an optimistic, feel-good way of looking at the nation, one that supposed that disparate groups could successfully be integrated, or melded, into white American society. The reality, of course, was that many people resisted this ideal. By the 1920s, nativists had passed anti-immigration laws limiting the number of “undesirable” immigrants. Between May and October of 1919, a series of race riots swept the country, leading James Weldon Johnson, field secretary for the NAACP, to call it “Red Summer.” The Ku Klux Klan entered a period of renascence, advocating a policy of “one-hundred-percent Americanism,” where “American” meant one-hundred-percent white (of old-immigrant stock) and Protestant.

I’ve been thinking of this as we enter another period of racial turmoil, nearly one hundred years later. The melting pot has been replaced by the fantasy of a postracial America, another optimistic, feel-good way of looking at the nation, one that supposed disparate groups can successfully transcend race altogether and become “colorblind.” Yet the Tea Party is the new voice of nativism, Donald Trump calls Mexican immigrants criminals and rapists, and black lives continue to be lost. Americans have lost the idealism they felt seven years ago, in the aftermath of Obama’s election. Back then, 67 percent believed black-white relations would “eventually be worked out” and 70 percent thought race relations would improve.

In some ways, though, I take heart in the loss of feel-good optimism. Only recently has white pessimism about race relations caught up with black pessimism, and that’s a good thing. Earlier this year, 58 percent of black Americans believed race relations were bad, while only 35 percent of white Americans thought the same. By May, however, 65 percent of black Americans and 62 percent of white Americans thought race relations were bad—an astonishing increase, especially among white subjects. To be sure, it took something like the Baltimore uprising following the funeral of Freddie Gray to shake white Americans out of a kind of willful complacency, with the mainstream media guilty of sensationalizing the protests and fanning white anxiety. But the protests starkly demonstrated that something was wrong—that the rhetoric of postracial progress was papering over deep fissures in American society.

The postracial fantasy, like the melting pot, is a white fantasy. It imagines a smoothing away of difference, a whitewashing of history. As Anna Holmes recently wrote in the New York Times, “Sometimes it seems that as if the desire for a ‘postracial’ America is an attempt by white people to liberate themselves from the burden of having to deal with that legacy.” I couldn't help but think of Ben Affleck's misguided desire to conceal his slaveholder ancestry from the public—a "whitewashing" of his family history and the history of this nation. In the aftermath of this scandal, Affleck posted an "apology," writing: "We deserve neither credit nor blame for our ancestors and the degree of interest in this story suggests that we are, as a nation, still grappling with the terrible legacy of slavery." No kidding, I thought. But it’s easy for people like Affleck to remain naively idealistic when they don’t have to imagine—or live—life from the point-of-view of a black American. Black Americans, on the other hand, have always had to imagine life from the point-of-view of white Americans. They have always possessed what W. E. B. Du Bois called the “second sight” of double-consciousness—an ability to see complexity and perspective in a way that many white Americans don’t have to. White Americans can be “colorblind”—or just plain blind. Black Americans must have colorvision.

The Black Lives Matter movement, the shootings in Charleston, even the widespread dismay that Atticus Finch is a racist—all of these force white Americans to see race in a way that’s uncomfortable and confusing, but also much more realistic. Jim Crow hasn’t been abolished—it’s gone undercover. Its legacy remains in all aspects of American society, covert but no less insidious, painted over but still there. Getting rid of the Confederate flag in South Carolina or forcing police officers to wear body cameras are ultimately superficial solutions to structural problems. The color line still exists, most starkly in black and white attitudes toward policing, with black Americans more than twice as likely to express anxiety regarding the police in their community. Acknowledging that this is the case is a small but necessary first step to improving race relations and insuring that “the Problem of the Twentieth Century” does not become the problem of the twenty-first.


Julia Lee, assistant professor of English at University of Nevada, Las Vegas, is author of Our Gang: A Racial History of The Little Rascals (forthcoming this fall from University of Minnesota Press) and The American Slave Narrative and the Victorian Novel. She was named a 2014 Emerging Scholar by the magazine Diverse: Issues in Higher Education. More information can be found at She tweets @profjulialee.

Thursday, July 16, 2015

On 'big data' and the ways we evaluate women's lives on a global scale

No Ceilings uses data sets to tell stories about gender inequality worldwide.
What are the stories behind the data?
Image: Screenshot,

Assistant professor of political science and ethnic studies at the University of Nebraska–Lincoln

On March 9, 2015, one day after International Women’s Day, former Secretary of State Hillary Clinton, Gates Foundation co-chair Melinda Gates, and Clinton Foundation vice-chair Chelsea Clinton celebrated the official release of No Ceilings: The Full Participation Report in New York City.

The report and companion website seek to measure the progress women have made since Clinton’s 1995 declaration that “women’s rights are human rights” at the Fourth World Conference on Women in Beijing. The reports finds that “there has never been a better time to be born female,” but “major gaps remain.”

The Clinton Foundation and Bill and Melinda Gates Foundation are not the first to use big data to compare and analyze levels of women’s empowerment around the world. In the 1990s, the United Nations Development Program created the Gender-related Development Index (GDI), the Gender Empowerment Measure (GEM), and more recently, the Gender Inequality Index (GII). The Organisation for Economic Co-operation and Development, an international organization composed of wealthy democracies, has its own Social Institutions and Gender Index (SIGI).

If these measures should appeal to anyone, they should appeal to me, a social scientist who studies women in politics across countries and continues to use large data sets. But I also have questions and reservations about global data sets that compare women’s lives, some of which grew out of the interviews and daily interactions I had while working on my book in the Republic of Niger.

1. Is one approach to marriage necessarily better than another?

The World Bank has, at least since the late 1980s, been interested in improving women’s lives by encouraging governments to change laws concerning the family. When I first visited Niger, I wanted to understand why the country’s leaders had not reformed its collection of family laws per the World Bank’s wishes. Later, I decided to investigate the history of another non-reform, Niger’s non-ratification of a regional African Union treaty on women’s rights. One issue in both controversies was family property; could, for instance, a daughter inherit the same amount as a son?

According to many projects that compare women’s statuses globally, the prevalence of unequal inheritance practices or laws lowers a country’s international ranking. SIGI, mentioned earlier, examines a large basket of items and takes into account the inheritance rights of daughters vis-à-vis sons.

Something I was unconsciously ignorant of at the beginning of my fieldwork was the difference between community-property and separate-property marital regimes. In California, where I spent most of my upbringing, community property is the default law. In what I believe to be a large percentage of households in Niger, the default regime is not community but separate property. That is, a wife maintains her finances separate from her husband, and vice versa. People with whom I spoke said they did not know how much money their spouses made per month or year or how much wealth their spouses possessed.

Under the ideal form of separate-property ownership, a wife uses her finances for her own purposes, whereas the expectation is that a husband uses his finances to provide food, shelter, and other amenities to the entire family. If men are expected to provide for the entire household and women are expected to keep their own property, then it is fair, for many of the men and women I talked with, for sons to inherit twice as much as daughters.

Things become complicated when, in reality, wives use their resources to also provide for the family. School fees, children’s clothes, medicines, small gifts, and money for children to spend on food while at school may become the purview of women or women and men.

It is also possible that separate property regimes better position women (and men) in the option of leaving one’s marriage. I wonder whether the divorce rate among certain communities in Niger, which is comparable to that of the U.S., is linked to this.

What I came to realize is there are alternatives to community-property marriage that are widely practiced, and it is important to consider the advantages and disadvantages of marriages where financials are kept separate.

2. Are global rankings of women’s statuses a modern reincarnation of the distinction between “civilized” and “uncivilized”?

By no means do I believe that those compiling global comparisons of women’s statuses want to portray the Global South as “uncivilized.” What concerns me, however, is the idea that may emerge from these rankings, that “other” countries and peoples need help but not “my” country or people. This problem is akin to the use of #firstworldproblems, which started with good intentions, but ends up denying that there are problems of access to quality education, malnutrition, poverty, and violence in wealthy countries and that people outside the first world can have so-called petty problems.

The discourse of “these women” being better off than “those women,” I find, can unintentionally perpetuate misunderstanding and be used as part of a backlash against women’s movements. More than I had anticipated, and something I learned from scholars such as political scientist Abdourahmane Idrissa, anti-family law reform and anti-African Union treaty activists used nationalist rhetoric to combat proposed changes to family law. Nigérienne women’s activists were painted as “foreign,” even though most Nigériens could be just as well traveled.

3. How do “non-experts” define inequality differently than “experts”?

One of the most unsettling but illuminating moments in my fieldwork was when a female religious leader asked me why so many foreigners were coming to her, asking her about the lack of family law reform in Niger. Isn’t the real problem poverty? she asked, turning the tables on me.

Anthropologist Sally Engle Merry points out that indicators, while they can be used for good or for ill, “tend to consolidate power in the hands of those with expert knowledge” and that “an increasing reliance on indicators tends to locate decision making in the global North, where indicators are typically designed and labeled.”

I wonder what the No Ceilings report or the SIGI index would look like if they were directed by women and men in the Global South, or by women and men of different socioeconomic classes in the Global North and in the Global South.

I laud former Secretary of State and now presidential candidate Hillary Clinton for her deep commitment to improving women’s lives around the globe, including in the United States. The No Ceilings report was well-conceived and carefully written, and, in my opinion, better attuned to the issues I raised above than other reports. I just hope that if new “experts” and ordinary people get on board with Clinton’s fight for women’s equality, they will consider who is deciding whom is “better off.”


Alice J. Kang is assistant professor of political science and ethnic studies at the University of Nebraska–Lincoln. She is author of Bargaining for Women's Rights: Activism in an Aspiring Muslim Democracy.

"Alice J. Kang compellingly argues that governments are more likely to adopt women's rights reforms when local activists mobilize for them, that opposing activists must also be considered, and that political context is essential for understanding outcomes around women's rights."
—Gretchen Bauer, University of Delaware

"Bargaining for Women’s Rights is a refreshing approach to thinking about women's rights in majority Muslim countries that captures how civil society groups mobilize and how multiple components of 'the state' actually debate women's rights legislation."
—Barbara Cooper, Rutgers University

Wednesday, July 8, 2015

Grace Lee Boggs on biracialism, social movements, and hope for America

Grace Lee Boggs, pictured here in 2012, was born on June 27, 1915, in Providence,
Rhode Island. She currently lives in Detroit.

On June 27, 2015, Grace Lee Boggs turned 100 years old. Boggs is a Chinese-American writer, philosopher, and social activist, and author of several books. Her autobiography, Living for Change, was published by University of Minnesota Press in 1998. Boggs has been actively involved with historic social movements including the civil rights and the Black Power movements. Her passionate belief in a better society remains unchecked, and the words she published almost 20 years ago have tremendous resonance with race and activism in the United States today.


Excerpts from Living for Change: An Autobiography by Grace Lee Boggs.

[On growing up in New York] We were the only Chinese in our neighborhood, and everyone we met or had anything to do with—our neighbors, classmates, and teachers—was Caucasian, a good many of them immigrants from Europe or their children. During this period it used to infuriate me when not only my peers but teachers and other adults would ask me, "What is your nationality?" I would reply patiently, as if giving them a civics lesson, that my nationality was American because I was born in the United States but that my parents were Chinese. But no matter how often or how carefully I explained, I would be asked the question again and again, as if to say that I could not be Chinese and American at the same time. Often the questioner, having heard my explanation, would go on to say, "But you speak English so well." It was said sweetly, as if I were being paid a compliment. But the message behind the sweetness was that being Chinese and speaking English well were just as incompatible as being Chinese and American.


When I was in the class of 1935 at Barnard the only people of color on campus were Louise Chin and I, and a Japanese woman, Grace Ijima, of the class of 1934. In the spring of 1995 Louise and I attended our alumnae reunion. It was my sixtieth—and also my first. One of the reasons I decided to attend was that a special alumnae of color reception was on the program. At the reception I learned that today more than 25 percent of Barnard students are Asian and that there are similar percentages at many other colleges and universities. Asian students are now the largest ethnic minority on the University of Michigan campus, and the West Lounge in the South Quad Residence Hall has been renamed in honor of Yuri Kochiyama, the Japanese-American human rights activist who cradled Malcolm's head in her lap as he lay dying on the stage at the Audubon Ballroom in February 1965. Equally interesting, an estimated 50 percent of Asian young people now marry non-Asians, mostly Caucasian but sometimes African American or Hispanic. What that means for the future of this country I cannot begin to imagine. But one thing is for sure: whoever still believes that East is East and West is West and never the twain shall meet is not ready for the twenty-first century.


I had participated in enough movements to know that no one can tell in advance what form a movement will take. Movements are not initiated by revolutionaries. They begin when large numbers of people, having reached the point where they can't take the way things are anymore, see some hope of improving their daily lives and begin to move on their own. I have also learned that if you want to know what a movement is going to be about, you should keep your ears close to the grassroots to hear the "why" questions that people are asking. For example, during and after World War II when black folks had acquired a new self-confidence from working in the plant and fighting overseas, they began asking, "Why do white folks treat us this way?" with a new urgency, and so the civil rights movement was born. In the 1960s, when white flight to the suburbs made blacks the majority or near-majority in cities like Detroit, people began asking, "Why are all the political leaders in our city still white?" giving rise to the Black Power movement.


At the 1992 futuring conference [at the University of Michigan] I created a vision of Detroit Youth in the year 2032. A record-breaking snow storm had occurred on the eve of the celebration of Martin Luther King's 103rd birthday, I wrote, but people had no trouble getting to the celebration because young people, organized in Youth Block Clubs, had assumed the right and responsibility to keep the streets clean and safe for the community, especially elders. The vision goes on to describe how community work had been incorporated into the school curriculum, so that elementary schoolchildren working with elders were growing most of the food for the city while middle and high school students were doing most of the work of preparing and serving food in the community, and so on. Having that vision in my head and heart since the futuring conference has helped me time and again to project youth activities that transform young people at the same time that they improve the community.


Grace Lee Boggs is a first-generation Chinese American who has been a speaker, writer, and movement activist in the African American community for more than 70 years. Her autobiography, Living for Change, was published by University of Minnesota Press in 1998.

Friday, June 26, 2015

In 1971, a wedding heard 'round the world. #LoveWins

It is so ordered.

Today's momentous Unites States Supreme Court decision to strike all bans on same-sex marriage means a lot of things to a lot of people. For Michael McConnell and Jack Baker of Minneapolis, it is another historic landmark in a life full of historic landmarks. In 1971, McConnell and Baker became the first same-sex couple known to apply for a marriage license. Their first attempt, at Minneapolis's Hennepin County Courthouse, did not go through; their second, however, did.

Find a timeline of key events in their lives here; look for their memoir, The Wedding Heard 'Round the World: America's First Gay Marriage, in January 2016.

Friday, June 19, 2015

Catherine Madison: From the front lines of a Korean War prison camp, 65 years ago.

Sixty-five years ago, on June 25, 1950, North Korea invaded South Korea, initiating the Korean War. The U.S. and sixteen other nations joined forces to repel the invaders.

About three weeks later, in July 1950, a young captain in the U.S. Army Medical Corps was captured on the front lines and held in brutal prison camps for more than three years. "Doc" Boysen would survive unbelievable hardships, return home, and live for almost fifty more years.

This fall, the University of Minnesota Press is publishing his story as told by his daughter, the writer Catherine Madison. Here is an excerpt from her forthcoming book, The War Came Home with Him: A Daughter's Memoir.


SEOUL, KOREA—July 1950

More than two hundred men were quartered in a two-story schoolhouse on the northern outskirts of Seoul. North Korean officers visited them to deliver lectures on the evils of capitalism and assure them that they would be treated well. The Koreans also announced that because Gen. Douglas MacArthur had insisted that captured Americans receive their customary three meals a day, the prisoners would be fed three times, which simply meant that their current rations of unseasoned rice balls, watery cabbage soup, and an occasional piece of melon were divided into three portions instead of two.

The men spent several days housed in the crowded school. Occasionally guards would take a prisoner or two away, ostensibly to make political broadcasts; those men were not seen again. Among the troops themselves, no one seemed to be in charge. One soldier informed Doc that, as a captain, he outranked others and was supposed to be the acting CO (commanding officer), but Doc protested, insisting that a medical officer does not command infantry troops.

Physically, he was suffering. His feet were bruised and swollen, and it was all he could do to walk to the latrine. Mentally, the games had set in, his suspicions repeating in an unforgiving loop. Why didn't the army keep its promise to send me home after ninety days? Am I being punished for refusing to give sleeping pills to that surly officer? Did I do something else wrong? Or fail to follow orders? Why didn't I receive any letters from my wife while I was in Japan? Was the army holding them back? Did she even write? Am I paying for my past sins? Back home I hit a chicken with the car. And I passed that extra copy of the med school test to my frat brothers. But didn't I already get punished for those things?

Slowly, as he began to feel better physically, the mental torture eased. His thoughts turned to survival, and he focused on the present moment and whatever he might do to make sure those moments kept coming, for him and for those around him. He asked to assist with sick call, but the Koreans refused. As near as Doc could tell, they had little to work with, shoddy equipment, and meager pharmacy supplies. Once they invited him to join them, but when he showed up at the "clinic," he was asked to pose for a propaganda picture. He refused.

At one point, all the prisoners were escorted into the school auditorium and told to sit. Stiff and stilted, select American officers and GIs read prepared statements asking the men to sign a petition demanding an end to the war. After the prisoners signed, the readers explained, the paper would be sent to the United Nations. The Koreans circulated the petition, a blank piece of paper, and insisted the men sign, which they did, of course, thinking it might help them survive. (Several of the men wrote the same names, like Mickey Mouse and Donald Duck, but no one seemed to notice.)

One afternoon, the guards summoned the men to the courtyard for roll call. "Come with, come with," the Koreans shouted. The men followed orders, not realizing that they would not be allowed to return to the schoolhouse, where they had stowed what few possessions they had left—tattered Bibles, rosary beads, pictures, whatever extra clothing they had managed to hold on to. As they were marched off to a train yard, they vowed they wouldn't make the same mistake again. From now on, they'd keep any and all possessions with them at all times.

Doc had already lost plenty: his thick glasses, his St. Christopher medal, his shoes. But he also gained much of a substance: a new acquaintance named Peppe, who would become a trusted confidant and lifelong friend, and other friends, like Shorty Estabrook, a nineteen-year-old spitfire who made everyone laugh, and Eli Culbertson, to whom he'd been tied with telephone-wire that bloodied their wrists. He also gained a new, or perhaps renewed, belief in the existence of a supreme being, whatever its name.

It's something that makes you believe that your strength is part of a plan devised by someone more powerful than you. It's there like a huge wave just before it crests, powerful and never ending in its beauty as it just keeps rolling along, silent in all its majesty but ever present.

It is the faith and hope that sustains you; something you accept and admit you do not understand. Prayer becomes a constant, not a once-a-night event—and not always in words, perhaps, but surely in thoughts.

How else can you explain the fact that you survive?


Journalist Catherine Madison was editor-in-chief of Utne Reader, senior editor at Adweek and Creativity Magazine, founding editor of American Advertising, and editor-in-chief of Format Magazine. She has written articles for many publications, including the Chicago Tribune, Star Tribune, and Minnesota Monthly.

The University of Minnesota Press is giving away 10 advance reading copies of The War Came Home with Him. To enter, send an e-mail with your preferred mailing address to, subject line: Catherine Madison giveaway. Deadline to enter is July 10th; winners will be notified within one week. All submitted mailing addresses will be used for the purpose of the contest only.

"I loved this book, not only for the knowledge gained concerning a war I knew so little about, but for Catherine Madison’s skill in relating both sides of this complex and difficult story. She is truly a reliable narrator, and her interweaving of her father’s ordeal as a prisoner of war with her own growing up in a household with a broken and damaged man is honest and generous and truly moving." —Judith Guest, author of Ordinary People

Wednesday, June 10, 2015

What is "Malian music"?

Assistant professor of ethnomusicology at The Ohio State University

For many, to think of a place called “Mali” is to hear, first and foremost, its music. Mali may be a poor, landlocked, and sunbaked country in the West African Sahel, but its widely acclaimed music culture—with its bluesy resonances, danceable rhythms, and haunting melodies—has a way of mitigating, even beautifying such realities.

For this reason, when things fell apart in March 2012—when a subaltern mutiny became a full-blown coup d’état, and a secessionist movement in the North added an Islamist insurgency to its ranks—many in the media spoke of “the death of music in Mali.” The fate of Mali and its music, it seemed, went hand in hand.

These reports tended to assume an uncomplicated relationship between a country, its people, and music, threatened in the present by bad politics, domestic disputes, and foreign threats.

Such problems are, of course, real (and ongoing), but what makes the music we hear (and hear about) “Malian” is, in fact, a significantly complicated affair.

So, what is Malian music?

What follows is a set of provisional answers from my experiences as an observer of Mali and student of its music over the past two decades. These answers are neither exhaustive nor mutually exclusive, but they do give a sense of the crucial complexity that Malian artists playfully, critically, and artfully negotiate when they make (and we hear) their music—what I call in my new book, Bamako Sounds, “the Afropolitan ethics of Malian music.”

Malian music is…

Mande music.

I first encountered the music of Mali through the modern echoes of its imperial past. In this sense, the word “Mali” refers to the eponymous Empire, which reigned over vast swathes of western Africa from the 13th to 16th centuries. Living and studying with a family of kora (21-stringed harp) players in Bamako, the Malian capital, I heard the praise songs, instrumental melodies, and characteristic rhythms of a medieval court music repurposed for the life and times of a postcolonial city.

World music.

Before traveling to Mali, its music came to me in small-town Minnesota, on a compact disc that a friend had purchased after a semester abroad in Madagascar. Malian music moves, through the commercial circuits of the global culture industry and within the communities of a Malian diaspora with roots on every continent. Some of its itinerant purveyors are well-known worldwide: Ali Farka Touré, Amadou & Mariam, Salif Keita, Oumou Sangaré, Toumani Diabaté, and Rokia Traoré. Still others are on the rise: Fatoumata Diawara, Sidiki Diabaté, Amkoullel, and Vieux Farka Touré. Just to name a few.

National music.

When I began my doctoral research on the postcolonial music culture of Mali, I found an archive rich with the sounds of nation building and statecraft. In this sense, “Mali” refers to the contemporary West African nation-state, which will celebrate 55 years of independence from colonial rule in September (2015). In the early 1960s, the newly minted Republic of Mali created a national ensemble, made up of traditional instrumentalists and vocalists from throughout the country, and an orchestra, a dance band with a drum kit, congas, electric guitars, and horns. Their job was simple, if abstract: to perform the nation, through the country’s varied traditions and nascent modernity.

Pirated music.

I arrived in Mali ten years ago to begin long-term fieldwork on Bamako’s urban music culture. I quickly encountered two things: a thriving informal marketplace, full of copied and counterfeit goods; and a diverse cohort of artists, who regularly bought and sold in this market but were adamant in protesting what they called “the scourge of music piracy.” One thing was clear: Malian music maintained an active and ambivalent relationship to intellectual property.

Urban youth music.

From the bals poussières (dust parties) of the 1950s and 60s to the balanin dance parties of the present, the music of a demographically young Malian populace has frequently taken to the streets. There, you will find posses huddled around stereos, discussing the nuanced history of global hip-hop over afternoon tea. And there you will find vendors, crouched in front of laptops, filling old cellphones with the latest hits from Bamako, New York, and Paris.

Islamic music.

I wrote a dissertation about the politics and economy of an apparently secular urban music culture. While most of my musician friends and interlocutors were Muslim, Islam did not substantively factor into my analysis of their work. Then, four years ago, when I was asked to contribute a paper to a conference on Qur’anic knowledge in sub-Saharan Africa, I listened again to my field recordings with ears tuned for religion. In this Malian music, I heard the vocal melismas of prayer calls, the precise diction of sacred recitation, benedictions, praises to the Prophet, and citations from the Qur’an, woven into the fabric of an apparently secular urban music culture.

Not Malian music.

In April 2012, when the Malian state had all but collapsed and a motley crew of Tuareg separatists declared an independent homeland (Azawad) in the North, the idea of “Malian music” became the object of an increasingly urgent ethnic identity politics. Some globetrotting groups, like the Saharan blues troupe Tinariwen, used their international profile to contest the Malian state and what they viewed as a long history of military aggression against a sovereign people in the North. Later, others came together to affirm Malian solidarity across ethnic boundaries, though the lines dividing what was and was not “Malian music” had now been drawn, quite literally, in the sand.

An Afropolitan ethics.

What is Malian music? It is the sonic convergence of these (and many other) social positions—ethnic, religious, urban, economic, political, transnational, and historical—within a rooted and routed African world.

And it is the existential art of working with and through such multiple modes of being to claim a personal stake in what is (and is not) “Malian music.”

It is this artful process of social articulation and cultural experimentation in contemporary Africa that I call an “Afropolitan ethics.”


Ryan Thomas Skinner is author of Bamako Sounds: The Afropolitan Ethics of Malian Music. He is also the author and illustrator of a children's book, Sidikiba's Kora Lesson. He is assistant professor of ethnomusicology at The Ohio State University, and an accomplished kora player.

"Accessible and heartfelt, Bamako Sounds is itself largely musical in its interweaving of inventive musical criticism, scholarly analysis, and the author's work as a musician."
-AbdouMaliq Simone, Goldsmiths, University of London

Wednesday, June 3, 2015

What do ellipses do for us?

Beyoncé (feat. Jay-Z) "Drunk in Love" Unofficial Emoji Video from JESSE HILL on Vimeo.


Assistant professor of cinema studies at Purchase College, State University of New York

Last year, Austin-based video producer Jesse Hill made a video to impress his girlfriend for Beyoncé and Jay Z’s “Drunk in Love” that transcribes the entire song’s lyrics into a series of emojis. The video went viral and even left the shrewd pop diva herself quite impressed, endorsing it on her Facebook page and designing two t-shirts for her songs with emojis on their arms, available to buy on her website.

The video’s popularity—and its clever appropriation of the eyeball, snowman, eggplant, and clock emojis, among many others—immediately makes clear that emojis circulate across popular culture to signify more than just their intended meanings. One less strikingly interesting emoji in this video is the use of the speech balloon emoji to stand in for the repeated line “I’ve been thinking” (it’s only repeated twice at the beginning of the song—by the end, the lovers are presumably too drunk to think anymore). Though there is technically a separate “thought balloon” emoji where the text bubble is empty, the use of the speech balloon emoji with the three dots to represent “I’ve been thinking” suggests that the symbol registers abstraction, illustrating a tendency to associate punctuation with thought itself.

In its 2012 iOS 5 update, Apple introduced this speech balloon to devices’ emoji keyboard, a text bubble with three dots in it. Known as a “typing awareness indicator” in chat services, this symbol’s introduction as an emoji signals the extent to which the ellipsis has become a familiar image we see moving through our digital communication streams.

The typing awareness indicator is a default feature offered by most popular chat services that shows when the person on the other end is typing, aiding in conversational turn-taking. The idea is that if I see this ellipsis, I know the other person is typing, so I wait to see what that person is typing so that we are not both typing simultaneously, potentially haphazardly moving our conversation in different directions at the same time.

Ellipses in the digital age

The ellipsis thus appears to solve a problem posed by the distinct features of communication in the digital age. One of the most fundamental affordances of computing technologies is that they allow physically separated people to have real-time textual chat. Unlike in face-to-face or telephone conversations though, one doesn’t see or hear who one is talking to and one doesn’t know if the person on the other end is actively involved in the conversation. With networked media and the expectations that we are multitasking and, as danah boyd puts it, “participating in the always-on lifestyle,” we might be accessing multiple services, online with multiple screens open, and we might leave them open while we are on the phone with someone else, working in an another room, out of the office, etc. [1] As cinema and media scholar Anne Friedberg observed twenty years ago in The Virtual Window, “Multiple-frame images are a readable new visual syntax, a key feature in the contemporary remaking of a visual vernacular.” [2] Friedberg’s work is important because it turned our attention to how digital screen displays were forming a significant break with the centuries-long regime of perspective in visual culture, whereby the composition and framing of images oriented viewing practices around a single, centralized point.

The multiple points of the ellipsis thus cue us in this postperspectival vernacular that our conversation partner is actually electronically present with us, engaged on the other end. In this way, while the cultural logic of these digital dots represents a break with one centuries-long tradition in art history, they are continuous with another centuries-long history of punctuation in writing. Punctuation marks were symbols invented to resolve ambiguity and to help facilitate the efficiency of communication, just like digital ellipses do. But digital chats’ reappropriated dots resolve an ambiguity specific to networked media: turn-taking when the conversation partner’s presence is otherwise impossible to assure.

The shift here is one from the ambiguity of how to read words within textual space to an existential ambiguity of whether or not anyone is even listening to us. Think of all the ways contemporary digital selves constantly need assurance that they are being listened to: by “liking,” retweeting, favoriting, or hashtagging.

Friday, May 29, 2015

Meditations of an Infomaniac, Part 2 of 2


In early 2012 I thought I had discovered the perfect title for my new project, which was to be a diary of my information habits. I wrote up several pages of notes under the title “Confessions of an Infomaniac.” Several weeks later I Googled the phrase, and to my chagrin found the title had already been taken by Elizabeth M. Ferrarini in 1984.

I instantly ordered a cheap copy of the book, and was pleasantly surprised when it arrived. I was so taken by the cover that I posted a Photoshopped version of it on Facebook with my own name in place of hers. It was well liked by my friends, but I felt some remorse to have erased the name of an author, even if only in the spirit of an informal repurposing.

Ferrarini published one more book in 1985, Infomania: The Guide to Essential Services. Infomania, according to Ferrarini, is a “state of mind” characterized by “an inordinately intense desire for the most up-to-date information available to computer users in the age of the electronic database.” Already in the auspicious year 1984, it was possible to describe infomania as a condition related to networked personal computers. Ferrarini’s vision is sublime and all-encompassing: “the world is full of infomaniacs. Each day, in homes, huts, castles, caves, corporations, and penthouses throughout the world, tens of thousands of computers access electronic services. The reasons for these accesses are as varied as the personalities behind the computers. But one thing is clear about all of us who are accessing these electronic services—we have infomania.”

The ingenious innovation of Confessions of an Infomaniac is its combination of a harlequin plot and a technical manual. Ferrarini uses her “electronic mail” to find just the right “electronic male.” The terms “email” or “e-mail” do not appear in the book; instead Ferrarini uses the term “electronic letter.”

Ferrarini’s work is ahead of its time, and not far off in its hyperbole. Her books presage the collapse of boundaries between the home and the workplace, and between the computer and the bedroom—and they foretell a new era of e-romance. Despite her alarmist asides about suffering from too much information, Ferrarini is enthusiastic about the endless romantic potential afforded by online dating services. Ferrarini even registered a trademark for the word “infomania” in 1985, intending presumably to start some kind of consulting business, but the trademark was cancelled in 1992 due to going unused by its owner.

In late 2012 I tried to find Ferrarini by Googling her; I found a brief obituary—she had died in October.


Metaphors for information overload tend to fall into two categories: those that suggest addiction or lack of self-control, such as infomania, datamania, infobesity, databesity, dataholism, infostress, dataddiction, infovorism, datadithering, data dread, infoxication; and those that suggest natural disaster: datanami, datageddon, dataclypse, data deluge, data smog, infoglut, information saturation, data swamp, drowning in data.

As someone who’s spent years poring over every book on information overload and information diets that I can find, I’m skeptical that effortful self-control will do much to address the sense of being overwhelmed by information. I’m also skeptical about too much information bringing about the apocalypse.

A larger problem is time poverty. A data diet is a luxury most knowledge workers can’t afford. As I detail in my book, Americans have been worried about the adverse effects of new communications technologies since the late nineteenth century.

At the other extreme is data poverty: 2.2 million incarcerated Americans are on forced data diets. A 2011 UN report declared Internet access a fundamental human right that should be accorded to all, including prisoners.

Everything we do as humans involves taking in data. I understand information overload broadly as a range of phenomena relating to the limits of cognition, perception, and memory (both personal and collective), typically associated with technological change. Many of the major aesthetic debates of the twentieth century—over, for instance, perception, style, technological reproducibility, cultural memory, and canonicity—take on new valences in the context of information overload.

As I write in my book:


While poetry may seem the most non-technological of literary genres, I show that poets were often obsessed by the changing nature of information and its dissemination in the twentieth century. The news that there is more news than we can process is not so new: while avant-garde poetry may not figure prominently in the global information glut, the global information glut figures prominently in avant-garde poetry. However marginal it may seem, poetry will long outlast our current media platforms. To quote William Carlos Williams:

                              Look at
               what passes for the new.
You will not find it there but in
               despised poems.
                              It is difficult
               to get the news from poems
                              yet men die miserably every day
                                             for lack
of what is found there.
                Hear me out . . .


Meditations of an Infomaniac, Part 1


Paul Stephens is author of The Poetics of Information Overload: From Gertrude Stein to Conceptual Writing. He has taught at Bard College, Emory University, and Columbia University. He edits the journal Convolution and lives in New York City.

Thursday, May 28, 2015

Meditations of an Infomaniac, Part 1 of 2


My iPhone slips from my hand and lands on the subway tracks. I glance down the tunnel and don’t see a train. I’m carrying a heavy bag containing a Macbook Pro, an iPad and a dozen or so books. The digital signboard says the next Manhattan-bound train will arrive in one minute. I put my bag down on the platform and hop onto the tracks. Within ten seconds I’ve grabbed the phone and am back up on the platform. I’m wearing white pants (I never wear white pants), which are now covered in grime.

Everyone on the platform is staring at me.

A guy walks by and says, “I never would have done that.”

In two minutes or so, I board the next train. 55 people were killed on NYC subway tracks last year. I’ve seen this statistic dozens of times on my commute.

Clearly, I need my data fix. I’m an infomaniac. Just about everyone is. After finishing an academic book on information overload, I was an expert on the subject. But I had avoided getting too personal. What does infomania mean to me, an infomaniac among infomaniacs?


Was Adam in Eden the first failed data dieter? His boss requested only one thing: that he not access the remote server. Given only one task not to carry out, Adam brought multitasking and all our woe into the world.

My own experience with data dieting was hardly less fraught. In honor of National Screen-Free Week, I went offline from May 4 to May 10.

Or I did my best to stay offline, which wasn’t easy.

National Screen-Free Week was started in 1994 by the organization TV-Free America, and was originally called "TV-Turnoff Week." Aimed primarily at children and promoted by the Campaign for a Commercial-Free Childhood and Adbusters, the concept seems simple enough.

I began to panic almost as soon as I powered down my smartphone, tablet and laptop. I exaggerate, of course. In fact, I started to absorb myself in the luxury of physical media almost immediately, and to make up for my screen intake with books and records.

I had intended to keep a detailed diary of my analog information habits, but found the task not only insurmountably time-consuming, but also problematic when it came to my privacy. One’s data intake is a very intimate form of biography. In his Soliloquy, Kenneth Goldsmith (whom I write about in my book) did something like the inverse: he recorded and transcribed every word he spoke for a week, without preserving the words of anyone else. The resulting book runs to nearly 500 pages. In his book Bib, Tan Lin (whom I also write about) attempted to record metadata information about every single thing he read over the course of two years. Simply the titles and URLs run to 150 large-format pages.

I was able to reduce my screen time drastically over the course of the week, and I wrote profusely in my notebook. I started to feel a bit like Karl Ove Knausgaard about how minutely I was recording details, and that made me uncomfortable. I didn’t have much luck with the typewriter I borrowed. I often asked my wife to Google simple information like directions. Waking up in the morning, I found myself reflexively reaching for my iPad. Given several imminent deadlines, I did have to send a number of emails over the course of the week (in advance of my diet, I decided I didn’t want to adversely affect anyone else by being offline).

One clear takeaway from the diet was that going offline is a luxury. Information is unequally distributed, as is privacy. But there’s no putting the data apple back on the tree. I’m tempted here to make a pun on the name of the world’s largest corporation, but I won’t. Instead, I’ll move on to another task and another of my open windows.


Look for Stephens's essay "Meditations of an Infomaniac, Part 2" tomorrow.


Paul Stephens is author of The Poetics of Information Overload: From Gertrude Stein to Conceptual Writing. He has taught at Bard College, Emory University, and Columbia University. He edits the journal Convolution and lives in New York City.