My mom died on July 18, 2013, of pancreatic cancer, a subtle blade that slips into the host so imperceptibly that by the time a presence is felt, it is almost always too late. Living about 16 months after her diagnosis, she was “lucky,” at least by the new standards of the parallel universe of cancer world. We were all lucky and unlucky in this way. Having time to watch a loved one die is a gift that takes more than it gives.
Psychologists call this drawn out period “anticipatory grief.” Anticipating a loved one’s death is considered normal and healthy, but realistically, the only way to prepare for a death is to imagine it. I could not stop imagining it. I spent a year and a half writing my mother a goodbye letter in my head, where, in the private theater of my thoughts, she died a hundred times. In buses and movie theaters, on Connecticut Avenue and 5th Avenue, on crosswalks and sidewalks, on the DC metro and New York subway, I lost her, again and again. To suffer a loved one’s long death is not to experience a single traumatic blow, but to suffer a thousand little deaths, tiny pinpricks, each a shot of grief you hope will inoculate against the real thing.
A boundless black terror is how I imagined life without my mom. The history of grief, or what we know of it, is written by its greatest sufferers and ransacked with horror stories, lugubrious poetry, and downward-spiraling memoirs plunged in sadness. For some people, the death of a loved one is truly life-stopping, and I worried it would stop mine.
On New Year’s Day in 1986, a group of 20 prisoners bearing shanks stormed the dining hall at the state penitentiary in Moundsville, West Virginia, where inmates had just been seated for dinner, and took hostage the correctional officers on duty. After ordering the stunned diners to “leave the fucking food alone,” the instigators ranged through the prison’s South Hall, freeing whole cellblocks and eventually capturing 16 staff members. They stripped the hostages to their underwear, dressed them up as inmates, blindfolded them, and placed them in separate rooms around the facility, to give themselves enough time to kill at least some of them in the event of a rescue attempt. What motivated the rioters were living conditions and sanitary standards inside the prison, which were inhumane. “You quit treating us like dogs,” one rioter screamed, “and this wouldn’t happen … We don’t want this any more than you do!” During the harrowing 52-hour standoff that ensued, things grew grisly: the rioters not only threatened violence against their hostages but also murdered three inmates thought to be either informants or the authors of especially repugnant crimes. One hostage, forced to watch the killing of a prisoner—Jeff Atkinson, a suspected informant who had been convicted of murdering a pregnant woman—reported that an inmate cut out Atkinson’s heart and said to a friend, “It’s amazing how this little thing will keep a fellow alive.”
Responding to the crisis, the governor of West Virginia cut short a vacation in Florida and hastened to the prison. He and his staff eventually negotiated an end to the standoff; 13 participants, including two leaders of the riot, were transferred to the maximum-security wing, and the rest were returned to their cells. Most of the hostages went back to work after their release, but the riot had taken its toll. “My nerves are shot,” one officer said before returning to work. “I can’t even write without shaking. I quit smoking 15 years ago and started again five minutes after I was released.”
In its length and brutality, the West Virginia riot was no anomaly. In 1980, a two-day riot in New Mexico had killed 33 people, and in 1971, the infamous four-day riot in Attica, New York, had killed 43. But here’s what’s significant about the West Virginia riot: it was among the last of its kind in this country. Sustained prison uprisings simply do not happen here anymore. In 1973, we had 93 riots for every 1 million prisoners; in 2003, we had fewer than three. Prison violence as a whole, in fact, is down dramatically. In 1973, we had 63 homicides per 100,000 prisoners; in 2000, we had fewer than five. Inmate assaults on staff dropped similarly over roughly the same period.
These are eye-opening statistics—especially given that the incarceration rate in this country has quintupled since 1970, and a remarkable 3 percent of American adults are now under the supervision of the correctional system.
Read more. [Image: West Virginia Division of Corrections]
In 2003, thanks to Michael Lewis and his best seller Moneyball, the general manager of the Oakland A’s, Billy Beane, became a star. The previous year, Beane had turned his back on his scouts and had instead entrusted player-acquisition decisions to mathematical models developed by a young, Harvard-trained statistical wizard on his staff. What happened next has become baseball lore. The A’s, a small-market team with a paltry budget, ripped off the longest winning streak in American League history and rolled up 103 wins for the season. Only the mighty Yankees, who had spent three times as much on player salaries, won as many games. The team’s success, in turn, launched a revolution. In the years that followed, team after team began to use detailed predictive models to assess players’ potential and monetary value, and the early adopters, by and large, gained a measurable competitive edge over their more hidebound peers.
That’s the story as most of us know it. But it is incomplete. What would seem at first glance to be nothing but a memorable tale about baseball may turn out to be the opening chapter of a much larger story about jobs. Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
Yes, unavoidably, big data. As a piece of business jargon, and even more so as an invocation of coming disruption, the term has quickly grown tiresome. But there is no denying the vast increase in the range and depth of information that’s routinely captured about how we behave, and the new kinds of analysis that this enables. By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007. Ordinary people at work and at home generate much of this data, by sending e-mails, browsing the Internet, using social media, working on crowd-sourced projects, and more—and in doing so they have unwittingly helped launch a grand new societal project. “We are in the midst of a great infrastructure project that in some ways rivals those of the past, from Roman aqueducts to the Enlightenment’s Encyclopédie,” write Viktor Mayer-Schönberger and Kenneth Cukier in their recent book, Big Data: A Revolution That Will Transform How We Live, Work, and Think. “The project is datafication. Like those other infrastructural advances, it will bring about fundamental changes to society.”
Read more. [Image: Peter Yang]
Two months into their relationship, Chris’s boyfriend José pushed him to the ground in a fit of anger and ripped the clothes off his body. “We had gone out dancing, and when we got home, I was changing in front of him,” said Chris, 34.
"I had on my favorite pair of underwear; it was the pair I had worn the first time we went out. He saw the underwear, and just flew into a rage, saying, ‘How dare you wear those! Those are for me!’"
José threw him on the floor of their bedroom closet, and smashed the only light bulb in the room, leaving them in darkness. He loomed above Chris on the floor as he tore the underwear away. That was the first time things had ever turned violent between the two.
"I was in such a state of shock," Chris recounted seven years later, his fingers tapping at a wine glass stem and his brown eyes drifting. "I thought, ‘Oh, he’s just jealous; it’s the drinking,’ and I let it go. There was a lot of drinking in this relationship. No drugs, but lots of drinking."
The second time was worse. “He was angry at something—I can’t remember what—and I was laughing,” said Chris. José again became incensed, strode into the kitchen and grabbed a butcher knife. “He pulled me by my hair, had me on my knees and had the butcher knife at my neck.”
Chris says he didn’t react. At the time, his sister was pregnant, and he wanted to live to see his niece. “I talked him down, told him to give me the knife. I put my hand on his, and we put the knife back in place together,” said Chris, demonstrating by holding his two hands together.
That night, José locked their bedroom door for fear that Chris would escape and tell someone. The next morning, he told Chris, “You know I didn’t mean it, right?”
"That was his way of apologizing to me," Chris scoffed. The relationship lasted nine months, but continued to affect Chris for years after it ended.
At the edge of the ancient Gálgahraun lava field, about a 10-minute drive outside Iceland’s capital city of Reykjavík, a small group of local environmentalists has made camp among the gnarled volcanic rock, wild moss, and browning grass to protest a new road development that will slice the bucolic landscape into four sections and place a traffic circle in its core. The project, led by the Icelandic Road and Coastal Administration and the nearby municipality of Garðabær, will provide a more direct route to and from the tip of the Álftanes peninsula, where the rustic, red-tiled compound of the country’s president and an eponymous hamlet of 2,600 people stand.
The Hraunavinir, or “Friends of the Lava,” believe that any benefits from a project that snakes through Gálgahraun are cancelled out by its cultural and environmental costs. According to protester Ragnhildur Jónsdóttir, the thoroughfares would destroy some of the “amazingly beautiful lava formations” and spoil a habitat where birds flock and small plants flourish. One of Iceland’s most famous painters, Jóhannes Sveinsson Kjarval, once worked on his canvases there, perhaps magnetized by the charm of the terrain’s craggy natural relics.
Read more. [Image: Bob Strong/Reuters]
When the balding Australian first stepped off the riverboat and into the isolated pocket of northeastern Peru’s Amazon jungle in 2010, he had what seemed like a noble, if quixotic, business plan.
An ambitious real estate developer, David Nilsson hoped to ink joint venture agreements with the regional government of Loreto province and the leaders of the indigenous Matses community to preserve vast thickets of the tribe’s remote rainforest. Under a global carbon-trading program, he wished to sell shares of the forest’s carbon credits to businesses that hope to mitigate, or offset, their air pollution.
Located a six-day ride from the frontier city of Iquitos, the jungle’s vegetation, soils, and looming trees store an immense amount of carbon dioxide—roughly one ton, the equivalent of one UN-backed carbon credit, per tree.
In an ideal scenario, this is how it’s supposed to work: A community in a developing country works with an NGO or developer to design a plan to protect a large swathe of forest and thus prevent the release of the harmful chemical compound into the atmosphere, in accordance with the United Nations’ program called REDD (Reducing Emissions from Deforestation and forest Degradation). Then, it can get the emissions reductions certified by a third-party auditor and sell the resulting carbon credits to corporations in developed countries interested in reducing their own carbon footprints. (Deforestation accounts for roughly 17 percent of all global greenhouse gas emissions.)
Nilsson’s Hong Kong-based company, Sustainable Carbon Resources Limited, planned to help the indigenous community set up the Peruvian carbon credit project in exchange for sharing the profits once they were sold. If Nilsson’s plan worked, in theory the forest would be spared from loggers, his company would net some profit, and the indigenous community would receive millions of dollars in funding for education and medical care from investors and corporations interested in expanding sustainability and social responsibility efforts.
Nilsson recruited Dan Pantone, an Iquitos-based American ecologist with close contacts in the Matses community, as a guide to show him around the jungle, and, more importantly, introduce him to the right decision-makers.
"[Nilsson] told me, ‘You’re going to be a millionaire in a year,’" Pantone said of his earliest phone conversation. "He said he was going to help the indigenous people."
Early on, Nilsson didn’t seem particularly interested in hammering out the details of a potential forest project.
I am sitting in a comfortable gold folding chair inside one of the many ballrooms at the Georgia International Convention Center. The atmosphere is festive, with a three-course dinner being served and children playing a big-band number. The kids are students at a KIPP academy in Atlanta, and they are serenading future teachers on the first night of a four-day-long series of workshops that will introduce us to the complicated language, rituals, and doctrines we will need to adopt as Teach for America “Corps Members.”
The phrase closing the achievement gap is the cornerstone of TFA’s general philosophy, public-relations messaging, and training sessions. As a member of the 2011 corps, I was told immediately and often that 1) the achievement gap is a pervasive example of inequality in America, and 2) it is our personal responsibility to close the achievement gap within our classrooms, which are microcosms of America’s educational inequality.
These are laudable goals. According to the National Center for Education Statistics, white fourth-graders performed better than their black peers on 2007 standardized mathematics exams in all 46 states where results were available. In 2004, there was a 23-point gap in mathematics scale scores between white and black 9-year-olds, with the gap growing to 26 points for 13-year-olds.
But between these two messages lies the unspoken logic that current, non-TFA teachers and schools are failing at the task of closing the achievement gap, through some combination of apathy or incompetence. Although TFA seminars and presentations never explicitly accuse educators of either, the implication is strong within the program’s very structure: recruit high-achieving college students, train them over the summer, and send them into America’s lowest-performing schools to make things right. The subtext is clear: Only you can fix what others have screwed up. It was an implication I noticed when an e-mail I received during Institute, the five-week training program, referring to “a system of students who have simply not been taught.” The e-mail explained, “That’s really what the achievement gap is—for all of the external factors that may or may not add challenges to our students’ lives—mostly it is that they really and truly have not been taught and are therefore years behind where they need to be.”
I later asked a TFA spokesperson if this e-mail reflects the organization’s official views on traditionally trained teachers. He denied that TFA believes “the shortcomings of public education” to be “the fault of teachers. If anything,” he added, “teachers are victims of more-structural problems: inequitable funding; inadequate systems of training and supporting teachers; the absence of strong school and district leadership.” Nonetheless, at the time, the dramatic indictment of America’s non-TFA teachers would stay with me as I headed into the scandal-ridden Atlanta Public Schools system.
In the weeks between accepting the offer to join TFA and the start of our training, I was told by e-mail that “as a 2011 corps member and leader, you have a deep personal and collective responsibility to ground everything you do in your belief that the educational inequality that persists along socioeconomic and racial lines is both our nation’s most fundamental injustice and a solvable problem. This mindset,” I was reminded, “is at the core of our Teach For America—Metro Atlanta Community.”
Read more. [Image: Teach for America Delta Institute, Julia Sweeney, HO/AP Photo]
We don’t know much about Meredith Hunter other than that he killed the American Hippie. We know that his friends called him Murdock, and that he was 18, and that there were three weeks until the last day of the 1960s. 300,000 people had gathered at the Altamont Raceway Park near San Francisco for Woodstock’s Pacific reincarnation, but of the increasingly violent masses, he was the only one who stormed the stage with a gun, and the only one who was stabbed to death by a Hell’s Angel.
Today, we know Hunter mostly in the context of his death, but even there he’s just a metaphor. In the rise-and-fall narrative of hippie culture, he is simply the Altamont tragedy, and Altamont is known as the day the music died.
In his reflections on the recent anniversary of the September 11th attacks, John Cassidy discusses the human “saliency bias”—our habit of forming memories around jarring events rather than, say, a series of minor incidents whose impact nets about equal. This mechanism explains how and why history can link a generation’s implosion to one day at the end of the decade. For both sides of the culture, the tragedy’s gruesome rawness gave legitimacy to the concern that peace and love were quite literally killing the country.
Consider Olivia Rotondo, whose by-all-accounts-normal life suggests that her death could have happened to anyone. Four hours after tweeting her excitement about the Electric Zoo Festival on New York City’s Randall’s Island, she collapsed in front of a paramedic, saying the seven words that in the weeks since have become a macabre Exhibit A in the campaign against the drug that is said to have killed her.
“I just took six hits of Molly.”
She died that night. Jeffrey Russ, a 23-year old also believed to have taken MDMA (the drug’s proper name) had passed away 18 hours earlier. The following day—what would have been the grand finale to the three-day gyration of 100,000 neon-clad ravers—Randall’s Island was deserted and silent.
Since it first plugged in its equipment five summers ago, Electric Zoo has marked the end of the annual electronic festival season in the United States, the centerpiece each year of one of the country’s most mainstream and lucrative new artistic industries. In 2012, electronic dance music (EDM) spawned eleven platinum hits and increased the population of Miami by one quarter for one of the biggest American musical events since Woodstock. It has repackaged and commoditized the two-decade-old EDM mantra of “Peace, Love, Unity, and Respect” (usually abbreviated to “PLUR”) that apparently captures what this whole vision, with its bass drops and Day-Glo campiness, and a certain synthetic chemical stimulant, has always been about.
It’s too soon to tell how the Electric Zoo tragedies will influence the cachet of either the music or MDMA use in America, though many believe they go hand-in-hand, to such an extent that it’s hard to determine exactly which came first.
“If you look at electronic dance music culture, it seems to be more diverse, more accepting of the ‘other’, more welcoming of gay people—a counter-ethos of ‘we’re in it together,’” Dr. Rick Doblin, founder of the Multidisciplinary Association of Psychedelic Studies (MAPS), told me. “There’s a spiritual aspect to it. For many, the drug serves that function. There’s something fundamentally wholesome about these communal dance parties.”
Read more. [Image: David McNew/Reuters]
Last year was a busy one for public giveaways to the National Football League. In Virginia, Republican Governor Bob McDonnell, who styles himself as a budget-slashing conservative crusader, took $4 million from taxpayers’ pockets and handed the money to the Washington Redskins, for the team to upgrade a workout facility. Hoping to avoid scrutiny, McDonnell approved the gift while the state legislature was out of session. The Redskins’ owner, Dan Snyder, has a net worth estimated by Forbes at $1 billion. But even billionaires like to receive expensive gifts.
Taxpayers in Hamilton County, Ohio, which includes Cincinnati, were hit with a bill for $26 million in debt service for the stadiums where the NFL’s Bengals and Major League Baseball’s Reds play, plus another $7 million to cover the direct operating costs for the Bengals’ field. Pro-sports subsidies exceeded the $23.6 million that the county cut from health-and-human-services spending in the current two-year budget (and represent a sizable chunk of the $119 million cut from Hamilton County schools). Press materials distributed by the Bengals declare that the team gives back about $1 million annually to Ohio community groups. Sound generous? That’s about 4 percent of the public subsidy the Bengals receive annually from Ohio taxpayers.
In Minnesota, the Vikings wanted a new stadium, and were vaguely threatening to decamp to another state if they didn’t get it. The Minnesota legislature, facing a $1.1 billion budget deficit, extracted $506 million from taxpayers as a gift to the team, covering roughly half the cost of the new facility. Some legislators argued that the Vikings should reveal their finances: privately held, the team is not required to disclose operating data, despite the public subsidies it receives. In the end, the Minnesota legislature folded, giving away public money without the Vikings’ disclosing information in return. The team’s principal owner, Zygmunt Wilf, had a 2011 net worth estimated at $322 million; with the new stadium deal, the Vikings’ value rose about $200 million, by Forbes’s estimate, further enriching Wilf and his family. They will make a token annual payment of $13 million to use the stadium, keeping the lion’s share of all NFL ticket, concession, parking, and, most important, television revenues.
After approving the $506 million handout, Minnesota Governor Mark Dayton said, “I’m not one to defend the economics of professional sports … Any deal you make in that world doesn’t make sense from the way the rest of us look at it.” Even by the standards of political pandering, Dayton’s irresponsibility was breathtaking.
In California, the City of Santa Clara broke ground on a $1.3 billion stadium for the 49ers. Officially, the deal includes $116 million in public funding, with private capital making up the rest. At least, that’s the way the deal was announced. A new government entity, the Santa Clara Stadium Authority, is borrowing $950 million, largely from a consortium led by Goldman Sachs, to provide the majority of the “private” financing. Who are the board members of the Santa Clara Stadium Authority? The members of the Santa Clara City Council. In effect, the city of Santa Clara is providing most of the “private” funding. Should something go wrong, taxpayers will likely take the hit.
The 49ers will pay Santa Clara $24.5 million annually in rent for four decades, which makes the deal, from the team’s standpoint, a 40-year loan amortized at less than 1 percent interest. At the time of the agreement, 30-year Treasury bonds were selling for 3 percent, meaning the Santa Clara contract values the NFL as a better risk than the United States government.
Although most of the capital for the new stadium is being underwritten by the public, most football revenue generated within the facility will be pocketed by Denise DeBartolo York, whose net worth is estimated at $1.1 billion, and members of her family. York took control of the team in 2000 from her brother, Edward DeBartolo Jr., after he pleaded guilty to concealing an extortion plot by a former governor of Louisiana. Brother and sister inherited their money from their father, Edward DeBartolo Sr., a shopping-mall developer who became one of the nation’s richest men before his death in 1994. A generation ago, the DeBartolos made their money the old-fashioned way, by hard work in the free market. Today, the family’s wealth rests on political influence and California tax subsidies. Nearly all NFL franchises are family-owned, converting public subsidies and tax favors into high living for a modern-day feudal elite.
Read more. [Image: Matt Lehman]
Gao, the largest city in northern Mali, is a place of extremes. It’s a sprawl of one- and two-story mud-brick houses that lack power lines and running water, but it’s also home to the garish, McMansion-style estates of Cocainebougou, or “Cocaine Town,” a deserted neighborhood that once belonged to Arab drug lords who controlled the region’s smuggling routes for hashish and cocaine but fled, fearing reprisals from local citizens who blamed them for the Islamist invasion. The city has few high schools and no universities, but many of Mali’s leading guitarists and percussionists learned their craft in Gao’s decades-old youth orchestras; it is a proudly secular city that also houses the Tomb of Askia, one of the oldest mosques in Africa, built in the 15th century to honor a regional ruler. Gao was for centuries best known as the capital of the ancient Songhai Empire, which once controlled a region larger than present-day Mali. In the summer of last year, an al‑Qaeda affiliate known as AQIM, for “al-Qaeda in the Islamic Maghreb,” took over Gao and made it the capital of the rump state the group created after forcing the Malian army out of the north. Months earlier, the Tuareg, a separatist minority long bent on independence, had laid the groundwork for AQIM and its Islamist allies when they captured the city. When I visited northern Mali in March of this year, a black-metal billboard the extremists had erected on the main road leading into the city was still welcoming visitors to the “Islamic City of Gao.”
French air and ground forces reconquered the north this past January, bringing the region back under the nominal control of Mali’s fragile central government. Camouflage pickup trucks full of Malian soldiers now rumble down Gao’s otherwise empty streets, and a handful of small bars and restaurants have reopened. Castel and other Malian beers, strictly forbidden under the Islamists, are freely available, though they’re usually served warm because of the city’s frequent power outages. I walked through the main bazaar one afternoon with Baba Douglass, an affable, rotund man who works as a top adviser to Gao’s mayor, Sadou Diallo. Teenagers hawked Nokia cellphones and women in brightly colored blue dresses and head scarves peddled warm bread and cake, calling out prices as we passed. Douglass pointed to a pair of canary-yellow bulldozers looming over a fenced-off expanse of dirt and stone. “That’s where the new central market building is going,” he told me. “If things stay quiet, it will be open by the end of the year.”
That’s a big if. Mali’s central government now runs Gao, but many locals believe that the jihadists who controlled the city last year have melted away into the surrounding countryside, where they are waiting out the French. France launched its military campaign on January 11 with a series of air strikes on insurgent targets. Thousands of French ground troops poured into the country later that month and began pushing north. At the peak of the campaign, more than 4,000 French soldiers were in Mali, but the French military has announced plans to withdraw about 3,000 of them by the end of the year. Paris will pull out the remaining troops next year, leaving behind an unspecified number of special forces and trainers to mentor the Malian security forces, and will also support a new United Nations peacekeeping force of 12,600 troops drawn from other African countries. But many ordinary Malians still fear that their country’s armed forces won’t be able to fill the void.
After saying goodbye to Douglass, I made my way through the remains of a walled compound that once housed the mayor’s offices. About a dozen militants had snuck in days before and lobbed grenades at a convoy of passing Malian military vehicles, kicking off a fierce gun battle that raged for more than seven hours. French forces relieved the overmatched Malian soldiers and eventually killed all the attackers, but the fighting left the compound in ruins, two of its yellow walls reduced to piles of scorched concrete and rebar. The ground was littered with spent cartridges, scraps of clothing, and razor-sharp shrapnel. The compound’s custodian, Hasan Haidara, led me into a garage and pointed to a splotch on the floor that looked like brown paint. “Blood from one of the jihadis,” he told me. Haidara, who’d been trapped in the compound during the attack, said several of the fighters were Arabs. “They were not from Mali,” he said emphatically. “They were not from here.”
I heard a similar refrain from an array of Malian and American security officials. Gao’s central jail is housed in a defunct two-story health clinic a short drive from the mayor’s compound. When I visited, the warden, Captain Ballo Banfa, told me that many of his prisoners had come from Algeria, Tunisia, Nigeria, and other neighboring African countries. Captain Ibrahim Sanogo, an intelligence officer at a nearby Malian military base, told me that he’d listened in on radio conversations between rebels speaking English, Fulani, and Hausa, three of the primary languages of neighboring Nigeria, and personally interrogated captured fighters from Burkina Faso and Chad. France captured two of its own citizens allegedly fighting alongside the Islamists in northern Mali and is holding them on terrorism charges. U.S. officials say foreign fighters from across Africa have been flowing into Mali to earn their jihadist bona fides and gain tactical experience battling a well-armed Western military. “Northern Mali has become a jihad front,” said a U.S. official familiar with the region, who spoke on condition of anonymity. “People think of northern Mali like they thought of Chechnya in the late ’90s—as someplace where you can go and do your part to restore the caliphate.”
The foreign militants battling Malian and French troops across northern Mali are part of a little-noticed but hugely important shift.
Read more. [Image: Issouf Sanogo/AFP/Getty]