When LeBron James and Chris Bosh came to the Miami Heat in 2010, the expectation was clear. They would win championships, plural—not one, not two, not three, not four, but some other, embarrassingly high number they would surely regret ever mentioning in their premature victory parade.
The assumption was that James—and, to a lesser extent, power forward Bosh—would provide value to the team not just through their own productivity, but through the effect they would have on those around them. A good player, after all, makes his teammates better—the rare sportswriter cliché often backed up by the numbers. In the NBA, where superstars are rare, the surest way to improve an organization tends not to be patiently gathering valuable, mid-tier employees; it’s luring a star to your city and letting him transform the organization from the inside.
The belief in the transformative power of stars goes well beyond the NBA; it’s reflected in the outrageous salaries for CEOs and the scramble for talent in Silicon Valley. “Someone who is exceptional in their role is not just a little better than someone who is pretty good,” Mark Zuckerberg told The New York Times in 2011. “They are 100 times better.”
In the early 2000s, Dan Pallotta was the head of a humming 300-employee business working out a 47,000-square-foot office in Los Angeles. The company produced charity events—“AIDS Rides” and “Breast Cancer 3-Days” that Pallotta invented and marketed, raising $305...
Read through enough social studies experiments and your view of humanity inevitably grows more jaundiced. We all know our species has its foibles and shortcomings, but modern researchers have found ways to tease apart each shameful emotion—honing in on the specific biological mechanisms that make us jerks, cataloguing and quantifying our base nature. A tumor is awful from a distance, as a generality. Zoom in closer, perform a few studies, and you confront the cells in all their uniquely horrific, malignant glory.
Take flattery. Academics define it as “lavish praise that is offered in an ingratiation setting”—a nice compliment when the person giving it may have something to gain. A closer look, though, reveals that a complex series of emotions, most of them unsightly, bubble to the surface whenever someone says a kind word about another human being.
For all of humanity’s collective awfulness—our vindictiveness and stupid hats, humblebrags and genocides—there have nonetheless been moments in which we’ve proven capable of genuine decency and compassion. We can be good. In fact, on multiple, documented occasions, one human has helped another without expecting anything in return.
This is altruism, a quirk of evolution that leads one human being to give a coconut to a sick neighbour, never mind the cost to himself. It’s one of the traits that has helped us thrive, allowing us to cooperate with people well outside our immediate families or tribes to build the kind of large-scale societies that go far beyond what other creatures have managed. It’s that sense of common humanity that makes us not only successful, but tolerable, and sometimes even sympathetic as a species.
Consider a situation: a number of people walk into a restaurant at the same time. The server attends to all the white customers. The last customer served happens to be the only person of colour. Is this racism?
What about something more general: how would you describe the portrayal of African Americans in the US entertainment media? Is there racism there?
It’s long been understood that how you respond to these kinds of questions is heavily influenced by the colour of your skin. Numerous studies have shown that people from the dominant racial group perceive far less racism in mainstream society than people from subordinate racial groups. The real question is: who is seeing things as they really are?
Early on the morning of February 20, 1976, two police officers approached a green Camaro parked at a rest stop near Pompano Beach, Florida. Inside, Jesse Tafero and his partner Sonia Jacobs were sleeping, along with Jacobs’ two young children. Tafero’s friend, Walter Rhodes, was asleep in the driver’s seat.
Exactly what happened next remains unclear. What’s certain is that one of the patrolmen asked the two men to step out of the vehicle, shots were fired, and both officers were killed. Tafero, Rhodes and the family tore down the interstate in the stolen police car. When they were stopped at a roadblock, the gun used in the shooting, which was registered to Jacobs, was in Tafero’s waistband.
Rhodes immediately struck a deal. In exchange for a lighter sentence, he testified that Jacobs and Tafero were solely responsible for the shooting. The couple, meanwhile, insisted that Rhodes, who was on probation, had panicked, shot the officers, and then given Tafero the gun while he drove away. Someone was lying, but with little physical evidence, it was difficult to know which story was true and which was false.
It is a sad paradox of modern existence that on a planet thick with humans—a place chock-full of them—so many are so desperately alone. A recent survey found that more than a third of Americans over 44 are lonely, and almost half of them have felt that way for more than six years. Here we are, desperate mariners floating through a sea of humanity—people everywhere but not a one to have a casual drink with on a Thursday evening while chatting about the latest episode of True Detective. What are we doing wrong?
There are plenty of potential reasons for this state of affairs, enough theories to fill sociological textbooks and fuel a thousand think-pieces. Is the big anonymous city isolating us? Whatever happened to bowling and community? Is it Facebook’s fault? One common explanation, the scapegoat in plenty of vaguely countercultural movies and high-school pot-smoking bullshit sessions, is materialism. Call it the Fight Club Thesis: our love of objects is making us sad.
Why do rich people work so much?
The quick, glib answer is that hard work is what made them rich in the first place. But then, why keep grinding away? At what point does it make sense to stop accumulating riches and start enjoying them? Are we even capable of knowing when enough is enough?
My grandmother dated a lot. This wasn’t out of some essential wantonness or a particular desire to play the field: it was a matter of circumstance. If you were the only available woman where you lived, you’d get asked out a lot, too.
In 1921, Toronto had 1,947 Chinese men and just 88 women. That ratio became even more skewed after the Exclusion Act, a piece of legislation inspired by fear of the “Yellow Peril” that banned all Chinese immigrants from the country. By the time my grandmother snuck through in 1937, using her cultural visa as an opera singer to escape her home country after Japan sacked Nanjing, she was one of a just handful of Chinese women of marrying age in the city. My grandfather—a nice guy with strong English skills and good prospects—got lucky.
For most other men, though, mid-century Chinatowns across the continent were lonely places.
If the Grammys are good for anything—and this remains an open question—it’s the sociological pleasure of watching today’s pop stars fake-smile, cheer, and side-eye their way through a three-hour ceremony in a room full of their closest frenemies. Thanks to its long-running habit of matching incongruous artists—flamboyant Elton John with noted homophobe Eminem!—the show creates a particularly rich stew of faux-friendliness seasoned with animosity. Macklemore wins best rap album and, in an amazingly self-aggrandizing act of public humility, publishes the apology text he wrote his bud Kendrick Lamar telling him that he was robbed. Taylor Swift dances in her chair, snaps a photo with Lorde, and tweets the results to cement their new BFF-itude: “And you know… We’re on each other’s team. #LORDE #CLEANINGUP #GRAMMYs.”
The feeling at the Grammys was that affability was the way to get ahead, that the best way to rise in the ranks in pop music—as in a newsroom or government bureaucracy or high school—is to make as many friends as possible, network like crazy, then reap the rewards. Acting aggressive, meanwhile, is seen as antisocial, irrational behaviour. Kanye refuses to let Taylor Swift finish and is immediately castigated. Aggressive parties lash out at someone in reaction to a perceived slight, then watch their social standing plummet.
But, in “Aggression, Exclusivity, and Status Attainment in Interpersonal Networks,” a study published in the journal Social Forces, sociologist Robert Faris argues that, contrary to popular opinion, acting like an asshole isn’t a problem in need of a solution—it’s a strategy that can help you clamber into the elite.
Humans are terrible narcissists. We’re forever gazing out into the world searching for our reflected image, unilaterally assigning human charms and foibles to creatures and objects that were doing perfectly fine on their own. We give people names to gerbils and hurricanes. We make cancer “vengeful” and turn love into some chubby baby with a bow and arrow. Our affections are so malleable, our brains so eager to embrace the merest insinuation of humanity, that you can stick some googly eyes on a rock and it will immediately earn a place in our hearts. It’s kind of weird, but it’s who we are.
Our tendency to anthropomorphize everything from pets to geometric shapes to abstract concepts helps us make these things worthy of our care and sympathy. Children, who have little experience with non-human entities, will see the sun as smiley or the clouds as frowny because human motivations are what make sense to them. Anthropomorphizing can be a way of understanding something, of creating empathy.
The thing about being born with a human brain is that it’s easy to take it for granted. This is, after all, an instrument that can not only memorize African capitals and multiply small numbers but also create elaborate erotic scenarios between Rashida Jones and a young Marlon Brando while riding the bus. Yet, without a keynote presentation by some turtle-necked figure on a darkened stage enumerating its many features and applications, the brain can feel mundane, workaday.
One of the more overlooked features of the human mind is the ability to time travel. Close your eyes and you can feel yourself in your ninth-grade classroom or your childhood kitchen, reliving past triumphs and humiliations from the comfort of your living room. Episodic memory, as opposed to semantic memory, is the ability to instantaneously immerse yourself in a personal past, rather than the ability to recall that Abuja is the capital of Nigeria. You are inserting a past experience into the stream of your present consciousness. It is, when you think about it, a wondrous thing.
Last week, Pitbull’s “Timber” reached number one on the Billboard Hot 100. The song, which features Ke$ha singing “It’s going down, I’m yelling timber” over a faux-country harmonica hook, also topped the digital charts, downloaded by 301,000 people. The video of Ke$ha line dancing in a country bar and Pitbull petting sharks, swimming while wearing a suit, standing on a beach near a beautiful woman, and generally just having a good time being Pitbull has been seen over 57 million times on YouTube. “We about to clown. Why?” the Miami rapper asks rhetorically to a crowd of people convinced and utterly delighted by the answer: “‘Cause it’s about to go down!”
The track’s success brings up an important question: why do people listen to this?
I didn’t realize I was going to my first nostalgia concert until I was right in the middle of it, standing in the grass on a warm Toronto evening last summer listening to Broken Social Scene play 2002’s You Forgot It In People front-to-back under the shadow of the Gardiner expressway.
A decade isn’t such a long time, but it’s enough to get a sense of distance. Ten years in, memories have been fermenting just long enough to take on that potent “back-then” quality that is the prerequisite for nostalgia.
We live in the age of the public apology. Turn on your TV or open a magazine and, chances are, you’ll find somebody begging for your forgiveness, promising to be a better person.
What was once a sign of weakness has become a badge of moral strength. Corporations release official apologies when their factories in Bangladesh collapse. Celebrities embark on carefully organized, Oprah-approved contrition tours—shedding a tear, getting a stern talking to from someone on the Today Show, then huddling with their publicists to monitor their Q scores. Politicians offer a heartfelt mea culpa whenever a new dick pic pops up on an intern’s cell phone and are more than willing to liberally spread around the official apologies for past atrocities committed by their forefathers (Japanese-Canadians thrown into internment camps, residential school victims, Chinese immigrants who paid the head tax: we offer to you a belated but entirely heartfelt “I’m sorry”).
When Canada’s pandas arrived last year, they were met on the airport tarmac by Prime Minister Stephen Harper, various dignitaries, and an orchestra of Canadian children playing sweet music in their honour. The bears reacted exactly the way a panda reacts to most anything that happens in its vicinity—by sitting, blinking, chewing on bamboo, and looking blankly into the middle-distance. People whooped, cameras flashed. This is how celebrity works: your every snack and bewildered yawn becomes noteworthy.
The arrival was the culmination of years of political haggling—the panda dreams of prime ministers from Trudeau onwards at last made flesh. According to a recent study in the journal Environmental Practice by three researchers from the University of Oxford, it also marked the beginning of the newest phase in China’s “panda diplomacy,” the term used to describe the country’s practice of gifting adorable bears to countries in order to build strategic friendships.
Over the last few weeks, Canadians who pay even the vaguest attention to politics have been forced into an extended meditation on the concept of shame. How does it feel, really? Can we catalogue its infinite shades and varieties? And, most urgently, what happens in its total absence?
It seems impossible, but does Stephen Harper really not feel a twinge of embarrassment standing in front of parliament, furiously avoiding questions from MPs while trotting out hapless mopes like Paul Calandra? How does someone like Mike Duffy, who seems to have tried to charge taxpayers for meals he ate at home, manage to summon so much righteous indignation? And then, of course, there’s Rob Ford, who, during the council meeting that stripped him of some of his mayoral powers, mimed drinking and driving, bowled over a female colleague, and ventured over to the gallery to mock the taxpayers he claims to love so dearly. “Shame! Shame!” the gallery shouted. Ford didn’t respond, as if the word were a foreign concept in a language he couldn’t possibly understand.
Edward Furlong wasn’t looking for fame. In 1990, the 12-year-old was hanging out on the steps of the Boys & Girls Club in Pasadena when a casting director approached him. The kid had the right face. Would he think about auditioning for a role? The movie was Terminator 2: Judgment Day, and the unknown Furlong was cast as John Connor—the adorable scamp and future saviour of the world who teaches Arnold’s Schwarzenegger’s roided-up robot some questionable slang and, also, exactly why humans cry.
Furlong had never been in a blockbuster before. He’d never acted at all. Now, suddenly, he was everywhere—the heartthrob on the cover of teen magazines, a model for Calvin Klein and the Gap. He was big in Japan, singing a shlocky cover of a Doors song. He was in the video for Aerosmith’s “Livin’ on the Edge” as a teenage badass—crashing a stolen car and contemptuously throwing away the condom his dad had just given him. He was very, very famous.
Language is kind of like the economy; even though humans supposedly invented it, we don’t really get how it works. One idea that’s been around for a long time (since Ferdinand de Saussure, one of the fathers of semiotics) is that words are essentially arbitrary—the word “table” has nothing to do with the object it describes. It’s not that all things made of wood start with the letter t, or that the cross on the t is supposed to look like a tabletop. Even in languages that do have a comparatively high degree of iconicity—American Sign Language, Chinese—some words look like the thing they describe, but many do not.
Even before the current crisis, Egypt had a problem: it has long been a populous nation with an iffy economy. Mubarak was all about cajoling Egyptians into having fewer children; in the 1990s, state television ran ads with slogans like “Before you have another baby, secure its needs.” And since the 1960s, Egypt’s birth rates have declined, aside from a spike in 2012.
A new study in the journal Demography, conducted by a team of researchers from Atlanta’s Emory University, the University of Chicago, and John Hopkins University, examines childrearing from an “investment” perspective. The research for this article was completed before the 2012 spike in Egypt’s birth rate, but the question they ask is still relevant: What makes people put resources into their children, and will they get the return on this investment that they expect?