Let’s say you’re an utterly humourless individual with a zeal for self-improvement. Not to worry! A quick Google search for “how to have a sense of humour” leads you to a wikiHow page that helpfully lays out how to do exactly that in just six steps. When is a good time to laugh? As someone new to laughter, will I find some things funnier than other things? Should I “take in a funny movie or YouTube clip on occasion”? All questions are answered. “Having a sense of humor is one of the greatest assets a person can have,” the guide asserts. “If you don’t have a sense of humor, you have a lot to learn!”
Written like an instruction manual for being a functional human, the guide is as bizarre and useless as it sounds, with suggestions like “making bad puns such as those in Airplane (‘I’m not kidding, and don’t call me Shirley’) can be used anywhere.” What’s interesting is how quickly the how-to guide bumps into serious complications in defining what we usually assume is a basic, easily understood concept.
In a Super Bowl commercial from 2012, an adorably shabby rescue dog is trained to retrieve Bud Lights on command. He brings his owner a beer. Then he drags over a couple of cold ones for the owner’s friends. Some pretty girls arrive, and the dog is sent to the fridge to beer them, too. At the end the dog does a keg roll, adorably. Everyone enjoys a Bud Light. Scene.
Ask me how I feel about the ad and I might say that it’s kind of weird to celebrate the enslavement of a rescue dog—should pets be forced to become our bartenders?—and then maybe I’d add something sniffy about light beer drinkers. But my brain would tell the truth. And the truth, according to a soon-to-be-published study in Nature Communications, is that my neurological reaction is likely the same as everyone else’s—a clear sign that the ad is bound for glory.
The bodies of the passengers of Malaysia Airlines Flight 17 began arriving at Eindhoven airport in the Netherlands yesterday—40 simple wooden coffins unloaded from military transport planes while a single bugle played on the tarmac.
The ceremony marked the end of a trip that had been, to that point, significantly less dignified. Shot out of the sky, the bodies had been left for days in a Ukrainian wheat field while the sun beat down and untrained volunteers, townspeople, and coal miners picked through their belongings. They’d been squabbled over by rebels, packed into black plastic bags, stacked onto refrigerated trains that gave off the powerful stench of decomposition.
For the relatives of the passengers, the delay has been excruciating. “If I have to wait five months for identification, I can do it,” said Silene Fredriksz-Hoogzand, the mother of one of the victims. “Waiting while the bodies were in the field and in the train was a nightmare.”
Last week, Islamic State militant leader Abu Bakr al-Baghdadi was caught on tape railing against western decadence while seemingly wearing a luxury wristwatch (his supporters quickly countered that the watch was, in fact, a cheaper Saudi Arabian make). That same week, the Wisconsin Republican party attacked the Democratic gubernatorial candidate for calling for an end to out-of-state campaign donations while accepting a million bucks herself, the British Education Secretary was criticized for demanding low-cost schools and then approving a fancy new headquarters, and Ottawa sex workers briefly considered outing a specific group of clients—Conservative MPs currently pushing through a harsh anti-prostitution bill.
Throughout all of this, Rob Ford continued to live and breathe, speaking words and performing actions in perfect opposition to one another, rumbling through an election campaign as if propelled by the electromagnetic force of his perpetual hypocrisy.
In 1991, the FBI ran a background check on Steve Jobs, who was being considered for a position on President George H. W. Bush’s Export Council. Of the more than 29 people the FBI interviewed, each agreed that Jobs, then the President of NeXT and the CEO of Pixar, would be excellent in the position. He was a tech industry visionary, an incredibly creative individual with foresight and charisma.
There were, however, concerns about his integrity. As two former Apple employees said: “Jobs possesses integrity as long as he gets his way.” One mentioned the daughter he had denied was his own until a paternity test proved otherwise. According to the report: “Several sources questioned Mr. Jobs’s honesty stating that Mr. Jobs will twist the truth and distort reality in order to achieve his goals.”
Why does Luis Suarez bite? It is a bottomless mystery, a puzzle that inspires sportswriters to take on quests to find the Uruguayan’s childhood foes and tempts pundits into misguided bouts of psychoanalysis. “Perhaps his biting started in childhood and was triggered by something, perhaps he was bitten in turn,” The Telegraph mused. “Human bites were the third most-treated kind of mammal bites in the emergency room,” Motherboard helpfully reported. Is Suarez football’s answer to the fable of the scorpion and the frog? Does he lash out, destroying himself in the process, because it is simply in his nature?
Yesterday, FIFA provided an answer to a more pragmatic question: what do you do with a biter like Luis Suarez? It announced that the striker has been suspended for nine matches and will be banned from participating in “soccer-related activity” for the next four months.
One of the more predictable pleasures of the World Cup—as comforting in its inevitability as the sight of the first Portuguese flop or the final Englishman in tears, pink face crumpled in bitter disappointment—is watching as a certain segment of the American pundit class works itself into a fit over all of this so-called “football.”
The anti-soccer diatribe is a venerable American tradition, a roomy enough genre to include the casually homophobic comments of sportscaster Jim Rome (“My son is not playing soccer. I will hand him ice skates and a shimmering sequined blouse before I hand him a soccer ball”) as well as the unhinged xenophobia of G. Gordon Liddy: “This game, I think, originated with the South American Indians, and instead of a ball they used to use the head, the decapitated head, of an enemy warrior.”
Imagine that you’re standing on a footbridge overlooking a train track. Beneath you, a small train is hurtling towards five unsuspecting people. The only way to save them is if a heavy object blocks the train’s path. As luck would have it, you are standing next to an extremely fat man. Do you push the man, killing him, but saving the other five? Or do you do nothing, spare the overweight innocent, but let the five others get mangled by the oncoming locomotive?
The dilemma is a variation of the “trolley problem,” a classic ethics thought experiment that has been used for decades, with various tweaks. What’s going on in your brain when you make this decision? You’re relying on some sort of moral code, but how exactly are you weighing your options?
In a study published last month in PLOS ONE, Albert Costa of the Pompeu Fabra University and University of Chicago psychologist Boaz Keysar argue that your response to these kinds of moral dilemmas may not be based on well-considered logic or deeply held values, but could be influenced by something as seemingly frivolous as whether or not you’re confronting the problem in a foreign language.
Here’s a puzzle: take a book of matches, a candle, and a box of thumbtacks. Now, figure out how to attach the candle to the wall so that it doesn’t drip all over the floor. Do you melt the side a little and attempt to stick it in place? Do you try to pin it to the wall? Maybe create a makeshift holder out of matchsticks and some wax?
The Duncker Candle Problem, as it’s called, is a cognitive behaviour test that’s been around since 1945. It’s used to measure your capacity for creative thinking, and, according to a 2013 study, your ability to solve it may be related to how racist you are.
An inability to MacGyver a candle holder out of a few odds and ends is not obviously related to casual racism, but Carmit Tadmore of Tel Aviv University and her colleagues believe they share a common mechanism: a reliance on rigid, categorical thinking. In a study published in the journal Psychological Science, the researchers conducted a series of experiments to test whether views about race changed affected people’s mental flexibility. Does a belief in racial essentialism make you less creative?
Adolescence is basically a morass of self-consciousness. It threatens to swallow every school dance, every class presentation, every public dinner with your increasingly embarrassing parents.
A “fun” class activity in which everyone’s invited to bring in a favourite song quickly becomes an anxiety-inducing referendum on taste—an instant assessment of teenage cultural capital. A simple walk through a high-school hallway filled with older, cooler people can suddenly be hijacked by a pang of acute self-awareness—a hypersensitivity to the fundamental mechanics of your body, as basic functions usually handled by some autopilot section of your brain suddenly demand careful monitoring. Is this how you’ve always walked, with this weird strut? What are you doing with your lips? Is this creepy half-smile supposed to communicate that you’re lost in reverie, reflecting on some charming anecdote from your past? Wait, are you whistling now?
Do I remember exactly what was flashing through my mind when I first saw the video of that tiny hamster eating its tiny burritos, each miniscule, lovingly crafted snack gripped between the creature’s front paws before being tidily devoured? I do not. It passed in a blur. I can only say that I clicked the link, saw exactly what had been promised (a tiny-ish hamster eating a truly tiny burrito), and immediately showed it to my girlfriend.
Obviously, I wasn’t the only one. The YouTube clip already has more than six million hits. It’s gotten Twitter endorsements from Jimmy Kimmel and Ellen. It’s been posted and reposted on Gawker, HuffPo and the rest of those aggregation sites that promote the same viral content with either with an ironic smirk or a burst of enthusiasm, depending on the audience.
The video is a viral hit, sure, but why exactly did it catch on?
Lies stick around in politics. Run an implausible attack ad, call your opponent a monster, and even after you’ve been thoroughly debunked, the negative feelings will linger. Fact-checkers and reporters can busy themselves correcting the record and calling out exaggerations all they want, but it’s impossible to fully undo the effects of a well-placed fib.
There have been plenty of studies about the way erroneous negative information about a politician creates feelings that last well after that information has been discredited. In one 2007 study by political scientist John Bullock, test subjects read about a fictional candidate with unpopular views on the environment and education. Later, they were told that the information was false—that the experimenter had simply made it up. Despite this, the participants still maintained a disapproving attitude towards the candidate. It didn’t matter that the facts had changed; the emotions remained.
Here’s a scenario: imagine you’re given a pie and told to share it with a stranger. The knife is in your hands, you can divvy it up any way you choose, but with one caveat: if the stranger doesn’t agree to the split, the pie will be taken away and you’ll both be left with nothing. What’s your offer? How equal a share do you propose? And, if you’re the stranger, how do you respond if someone offers you, say, a quarter of the pie? What about a single measly slice?
The pie quandary is the essence of something called the “ultimatum game,” a common tool in economics experiments. In a perfectly rational world in which each player is out to get the biggest reward, the best move for the responder is to accept any offer. Assuming she will play this way, the way for the knife-wielder to take home the biggest slice is to make the smallest offer possible. Even a sliver of pie, the thinking goes, is better than no pie at all.
Studies have long found, however, that this is not how people actually play. The person in charge of the split often offers far more than necessary. And responders frequently reject offers, even though it means leaving empty-handed.
When LeBron James and Chris Bosh came to the Miami Heat in 2010, the expectation was clear. They would win championships, plural—not one, not two, not three, not four, but some other, embarrassingly high number they would surely regret ever mentioning in their premature victory parade.
The assumption was that James—and, to a lesser extent, power forward Bosh—would provide value to the team not just through their own productivity, but through the effect they would have on those around them. A good player, after all, makes his teammates better—the rare sportswriter cliché often backed up by the numbers. In the NBA, where superstars are rare, the surest way to improve an organization tends not to be patiently gathering valuable, mid-tier employees; it’s luring a star to your city and letting him transform the organization from the inside.
The belief in the transformative power of stars goes well beyond the NBA; it’s reflected in the outrageous salaries for CEOs and the scramble for talent in Silicon Valley. “Someone who is exceptional in their role is not just a little better than someone who is pretty good,” Mark Zuckerberg told The New York Times in 2011. “They are 100 times better.”
In the early 2000s, Dan Pallotta was the head of a humming 300-employee business working out a 47,000-square-foot office in Los Angeles. The company produced charity events—“AIDS Rides” and “Breast Cancer 3-Days” that Pallotta invented and marketed, raising $305...
Read through enough social studies experiments and your view of humanity inevitably grows more jaundiced. We all know our species has its foibles and shortcomings, but modern researchers have found ways to tease apart each shameful emotion—honing in on the specific biological mechanisms that make us jerks, cataloguing and quantifying our base nature. A tumor is awful from a distance, as a generality. Zoom in closer, perform a few studies, and you confront the cells in all their uniquely horrific, malignant glory.
Take flattery. Academics define it as “lavish praise that is offered in an ingratiation setting”—a nice compliment when the person giving it may have something to gain. A closer look, though, reveals that a complex series of emotions, most of them unsightly, bubble to the surface whenever someone says a kind word about another human being.
For all of humanity’s collective awfulness—our vindictiveness and stupid hats, humblebrags and genocides—there have nonetheless been moments in which we’ve proven capable of genuine decency and compassion. We can be good. In fact, on multiple, documented occasions, one human has helped another without expecting anything in return.
This is altruism, a quirk of evolution that leads one human being to give a coconut to a sick neighbour, never mind the cost to himself. It’s one of the traits that has helped us thrive, allowing us to cooperate with people well outside our immediate families or tribes to build the kind of large-scale societies that go far beyond what other creatures have managed. It’s that sense of common humanity that makes us not only successful, but tolerable, and sometimes even sympathetic as a species.
Consider a situation: a number of people walk into a restaurant at the same time. The server attends to all the white customers. The last customer served happens to be the only person of colour. Is this racism?
What about something more general: how would you describe the portrayal of African Americans in the US entertainment media? Is there racism there?
It’s long been understood that how you respond to these kinds of questions is heavily influenced by the colour of your skin. Numerous studies have shown that people from the dominant racial group perceive far less racism in mainstream society than people from subordinate racial groups. The real question is: who is seeing things as they really are?
Early on the morning of February 20, 1976, two police officers approached a green Camaro parked at a rest stop near Pompano Beach, Florida. Inside, Jesse Tafero and his partner Sonia Jacobs were sleeping, along with Jacobs’ two young children. Tafero’s friend, Walter Rhodes, was asleep in the driver’s seat.
Exactly what happened next remains unclear. What’s certain is that one of the patrolmen asked the two men to step out of the vehicle, shots were fired, and both officers were killed. Tafero, Rhodes and the family tore down the interstate in the stolen police car. When they were stopped at a roadblock, the gun used in the shooting, which was registered to Jacobs, was in Tafero’s waistband.
Rhodes immediately struck a deal. In exchange for a lighter sentence, he testified that Jacobs and Tafero were solely responsible for the shooting. The couple, meanwhile, insisted that Rhodes, who was on probation, had panicked, shot the officers, and then given Tafero the gun while he drove away. Someone was lying, but with little physical evidence, it was difficult to know which story was true and which was false.
It is a sad paradox of modern existence that on a planet thick with humans—a place chock-full of them—so many are so desperately alone. A recent survey found that more than a third of Americans over 44 are lonely, and almost half of them have felt that way for more than six years. Here we are, desperate mariners floating through a sea of humanity—people everywhere but not a one to have a casual drink with on a Thursday evening while chatting about the latest episode of True Detective. What are we doing wrong?
There are plenty of potential reasons for this state of affairs, enough theories to fill sociological textbooks and fuel a thousand think-pieces. Is the big anonymous city isolating us? Whatever happened to bowling and community? Is it Facebook’s fault? One common explanation, the scapegoat in plenty of vaguely countercultural movies and high-school pot-smoking bullshit sessions, is materialism. Call it the Fight Club Thesis: our love of objects is making us sad.