Don’t read the comments! So goes the ubiquitous online exhortation warning readers away from the bile in the boxes below, now so common that it has its own Twitter account.
If only that view weren’t so mistaken. Lost in the broad strokes of that puritan refrain is that the space under a news story or blog post can be awful or it can be brilliant, a seething mass of hate and idiocy or a veritable kaleidoscope of crackling ideas and debate. For me and fortunate others, the comment section has so often been the latter of those binaries—a place that feels more like home than chaos. In conflating the good and the bad, that pernicious phrase of “Don’t read the comments” erases this crucial aspect and more.
Three intriguing transportation technologies have floated past in the fog of headlines this month. First, South Korea unveiled an electric bus that is powered by a magnetic coil buried in the road. Second, nominal plans for a Jetsons-looking pod transit system for Tel Aviv apparently got closer to reality. And of course, on Monday, Elon Musk unveiled his notion of the Hyperloop.
I’ve listed these ideas both in ascending order of awesomeness and descending order of likelihood that you’ll ever actually ride one. That is, in 2050, I suspect a great many of us could be riding on electrically charged buses in areas where people are too precious to allow overhead wires (the trolleybus is a thing, look it up), but I suspect almost nobody will be riding elevated podcars on overhead rails or, alas, lining up to be shot at intercontinental speeds through an almost-airless, windowless coffin tube.
This summer, Guillermo Del Toro’s Pacific Rim continued the illustrious Hollywood tradition of movie computer interfaces that look totally awesome and that no sane person would want to use. I don’t know about you, but if I was in the middle of helping building-sized robots fight monsters—and who knows, with climate change continuing apace, anything’s possible—I’d probably want to do something a little more precise than make what may or may not be circles in the air.
For all the silliness of Minority Report-style interfaces, though, their ubiquity in film makes sense: They are, after all, more visually arresting than, say, someone banging away at a keyboard. Yet, now that similar interfaces have started to infiltrate the real world—first with Microsoft’s Kinect, and most recently with the recently released Leap Motion—it’s also becoming clear there’s more to our affinity for these new modes of interaction than our appreciation for the whims of Hollywood’s VFX artists. Instead, the excitement over motion control seems to be about getting to “touch” the things behind the screen—as if what we really want is to break the barrier between the digital and the physical.
Sylvia broke up with me a few months after we’d both started university, over the phone, with a simple, curt phrase: “He kissed me and I kissed him back.” A couple weeks later, feeling like I should try to move on, I threw away the tiny figurine of Winnie the Pooh’s Tigger she’d given to me as a gift. I now can’t for the life of me remember why she had given me that or why it was important.
That was the last time I’d be involved with a woman without the Internet being involved somehow, with its record of emails or social media posts or digital photographs—maybe something that would explain why a statue of a cartoon tiger was significant. Gaps like that in my memory have now turned me into something of an obsessive documenter. From pictures of meals I’ve cooked to email conversations from a decade ago, I hoard digital markers of memory. But when you can record and save everything, you’re also confronted with a difficult question: what do you need to remember and what do you need to forget?
That can’t be right, can it? It says here that I’ve played the mobile game Real Racing 3 for 40 hours. That’s a full work-week spent I’ve spent racing virtual cars. Staring at the figure, it is difficult not to recall The Simpsons’ Comic Book Guy who, only at the moment before being hit by a nuclear missile, realizes it’s possible there may have been more productive ways to spend his life.
Nevertheless, my guilt at that statistic and my experience playing the game seem to be two separate things. Driving a digital Porsche 911 GT3 endlessly around the same tracks has a strange hypnotic effect. The curves of the courses start to become imprinted on the mind so that, spinning through them, one feels a bit like a child being read a favourite bedtime story for the hundredth time: the familiarity is the point.
Wandering around London’s Tate Modern gallery a few years ago, I found myself starting to bore—until I saw Bruce Nauman’s “Double No.” The installation was two screens of looping video in which a jester jumps up and down while saying “no” over and over again. It had a weird effect: first you smirk at the frustrated figure, then you start to be a bit put off by it looping, and then you finally start to feel disturbed, as the image of a peevish, childish clown starts to remind you of every selfish, angry adult you’ve ever known.
It’s the looping that made it “art,” of course. But now that looping video has become so common in gifs and Vines, it seems worth thinking about what looping does to our experience of video, and whether or not Instagram’s decision to have its new video feature not loop might be an inadvertent stroke of genius.
The way many feel about books, I feel about video games. We tend to think that it’s games that are the menace to books, but they’re actually in the same boat: just as some argue video- and web-based forms of culture threaten to supplant the sustained attention of reading, quick and often superficial handheld games threaten the more traditional long-form gaming I was raised on. Basically, you damn kids get your Angry Birds off my pixelated, Mario-filled lawn.
So you might say that I welcome the less-than-stellar debut of new much-ballyhooed game console Ouya with the kind of relish a certain kind of bibliophile might greet the shutdown of the Internet. Which is to say, it’s a perspective that’s as wrong as it is stupid, but as examples of ugly schadenfreude go, it’s one both the gaming and book worlds should pay attention to.
Ian Brown thinks the glut of digital photographs is destroying the mindfulness with which we capture the world. This year, the acclaimed Globe and Mail writer was an adjudicator for the 2013 Banff Mountain Film and Book Festival photography competition. As he related in a feature for the Globe, for the first time, Brown and other jury members couldn’t pick a winner, or even a runner-up, as not one of the entries even “managed to tell the simplest of stories.”
Brown theorizes as to why this is happening. One proposal: like addicts, we turn to Instagram et al and simply shoot to confront “the uncomfortable difficulty of actually seeing,” instead craving “the instant gratification and collective approval that the Internet deals out to us.” Another idea: “as we live less and less physically and more and more virtually, we take pictures as substitutes for the real.”
Though cyberwar and cybercrime may seem like a recent development, it's been a major concern for governments around the world since the early '70s. What started with annoying chain e-mails that touted get-rich-quick schemes and better sex has evolved into international breaches of security and impressive feats of cyber-stealing. To mark today's publication of Black Code: Inside the Battle for Cyberspace, and our interview with its author Ronald Deibert, we assembled this history of cyber-shenanigans.
When you’re an awkward person, social situations require strategy. One of mine: reading lots online so that I can contribute to conversations, or maybe even offer up an interesting anecdote. The trouble, though, is that given the vast, overwhelming morass of things to read online, how do we know what’s good?
That question has plagued us since Internet media first became popular, and the progression of answers over time is like a series of photos of the ways in which our relationship to the web has changed. First came the search era, in which the Internet was an open treasure trove of information to be actively delved into by the brave and skilled. Then, it was all about aggregation, in which algorithms and sites like the Huffington Post did the sifting for us. Next came the social phase, where the filtering was left to the wisdom and whims of our friends. Now, however, it seems we are finally entering the next stage—and it looks a lot like the revamped Digg, and a newer platform called Medium.
A year ago, Paul Miller believed he was being corrupted by the Internet. But as it turns out, his enemy isn’t technology; it’s William Wordsworth.
In 2012, Miller, a writer for tech site The Verge, embarked on a stunt almost perfectly suited for the times: for one year, he would remain “off the internet.” This week, he returned with a long, intriguing post in which he reflected on his time offline, but you need only read the first line to gather the gist of what follows: “I was wrong,” says Miller.
Marketing has long since ceased shilling the virtues of a product. It is now about conjuring an ethos, then associating a product with that idea. If you want to get a sense of how to do that spectacularly wrong, you need look no further than Microsoft.
Witness its latest abomination, a Windows Phone ad in which a wedding party descends into chaos after iPhone and Android users exchange barbs. Amidst the ensuing food-fight madness, two attractive wait-staff—the reception was in the church, apparently?—comment on how the fight is futile because… Windows Phone! It’s a thing that exists! And may or may not have features you want! We don’t really know, because they don’t say. As a friend on Twitter pointed out, even their legal disclaimers are weird: “Do not attempt” appears beneath the nuptial brawl. Uh, thanks Microsoft, I’ll keep that in mind.
Who could have imagined, even a year ago, that we’d be talking about the end of Apple’s ascendancy? Yet, shockingly, here we are. With its stock price having fallen 40% from its all-time high, year-on-year profit dropping for the first time since 2003, and only vague promises of new products on the horizon, the once-unthinkable might now be true: Apple’s moment may be over.
Let’s not exaggerate, though. In a quarter in which the iPhone finally saw weakening demand, Apple still made a cool $9.5 billion in profit. What is new, however, is that its rapid expansion is probably over, and it’s quite plausible that the company will become the next Microsoft: huge, rich and mostly predictable.
Google Reader was a map; the mass of online information was the territory. It was a way of making sense of that which, by rights, should deny any attempts at organization. That’s the best way I can explain what, to the outsider, must have seemed a baffling explosion of outrage Wednesday night when Google announced it was shutting Reader down.
The service in question is an RSS reader. Almost every website (including this one) has something called an RSS feed which is like on ongoing ticker of new published pieces. An RSS reader lets you take those feeds—whether from the ten sites you read regularly, or the 400 you try to keep up with—and aggregate them all in one place.
So what’s the big deal?
Let’s say you’re a parent, and when you sit down to play video games with your daughter, she says she wants to be the Princess, not Super Mario. If you’re like most mums and dads, you merely lean over and explain, “Jesus! I’m not a freakin’ wizar—um, sorry baby, it’s like sexism and stuff. Use your imagination.” But if you’re Mike Mika, you stay up late and actually hack the game so your daughter can play the heroine and rescue the plumber, instead of the other way around.
It’s a heartwarming tale that has made its way around for the obvious appeal of a father’s impressive, innovative dedication. But it also comes at a vital moment in which, finally, gaming is starting to grapple with its longstanding gender problem.
“Wow, she’s so real!” That’s the sentiment it’s been impossible to escape since Jennifer Lawrence’s disarming performance at the Oscars a little while back. It’s now also being heard in relation to the alternately charming and excruciating Mila Kunis interview that has been passed around so much your grandma is probably emailing you a link to it as we speak.
So here we are at the new dawn of “post-celebrity.” Whereas once what we sought from the famous was their very preternatural unrealness, the age of participatory media has wrought something quite different: stars who are like us, stars like Lawrence do a shot when celebrating, or like Kunis, gesture toward the performed, rote nature of the whole shtick.
Quitting Facebook, as writer and entrepreneur Rex Sorgatz once said, is the new “I don’t own a TV!” Like dismissing the idiot box, declaring your abstention from the social network is a kind of performance: here, the Facebook Quitter says ostentatiously, this is the kind of person I am. And in that sense, Facebook isn’t just the new TV, it’s the new suburbs, too—the place you leave when you are lucky enough to loudly and showily leave the mainstream behind.
That is perhaps uncharitable, though, As far as I can tell, there are roughly three reasons people leave Facebook, in descending order of legitimacy: that, like Douglas Rushkoff and many others, you have genuine concerns about the network’s policies or its culture of surveillance; that you are getting overwhelmed by the frippery of Facebook friends you hardly know; or that you simply think it’s gauche.
A few months ago, while emptying out some long-neglected moving boxes, I found an old business card of mine tucked away in a novel. Sure, I felt a twinge of nostalgia for the big box computer store I worked at in the late nineties. But as an avowed ‘digiphile’, I was also struck by the fact that, with ebooks, you rarely find within them these odd ephemera of your past—unless, I suppose, you have the habit of keeping notes into your iPad case.
Smartphones are many things: impressive, complicated, distracting and useful. But if Google co-founder Sergey Brin has anything to say about it, we need to tack on another adjective: emasculating.
Brin made the baffling assertion at a TED talk on Wednesday in Long Beach, California. He claimed it was so because with a smartphone, “you're standing around and just rubbing this featureless piece of glass.” The obvious implied contrast was with Google Glass, the augmented-reality glasses that Brin and Google have been pushing hard in the last couple of weeks, and I wrote about here at Hazlitt last week.
Laughter, we all know, comes in many forms: belly laughs, chortles, snickers and guffaws. But if you’ve never heard what cynical, resigned laughter sounds like, just do this: Find a group of twenty-somethings somewhere in North America and ask them if they feel confident about their capacity to find stable, high-paying jobs in the next few years. Trust me: the bitter, knowing chuckles will last a solid minute.
As it turns out, though, the general feeling about employment that simmers in conversation across this country isn’t simply anecdotal. According to a recent study by the United Way and McMaster University, those who like the far-out idea of knowing where the next paycheque is coming from are increasingly out of luck. In the Greater Toronto Area, nearly half of all workers are in some form of “precarious employment” in which they lack security, benefits—or as I call it, ‘the kind of stability that helps people not go crazy’.