Though cyberwar and cybercrime may seem like a recent development, it's been a major concern for governments around the world since the early '70s. What started with annoying chain e-mails that touted get-rich-quick schemes and better sex has evolved into international breaches of security and impressive feats of cyber-stealing. To mark today's publication of Black Code: Inside the Battle for Cyberspace, and our interview with its author Ronald Deibert, we assembled this history of cyber-shenanigans.
When you’re an awkward person, social situations require strategy. One of mine: reading lots online so that I can contribute to conversations, or maybe even offer up an interesting anecdote. The trouble, though, is that given the vast, overwhelming morass of things to read online, how do we know what’s good?
That question has plagued us since Internet media first became popular, and the progression of answers over time is like a series of photos of the ways in which our relationship to the web has changed. First came the search era, in which the Internet was an open treasure trove of information to be actively delved into by the brave and skilled. Then, it was all about aggregation, in which algorithms and sites like the Huffington Post did the sifting for us. Next came the social phase, where the filtering was left to the wisdom and whims of our friends. Now, however, it seems we are finally entering the next stage—and it looks a lot like the revamped Digg, and a newer platform called Medium.
A year ago, Paul Miller believed he was being corrupted by the Internet. But as it turns out, his enemy isn’t technology; it’s William Wordsworth.
In 2012, Miller, a writer for tech site The Verge, embarked on a stunt almost perfectly suited for the times: for one year, he would remain “off the internet.” This week, he returned with a long, intriguing post in which he reflected on his time offline, but you need only read the first line to gather the gist of what follows: “I was wrong,” says Miller.
Marketing has long since ceased shilling the virtues of a product. It is now about conjuring an ethos, then associating a product with that idea. If you want to get a sense of how to do that spectacularly wrong, you need look no further than Microsoft.
Witness its latest abomination, a Windows Phone ad in which a wedding party descends into chaos after iPhone and Android users exchange barbs. Amidst the ensuing food-fight madness, two attractive wait-staff—the reception was in the church, apparently?—comment on how the fight is futile because… Windows Phone! It’s a thing that exists! And may or may not have features you want! We don’t really know, because they don’t say. As a friend on Twitter pointed out, even their legal disclaimers are weird: “Do not attempt” appears beneath the nuptial brawl. Uh, thanks Microsoft, I’ll keep that in mind.
Who could have imagined, even a year ago, that we’d be talking about the end of Apple’s ascendancy? Yet, shockingly, here we are. With its stock price having fallen 40% from its all-time high, year-on-year profit dropping for the first time since 2003, and only vague promises of new products on the horizon, the once-unthinkable might now be true: Apple’s moment may be over.
Let’s not exaggerate, though. In a quarter in which the iPhone finally saw weakening demand, Apple still made a cool $9.5 billion in profit. What is new, however, is that its rapid expansion is probably over, and it’s quite plausible that the company will become the next Microsoft: huge, rich and mostly predictable.
Google Reader was a map; the mass of online information was the territory. It was a way of making sense of that which, by rights, should deny any attempts at organization. That’s the best way I can explain what, to the outsider, must have seemed a baffling explosion of outrage Wednesday night when Google announced it was shutting Reader down.
The service in question is an RSS reader. Almost every website (including this one) has something called an RSS feed which is like on ongoing ticker of new published pieces. An RSS reader lets you take those feeds—whether from the ten sites you read regularly, or the 400 you try to keep up with—and aggregate them all in one place.
So what’s the big deal?
Let’s say you’re a parent, and when you sit down to play video games with your daughter, she says she wants to be the Princess, not Super Mario. If you’re like most mums and dads, you merely lean over and explain, “Jesus! I’m not a freakin’ wizar—um, sorry baby, it’s like sexism and stuff. Use your imagination.” But if you’re Mike Mika, you stay up late and actually hack the game so your daughter can play the heroine and rescue the plumber, instead of the other way around.
It’s a heartwarming tale that has made its way around for the obvious appeal of a father’s impressive, innovative dedication. But it also comes at a vital moment in which, finally, gaming is starting to grapple with its longstanding gender problem.
“Wow, she’s so real!” That’s the sentiment it’s been impossible to escape since Jennifer Lawrence’s disarming performance at the Oscars a little while back. It’s now also being heard in relation to the alternately charming and excruciating Mila Kunis interview that has been passed around so much your grandma is probably emailing you a link to it as we speak.
So here we are at the new dawn of “post-celebrity.” Whereas once what we sought from the famous was their very preternatural unrealness, the age of participatory media has wrought something quite different: stars who are like us, stars like Lawrence do a shot when celebrating, or like Kunis, gesture toward the performed, rote nature of the whole shtick.
Quitting Facebook, as writer and entrepreneur Rex Sorgatz once said, is the new “I don’t own a TV!” Like dismissing the idiot box, declaring your abstention from the social network is a kind of performance: here, the Facebook Quitter says ostentatiously, this is the kind of person I am. And in that sense, Facebook isn’t just the new TV, it’s the new suburbs, too—the place you leave when you are lucky enough to loudly and showily leave the mainstream behind.
That is perhaps uncharitable, though, As far as I can tell, there are roughly three reasons people leave Facebook, in descending order of legitimacy: that, like Douglas Rushkoff and many others, you have genuine concerns about the network’s policies or its culture of surveillance; that you are getting overwhelmed by the frippery of Facebook friends you hardly know; or that you simply think it’s gauche.
A few months ago, while emptying out some long-neglected moving boxes, I found an old business card of mine tucked away in a novel. Sure, I felt a twinge of nostalgia for the big box computer store I worked at in the late nineties. But as an avowed ‘digiphile’, I was also struck by the fact that, with ebooks, you rarely find within them these odd ephemera of your past—unless, I suppose, you have the habit of keeping notes into your iPad case.
Smartphones are many things: impressive, complicated, distracting and useful. But if Google co-founder Sergey Brin has anything to say about it, we need to tack on another adjective: emasculating.
Brin made the baffling assertion at a TED talk on Wednesday in Long Beach, California. He claimed it was so because with a smartphone, “you're standing around and just rubbing this featureless piece of glass.” The obvious implied contrast was with Google Glass, the augmented-reality glasses that Brin and Google have been pushing hard in the last couple of weeks, and I wrote about here at Hazlitt last week.
Laughter, we all know, comes in many forms: belly laughs, chortles, snickers and guffaws. But if you’ve never heard what cynical, resigned laughter sounds like, just do this: Find a group of twenty-somethings somewhere in North America and ask them if they feel confident about their capacity to find stable, high-paying jobs in the next few years. Trust me: the bitter, knowing chuckles will last a solid minute.
As it turns out, though, the general feeling about employment that simmers in conversation across this country isn’t simply anecdotal. According to a recent study by the United Way and McMaster University, those who like the far-out idea of knowing where the next paycheque is coming from are increasingly out of luck. In the Greater Toronto Area, nearly half of all workers are in some form of “precarious employment” in which they lack security, benefits—or as I call it, ‘the kind of stability that helps people not go crazy’.
Have you ever glanced at someone completely lost in their smartphone and despaired over the insidiousness of modern tech? If so, you may want to sit down for this. The next big thing is going to be wearable computing—and it’s exactly what it sounds like. It’s an idea that has popped back into the cultural ether recently because of two products, one real, the other only a rumour. The first is Google Glass, a distinctly Star Trek-like idea from the search company that places a tiny screen on the right side of some spectacle frames.
Tell me if this sounds familiar: an established long-form medium that emphasizes depth is now under threat from new technology that tends to privilege bite-size experiences. Proponents of the older format worry the shiny new thing will chip away at the cultural importance of their preferred medium, while tech bloggers breathlessly talk about the inevitability of what comes next.
I’m talking about books, right? Or magazines? Nope. I’m describing the contemporary world of video games.
It may sound strange to link these two fields. But not only do the game and book businesses have more in common than you might think, there might also be lessons for the literary world in the travails of gaming.
“Binge-watch.” That’s the phrase it’s been impossible to escape recently after Netflix released House of Cards, their first original series, at the beginning of the month. Unlike traditional TV, season one of the online-only political drama was released in its entirety at once, and chatter of people gorging themselves on multiple episodes ricocheted around the usual channels, often eclipsing discussion of the show itself.
Maybe this is unsurprising. For some time now, Netflix has been emblematic of how, thus far, the emphasis of digital tech has been on how we get media, rather than the media itself. There are notable exceptions—video games, certainly—but by and large, it’s the distribution and consumption of media that has changed. It thus feels worth asking: what does it mean that we are producing delivery mechanisms that allow for binging—and then encouraging the indulgence?
A drop in e-reader sales and a surge of interest in tablet devices forces a reconsideration in how we've been thinking about the book's future over the past few years.
Now that everyone is talking about Vine, the hot new app that lets you share six second video clips, the main question I have is this: what exactly took so long? Like one’s first few hesitant hours with Twitter, using Vine elicits an initial “why on earth would anyone want this?” that quickly gives way to a feeling it was simply inevitable. Here we have the audio-visual equivalent of the tweet. It sounds like nothing, but make no mistake: Vine will be a big deal.
Just a few short years ago many believed the Internet represented the end of mass culture. Faced with ability to pick and choose what they read and saw, people would use the web to simply sequester themselves in their own, personalized version of the world. Thus far at least, nothing could be further than the truth. If you’ve spent any time at all online in the past few days, you know this very well: awards season has been all but impossible to escape, regardless of where your browser happens to land.
Far from precipitating the end of the mainstream, social media has helped solidify certain popular events like the Oscars or the Olympics as our own versions of shared experience. Yet, the outpouring of feeling and thought during these moments forces one to wonder: Why do we do it? Why do we throw our thoughts about these happenings into online spaces often filled with casual acquaintances and strangers, only to have them lost amidst a rushing torrent of similar speech?
Five years ago, no one could've predicted the scale, shape and substance of today's Internet, from how social networks function to the invasion of surprisingly conventional forms of commercialization into the web's every corner. What of the web's once bold promise of social transformation? We hash out what's happened with Tim Carmody of The Verge.
Memory, for better and for worse, is like a highlight reel. That torrid first kiss in the silence of a bitterly cold winter night is likely burned forever into my brain. Last week’s cheese sandwich? Not so much.