Surveillance, Good and Evil
April 15, 2014
Social physics is an emerging (and ominous-sounding) discipline that wants to “connect the dots” of our data—but, ideally, as a force for good.
When people say they’re terrified of the future, it’s usually because of Big Dog, or Google Glass, or maybe, one day, the horrific prospect of having to pay for porn again.
But there are certainly worse things—namely, the idea that everything from the tone of my voice to the way I move my hands might be measured, tracked, and distilled into a softly pulsing smartphone notification meant to remind me that maybe I’m not engaged enough, or not working optimally within a team.
The thought is both fascinating and terrifying, mostly in a how-the-hell-do-I-preemptively-opt-out-of-this-at-once sort of way. Because what is life, now, if not a trove of Big Data, all of our actions meticulously tracked? Cell phone companies know our routines, and credit card companies know what we like. You can’t lie to your friends about the books you’ve read because Amazon knows the truth.
But here’s the thing: most of that data exists in isolation. It’s siloed. The liquor store may know how often I’m buying wine, but not the number of times I’ve listened to Sea Change or how often I’ve called my ex after midnight this past week. If you could connect the dots, what sort of insights might you arrive at?
That’s where things have the potential to get truly scary. And it just so happens that Alex Pentland, the founder and former director of MIT’s Media Lab—a multidisciplinary research lab for emerging, unconventional technologies—is already connecting those dots.
Social physics, as Pentland calls it, uses digital signals and internet sources to track our relationships, our interactions, how we work, and how we live, tying seemingly disparate data together in new and interesting ways. In his new book, Social Physics: How Good Ideas Spread—The Lessons from a New Science, Pentland presents this study of connections as a predictive, computational algorithm for, well, us. In one example, he and his team use location data to track the movements of people through a city, while sensor data from smartphones is used to track behavioural changes in tandem. Put those two sources together, and you can reliably (astoundingly? terrifyingly?) track the spread of flu.
Pentland terms these types of experiments “living labs,” and in conducting them he presents an interesting tradeoff—a little slice of privacy in exchange for the betterment of our lives.
Of course, these sorts of experiments are already taking place, just against our will. Intelligence agencies are scooping up call records, location data, and text message logs, drawing who knows what sort of insights. In a era when people are more sensitive than ever about being tracked and surveilled, to volunteer even more of our data is an interesting request.
But “volunteer” is key. We’re so used to the thought of our data being tracked against our knowledge that we’re not able to conceive of what we’d do with it ourselves. And choice, to Pentland, is key: we should be able to choose who to share our data with and how—or not to share it at all, or only for a limited time. The hope is that you might decide to share some of your data with people like him. After all, what is our data nowadays but currency for a transaction or exchange? We give apps access to our cameras and microphones because it’s part of the deal.
In describing this so-called “New Deal on Data,” Pentland writes that “maintaining protection of personal privacy and freedom is critical to the success of any society,” and acknowledges that mechanisms for collecting, storing, and sharing data would have to be overhauled. Once we liberate that data, Pentland argues, and know exactly how, when, and with whom it’s being shared, we’ll be able to make more informed—and hopefully, altruistic—decisions about its use. What if scientific studies could track the exact amount of sugar we ate? Would we be better at preventing child abductions if our every move were trackable? Would we be more inclined to use less energy if we saw that our neighbours were, too? All of this data in aggregate could reveal behavioural patterns yet unseen, and enable us to change the world for the better.
If it sounds too good to be true, you’re right. “We don’t yet have any consensus around the trade-offs between personal and social values,” Pentland admits, and so social physics draws our suspicion. We’re happy to give Google permission to scan our inbox and serve us targeted ads in return for free email. But when it’s our health records or location data at stake, some vague, distant notion of societal good may seem too abstract a reward.
Social physics is a gamble, because it means thinking about privacy and surveillance and the collection of data in a different way than we currently do. It also means thinking differently about our data itself: as an investment rather than an exchange, one that may yield results in the form of societal change, but just as likely may not. It’s the age-old question of whether we ought to value personal liberties over societal good, but with a version update and a couple of patches for the Internet age.
But even if you don’t agree with the means of social physics, Pentland’s New Deal on Data would allow you to opt out at any time. And that, at least, is progress—the freedom to not be tracked, and to work un-optimally on a team.