Snooping after Snowden: Surveillance, Big Data and Anxiety

Snooping after Snowden: Surveillance, Big Data and Anxiety

MIT has hosted a series of thought provoking events about surveillance

We accept unprecedented surveillance, Dr. David Lyon told an MIT audience, because routine surveillance has become so much a part of day to day life. Delivering the 2014 Arthur Miller Lecture on Science and Ethics, Lyon, Professor and Chair of Surveillance Studies Queen's University, Canada, reviewed some of the revelations from Edward Snowden, placed them in historical context and outlined the corrosive harms to society from surveillance regimes.

Lyon defines surveillance as the "focussed, systematic and routine attention to personal details for the purpose of influence, management, protection or direction." Modern surveillance technology is nothing new, said Lyon, with its roots in the mid 1800s, with the invention of the telegraph, the telephone, and the camera. Even detailed tracking of customers began with department stores in the early 20th century. Lyon recounted awkward conversations he has with his colleagues in Queen's University's business school where they are talking about the same techniques. He calls them "surveillance", his colleagues call it "marketing."

The rise of digital technology does change things, Lyons said. Digital surveillance is less expensive, hence easier to deploy in mass, dragnet-style surveillance. Lyon quoted the CIA's Chief Technology Officer, Gus Hunt, as saying "[s]ince you can't connect dots you don't have, it drives us into a mode of, we fundamentally try to collect everything and hang on to it forever," something that would have been prohibitively expensive prior to digital technology.

The other way the digitalization of surveillance is what Lyon termed a "shift into the future tense." More and more, surveillance isn't used to detect current or past behavior, it's used to predict future behavior. This is the what "big data" is said to promise, whether it's prediction of consumer behavior, or, as Lyon noted, "pre-crime", referring to "Minority Report", a science fiction story and, later a movie, where individuals are punished for crimes they are about to commit. As computational criminologist Richard Berk told an MIT audience last fall about advances in his field, "We're not at Minority Report yet, but we're getting there."

The effects of widespread surveillance are pernicious and largely invisible, creating a process of "social sorting," the grouping of individuals into various categories based on the data acquired via surveillance. Viewed through the lens of surveillance systems, you are "someone who buys arugula," or "someone who talked to someone who talked to a terrorist," or "someone who attends a Mosque," or "someone who called an abortion clinic." When carried out by the state, Lyon pointed out, this completely vitiates the presumption of innocence, placing you in a class of people who are subject to even further scrutiny. This makes a mockery, according to Lyon, of the defense of surveillance that asserts that if you have nothing to hide, you have nothing to fear. Sorting places you in a class assumed to have more to hide, regardless of whether you have anything to hide or not. For some classes of people, repeated episodes of "sorting" lead to ongoing cumulative disadvantage. Arabs and Muslims, said Lyons, are the "usual suspects" for terrorism, regardless of the state of the evidence.

Lyon's proposed remedy for surveillance is legal reform, believing that security technologies are too difficult. Speaking at MIT, a place where future technologies are shaped, Lyon should have borrowed a page from Bruce Schneier, who has been reminding audiences of technologists that dragnet surveillance is governed by economics and that widespread use of encryption would render it economically infeasible, It is incumbent on technologists make security trivial, Schneier told a San Francisco audience, saying that "one-click encryption" is one click too many.

Lyon's talk, presented by MIT's Program in Science, Technology and Society, is one of a series of events at MIT addressing big data, privacy, and surveillance. Lyon's presentation stood in stark contrast to a Big Data Privacy Workshop co-sponsored by MIT and the White House the previous week. That workshop accepted the collection of data not only as inevitable but as a societal good, one that will save lives and promote healthy behavior. Privacy, in that context, was a matter of protecting data already collected by corporations and the state. This, Lyon noted, was a case of "digital dreams", the belief that more technology will make life sublime.

Kate Crawford, Principal Researcher at Microsoft Research's Social Media Collective, took another MIT audience through "Squeaky Dolphin", a large PowerPoint presentation created by Government Communications Headquarters (GCHQ), the intelligence agency responsible for providing signals intelligence for the British government, and leaked by Edward Snowden. Crawford, who is co-organizing the next White House summit on Big Data and Privacy, focused on what she termed the anxiety of those doing surveillance. The overall lesson revealed by "Squeaky Dolphin" is that no matter how much data is collected, it's never enough to truly understand or predict, that organizations of surveillance always want more data. This, she said, displays the anxiety of the powerful about managing the unpredictable, and a positive terror of the unforeseen. As intelligence agencies collect more and more but still can't achieve their objectives, the problems become epistemological, grounded philosophical questions of what is knowledge. And, as Crawford notes, referencing an exploration of objectivity, "all epistemology begins as fear."


Read CambridgeHappenings, a daily Cambridge news summary, curated from fresh, local sources.