June 9, 2016

Advocating for Online Privacy

Category: Digital Citizenship
Person looks through small opening in dark room

As a researcher who actively engages in tech policy, Seda Gürses considers how a variety of actors may disrupt online wellbeing. She also brings an international perspective to her collaborative work. As part of the Trust Challenge team launching the Center for Solutions to Online Violence, Gürses contributes her expertise as a computer scientist and privacy advocate. Now based at Princeton, she previously held positions at New York University and the University of Leuven.

In an interview with DML Central, Gürses mused about the fact that her earliest digital literacy experiences had been shaped by childhood experiences. “In the ’80s, my dad was stationed in Bulgaria for three years. I loved being in Bulgaria, but the relationship between Turkey and Bulgaria was tense and that had significant impact on our family. But, as a child, the political tensions were the backdrop. During our time in Sofia, I met people from other countries, and they shared with me things I never even knew existed.”

She described acquisition of an Atari 800 from an American family that lived in her building as a “life-changing experience.” Given her location, there were challenges to digital participation. “I couldn’t get floppy disks; the computer worked on a different voltage and the transformer that came with the computer would every so often blow out the fuses. The previous owners gave me a book for BASIC programming, but I had to rush to type in the program before the fuse went out.” This computer was her first step into a long engagement with computers. “I eventually got a Ph.D. in privacy and security in computing.”

Gürses explained that, in hindsight, the time in Bulgaria probably also informed her interest in computer privacy. “I guess one of the first interactions I was aware of was that my dad would communicate with Turkey through this big encryption machine.” Thus, she was much more conscious of “government secrets and spying and all that stuff” than most children. As she became more aware of issues about human rights and social justice, her interests in privacy matured: “I don’t think I was interested only in privacy; I was interested in how we can build systems that are fair just.”

Privacy, Participation and Politics

While her projects developed she became invested in approaches governed by “participatory design,” which she described as “how to work with communities to serve their needs” rather than develop “opportunistic” systems that work against them. She began by collaborating with free software developers and presented their work on creating an online university at the Wizards of OS conference in Berlin. She then expanded to other kinds of digitally engaged collectives. Gender and feminist critique of computing were also an important theme. As she worked on projects in Berlin and Bremen, including an EU project (DATAWOMSCI)  that was a study on integrating online directories that promoted women scientists she observed that there were “very different understandings in different countries” of privacy and the politics of visibility. “In Poland or Czechoslovakia, the directories would include the complete CV and contact details.”

In contrast, in Germany, because of privacy and reputational concerns, personal data that may identify the scientists were obscured. According to Gürses, “a person searching the German directory would search for topics of interest and then could ask for the contact information of researchers working on that topic. There was no way to search researchers by name or profile. The directories in each country were hardly comparable and integration would have to come at a cost. But, who bears that cost, who decides?”

The different manifestations of the directories illustrated how the desire to promote women by making their work visible had to do so by taking seriously the potential consequences of such visibility for profiling, reputational concerns, and harassment in different contexts. The ambition to connect all these directories had to deal with these frictions, a matter that could not be solved using purely technical means. (See the Global Database of Quotas for Women for more ways to think about this issue.)

Gürses explained her commitment to exploring new ways to reach new audiences about digital privacy. After all, the CSOV advocates taking novel approaches to teaching kids about online security to avoid the pitfalls of preaching and lecturing that don’t seem relevant or don’t get through. Recently, CSOV participant Mikki Kendall released her online comic Paths, a young adult comic about harassment as a way to invite deeper discussions.

Art as Advocate

For Gürses, Kendall’s work is exemplary of how art can be a powerful way to raise awareness about vulnerabilities and digital precarity. As part of the public programming that accompanied the recent Whitney exhibition about government surveillance by Laura Poitras, she had the pleasure of working with Harlo Holmes, a security researcher and trainer whom she praised for “a very good hands-on approach” to informing people participating in her training “about technological infrastructure” and “to think who is involved.”

For example, when people are playing a game on a mobile phone, “they might not think about the gaming companies of their phone provider as parties that are potentially profiling them. Harlo Holmes had this amazing talent for allowing the young people in the room to bring their own experiences and map the different actors that matter for their privacy. She empowered them to work for change and to identify better digital tools.”

In preparing their session, Harlo and Seda were also inspired by conversations Bianca Laureano, one of the digital alchemists in the CSOV who specializes in training youth about sexual health and safety. “Bianca had great ideas on how to bring together privacy and critical media literacy. For example, she encouraged us to approach the topic of online vulnerability through a ‘rapid development’ game. Within 10 minutes, a team of three students would have to come up with an image and good meme statement plastered on it.” Then, they had to come up with a plan to distribute the meme. “After planning to make their messages go viral, we would ask them to reflect critically about the ways in which they put the meme together: Did they ask for permission to use the image? Who did they make visible and invisible? What data traces did their 10 minutes of digital production leave behind on different platforms? And, what is the potential for improper access or possible misuse of the meme after distribution? Bianca’s game reflected everyday practices. You typically only have 10 minutes. This is also how a journalist works. People retweet without thinking too much.” Gürses argued that such “games and simulations” offer ways to model “real situations” so participants can “reflect from it and learn lessons for creativity and respect.”

Gürses asserted that in the U.S., “we tend to talk about users as if they are a universal category, unless we talk about digital rights from a human rights perspective. Both framings are not very helpful from an international perspective. We often tend to not question how technology plays a very important ideological role in our lives.”

As an example, she cited, “the internet freedom agenda kicked off by then Secretary of State Clinton” that “basically makes this black-and-white distinction between democratic states and authoritarian states” that perpetuates “the mythology of the civilized nations and the uncivilized rest.” In this system, “internet freedom basically allows American companies and government agencies to claim that they are bringing about democratic change in authoritarian states.” Gürses lamented that “one of my battles has been to show that Turkey’s issues about surveillance or censorship are highly connected to market and political projects of this ‘civilized west.’ Sweden, for example, hosts internet freedom conferences yet exports deep package inspection software that is leveraged for surveillance and censorship. UK legislators continue to lead in making terrible surveillance and censorship laws. Journalists have been exposing how the NSA is pushing not only legal but also technical limits to expand their surveillance activities. And, most U.S. companies normalize surveillance while disrespecting user privacy”

She argues that such hypocrisy can have terrible global consequences. “For example, Turkey points to the U.S. government and companies to legitimize their need for home brewed surveillance programs, to the UK to justify censorship, and turns to Sweden to buy the tools for implementation. We live in an interconnected world in which the distinction between democratic and authoritarian states as a way to organize technology politics is at best misleading. This is especially the case, when it comes to the surveillance of (racial) minorities and vulnerable communities.”

Party with a Purpose

In her work with various privacy advocates, Gürses puts together events with unlikely partners and works with local communities. After the Snowden revelations, they started a “Women and Surveillance” group in order to make space for discussions of surveillance that centered on the experience of women, trans persons, and minorities. Members of the group have actively supported CryptoParties with different target groups. Matt Mitchell, a security researcher, organizes CryptoParties in Harlem. He recognizes that the black community has different needs and makes space for these needs to be expressed and attended to. We learned from his model. The Women and Surveillance group did a CryptoParty internally with the intention to rethink threat models of women and trans people. Depending on race, gender, class, sexuality, religion or disabilities, a person may have different concerns and risks and may have limited access to tools. We worked together to strategize about these matters and to develop alternative approaches that start with people’s needs rather than threats.”

Gürses lauded Melissa Morrone of the Brooklyn Public Library in New York for collaborating on initiatives like the Data Privacy Project, focusing on digital safety practices. “This is unique work that caters to the needs of local people, that makes great use of public institutions to do the trainings, and, in the meantime, gets those public institutions to rethink the digital environments and cultures they are creating. This kind of work illustrates one way to understand the direct concerns of different groups. If we just look for the average user, a happy average that covers most of the population, we reproduce the exclusion and mistreatment of racial minorities or underserved communities. We need to reframe our approaches to digital rights so that we don’t perpetuate current injustices and demand change from the responsible parties.

“One metaphor may help demonstrate what I mean. Often in advocating privacy, it is argued that taking care of your personal information is like eating good food. It’s healthy, like exercising, and part of taking care of yourself. But, to tell people to eat good food is a joke if the conditions of their participation are already set up in a certain way.” Gürses resists responsibilization, a position that guided her work in a Belgian project called SPION that worked to reverse it in the context of privacy and online social networks. She worries that organizations that promote “resilient citizenship” perpetuate it by putting the burden of protection from market and government externalities on the shoulders of the most vulnerable. “If your local supermarket is stuffed with cheap and unhealthy food, and you don’t have further financial means, eating healthy is a non-option. Such framings are disrespectful to people and they redirect the emphasis from those who are providing us with the ‘food.’ The same holds for asking people to protect their privacy when their environment is saturated with intrusive technologies. As trainers or mentors, we can teach people some privacy tricks, but we are not solving the bigger problems. What we need is to change the way we consume technology AND how we produce it. Much of my academic work focuses on how we can change practices in computer science and software engineering. However, that work needs to be informed by valuable efforts to express and make visible when and how companies, governments and researchers fail to fulfill their responsibility to respect the privacy and needs of underserved and vulnerable communities. Only then can we really learn from past mistakes and create better futures.”

Banner image: Caroline Sinders at Women and Surveillance NYC outing at Laura Poitras exhibition #tengaze #2ndgaze. Photo courtesy of Bernadette Bakers.