“The future is already here – it’s just not very evenly distributed”: Panel Discussion on the ‘Darker Side of Digital: Human Rights Implications of Technology in Canada & Abroad’

Chelsey Legge (2L JD/MPP)

 

Full house: ‘Darker Side of Digital’ discussed the intersection of technology, privacy and human rights. Photography by Alison Thornton / Human Rights Watch.

We are living in an era of unprecedented technological change that is rapidly and fundamentally altering nearly every aspect of human life. From biometric identification software and advanced robotic technology to stem cell treatments and lab-grown meat, emerging technologies are characterized by their novelty and impact. While these technologies have enormous potential to improve our lives in various ways - to make us happier, healthier, smarter, safer, and more efficient - there is a darker side to technological progress. Technology that enables the collection of massive amounts of digital information has the potential to compromise privacy interests. New and developing surveillance technologies threaten individual liberties such as freedom of expression and association. People from diverse fields and all walks of life are asking: how do we protect human rights in the digital age? 

On Tuesday, February 6, a panel of experts convened at the Faculty of Law to discuss the human rights implications of new and rapidly evolving technologies. “Darker Side of Digital: Human Rights Implications of Technology in Canada & Abroad,” co-hosted by the International Human Rights Program (IHRP) and Human Rights Watch Canada, drew a sold-out crowd. It took place in the Rosalie Silberman Abella Moot Court Room at the University of Toronto’s Faculty of Law, and was also livestreamed on Facebook -- the irony of which was not lost on the organizers and panellists (video available here). 

Samer Muscati, IHRP Director, opened the event by welcoming the full house of attendees and introducing the panellists: Stephen Northfield (moderator), digital director at Human Rights Watch; Felix Horne, Ethiopia and Eritrea researcher at Human Rights Watch; Lex Gill, research fellow at the Citizen Lab; and Professor Lisa Austin, chair in law and technology at the Faculty of Law. 

 

Before launching into the discussion, the panellists each talked about how their work has intersected with technology and human rights.

(From left) Stephen Northfield, Felix Horne, Lex Gill, and Professor Lisa Austin. Photography by Alison Thornton / Human Rights Watch.

Northfield began by discussing some of the positive aspects of technology, noting that “technology can be a real help in investigations and effecting change.” He explained how technology figures into the Human Rights Watch research methodology. For example, Human Rights Watch recently used satellite imagery to document the ethnic cleansing of Rohingya villages in Myanmar.

Horne shared some of his experiences conducting research in Ethiopia, a country engaged in pervasive censorship and surveillance. He mentioned that Human Rights Watch released a report on telecom and internet surveillance in Ethiopia in 2014. The Ethiopian government strictly controls internet and mobile technologies so it can monitor their use and limit the type of information that is being accessed and communicated. Unlike most other African countries, Ethiopia has a complete monopoly over its rapidly growing telecommunications sector through the state-owned operator, Ethio Telecom. This monopoly ensures that the Ethiopian government can effectively limit access to information and curtail freedoms of expression and association without any oversight, since there is a lack of independent legislative or judicial mechanisms that would ensure that surveillance capabilities are not misused.

Gill gave the audience a brief introduction to the Citizen Lab, an interdisciplinary laboratory based at the Munk School of Global Affairs, focusing on research, development, and high-level strategic policy and legal engagement at the intersection of information and communication technologies, human rights, and global security. She talked about her extensive work around encryption and anonymity laws, and told the audience about the Citizen Lab’s recent collaboration with the IHRP to produce a submission on technology-facilitated violence against women to the UN Special Rapporteur on violence against women. She also made a point of “challenging the idea of a dark side and light side [to technology],” suggesting instead that we think about darkness “not as malevolent or evil, but as unknown.” 

Austin spoke about her work on privacy issues and how the way we think about public space and privacy has changed as we have shifted from an analog world to a digital world. She encouraged the audience to consider, for example, what it means to be in public in the age of video surveillance, sensors, and “smart cities.” She proposed that we think about “how [we can] enlist new modes of technology to help us navigate this new [digital] space.”

All eyes on the panelists as Northfield opens the discussion. Photography by Alison Thornton / Human Rights Watch.

Northfield then posed a number of discussion questions, eliciting answers that were both informative and thought-provoking. He began by asking the panellists how worried people should be about infringements on their privacy. Gill suggested that some paranoia is reasonable, but also pointed out that certain communities (e.g., indigenous communities, activist groups, political dissidents) have more cause to worry than others, since those communities are subject to disproportionate levels of surveillance. Quoting American-Canadian writer William Ford Gibson, Gill said, “The future is already here – it’s just not very evenly distributed.” Austin added that privacy is highly contextual: “that you share something in one context does not mean you have no privacy interest in other contexts.” She discussed the very real example of Presto card users’ private travel records. Presto users may be comfortable with their records being used to improve the public transit system, but considerably less comfortable with their records being shared with the police without a warrant.

The conversation shifted to the increasing powers of governments to surveil their citizens, and the implications of this surveillance. Using Ethiopia as an example, Horne talked about how pervasive state surveillance can impede the work of journalists, activists, and human rights organizations. Moreover, the ubiquity of surveillance and spying spreads paranoia and “tears at the social fabric of a community, [which] really affects every facet of life.” These effects can spill across borders – for example, the Ethiopian government has begun using spyware to surveil the diaspora. This has a chilling effect on Ethiopians around the world. Horne noted that “[the diaspora] are self-censoring because of this perception that they’re being surveilled.” 

Naturally, the discussion moved toward the trade-off between national security interests and individual privacy interests. Horne pointed out that “governments conflate peaceful expression of dissent with terrorism all the time.” Gill questioned whether there is always a trade-off between security and human rights. For example, governments and police forces have long complained about encryption technology impeding investigations, but fail to consider how much crime is prevented by encryption (e.g., safe online banking). “This is a very narrow view of what it means to be secure,” said Gill. Austin suggested that we think about the security-privacy balance as we constructed it in the analog world, but we need to be thinking about how it has changed as we have moved to the digital world. For example, what does it mean to be free from unreasonable search and seizure in the digital world? What is a reasonable expectation of privacy, and how has our understanding of what is reasonable changed? “What is getting lost in translation?” asked Austin. “Is the surveillance power growing because we’re not paying attention to what’s happening as we shift?” 

Northfield then asked the panellists how well Canada protects privacy interests. Gill gave the audience a brief primer on Bill C-59, the current government’s answer to some of the problems with Bill C-51 (the anti-terrorism legislation passed in 2015 by the Harper government). She said this new legislation represents “two steps forward, six steps back.” On the one hand, Bill C-59 introduces new oversight and accountability functions. On the other hand, it normalizes mass surveillance through bulk collection, and gives extraordinarily broad powers to Communications Security Establishment Canada (CSEC), the Government of Canada’s national cryptologic agency. Safety measures and restrictions on the use of data collected by CSEC will be set out in regulations later, but Gill said that this “trust us” framework is not good enough, adding that “democracies are fragile, institutions are fragile – we need to have a longer-term view of what kind of infrastructure we want to be building.” 

Corporations, like governments, are key players in the field of technology and human rights. Northfield asked the panellists to talk about some examples of ways in which technology companies are impacting human rights, and what governments ought to do in response. Austin talked about balancing privacy rights and the legitimate needs of businesses. She also discussed the central role played by the Personal Information Protection and Electronic Documents Act (PIPEDA). PIPEDA sets out the ground rules for how private-sector organizations must handle personal information in the course of commercial activity. “[This legislation] was about allowing people to feel okay about shopping online. [It is] consumer-friendly e-commerce legislation, and now we’re trying to make it do a bunch of things it was never meant to do,” said Austin. “PIPEDA is really inadequate to deal with this new era of platforms [like Facebook] that give you free services on the condition that you give up all your data.” 

Gill added that there are companies “actively engaged in the business of ‘Black Mirror’ – spyware companies, companies that sell hacking tools, internet filtering and censorship companies.” She said that businesses have a responsibility to account for and respect international human rights, and that our own governments should be mindful about doing business with these companies. Horne agreed, and noted that commercial spyware companies are “giving [these] technologies to an oppressive government with no monitoring [and] no accountability.” He said that with such rapid change, laws and policies cannot keep up, and so there is an absence of oversight.

One of Northfield’s final questions was on emerging technologies, such as artificial intelligence, and their potential impacts on human rights. Austin pointed out that there are different kinds of machine learning algorithms, and she would like to see human rights bodies focus on “algorithmic responsibility.” She said, “When you’re automating decision-making, or doing certain kinds of analysis, [the question is] how do we make sure that those processes are transparent? How do we make sure the algorithms are non-discriminatory?” Austin discussed examples such as predictive policing and automated sentencing decisions, and talked about the risk of discrimination being “baked in” to these models. “We need a broader conversation about what it means to be responsible and fair in the context of algorithms.”

Audience questions after the panel discussion ranged from inquiries about how the panellists manage their own online presences, to the ability of governments and regulators to keep pace with technological change, to the implications of biometric security software. One theme that wove through many of the questions was the role of ordinary individuals and communities in challenging excessive or rights-infringing state surveillance and censorship. All of the panellists expressed optimism on this front. Horne noted that “many tools exist that can be used to protect oneself and minimize risk, such as Tor, VPN, and encryption applications.” Gill pointed out that we all have different concerns about privacy, but “tools like the Citizen Lab’s Security Planner can help people to develop safer habits over time and at their own pace.” She cautioned, however, that the answer to state surveillance is not individual self-defence, and suggested that we focus on building censorship- and surveillance-resisting communities organized around an ethic of care. Austin encouraged civic action, and stressed the importance of “showing up, and insisting on best practices above the baseline.”  

The questions continued to pour in, and even after the panel concluded, many people in the audience approached the panellists to thank them for their insights and to continue the discussion. The degree of engagement suggests that the audience members had already internalized the advice of the panelists: learn, discuss, and don’t be complacent. We all have a role to play in creating a rights-respecting digital future.  

Check out the discussion on Twitter, at #darksideTO.