Face Facts: A Forum on Facial Recognition Technology, Project No. P115406 #00075 

Submission Number:
00075 
Commenter:
Sam  Gregory
Organization:
WITNESS
State:
New York
Initiative Name:
Face Facts: A Forum on Facial Recognition Technology, Project No. P115406

WITNESS is a non-profit with twenty years of experience supporting people to use video for human rights. Most recently we have been analyzing the risks and opportunities created by the spread of ubiquitous video, and in 2011 published the 'Cameras Everywhere: Current Challenges and Opportunities at the Intersection of Human Rights, Video and Technology' report (http://www.witness.org/cameras-everywhere/report-2011). One area of risk we identify relates to privacy and security in an increasingly visually-mediated world. As we note in the report: “With cameras now so widespread, and image-sharing so routine, it is alarming how little public discussion there is about visual privacy and anonymity. Everyone is discussing and designing for privacy of personal data, but almost no-one is considering the right to control one’s personal image or the right to be anonymous in a video-mediated world. Imagine a landscape where companies are able to commercially harvest and trade images of a person’s face as easily as they share email addresses and phone numbers. While it is technically illegal in some jurisdictions (such as the EU) to hold databases of such images and data, it is highly likely that without proactive policymaking, legislative or regulatory loopholes will be exploited where they exist. So far, policy discussions around visual privacy have largely centered on public concerns about surveillance cameras and individual liberties. But with automatic face-detection and recognition software being incorporated into consumer cameras, applications (apps) and social media platforms, the risk of automated or inadvertent identification of activists and others–including victims, witnesses and survivors of human rights abuses–is growing. No video-sharing site or hardware manufacturer currently offers users the option to blur faces or protect identity. As video becomes more prevalent as a form of communication and free expression, the human rights community’s longstanding focus on the importance of anonymity as an enabler of free expression needs to develop a visual dimension–the right to visual anonymity.” Threats to free expression based on identification from publicly-available social media have been seen recently in Syria, Iran and Burma. As facial identification evolves (see Alexander Acquisti at the workshop), more automated and social media-based identification based on videos taken at protests is a growing concern for human rights defenders (HRDs). This risk is not restricted just to protests in repressive societies. As Chris Conley noted in the FTC hearing these risks are relevant in any situation where a single photo or video can be connected to one's identity, for e.g an LGBT person who is not publicly ‘out’, or a person who chooses in a single circumstance to whistle-blow or express an unpopular opinion. As a corollary to the investment by tech companies in facial recognition technologies we need much simpler ways to anonymize people's faces and voices. This would protect not just human rights activists in Syria, but also more everyday consumers, for instance, a victim of trafficking in Illinois. To model one response to this, WITNESS and the Guardian Project have developed a technology to address peer-to-peer issues around facial recognition for at-risk individuals who wish to exercise their right to free speech in a visually-mediated world, yet not have that image correlated to the remainder of their social media presence. ObscuraCam (http://blog.witness.org/2011/06/obscuracam/) is an Android-based application allowing easy redaction of faces in images shot on mobile devices, making it easier for individuals to control how images they shoot can protect the visual anonymity of vulnerable individuals. For more on the rationale for this approach, see ‘Human Rights, Privacy and Visual Anonymity in the Facebook Age’ (http://blog.witness.org/2011/02/human-rights-video-privacy-and-visual-an...).