Transparency as a user experience problem

One of the top-level recommendations of the FTC privacy report was greater transparency about the data practices of companies and technologies.   The report pointed to mobile apps as especially needing better transparency.   Indeed, a previous FTC staff report on mobile apps for kids found that hardly any of the apps that were studied offered full privacy disclosures.

Lawyers tend to think of disclosure as a writing problem: what should our privacy policy say?   That's an entirely valid viewpoint--written policies could be a lot shorter and clearer than they are.   They are important, and worth getting right.

But as a computer scientist, I tend to look beyond written privacy policies, seeking disclosure opportunities elsewhere, in the design of the technology itself.   Beyond seeing disclosure as something that sits alongside the technology--a link you can click--I tend to look for ways to make technologies whose privacy attributes are clear to the user.   The goal is to make transparency an integral part of the user experience.

One of my favorite examples is the little red light next to my laptop's built-in camera.  The light comes on whenever the camera is active and capturing video.  Users understand this naturally, without having to read a manual, and they can tell at any moment whether they are being observed.   Could companies have gotten the same level of real-world transparency by putting statements into the privacy policies of programs on my laptop?   I doubt it.

Of course, the red light can't be the only control on use of the laptop camera.  The light tells me that the camera is active, but it doesn't tell me which applications have access to the video stream.   The developers had more work to do, to make sure programs didn't get camera access when they shouldn't have, and that programs couldn't eavesdrop on the video stream in ways that would surprise the user.   Still, the red light lends valuable transparency, without interfering with the user's flow of activity.

One of the reasons the red light works well is that it fits with user expectations about the camera.  The design of the laptop makes the camera's presence and location obvious--which makes sense, because the user wants to look at the camera.   The status of the camera is most sensitive when the user is in front of the camera and can see the light.  And a red "on air" light is a standard feature on video cameras, so the user will know how to interpret the light when it comes on.

These same factors make the red light a poor fit for other areas where we want transparency.  We can't secure access to my mobile phone's address book by putting a red light next to the address book (whatever that means).   If we want the same kind of natural transparency elsewhere, we can't just follow a cookie-cutter approach--we have to think about the user's expectations in each instance.  This is what great user experience (UX) designers do already.   We just need to apply it to privacy.

There's one more thing that great UX designers know: you can't just build a technology and then bolt on a great UX at the end.  You have to have UX in mind through the entire design process, looking at your design through the user's eyes.  What will the user expect this feature to do?   How can we make this option clear to the user?  Why does that function exist at all?  A great UX isn't just a skin, it's a way of aligning the user's expectations and the underlying technology with each other.

The same is true for privacy.  Explaining things to the user will only get you so far.  To align your practices seamlessly with your users' expectations, you will need to shape both your user interface and your underlying practices.   This doesn't mean that you need to refrain entirely from collecting and using users' information--but it does mean that you shouldn't collect and use data in ways that surprise your users.  If you can do this successfully--if users find that your products don't give them unpleasant privacy surprises--then you can build the kind of trust that wins loyal customers.

[Note: Looking for a joke or Easter Egg in this post?  There's not one in the text.]

Note: This blog post was reposted from the former Tech @ FTC blog. Comments are now closed for this post.

Original Comments to “Transparency as a User Experience Problem.”

Peter Cranstone (@cranstone)
April 6, 2012 at 11:07 am  

Privacy By Design is the future – if – you can convince both consumers and content providers to offer real transparency. I think the only way to do this is to build an app that allows the user to control every aspect of what gets shared in real time. Only when you give them the choice to control their data do we have a chance of improving privacy. It really all boils down to the content providers. If they abuse the trust placed in them by the consumer sharing his/her data then you have to provide the “consumer” with the ability to turn off what gets shared.

This is the core problem with the Do Not Track standard – you have no way of verifying that the content provider is not sharing your data. It’s really hard to change decades of practice (not to say millions of lines of code) that currently abuses my privacy.

Reagan said it best – “Trust and Verify”. In the absence of verification then there’s the potential for a surprise no matter how well your UI is designed.

Peter

Nick Grossman
April 12, 2012 at 11:16 am

I would love to see approaches to this coded up and ultimately shared out, open source style. If it’s done well, then people will want to emulate it — especially if it gives developers and designers a leg up when starting a new project (the way that twitter bootstrap does for HTML/CSS).

See also Aza Raskin’s project w/ Mozilla on privacy icons: http://www.azarask.in/blog/post/privacy-icons/ I think this ended up falling flat, but I don’t know the whole story why. This is not as natural as a “red light” solution, though.

Nick Grossman

April 12, 2012 at 11:43 am

Here’s another example, also not quite a red button but at least a step — 500px.com’s “basically” version of their TOS http://news.ycombinator.com/item?id=3831357

Elliot
August 26, 2013 at 10:45 am

You will acknowledge it when three of the four lights flash around the
power button. several zillion all of us us
dollars downloading and install are finished. Just what y� might require
is tο find out the excellent a specific out there οf them which can supply yο�� sοme
a person οf the type and many requirements as a quit outcome οf
which y� could select a child thing innovative.

The author’s views are his or her own, and do not necessarily represent the views of the Commission or any Commissioner.