Inclusive Design for Privacy
- Dana Ayotte
Inclusive Design for Privacy
Many people rely on technology and “smart” services to provide accessibility or convenience in their everyday lives. For example, devices that allow voice commands to control lighting or other appliances in the home can significantly improve the accessibility of an interior space. Services that remember a user’s log-in or purchasing information can make it much easier to access those services on a regular basis. However, the use of these services come with the cost and associated risk of sharing personal information online. Those who can benefit most from these smart services, including persons with disabilities, persons who are aging and others who face discrimination, stereotyping, or exclusion, are often the most vulnerable to the misuse of private information - for example through denial of medical insurance, jobs and services, or fraud.
Putting control of online personal privacy into the hands of the user is an important aspect of inclusive design. Many people avoid using particular services (for example, online banking) due to fear of misuse of their personal information. Too often these choices are based on a lack of knowledge of how our personal information is being used and/or of how we can protect it. And many services do not allow a sufficient level of control over our privacy. At the same time our fears are often reinforced with news about online hacking and security breaches.
By designing services that provide more transparency and individual control over how our personal information is being used, we can help to educate people about digital privacy, foster a sense of entitlement to that privacy, and facilitate more informed choices. Users must be able to personalize their experience to match not only the task at hand, but also to match their acceptable level of risk.
Providing a way for users to weigh the risks and benefits of sharing their personal information online is also an important aspect of designing for privacy. However, ultimately the burden should not be on the user to have to choose between usability and privacy. By embedding privacy into our designs from the start, we are more likely to allow for usability and privacy to co-exist. Designing for privacy in this way foments an innovative space in which we can challenge ourselves to come up with exciting, new solutions.
Examples of inclusive design for privacy:
Designing and implementing segmented privacy policies that can be accepted in part or in whole, and communicating these policies with clear, simple language
Providing default settings with a high level of privacy, and allowing the user to opt in to sharing information (rather than having to opt out)
Allowing the user to understand and weigh the risks and benefits (to them) of sharing their personal information with a service, and to self-assign value to their information based on this understanding
Conveying the idea that privacy is a right, is not about hiding, and rather is about having control over what is rightfully ours
Facilitating the self-assignment of value to personal information
Encouraging the development of trust lists or circles of trusts of services or people with which to share personal information
Resources
International Council on Global Privacy and Security by Design
The Electronic Frontier Foundation
Projects by If - data permissions catalog
The Platform for Privacy Preferences Project (W3C)
Designing a Privacy Preference Specification Interface - A Case Study. Cranor, L.F.
User Interfaces for Privacy Agents. Cranor, L.F., Guduru, P. and Arjula, M.
Things to Try
Consider privacy from the beginning of your design process, so that it can be embedded in your design.
Aim for designs that do not limit usability for the sake of privacy, or that limit privacy for the sake of usability.
Don’t assume user knowledge of privacy-related issues or how to change privacy settings.
Provide granularity in privacy policies and use clear, simple language.
Provide a way for users to opt-in to personal information sharing (e.g. third-party tracking), rather than having to opt-out.