More and more people rely on technology and “smart” services to provide convenience and accessibility in their everyday lives. For example, devices that allow voice commands to control lighting or other appliances in the home can significantly improve the accessibility of an interior space.
The use of these services, however, comes with the cost and associated risk of sharing personal information online. Those who can benefit most from these smart services, including persons with disabilities, persons who are aging and others who face discrimination, stereotyping, or exclusion, are often the most vulnerable to the misuse of private information (for example through the denial of medical insurance, jobs and services, or fraud).
Putting control of online personal privacy into the hands of the user is an important aspect of inclusive design. By designing services that provide more transparency and individual control over how our personal information is being used, we can help to educate people about digital privacy, foster a sense of entitlement to that privacy, and facilitate more informed choices. Users must be able to personalize their online experience to match not only the task at hand, but also to match their acceptable level of risk. Ultimately, the burden should not be on the user to have to choose between usability and privacy.
- Consider privacy from the beginning of your design process, so that it can be embedded in your design.
- Aim for designs that do not limit usability for the sake of privacy, or vice versa.
- Don’t assume user knowledge of privacy-related issues or how to change privacy settings.
- Provide granularity in privacy policies and use clear, simple language.
- Provide a way for users to opt-in to information sharing, rather than having to opt-out.