Soraya Okuda, Education and Design Lead, Electronic Frontier Foundation (EFF)
Visualizing how the Internet works to present user data
ALIGNING USER’S CONCEPTUAL MODELS OF WHERE THEIR DATA MOVES, AND WHO HAS ACCESS TO IT
How can users give consent, if there are information asymmetries between their understanding of a service and those who design the service? For example, a service may know exactly which jurisdiction and under what conditions user data is stored, as well as how that data is secured. However, a user may neither easily find this information of data storage and retention, nor have a framework of understanding how digital information moves online. Therefore, there is information asymmetry.
Data collection is fairly abstract for many users, especially as it relates to companies as well as the third party companies they partner with. If a designer is mindful not taking advantage of what users don’t understand, what might an informed approach to interaction with a service look like?
I propose a clearer framework for a privacy setting page. Users will be able to understand and interact with:
what kind of data is collected from them
where their data is stored and how it is secured
a choice for how they can control access to this data at these various storage points
Borrowing from digital security training and educational approaches, these actions would take place within the context of onboarding (when someone visits a service like a company website for the first time) and on a clearly-labeled privacy settings page. Information about the type, location and choice for managing data could appear just-in-time. This means as the user moves through the natural uses of the service, permissions are requested when needed, in the context of the use.
Figure 1: On the left (Point 1), a user at their computer. This would be labeled with something in the vein of “You” or “Your data” or “Your device”, and include a list below of types of data collected during a session and over time. An arrow points from the user at their computer to a router (Point 2), to a tower (Point 3) representing the Internet, and a cluster of server boxes representing intermediary networked computers (Point 4). An arrow points to the right with the intended website/service, labeled “Our service,” or “Our website,” or “Our app” (Point 5). “Finally, that server points to another server as the final endpoint (Point 6), which can represent shared data agreements with other companies, labeled “Our partners”.
This interactive diagram (Figure 1) illustrates how users can better understand and interact with their data. The diagram begins at the left (point 1) with the user at their computer, followed to the right by an intermediary set of connecting devices (point 2) router, (point 3) tower, (point 4) multiple servers, and at the right-most point (point 5), the website at the endpoint computer. Finally, the website-hosted computer connects to an extension of other servers that will receive this information (point 6) additional servers within the company, third parties advertisers. At each point of this diagram, a user can interact with the diagram to see what kind of data is collected.
Figure 2: On the right, the types of data a user might see at various points, such as Location, IP address, username, message history, searches, shopping history, friends, phones and connected devices, and so on.
Figure 3: Toggles for various types of data, where the default is switched “off”.
Finally, at the last two points (point 5 and 6, the website service and the shared data agreements with other companies) of the interactive diagram, the user can learn more and interact with what they consent to share. This can include, but is not limited to, their interests, location, friend connections, search history, and so on. For example, the user can click on the last point—the website server and the third party computers receiving this data—and further learn what information is received by the company, as well as what information is shared beyond the original company. I have represented toggles for each of the types of data that a user might be interested in withholding from sharing, with the default sharing setting configured to “off”.
Service providers can also include relevant information on that page, such as a summary of what measures are taken to protect that data (e.g. links to more information on the types of encryption used, links to relevant sections in their privacy policies and transparency reports, further details on the length of retention of data, and so on).
Just as designers will test workflows for usability, I propose that the designers should also aim to user test these interactive privacy settings for user’s comprehension.
Who is this for?
The user is someone who is sight-impaired and needs graphics and text to be large and to meet web accessibility color contrast standards.
This user did not have formal schooling beyond middle school but has familiarity with using a smartphone. They may be intimidated by legal language, but still, want an at-a-glance summary, and are conscientious of privacy and data collection. Thus, the wording would ideally be written at a fifth-grade reading level and with limited additional distracting stimuli.
The user is someone who may not be aware of machine-readable opt-out solutions, such as standards in the spirit of Do Not Track.
A user is coming to a social media service for the first time and is looking through public-facing comments shared by other users. In this imagined scenario, they decide that they want to sign up but are unsure if they want to participate in the service. In the onboarding flow, they would encounter this diagram of what information they are sharing. After selecting their data sharing preferences, they can proceed to use the service.
Figure 4: On the right, an image that shows a phone with a minimizable window that says “How we are using your data.
If the user changes their mind about their preferences at any point, they should be able to find this page easily. The user would have a way to find information about their data settings, such as in a clearly-communicated, large and easy-to-navigate section on the page as “site preferences,” or “my privacy, my data”, or “how we are using your data”: this would be large enough that can be easily tapped by someone with imprecise navigation, and large enough to be seen as a distinct piece of the site.
Alexis Hancock, EFF: Designing Welcome Mats to Invite User Privacy https://www.eff.org/deeplinks/2019/02/designing-welcome-mats-invite-user-privacy-0
Adam Schwartz, Corynne McSherry, India McKinney and Lee Tien, EFF: New Rules Protect Data Privacy: Where to Focus, What to Avoid https://www.eff.org/deeplinks/2018/07/new-rules-protect-data-privacy-where-focus-what-avoid
Una Lee and Dann Toliver, And Also Too: Building Consentful Tech Zine https://www.andalsotoo.net/wp-content/uploads/2018/10/Building-Consentful-Tech-Zine-SPREADS.pdf
Surveillance Self-Defense, EFF: What Should I Know About Encryption?https://ssd.eff.org/en/module/what-should-i-know-about-encryption
Dark Patterns https://darkpatterns.org/
Norwegian Consumer Council: How tech companies use dark patterns to discourage us from exercising our rights to privacy https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf
Norwegian Consumer Council: Every Step You Take: How deceptive design lets Google track users 24/7 https://fil.forbrukerradet.no/wp-content/uploads/2018/11/27-11-18-every-step-you-take.pdf