present study, five interviews and three focus groups
with overall 18 participants were carried out to
identify mental models and possible visualizations of
privacy protection. Participants with differing
technical knowledge about online privacy were
included in the study, as it was hypothesized that their
mental models differ in level of detail and accuracy,
based on, e.g., (Coopamootoo & Groß 2014a).
The analysis showed that privacy protection is
perceived as a complex concept with many
influencing factors. No simplistic, easy to use mental
model was identified in our sample, but clues for
some useful models were extracted. It was hard for
the participants to directly define privacy protection
but many related topics were discussed: Who is
responsible for privacy protection? Against what and
whom is protection needed or wanted? How is
privacy protection currently managed? What are
preconditions for successful privacy protection?
Those different topics show that privacy protection is
context-dependent: It can be the protection of the
individual against targeted advertising by online
companies, or, it is the protection of stored data by
online companies against hackers. And it should also
always be somehow supported by law and
regulations.
Complex is the attribute that all participants
agreed upon for privacy protection. Also, the results
of Prettyman et al. 2015, namely that one important
perception is that privacy protection takes much
effort, are mirrored in this study. However, her
findings that privacy protection is perceived as
irrelevant because users have nothing to hide was not
replicated, at least not in this German sample. In our
sample, the participants emphasized the importance
of privacy protection. As this is a qualitative study
with a very small sample size, we cannot generalize
these findings. Still, they could be, in fact, culturally
sensitive. Studies dealing with international
differences regarding information privacy show that
there are large differences across nations in this
regard (cf. Culnan & Armstrong 1999; Trepte &
Masur 2016).
The risk associations found by Camp (2009) were
mostly also present in our study. Many participants
described privacy protection as the absence of
negative consequences and listed those threats.
Especially criminal behavior and financial losses
were addressed often. But we found another focus: At
its core, our analysis showed privacy to be understood
as the protection of the individual and his or her
identity. Additionally, data collection itself, the
“annoying” targeted advertising and “unfair”
individual pricing, and also the protection from
manipulation of society and democracy were
addressed.
Initially, privacy protection is felt to exist only on
a binary level – either one’s privacy is protected or it
is not. This approach is revised by the participants
once they delved deeper into the topic, its complexity,
and the idea of adjustable privacy.
The central point of identity is also focused in the
understanding of privacy protection in a scenario of
data provision. The scenario introduced the idea that
when data is voluntarily provided to a data collector,
the user can decide on a level of privacy protection
that is given to that data. Here, the participants
interpret privacy protection as anonymization and the
level of privacy protection as proportionate to the k-
anonymity in a data set. This idea is also then merged
into the visual representation and control elements for
privacy protection. The participants wish to see the
group of people among which they would not be
distinguishable anymore.
This focus on the concept of k-anonymity may
show that this is the mental model the participants
have for privacy protection. On the other hand, the
discussion could also have taken this focus, because
no alternative concept was available and this one was
easy to relate to. In such a qualitative approach, the
framing due to the questions asked by the interviewer
as well as the answers of other focus group attendees
influence the participants. Thus, we cannot claim that
this is a pre-existing mental model.
Other concepts, like privacy protection as a
barrier or lock (cf. Dourish et al. 2003, Asgharpour et
al. 2007), were not well applicable in the scenario
because they offer only two states: protected or not.
When given the choice, the participants wanted more
control and, thus, more nuances or gradations in the
setting. Still, the models of physical privacy
protection by a fence, wall, or padlock are matched
by the initial evaluation of some participants, namely
that privacy protection is binary, and were initially
preferred by some participants. In other privacy
contexts, physical privacy, psychological, and social
privacy, protective means are often binary, such as
shutting a door or refusing to speak to a person. These
measures have been known to people for centuries.
But the complexity of the online world is still new and
always changing. The idea of scalable privacy
protection may not be obvious to users and, hence,
does not fit existing mental models.
Within the research project myneData, the idea of
adjustable privacy protection is one central element.
If it is indeed the case that the only existing mental
models of privacy protection are binary, these models
cannot be used. To the concept of k-anonymity –