surveillance values compliance over freedom, and it
undermines human dignity.
In light of these concerns, this paper has argued
for designers and businesses to employ an ethical
approach to persuasion design. The arguments
provided in this paper can help design technologies in
a manner more conducive to autonomous decision
making and freedom of choice for the users. The
proposed ethical arguments include the principle of
minimal surveillance and an explicit creation of
awareness mechanisms such that the users have real-
time awareness of being under surveillance. Other
arguments include the explicit disclosure of the
intentions by the persuasive technology as well as the
disclosure of its side effects or foreseeable unintended
consequences. For technologies bordering on the
coercive, the paper suggests that digital products
always provide users with the freedom to opt-out of
persuasion, and that a democratic process is used for
coercive technologies being integrated into
significant socio-economic systems.
The aim of this paper was to highlight that
surveillance-based persuasive technologies can be
used to both enhance human autonomy and freedom
or to reduce it. The long-term social consequences of
these technologies will significantly depend upon
how they are integrated into socio-economic systems
and how policymakers design technology policies
with explicit consideration for these factors. There is
a potential for misuse of these technologies, which
can be used by private companies as well as
governments to create power imbalances and to evoke
compliance from their users, clients, employees or
citizens. Therefore, there is a need for designers to
take an ethical approach to technology design, as well
as for policymakers to incorporate these insights into
emerging policies in the domain.
REFERENCES
Alshammari, M., & Simpson, A. (2017). Towards a
Principled Approach for Engineering Privacy by
Design. Privacy Technologies and Policy: 5th Annual
Privacy Forum. https://doi.org/10.1007/978-3-319-
67280-9_9
Bascur, A., Rossel, P., Herskovic, V., & Martínez-
Carrasco, C. (2018). Evitapp: Persuasive Application
for Physical Activity and Smoking Cessation.
Proceedings of the 12th International Conference on
Ubiquitous Computing and Ambient Intelligence.
https://doi.org/10.3390/proceedings2191208
Berdichevsky, D., & Neuenschwander, E. (1999).
Toward an ethics of persuasive technology.
Communications of the ACM, 42(5), 51–58.
https://doi.org/10.1145/301353.301410
Bernal, P. (2016). Data gathering, surveillance and human
rights: recasting the debate. Journal of Cyber Policy,
1(2), 243–264. https://doi.org/10.1080/23738871.
2016.1228990
Burkell, J., & Regan, P. M. (2019). Voter preferences, voter
manipulation, voter analytics: policy options for less
surveillance and more autonomy. Internet Policy
Review, 8(4), 1-24. https://doi.org/10.14763/2019.4.
1438
Buss, S., & Westlund, A. (2018). Personal Autonomy. The
Stanford Encyclopedia of Philosophy. Edward N.
Zalta (ed.), URL = <https://plato.stanford.edu/archives/
spr2018/entries/personal-autonomy/>.
Desclaux, A., Malan, M. S., Egrot, M., Sow, K., for EBSEN
Study Group, & Akindès, F., for EBO-CI Study Group.
(2019). Surveillance in the field: Over-identification of
Ebola suspect cases and its contributing factors in West
African at-risk contexts. Global Public Health, 14(5),
709–721. https://doi.org/10.1080/17441692.2018.153
4255
Fogg, B. J. (2003). Persuasive technology: Using
computers to change what we think and do. Morgan
Kaufmann Publishers, Amsterdam.
Forgó, N., Hänold, S., & Schütze, B. (2017). The Principle
of Purpose Limitation and Big Data. In New
Technology, Big Data and the Law (pp. 17–42).
https://doi.org/10.1007/978-981-10-5038-1_2
Friedrich, O., Racine, E., Steinert, S., Pömsl, J., &
Jox, R. J. (2018). An Analysis of the Impact of Brain-
Computer Interfaces on Autonomy. Neuroethics.
https://doi.org/10.1007/s12152-018-9364-9
Hadjimatheou, K. (2017). Surveillance Technologies,
Wrongful Criminalisation, and the Presumption of
Innocence. Philosophy & Technology, 30(1), 39–54.
https://doi.org/10.1007/s13347-016-0218-2
Jacobs, M. (2019). Two ethical concerns about the use of
persuasive technology for vulnerable people. Bioethics,
34(5), 519−526. https://doi.org/10.1111/
bioe.12683
Lemieux, F. (2018). Intelligence and State Surveillance in
Modern Societies: An International Perspective.
Emerald Group Publishing, Bingley, UK.
Levy, N. (2007). Neuroethics: Challenges for the 21st
Century. Cambridge University Press, Cambridge, UK.
Lupton, D. (2012). M-health and health promotion: The
digital cyborg and surveillance society. Social Theory
& Health, 10(3), 229-244. https://doi.org/10.1057/sth.
2012.6
Lyon, D. (2001). Surveillance Society: Monitoring
Everyday Life. McGraw-Hill Education, UK.
Maclean, A. (2006). Autonomy, Consent and Persuasion.
European Journal of Health Law, 13(4), 321-338.
https://doi.org/10.1163/157180906779160274
Nagenborg, M. (2014). Surveillance and persuasion. Ethics
and Information Technology, 16(1), 43–49.
https://doi.org/10.1007/s10676-014-9339-4
Obar, J. A., & Oeldorf-Hirsch, A. (2020). The biggest lie
on the Internet: ignoring the privacy policies and terms
of service policies of social networking services.