non-tampering, and proof of origin respectively, pri-
vacy definitions refer to the rights of a living subject
through anonymity, pseudonymisation, and tokenisa-
tion (Varanda et al., 2021). Anonymisation means the
inability to identify a living person amongst a data set
- a series of values exemplified by health data of a
group, for example, where many subjects will belong
to a group of age or gender, but none shall stand out as
an individual. Anonymisation is often used in statis-
tics for geography, health, demographics, and risk
factors like insurance. Pseudonymisation, compared
to anonymisation, does not alleviate data of all iden-
tifiable information - but reduces the link of a dataset
with the original identity of an individual.
Pseudonymisation and anonymisation as privacy
functions overlap with security, as they could mean
an encryption scheme, or a nickname with meaning to
operatives but not outsiders. They could also mean a
reference code, an artificial identifier indexed against
the real data in a central server. Tokenisation is also
similar to encryption, but often operates on a simple
substitution to disguise data such as credit card num-
bers. Tokenisation can therefore be ‘unlocked’ with-
out a key, particularly with well known credit card to-
kenisation algorithms used as part of the Cardholder
Data Environment (CDE) (El Alloussi et al., 2014), of
PCI DSS (Razikin and Widodo, 2021).
The approval of PCI DSS for use of cryptographic
alternatives such as pseudonymisation for sensitive
data, indicates that negation of heavily mathematical
functions is entirely safe, practical, and even encour-
aged under the governing bodies of financial regula-
tion. This work investigates where it is safe to substi-
tute security functions for privacy.
2.2 Nature and Vulnerabilities of IoT
IoT networks by nature are chaotic, constrained, and
heterogeneous - opposite of the capabilities and re-
sources of typical computers such as desktop ma-
chines and smartphones. However, particularly in
the case of microcontrollers (as opposed to micro-
computers), their constraint and simplicity provides
natural protection. In the way that a calculator
cannot be exploited through lacking architecture, a
microcontroller is naturally without Operating Sys-
tem (OS) or the problems that go with it. De-
velopments of microcontroller-oriented developments
such as Serial Peripheral Interface Flash File System
(SPIFFS) (Espressif, 2021a), Mongoose OS (Espres-
sif, 2021b), and the Arduino webclient (Microcon-
trollerslab, 2019), increase the threat landscape by
removing this natural robustness of IoT simplicity.
Attempting to transform machines comparable with
those of the 1990’s towards modern expectations,
weighs heavily on the storage and energy restrictions.
TLS is a reflection of such unreasonable expectations;
it is centralised, rigid, and contradicts notions of het-
erogeneous freedom and spontaneity. However, TLS
is strong, widespread, and available for most, if not
all, communications protocols.
2.3 Rigidity of TLS
As the long-standing de facto standard in online secu-
rity, TLS has carried ubiquity onto IoT, with libraries
for both software and embedded devices, and dedi-
cated hardware acceleration in some microcontrollers
such as the ESP32 (ESP32, 2021). However, the ben-
efits of TLS end at the convenience of ubiquity. De-
signing an agnostic security application to satisfy the
same standards as TLS would be quite a challenge,
particularly since the CIA functions are readily avail-
able. Therefore the challenge is not to reinvent TLS
security, but disrupt everything about it apart from
the ubiquitous functions - as a challenge centric to
carbon-neutrality and device longevity, this is moti-
vation enough to accept.
Now, TLS relies on invoking a channel to agree
on the authenticity of the server which a client wishes
to connect with. The client interrogates the server, to
ascertain it is who it claims to be, before agreeing to
send over any sensitive information such as username
and password credentials. Subsequently the server’s
identity is confirmed through the medium of an X.509
certificate, or more specifically, the Digital Signature
Algorithm (DSA) on that certificate, and then the se-
cure session begins. This is the way online transac-
tions have taken place for decades.
Now, if the network entities do not intend to use
web browsers, Certificate Authorities (CA), an OS,
SPIFFS, or any other complications pertaining to their
detriment, they do not need the certificate either. The
reason for this is, without a browser, the certificate
need not be kept, since it is for browser display behind
the TLS handshake. Without the browser, devices can
authenticate perfectly well using regular TLS func-
tions and without all the unnecessary infrastructure.
This deals with the centralised aspect that would oth-
erwise stifle equality and distribution throughout the
network.
To address energy consumption this is where risk
assessment becomes so valuable. Since the TLS chan-
nel must no longer be invoked due to removing the
centralised infrastructure, security functions can be
employed with disregard to the rules of that channel.
This allows freedom to employ CIA or privacy func-
tions on a very flexible basis. It is important to em-
IoTBDS 2022 - 7th International Conference on Internet of Things, Big Data and Security
92