
family doctor, cardiologist office and the gym for in-
dependent use. The three parties improve their mod-
els by processing knowledge from the distant data sets
which are not residing on their servers. Federated
learning limits the exposure of information to other
parties and therefore protects the privacy of the data.
Although federated learning does not guarantee com-
plete privacy protection, the risk is low in severity.
5.2 Use Case 2: Smart Fridge
Smart fridge is a type of an IoT device where the
data collected is medium or slightly sensitive unlike
health or financial data. This smart device is com-
monly used by people in their homes to synchronize
the data with their phone, shared with the fridge man-
ufacturers to uncover the usage patterns or with any
grocery store for item tracking. Following the path
in the decision tree in Figure 3, the data is less pri-
vate, the end user is a single individual with limited
computational capability, assuming that the fridge is
more difficult to be stolen and no unauthorized access
can occur due to the physical security of the house
and limited bandwidth for data transfer is available to
the user. Furthermore, the system recommends using
synthetic data generation as the most appropriate PET.
The data is not shared among multiple parties and
no collaboration for analysis is required. The smart
fridge transmits a subset of real data only (due to less
bandwidth availability) to the manufacturer who can
use this small volume of real data to simulate the syn-
thetic data for mining the usage. Synthetic data re-
lease ensures the privacy of the user data, since no
real data with personal information is released to the
manufacturers. Although data privacy is ensured by
not sharing the real raw data at all, synthetic data is
not an exact representation of the real data and might
generate a less accurate data to some extent. How-
ever, due to the fact that the data is not very critical
nor private, a small deviation from the real data is ac-
ceptable in this case.
6 CONCLUSIONS AND FUTURE
WORK
Data sharing is unavoidable in the current world of
IoT computing because the data belonging to differ-
ent application domains requires a variety of process-
ing for purposes like anomaly detection, fine tuning,
data mining, deduplication etc. Due to the volume
of the data, processing of such data typically requires
a large computational overhead. Organizations often
do not have resources to perform required processing
on their premises, and need to outsource this process-
ing. In this scenario, PETs can offer a viable and ef-
fective tool to protect the privacy of the data after it
leaves the owner’s domain. In this paper, we have
proposed a novel method based on NIST standards to
recommend an optimized PET with regard to privacy,
efficiency, cost and scalability parameters of the ap-
plication domain. The proposed framework is easy
to use and it can be adjusted to meet changing needs.
We have also presented two different use cases to sup-
port our framework and explain the concepts applied
behind the selection of a particular PET through the
proposed framework. One possible future work di-
rection is to incorporate machine learning models to
recommend the most optimized PET. The selection
framework can be augmented by adding more dimen-
sions in addition to privacy, prioritizing different fac-
tors and using more user responses for recommenda-
tions.
REFERENCES
Alvim, M. S., Andr
´
es, M. E., Chatzikokolakis, K., Degano,
P., and Palamidessi, C. (2015). On the information
leakage of differentially-private mechanisms. Journal
of Computer Security, 23(4):427–469.
Binjubeir, M., Ahmed, A. A., Ismail, M. A. B., Sadiq,
A. S., and Khurram Khan, M. (2020). Comprehensive
survey on big data privacy protection. IEEE Access,
8:20067–20079.
Creswell, A., White, T., Dumoulin, V., Arulkumaran, K.,
Sengupta, B., and Bharath, A. A. (2018). Generative
adversarial networks: An overview. IEEE signal pro-
cessing magazine, 35(1):53–65.
Dwork, C., McSherry, F., Nissim, K., and Smith, A. (2016).
Calibrating noise to sensitivity in private data analysis.
Journal of Privacy and Confidentiality, 7(3):17–51.
Dwork, C. and Roth, A. (2014). The algorithmic founda-
tions of differential privacy. Foundations and Trends
in Theoretical Computer Science, 9(3–4):211–407.
El Emam, K., Mosquera, L., and Hoptroff, R. (2020). Prac-
tical synthetic data generation: balancing privacy and
the broad availability of data. O’Reilly Media.
Fagan, M., Megas, K. N., Scarfone, K., Smith, M. N. I. o. S.,
and Technology (May 2020). Foundational cybersecu-
rity activities for iot device manufacturers. Technical
Report NISTIR 8259, National Institute of Standards
and Technology (NIST).
Farall, F. (2021). Deloitte insights: Data sharing made easy.
https://www2.deloitte.com/us/en/insights/focus/t
ech-trends/2022/data-sharing- technologies.html.
Accessed August 1, 2023.
Jordan, S., Fontaine, C., and Hendricks-Sturrup, R. (2022).
Selecting privacy-enhancing technologies for man-
aging health data use. Frontiers in Public Health,
10:814163.
PETRIoT - A Privacy Enhancing Technology Recommendation Framework for IoT Computing
843