of a fog data center. We leverage the potential of ge-
netic algorithms to propose a strategy that explores
the space of service-resource mappings to discover
the configuration that best matches the end users ex-
pectation in terms of service response time. Finally,
we discuss the results of tests run to assess the vi-
ability of the proposed strategy on several boundary
conditions.
In summary, the paper proposes the following in-
novative contributions:
• an analytical framework to model the placement
of micro-service chains in a fog environment;
• an optimal placement strategy leveraging a ge-
netic algorithm approach;
• a sensitivity analysis aimed to assess the ability of
the devised strategy to find suitable solutions.
The rest of the paper is structured in the following
way. In Section 2, we report a survey of the state of art
addressing the placement of services in fog infrastruc-
tures. In Section 3, we introduce the motivation of the
paper along with a basic use case scenario. We dis-
cuss a theoretical model to represent the performance
of services deployed in a fog infrastructure in Section
4. In Section 5, we present the results of experiments
aimed at evaluating the proposed approach. Finally, in
Section 6 we conclude the paper and anticipate some
future directions of the work.
2 LITERATURE REVIEW
While service placement in terms of Virtual Machine
allocation in cloud datacenters has been extensively
studied Mann (2015); Canali and Lancellotti (2017),
the placement of micro-services over the nodes of a
fog computing infrastructure has received far less at-
tention.
Several studies propose mechanisms for service
placement over the geographically distributed nodes
of a fog infrastructure starting by the simplifying as-
sumption that an IoT application only consist of one
independent micro-service. Among them, the solu-
tion proposed in Yu et al. (2018) is based on an opti-
mization model to jointly study application placement
and data routing. The authors in Skarlat et al. (2017)
proposes a solution for the placement of IoT services
on fog resources, taking into account their QoS re-
quirements. They rely on the concept of fog colonies
and model the fog service placement problem as an
Integer Linear Programming problem. The study pre-
sented in Canali and Lancellotti (2019) proposes for
the first time a service placement for fog comput-
ing systems based on genetic algorithms, demonstrat-
ing the efficacy of this kind of solution in a fog en-
vironment. However, in the reality complex appli-
cations usually are made up of multiple dependent
micro-services, while all the cited studies did not con-
sider the existence of a chain of multiple dependent
services and the consequent constraints, that signifi-
cantly increase the complexity of the solution.
Other studies focus on service placement in com-
bined fog-to-cloud architectures Souza et al. (2018);
Gupta et al. (2017); Yousefpour et al. (2017). The
study in Souza et al. (2018) proposes novel strate-
gies to offload services execution within the whole set
of cloud and fog resources, according to the specific
services needs and resources characteristics. The so-
lutions proposed in Gupta et al. (2017); Yousefpour
et al. (2017) place services with low latency require-
ments on the fog nodes, not powerful enough to host
all services. In our solution, we focus on placing
the micro-services only on the nodes of the fog layer
in order to maximize the user satisfaction, assuming
that, for the considered service chains, fog nodes are
able to process every request.
Only a minor number of studies have consid-
ered the problem of modeling the service chains and
their placement over the fog nodes. Among them,
some solutions are based on completely distributed
approaches Kayal and Liebeherr (2019); Xiao and
Krunz (2017). In Kayal and Liebeherr (2019) authors
seek to optimize energy consumption and communi-
cation costs based on a game-theoretic approximation
method. In Xiao and Krunz (2017), fog nodes cooper-
atively determine the optimal amount of workload to
be forwarded and processed by each other to improve
the users’ quality of experience. On the other hand,
in Santos et al. (2020) a centralized service chain con-
troller optimizes the placement of service chains in
fog environments.Our solution relies on Genetic Al-
gorithms to cope with the non linear nature of the
optimization problem used to minimize the response
time of the service chains, and proposed a wide sen-
sitivity analysis to consider the impact of varying ser-
vice chain length, load level and number of fog nodes.
3 MOTIVATING SCENARIO
The fog computing paradigm aims at compensating
the inability of cloud computing to guarantee low
latency requirements typically required by applica-
tions in IoT contexts. This is typically achieved by
deploying services close to the source of data they
need to process and/or users they need to serve. Un-
fortunately, the processing and storage power of fog
nodes is limited compared with that offered by the
CLOSER 2022 - 12th International Conference on Cloud Computing and Services Science
200