tegration of Cloud-Fog technologies. They evaluated
the scenario in regards to distributed computing, re-
ducing network latency, optimize communication and
power consumption. The investigation showed that
using Fog-Cloud solutions results in better task distri-
bution, instance cost, energy usage and network delay.
Notably, a significant improvement in the field of real-
time processing can be recognized. The benefits are
also pointed out by Chakraborty et al. (Chakraborty
et al., 2016). They introduced a solution to gather, an-
alyze and store time-sensitive data. To achieve min-
imum delay while ensuring data accuracy they per-
form time-sensitive tasks within the Fog nodes and
apply further processing and long-term persistence in-
side the Cloud. We took these design principles and
enhanced them with the capability to integrate mea-
surement probes and container virtualization to create
an infrastructure focussing on distribute services.
Developing an application that is distributed over
Cloud and Fog poses particular challenges such as
ensuring interoperability, providing consistent pro-
gramming models and distribution algorithms. Re-
cently appeared frameworks try to simplify applica-
tion development. Cheng et al. (Cheng et al., 2018)
described a framework for Smart City applications
while considering distributed data processing, low la-
tency and reducing network usage. The main char-
acteristic of this approach is a programming model
that allows to implement applications using Fog and
Cloud in an uniform fashion. Using this approach,
they showed three use cases including anomaly detec-
tion of energy consumption, video surveillance and
smart city magnifier. There are several similarities
to the work presented in our paper such as enabling
low latency and scalable applications. To increase the
flexibility and support heterogeneous nodes, we use a
container-based approach instead.
The feasibility of using containerization is inves-
tigated by Bellavista et al. (Bellavista and Zanni,
2017). They presented a fog-oriented framework
based on extensions of the Kura gateway to investi-
gate the feasibility of Docker-based containerization
even over strongly resource limited nodes. The re-
sults showed that containerization provides good scal-
ability, limited overhead and high flexibility. Another
approach that utilizes containerization is presented by
Woebker et al. (Woebker et al., 2018). They described
a solution to deploy and manage Fog applications,
based on containers. The deployment mainly relies
on statically raised distribution criterias such as la-
tency and bandwith. Based on these criteria the nodes
are labeled and categorized using the labeling system
provided by Kubernetes. Our approach builds upon
the solutions presented above but focusses on provid-
ing an infrastructure to measure environmental con-
ditions such as the current load, location or security
levels to support decision making wheater a service
shall be carried out in the Cloud or on a Fog node.
A lot of effort has been made to investigate dis-
tribution algorithms in the areas of Cloud and High-
Performance Computing. The solutions need to be
adapted to the specific requirements of Fog and Cloud
integration. Neto et al. (Neto et al., 2017) described
the current problems of Fog Computing such as qual-
ity of service and load distribution. They described an
algorithm to optimize load distribution, that takes cri-
teria such as latency and priorities into consideration.
The solution presented in our paper aims to provide a
framework to evaluate existing distribution algorithm
and apply them in Fog-Cloud environments.
3 DISTRIBUTION CRITERIA
Selecting the most suitable node to distribute a ser-
vice is one primary challenge while combining Fog
and Cloud. The relevance of the criteria highly de-
pends on the optimization goal, like enery optimiza-
tion, privacy protection or process time minimization.
The following section presents static and dynamic cri-
teria that can support the decision making.
Network Connection. Several network characteris-
tics need to be considered to choose a suitable node.
Ensuring low latency data processing is crucial to
provide appropriate query time and fast computation.
Industry 4.0 applications require to collect local data,
process it and use the processing result to control the
manufacturing process. Thus, it’s necessary to deploy
time-sensitive services to a node with low latency.
Industry 4.0 services often processes large
datasets and extract insights on time. Transfering
large data sets need a high network bandwidth and is
critical for deploying services. For instance, it’s pos-
sible that short running applications perform better
deployed in the Cloud due to the delay for the deploy-
ment on low bandwidth nodes. The type of network
connection is also relevant. Applications with high
data rates or a high demand for reliable network con-
nections require a wired connection, whereas other
applications perform well with wireless connections.
Performance & Storage Capabilities. Industry 4.0
services require high computational and storage capa-
bilities. In contrast, Fog nodes are typically resource
limited. Therefore, it is necassary to consider the per-
formance (e.g. CPU and RAM). Besides the statical
performance, it is essential to dynamically check the
CLOSER 2019 - 9th International Conference on Cloud Computing and Services Science
234