Connectivity Layer
Implementing the communications protocols and
network infrastructure (HTTP, MQTT, sockets,
etc.);
Data Integration Layer
This layer responsibility is concerned with data
integration from several sources, persistence and
data processing (e.g. time series)
Application Layer
The application layer implements the main logic
and data access and processing. Exposes data ac-
cess and functionality via RESTful services. Also
enables subscription to real time services using
web sockets that enable two way communication
between the application and presentation layer;
Presentation Layer
This layer comprises the front-end modules and
applications, intended to be executed by a web
browser. Web standard client side technologies
such as HTML5, CSS and JavaScript are the sup-
porting technologies, although front end frame-
works and libraries can be integrated. Presen-
tation is intended to be cross platform and re-
sponsive to several devices specific characteristics
(size, colors, etc.).
In our specific instance of the proposed architec-
ture, the interface is fetched by a web browser that
loads all the resources needed to start the HMI. In
addition to HTML (structure and content) the loaded
web page incorporates by reference CSS (presenta-
tion) and Javascript / Ember.js (communication, in-
teraction, and behavior) documents. Ember.js was the
front-end framework selected for creating web inter-
faces using Javascript.
Real time services are subscribed using Socket.IO
library which enables real-time, bidirectional and
event-based communication between the browser and
the server.
The web server was implemented in Node.js and
making use of the express.js frameworks and the
ADS.js module. The latter enables communication
with automation software via TCP/IP.
At the integration layer we opted to incorporate
the Twincat automation solution from Beckhoff man-
ufacturer, which enables you to develop automation
solutions as well as communicate with hardware de-
vices that are connected to the system, such as motor
drives, PLCs (Programmable Logic Controller or Pro-
grammable Logic Controller) , input/output channels,
etc. This software has a modular architecture that al-
lows to deal with each module (consisting of software
and possibly hardware device) as a stand-alone de-
vice. The messages between the modules are made
Figure 1: First Interface Prototype.
possible through an ADS (Automation Device Spec-
ification) interface that each module has and through
the ”ADS Router” in the software that manages and
can identify the recipients of the messages. This in
practice means that when a remote message arrives,
the ”ADS Router” can identify which module / device
this message is addressed to.
Finally, at the sensing/actuator layer there are sev-
eral different types of devices that can control and
monitor the physical process, such as servos, temper-
ature and pressure sensing devices, etc.
3.2 First Prototype
In order to validate the proposed architecture and
technologies, a first functional prototype was imple-
mented, although not in the final equipment, but in a
similar equipment in terms of functionality and char-
acteristics.
The developed prototype allowed to assess possi-
ble issues from the implementation and to validate the
architecture and the considered for the final system
in a real equipment, thus giving us unidealized infor-
mation of its behavior, forms of communication, and
integration issues. It also allowed us to show a first
approximation to the reality that would be put into
practice in the development of the final system. With-
out hard concerns about usability or requirements re-
sponding to user needs, the main focus of the proto-
type was to develop a system using a set of selected
technologies, as well as to analyze its behavior.
This prototype has a set of functions that allow to
control and monitor the process equipment and pa-
rameters (see Figure 1), such as monitoring the state
of the machine (it can vary between on / off, pause,
auto or manual), set the axes to a certain position,
send a GCode program to the machine to interpret and
execute, pause the equipment, abort the execution of a
GCode program, view the GCode lines that are being
executed in real time, track position and view the 2D
and 3D models drawings (Figure 2).
As conclusion, we confirmed that the set of se-
lected technologies was suitable for the development
I-AM: Interface for Additive Manufacturing
647