and non-engineering contexts, such as biomedical,
electrical and mechanical engineering, computer sci-
ence, informatics, sports science, psychology, neu-
rosciences, medicine and many more. In previous
work by our group, we have developed an application
for off-line data annotation and ground-truth collec-
tion (Lourenc¸o et al., 2013) Now, the main motiva-
tion for our contribution was the need for an easy-to-
use, versatile and scalable software for biosignals vi-
sualization in real-time, capable of direct interaction
with an acquisition device, the BITalino (Alves et al.,
2013), also a result from previous research done by
our group.
This paper is organized as follows: Section 2 de-
scribes the architecture used in our application and
the biosignal visualization software framework; Sec-
tion 3 providesexperimental results obtained from the
software performance and usability standpoints; and
finally, Section 4 highlights the main conclusions and
future work perspectives.
2 SYSTEM ARCHITECTURE
Our framework was designed under a Model-View-
Controller (MVC) architecture, decoupling the pre-
sentation from the processing and persistency layers,
as proposed in (Silva et al., 2012). This approach has
the advantage of dividing the application into inde-
pendent and interchangeable modules, which can be
developed in different platforms, and with the possi-
bility of reusability. In the approach presented in this
paper, the system is divided into 3 main modules, as
represented in Figure 3: a) View, a front-end based on
Web technologies that displays the user interface and
allows all the interaction; b) Controller, where all the
events triggered in the user interface are mapped into
operations; c) Model, which coordinates the applica-
tion logic by evaluating the messages received by the
controller, executing the operations and producing re-
sults.
2.1 View
One of the requirements for our work was for the
front-end to provide intuitive and interactive inter-
faces. Web browsers currently offer the versatility of
being used in all operative systems, combined with
the possibility to create a rich user interface expe-
rience with relative ease of layout design and for-
matting. The HTML is the base technology, which
is responsible for modelling the web page structure.
Alongside with this technology there are the Cascad-
ing Style Sheets (CSS), which controls the style and
layout of the web page. JavaScript provides a compre-
hensive set of functions for user-interface event han-
dling, interaction logic, and browser-side computing.
Our web application user interface was designed
recurring to HTML5 and CSS3, the latest standards
that offer means for producing aesthetical, simple and
intuitive front-ends. Figure 1 shows the overall inter-
face that we entitled SignalBIT. A global overview
plot on the bottom of the work area provides a 1
minute summary of all the analog channels being ac-
quired. The graphics on the center of the screen show
the individual channels, which in this case are Ac-
celerometer (ACC), Light Dependent Resistor (LDR),
Electrodermal Activity (EDA) and Electrocardiogra-
phy (ECG) sensor, and finally, in the separation be-
tween the two areas we can visualyze the state of the
digital inputs or change the state of the digital outputs,
by clicking on the 4 buttons on the right.
Figure 1: SignalBIT application user interface.
The major interaction is provided by the menu,
shown in Figure 3, which contains the following op-
tions: a) Start/Stop acquisition; b) Choose acquisi-
tion configurations, such as the number of channels to
be acquired, or the device to communicate with; and
c) Save the recorded data for later processing.
When the user starts a new recording session, the
signals start to appear in each corresponding plot, as
represented in Figure 1. By default, the timespan
of the individual graphics shown corresponds to 12
seconds of signal acquired, marked as grey in the
overview plot, however, the user can change the time
scale by pressing the left or right arrow keyboard
keys, increasing or decreasing, respectively, from 1
to 30 seconds. Figure 2 a) and b) represent the visu-
alization result of changing the time scale. The user
can also change the scale amplitude by zooming or
change the offset of the graphic. The zoom event
is triggered by pressing the ’plus’ key for zoom out
and ’minus’ for zoom in. The user can also adjust the
baseline of the signal by pressing the up or down ar-
row keys. In Figure 2 c) and d), we depict the result
SignalBIT-AWeb-basedPlatformforReal-timeBiosignalVisualizationandRecording
157