BEYOND INDOOR PRESENCE MONITORING
WITH SIMPLE SENSORS
Tuan Anh Nguyen and Marco Aiello
Distributed Systems Group, University of Groningen, Groningen, The Netherlands
Keywords:
Context-awareness, Activity Recognition, Energy-awareness, Wireless Sensor Networks.
Abstract:
To have buildings that are able to adapt to the user needs and at the same time to operate efficiently, it is
essential to know the activity the people are performing. Presence sensors, which are widely deployed in
modern buildings, attempt to regulate lighting to the presence of people in indoor spaces. Though, much more
in terms of comfort and energy efficiency can be achieved if more detailed information on the activity of the
users is detected. In this paper, we provide an initial investigation on detecting indoor activities by using
simple sensors (infrared, pressure and acoustic) deliberately avoiding the use of rich sensors such as cameras.
The sensors are low-cost, wireless, and retrofittable in existing structures. Our prototype is able to recognize
five activities (working at a desk with or without a PC, having a meeting, and the presence/absence in the
office) with accuracy of almost 95%, while unaffecting user’s behavior and comfort.
1 INTRODUCTION
Buildings account for more than 40% of energy con-
sumption in EU-27, with up to 50% in the UK
and Switzerland, and are the largest CO2 produc-
ers (EPBD, 2002). In addition, for typical industrial
and commercial buildings power usage amounts to
roughly 30% of the total operational costs. Hence, to-
wards a low carbon economy, making “smarter” use
of energy in buildings will fundamentally contribute
to energy and cost savings. Building automation sys-
tems (BAS) provide automatic control of the condi-
tions of indoor environments. Their primary goal is to
realize significant savings in energy and reduce cost.
While BAS can currently operate lighting and heat-
ing, ventilating and air conditioning (L-HVAC) sys-
tems efficiently, they are based on very simple solu-
tions to detect the presence of people. In fact, cur-
rent technology is based on infrared sensors that de-
tect movement and assume presence for a fixed tem-
poral interval after the last movement was detected.
Such a solution is reasonable in first approximation,
but clearly having better understanding of user activ-
ities, would lead to better control and possibly higher
user comfort.
User activity is an essential ingredient for maxi-
mizing the energy savings and spurring rapid adop-
tion in buildings. The recognition solutions should
be very accurate to satisfy the user and provide effec-
tive energy savings, while consuming not too much
energy itself to function nor invading the privacy of
the user. Most current solutions for BAS still operate
based on pre-determined schedules. HVAC condition
rooms assuming maximum occupancy rather than ac-
tual usage. Most occupancy sensors currently in-
stalled within buildings are fairly coarse-grained and
inaccurate. Passive infrared (PIR) based sensors are
often used for occupancy given their relative low cost
and energy consumption, though these sensors only
sense movements and most of them use a timeout for
determining room occupancy. More advanced sys-
tems have been deployed using cameras and vision
algorithms, but these systems suffer from deployabil-
ity, cost and privacy issues.
We propose an accurate activity recognition sys-
tem working at the level of individual offices. Our
work contributes to one of the key components of an
energy-aware framework based on embedded service
middleware and a building-distributed architecture of
smart objects. The platform is currently under de-
velopment within the European Framework 7 project
GreenerBuildings, (GreenerBuildings, 2010). In the
present work, we design and illustrate a system based
on a battery-based wireless sensor network that can
identify five activities of one person in an office, pos-
sibly with other people present in the room (e.g, meet-
ing at the table). The activities are: working at a desk
with or without a PC, having a meeting, and the pres-
5
Nguyen T. and Aiello M..
BEYOND INDOOR PRESENCE MONITORING WITH SIMPLE SENSORS.
DOI: 10.5220/0003801300050014
In Proceedings of the 2nd International Conference on Pervasive Embedded Computing and Communication Systems (PECCS-2012), pages 5-14
ISBN: 978-989-8565-00-6
Copyright
c
2012 SCITEPRESS (Science and Technology Publications, Lda.)
ence/absence in the office, shown in Table 1. This is
done combining sensed data from a PIR, pressure and
acoustic sensors. We evaluate the initial prototype in
terms of accuracy and show that it can detect the activ-
ities in single room offices. We stress the requirement
of not resorting to any advanced sensors which are ex-
pensive or require a change in user behavior, such as
cameras, RFID tags, or wearable sensors. Instead of
that, simple, wireless, binary sensors are enough for
accurate activity recognition. Through a number of
experiments performed in our offices, we show that
with the proposed solution it is possible to recognize
five activities of interest with 94.81% accuracy.
The remainder of the paper is organized as fol-
lows. In Section 2, we describe the requirements in-
cluding the desired set of recognized activities. Sec-
tion 3 presents the implementation of the solution in
an actual office and experimental results. We compare
our approach with related work in Section 4. Finally,
we conclude by summing up our findings and propose
future direction for investigation in Section 5.
2 INDOOR ACTIVITY
RECOGNITION
The main requirements for performing indoor activity
recognition with the goal of providing input to a con-
trol strategy for energy savings in office buildings are
the following ones.
(R1) Discriminative. The solution should distin-
guish as many activities as possible which are typ-
ical of office presence.
(R2) Simple sensors. The sensors used in the solu-
tion should be inexpensive and easy to retrofit in
existing buildings.
(R3) Energy parsimonious. The sensing solution
should itself consume not too much energy.
(R4) Privacy preserving. The solution should be
as less invasive as possible and affect the user pri-
vacy as little as possible.
(R5) Accurate. The solution should be accurate in
its recognition of user activities.
2.1 Requirement Analysis
We address (R1) by performing a survey for a com-
plete working day (8 hours) in two offices, one at
a higher education institution’s office building in
Groningen (Hanzehogeschool, The Netherlands) used
by two staff members and the another one at the
Figure 1: Activities of interest survey results.
premises of Mennes & Jager in Groningen, an elec-
tronics company with five staff members.
The offices were filmed and then the footage used
to classify by hand the activities of people over time.
The results of the manual classification are summa-
rized in Figure 1 as pie-charts. One notices in both
cases a dominance of being at the desk working with
a PC or without one. At both Mennes & Jager and
Hanzehogeschool, occupants mainly spend their time
working with computers at their desks, nearly 5 hours
and 3 hours, respectively. They spend the main part of
their remaining time sitting on their chairs at the desk
doing other tasks. In total, staff at Hanzehogeschool
use 5 hours (63%) out of their 8-hour working day sit-
ting at their desk versus 6.7 hours (84%) at Mennes &
Jager. Absence is considered as an activity, represent-
ing the unoccupied status of the office. Both offices
are not occupied more or less for a half hour dur-
ing the filmed working day. Presence indicates that
user is in the room but the specific activity does not
have any significant impact on changing the state of
the appliances. Activities of this kind include tem-
porarily walking or standing looking through win-
dows, drinking tea and so on. In the data collection
at Hanzehogeschool, presence happens in two hours
and a half while the same occurs for just 0.7 hour at
Mennes & Jager in Groningen office. With respect to
energy-awareness, “having a meeting” is an activity
that requires appliance’s adaptations such as turning
the lights on during the meeting and turning them off
otherwise, thus this activity is worth taking into ac-
count in order to save energy and satisfy user com-
fort. Therefore, in first instance we take the activities
that have emerged from the video study as the start-
ing point of the indoor activity recognition system;
namely: 1) Working with PC, 2) Working without PC,
PECCS 2012 - International Conference on Pervasive and Embedded Computing and Communication Systems
6
3) Having a meeting, 4) Presence, and 5) Absence.
Activities and their definition are described in Table 1.
Table 1: Office activities.
Activity Definition
Activity 1: Working with PC Manipulating keyboard or mouse
Activity 2: Working without PC Sitting at the desk but not manipulat-
ing keyboard and mouse
Activity 3: Having a meeting Discussing at the meeting table
Activity 4: Presence Being active in the room but the spe-
cific activity is not recognized
Activity 5: Absence Being absent from the room
The requirements (R2)-(R4) are strictly interdepen-
dent as they mostly involve the choice of the type
of sensor and the characteristics of the hardware cur-
rently available that implements them. We need to
avoid cameras for privacy reasons and sensors which
require users to wear them, such as RFID, active
badges and the like. We also have to look at sensors
which provide discriminative data without costing too
much and consuming excessive amounts of energy. In
summary, we resort to a combination of anonymous
and binary wireless sensors that are cheap, easy to
install, require minimal maintenance and supervision,
and do not have to be worn or carried, such as 1) chair
pressure sensor, 2) acoustic sensor, and 3) passive in-
frared (PIR) motion sensor. These sensors are placed
so that they are able to sense crucial information. In
addition, user privacy is unaffected as information is
sensed in binary manner (i.e. TRUE, FALSE), for ex-
ample whether the user is sitting on a chair or not,
whether the keyboard/mouse is being operated or not,
etc.
Finally, with respect to (R5), we propose the fol-
lowing recognition cases:
Working with PC activity: we define this activity
when the user sits on the working chair AND uses
the PC by operating the keyboard/mouse.
When user sits on a chair but does not operate the
keyboard/mouse then user is Working without PC.
Having a meeting is indicated by user sits on a
chair at the meeting table AND speaking.
Our system reports the current activity is Presence
in two cases: when user does not sit on working
chair, however user might be sitting on a chair at
the meeting table but no discussion is detected. In
the second case, no chair in the room is occupied,
but movement is still detected inside the room.
The last activity Absence is indicated if following
condition is satisfied: no chair in the room is oc-
cupied AND no movement is detected.
Table 2: Activities associated with sensors’ values.
No.
Boolean value of sensors
Activity
Working
chair
Key/Mouse
acoustic
Meeting
chair
Meeting
acoustic
PIR
1 T T T T T
2 T T T T F
3 T T T F T
4 T T T F F
Working
5 T T F T T
with PC
6 T T F T F
7 T T F F T
8 T T F F F
9 T F T T T
10 T F T T F
11 T F T F T
12 T F T F F
Working
13 T F F T T
without PC
14 T F F T F
15 T F F F T
16 T F F F F
17 F T T T T
18 F T T T F
Having
19 F F T T T
a meeting
20 F F T T F
21 F T T F T
Presence
22 F T T F F
23 F T F T T
24 F T F F T
25 F F T F T
26 F F T F F
27 F F F T T
28 F F F F T
29 F F F F F
Absence
30 F T F T F
31 F T F F F
32 F F F T F
2.2 Activity Recognition
The task of recognizing an activity is reduced to read-
ing the sensor data and associating the data to a spe-
cific configuration which, in turn, identifies an activ-
ity. This is best represented as a truth table of basic
sensor values, shown in Table 2, in which T means
TRUE and F stands for FALSE. The table is exhaus-
tive of all possible readings of the sensors.
The recognition of the described cases are
performed on a base station which makes a clas-
sification every minute. Each minute, the Base
Station receives reports from the sensors, it fuses the
information to decide on what activity is currently
BEYOND INDOOR PRESENCE MONITORING WITH SIMPLE SENSORS
7
Table 3: Information Fusion at the Base Station.
Activity 1:
Working with PC
condition-1= (WorkingChairOccupy == T RUE
Key/MouseOccupy == T RUE)
Activity 2:
Working without PC
condition-2= (WorkingChairOccupy == T RUE
Key/MouseOccupy == FALSE)
Activity 3:
Having a meeting
condition-3= (WorkingChairOccupy == FALSE
MeetingChairOccupy == T RUE HumanVoice == T RUE)
Activity 4:
Presence
condition-4= ((WorkingChairOccupy == FALSE
MeetingChairOccupy == T RUE HumanVoice == FALSE)
W
(WorkingChairOccupy == FALSE MeetingChairOccupy == FALSE
PIR == T RUE))
Activity 5:
Absence
condition-5= (WorkingChairOccupy == FALSE
MeetingChairOccupy == FALSE PIR == FALSE)
occurring. Table 3 shows the information fusion
process for recognizing the current activity. In
the first case, Activity 1-Working with PC is con-
firmed by the condition: (WorkingChairOccupy ==
T RU E Key/MouseOccupy == T RU E). Activ-
ity 2-Working without PC is recognized through
the change in Keyboard/Mouse Occupy status,
with the condition: (WorkingChairOccupy ==
T RU E Key/MouseOccupy = FALSE). Acous-
tic sensor at the meeting table is used to check
the status of Human voice appearance and Ac-
tivity 3-Having a meeting is recognized un-
der the condition: (WorkingChairOccupy ==
FALSE MeetingChairOccupy == T RUE
HumanVoice == T RU E). In the case of
Activity 4-Presence the following condi-
tion is checked: ((MeetingChairOccupy ==
T RU E HumanVoice ==
FALSE)
W
(WorkingChairOccupy == FALSE
MeetingChairOccupy == FALSE PIR ==
T RU E). Finally, (WorkingChairOccupy ==
FALSE MeetingChairOccupy == FALSE
PIR == FALSE) confirms Activity 5-Absence. The
condition-(1 to 5) is then used in our recognition
algorithm represented in Algorithm 1. To give best
performance, synchronization of the sensed data is
important, the more precise the alignment of the
reports is, the more accurate the sensor fusion is.
We use the Packet-level Time Synchronization that
is limited to single-hop communication described
in (TEP133, 2008). More precisely, the proposed
recognition algorithm is given in Algorithm 1.
In the algorithm, the input is a vector of sensor
data, in our case, it is V = [WorkingChairOccupy,
Key/MouseOccupy, MeetingChairOccupy, Human-
Voice, PIR]. Data is pushed to the Base Station
periodically and the vector V is updated with the data
Algorithm 1: Activity Recognition.
Input: V: vector of sensor readings
Output: A: recognized user activity
1: for each time interval do
2:
¯
V = read(V)
3: A = activity
(
¯
V ) {* See Table 3}
4: return A
5: end for
and the time stamp (e.g. each minute, each five
minutes, etc.). The implemented time synchroniza-
tion mechanism (Packet-level Time Synchronization)
guarantees sensor data timestamps are accurate up to
±(T
r
/2 min) (where T
r
is the round trip time and
min is an estimated minimum time for sending any
message). For a single hop network, the accuracy
is very high, given the small value of T
r
, especially
when compared with the size of the time interval.
By using reliable messaging, the base station is also
guaranteed to eventually get all sensor readings for
each time interval; moreover, message omissions due
to packet losses are handled by keeping the previous
state as valid until a new message arrives. The al-
gorithm works as an infinite cycle that at the end of
each time interval reads all the pushes of the sensor
data and it uses them to determine which activity has
taken place in that time interval, that is, the sensor
data is read to
¯
V from V by
¯
V = read(V ) and then the
function activity(
¯
V ) recognizes the occurring activity
based on the information fusion cases of Table 3.
3 IMPLEMENTATION
To test the quality of the activity recognition with real
sensors, we make a study in our own office at the Uni-
versity of Groningen, using three types of sensors and
PECCS 2012 - International Conference on Pervasive and Embedded Computing and Communication Systems
8
classifying five activities, as presented in the previ-
ous sections. In one of the rooms, chairs occupancy
is detected by a pressure sensor. We also place one
acoustic sensor near the keyboard and mouse of desk-
top position. Another acoustic sensor is placed at the
meeting table for detecting human voice as the sign
of a meeting taking place. Finally, one central ceil-
ing mounted PIR sensor is placed in the center of the
room. Figure 6(a) overviews our instrumented room.
3.1 Simple Sensors
We use IEEE 802.15.4 compliant wireless sensors
based on the original open-source “TelosB” platform
design developed and published by the University of
California, Berkeley (“UC Berkeley”). The hardware
is produced by Advantic Systems (ADV, 2011), see
the photos in Figure 4. The sensors are equipped
with ultra low-power 16bit microcontroller MSP430
and run a low-power consumption management algo-
rithm. The motes also have an extension interface
that can be used to connect various sensor boards
containing photo-, temperature-, humidity-, pressure
sensors, and accelerometers, magnetometers and mi-
crophones. The on-board PIR and Microphone are
used together with FlexiForce pressure sensor. The
motes are programmed in nesC and run on the TinyOS
2.1.1 platform (Philip Levis, 2004), a lightweight,
low-power platform for embedded operating systems.
The component for Information Fusion and Activity
Recognition on the Base Station is written in Java.
Let us consider the sensors one by one.
3.1.1 Pressure Sensor
We use a pressure sensor to detect chair occupancy
status. The sensor designed by Advantic Systems uses
the Tekscan
R
A201-100 FlexiForce
R
sensor (see
Figure 4(b)), which provides force and load measure-
ments. The FlexiForce sensor can be used to measure
both static and dynamic forces (up to 100lb or 400N),
and is thin enough to enable non-intrusive measure-
ment. In our experiments, the sensor is placed on the
chair as shown in Figure 6(c). Figure 2 shows a 30-
minute (from minute 28 to minute 55) result from one
of our tests about chair pressure. When no one sits
on the chair, pressure sensor returns a value which is
less than 5 Newtons and has an average value of about
1 Newton. While the chair is occupied, the value of
pressure is greater than 20 Newtons and can go up
to almost 100 Newtons. Additionally, when work-
ing at the desk (with or without PC), the user might
stand up for a while or change the sitting posture thus
the pressure value hovers over time, however, we es-
tablish from experimental data that when the chair is
occupied the pressure average value in one minute is
greater than 20 Newtons while less than 10 Newtons
is the value when the chair is unoccupied. Therefore,
a one-minute average value of 20 Newtons is set as
the threshold to distinguish among the two states of
the chair’s occupancy. In order to get a precise aver-
age value, it is calculated from 10 polled values ev-
ery minute. The average value is then compared with
the threshold (20 Newtons) to identify the chair sta-
tus. The pressure sensor node wirelessly sends chair
status packets with a single value (TRUE if it detects
chair occupy; FALSE otherwise) to the Base Station.
Figure 2: Pressure Sensor Data.
3.1.2 PIR Motion Sensor
The Passive InfraRed sensor detects the motion of
users inside the office. The sensor uses the Perkin
Elmer Optoelectronics
R
LHI878 sensor, Figure 4(c).
The LHI878 pyroelectric infrared-detector series are
the standard dual element design recommended for
all variants of motion control. PIR sensor data pro-
vides indication of whether the user is likely to be in
the office, especially when all other sensors do not de-
tect any activity. Different from pressure information,
with which average value reflects the pressure during
the polled period, user movement only triggers PIR
sensor for short time intervals. Figure 3 illustrates a
10-minute reading (from minute 45 to minute 55) of
PIR data. PIR value is measured in analog-to-digital
converter (ADC) unit. When there is no movement
inside the field of view, the used PIR sensor returns
a small value that stably is either one, two or three
ADC. Movement is detected when PIR is triggered
and it returns a value that is also stably at around
2840 ADC. Based on these experimental results we
use 2800 ADC as the threshold for detecting motion
of user inside the office. After the polled period, max-
imal PIR value is taken, instead of calculating an av-
erage value, as the key value to compare with the
threshold of 2800 ADC. The PIR sensor also sends
BEYOND INDOOR PRESENCE MONITORING WITH SIMPLE SENSORS
9
a single-value status (TRUE or FALSE) packet to the
base station each minute.
Figure 3: PIR Sensor Data.
(a) Advantic TelosB mote
(b) Tekscan
R
A201-100 (c) PIR and Acoustic
Figure 4: Advantic Sensor. Retrieved November 11 2011
from http://www.advanticsys.com/.
3.1.3 Acoustic Sensor
The SE1000 acoustic sensor provided by Advantic
Systems goes on the attachable sensor board is a mini-
microphone (20 16000 Hz, SNR 58 dB), Figure 4(c),
capturing ambient audio information. It is designed
to detect the presence/absence of sound.
Different from PIR or pressure data, with which
we are able to set an absolute value as a thresh-
old to distinguish occupancy states, the changes in
sound level (decibel) depend on offices and settings.
For example, background noise in different offices or
at different time of the day is different. Therefore,
the threshold value can not be given as an absolute
value. However, monitored sounds (keyboard typing,
mouse clicking, person speaking) make a significant
change in sound level in comparison with ambient
background noise. For this reason, before final de-
ployment, one runs a training phase to establish the
difference in decibel level of acoustic data between
two cases. Acoustic data is sampled at 1 MHz provid-
ing microsecond time resolution needed, in turn, for
capturing the appearance of expected sounds. Then,
average value of 1-minute polling data is used as deci-
bel value of the background noise, then again, the
maximal value of polled data is compared with this
ambient noise’s level to check the appearance of mon-
itored sounds.
For sensing keyboard/mouse sounds, we place the
acoustic sensor within 30 centimeters from the key-
board and mouse, as shown in Figure 6(b), another
acoustic sensor is placed on the meeting table to sense
human voice. In case of keyboard typing and/or
mouse clicking, data from early experiments shows
that when keyboard is being typed and/or mouse is
being clicked the change in decibel level is greater
than 20 dB. In “person speaking” case, the differ-
ence between background noise and human voice is
even bigger, the subtraction of average value from
maximal one leaves a value greater than 30 dB. Fig-
ure 5 shows results from some of our experiments dur-
ing training phase, illustrating 1-minute polling. Fig-
ure 5(a) exemplifies the result from Key/Mouse us-
age while Figure 5(b) represents result from Meet-
ing table when person’s speaking sound is present.
From the results during the training phase, we set the
value of 20 dB, getting from the subtraction of aver-
age value from maximal one as the threshold for both
keyboard/mouse acoustic sensor at the working table
and human voice acoustic sensor at the meeting table.
3.2 Experimental Setup and Results
Using the layout of Figure 6, the sensors just de-
scribed and the classification algorithm presented in
Section 2, we perform an experiment lasting 5 days,
that is, daily from 9:00 AM to 18:00 PM, from August
22 to 26, 2011 to verify the accuracy of the proposed
approach in terms of activity classification. During
the experiment, the user takes accurate notes of ac-
tual activities happening in the office every five min-
utes, which is used as golden truth for the evaluation.
Table 4 shows the real occurrences of activities for
each minute of each activity. As can be seen, the user
spends most of his/her time Working with PC, 1350
out of 2700 minutes, followed by Working without PC
with 595 minutes over the experiment. Another 180
PECCS 2012 - International Conference on Pervasive and Embedded Computing and Communication Systems
10
(a) Keyboard/Mouse Acoustic Data
(b) Human Voice Acoustic Data
Figure 5: Acoustic Data.
(a) Overview of the instrumented room
(b) Acoustic Sensors’ Placement (c) Pressure Sensor Placement
Figure 6: The instrumented room.
minutes during one working week is spent for Having
a meeting activity while Presence and Absence activ-
Table 4: Golden Truth of activities’ happening time.
Activity Working
with
PC
Working
without
PC
Having
a
meeting
Presence Absence
Happening
time (min.)
1350 595 180 265 310
Total (min.) 2700
ities take 265 and 310 minutes, respectively.
To be more precise, we adapt to our case the follow-
ing performance measures. Given an activity A, we
measure accuracy in the following way:
FP (False Positive): the number of happening
minutes of other activities recognized incorrectly
as A.
FN (False Negative): the number of happening
minutes of A recognized falsely as not A.
TP (True Positive): the number of happening min-
utes of A recognized truly as A.
Recall: the proportion of the time correctly recog-
nized as A against the real happening time of A.
Also called “True Positive Rate” - the percentage
of positives the system recognizes correctly.
BEYOND INDOOR PRESENCE MONITORING WITH SIMPLE SENSORS
11
Recall =
T P
T P + FN
(1)
Precision: the rate of the real happening time of A
over all the time recognized as A.
Precision =
T P
T P + FP
(2)
The overall success rate of the system is computed
based on the following formula:
success rate =
1
activities
(FP + FN)
2700
(3)
in which,
activities
(FP + FN) is the total minutes of
wrong recognition. 2700 is the number of minutes of
total experimental period.
Our system makes a classification every minute.
The results of our experiment are presented in Ta-
ble 5. The results for TP of Working with PC is 1350
out of 270 minutes. The FN is kept to 0 minutes,
which leads to the perfect Recall of 100% while the
Precision is 96.84% caused by 44 minutes of the FP.
Our system recognize 565 minutes for Working with-
out PC activity with 30 minutes of FN and the FP is
0, resulting in 94.96% of Recall and 100% of Preci-
sion. The next row in the table depicts the atributes
of Having a meeting activity. 93.57% is the Preci-
sion of the system as recognized happening time is
192 minutes while the real one is 180 minutes. As
a consequence, the system recognizes mistakenly 12
minutes more than the total happening time of this ac-
tivity, reflected on FP is 12, FN is 0, and Recall for the
one is 100%. The Recall of the system in recogniz-
ing Presence and Absence activities are 94.34% and
96.45%, respectively. The FN values express that 15
minutes for Presence and 11 minutes for Absence are
recognized incorrectly as the activities. However, the
Precision for both of them is 100%.
Table 5: Experimental Results.
Activity FN
(min.)
FP
(min.)
TP
(min.)
Recall
(%)
Precision
(%)
Working with PC 0 44 1350 100 96.84
Working without PC 30 0 565 94.96 100
Having a meeting 0 12 180 100 93.75
Presence 15 0 250 94.34 100
Absence 11 0 299 96.45 100
3.3 Discussion
By analyzing raw result data, we find out that 44-
minute FP of “Working with PC” is taken from FNs
of “Working without PC” and “Presence” activities.
While 12-minute FP of “Having a meeting” is taken
from FNs of “Presence” and Absence” activities.
This is caused by inaccurate acoustic data decisions.
At some moments some louder sound appears causing
the subtraction from the average value and thus brin-
ing the maximum value above the 20dB threshold,
thus the acoustic sensors returns TRUE. As a conse-
quence, the system indicates the activity as “Working
with PC” or “Having a meeting”.
The error may cause unnecessary energy usage
by increasing the PC-related or meeting-related load
(e.g. LCD monitor or lights are not turned off). Mean-
while, this results in the Recall of “Working with
PC” and “Having a meeting” activities is 100%, mak-
ing sure that PC/meeting-related devices are working
probably while the user is really working with the PC
or having a meeting. These satisfy one of the impor-
tant criterion of a smart building that is the user com-
fort has higher priority than energy saving.
4 RELATED WORK
In recent years, many research efforts have been car-
ried out to design smart building. The advent of
low-cost wireless sensor networks has enabled wider
deployment opportunities of a large number of con-
nected sensors thus allowing for improved sensing in
buildings. We shall review here some of the most rel-
evant previous works on using activity recognition as
a driver for control of smart building with respect to
energy-aware perspective.
Prior researches in HVAC control systems show
that occupancy information can be used to drive a
more optimized HVAC schedule. In (Erickson and
Cerpa, 2010) or (Agarwal et al., 2010), the authors
propose a demand response HVAC control strategy
that uses real-time occupancy monitoring with oc-
cupancy prediction to achieve efficient conditioning.
Another example is (Newsham and Birt, 2010), where
the ARIMAX model is developed to forecast the
power demand of the building in which a measure of
building occupancy is a significant independent vari-
able and increased the model accuracy. However, due
to the difficulty in obtaining real-time accurate occu-
pancy data, many of these techniques focus on us-
ing pre-determined schedules. Furthermore, these re-
searches only try to use occupancy data, far less than
the details of user activity.
In the context of Artemis SOFIA Project (SOFIA,
2009), the authors in (Niezen et al., 2010) provide a
way for users to physically interact with devices, ex-
pressing themselves. Through developing meaning-
ful interaction devices their system can predict to un-
PECCS 2012 - International Conference on Pervasive and Embedded Computing and Communication Systems
12
derstand what the user is trying to accomplish, never-
theless user needs to perform necessary actions. Our
approach differs in trying to recognize user’s activity
automatically.
Simple sensors are used in many modern build-
ings. For instance, Passive InfraRed (PIR) based
sensors are often used (especially with lighting sys-
tem) for occupancy detection. The sensors are con-
nected directly to local lighting fixtures. These PIR
sensors are also simple movement sensors and of-
ten cannot actually determine if the room is occupied
or not. Thus, most of them use a timeout for shut-
ting off the lights (30 minutes is common) which can
be sub-optimal. (Padmanabh et al., 2009) investi-
gate the use of microphones and PIR sensors for the
efficient scheduling of conference rooms. (Delaney
et al., 2009) use PIR based wireless occupancy sen-
sors to measure wasted energy in lighting even when
there are no occupants. In the AIM Project (Barbato
et al., 2009), authors suggest to measure some phys-
ical parameters like temperature and light as well as
user presence based on PIR sensors in each room of
a house. (Gao and Whitehouse, 2009) seek to use
coarse occupancy data (leave home, return home) to
drive a self-programming home thermostat; however
the focus is on the thermostat self-programming al-
gorithms, and not on obtaining accurate occupancy.
(Wang S, 1998) examine CO2-based occupancy de-
tection yet it is very slow to detect events such as
incoming people. These efforts, however, neither
use occupancy information to drive actual systems
nor evaluate accuracy of their detection sensor, and
are rarely used for intelligent energy management in
buildings. Our system uses the same simple sensors
but provides far more detailed information on the ac-
tivity of the users.
More advanced systems, such as using sonar-
based sensing (Tarzia et al., 2009) or cameras and vi-
sion algorithms (Teixeira and Savvides, 2008) have
been presented, though they suffer from deployabil-
ity, cost and privacy issues. SCOPES (Kamthe et al.,
2009) is an occupancy monitoring system that detects
near real-time occupant movement between rooms
with an accuracy of 80%. (Erickson et al., 2009) pro-
pose a wireless network of cameras (which have the
aforementioned privacy and cost issues) to determine
real-time occupancy across a larger area in a building,
focusing more towards coarse-grained floor-level oc-
cupancy detection. In (Singhvi et al., 2005), the occu-
pants are equipped with sensor badges, with which it
is possible to achieve relatively accurate localization
using, for example, RFID tags. In SPOTLIGHT (Kim
et al., 2008), the authors present a prototype system
that can monitor energy consumption by individuals
using a proximity sensor. While the authors tackle
the right challenge, the system either requires users
to carry active RFID tags or to explicitly tap tags on
RFID readers, which is cumbersome. In contrast, we
use simple binary sensors.
In summary, most related work either uses rich
sensors (e.g. cameras, RFID tags, wearable sensors)
or simple sensors but providing coarse-grained occu-
pancy information rather than detailed activities into
appliance control strategies. In this paper, our system
provides a finer grained activity recognition solution
with simple sensors exhibiting very good classifica-
tion results.
5 CONCLUSIONS
We believe that one of the key inputs for BAS is de-
tailed and precise user activity information that, in
turn, can drive the control of buildings and decrease
energy consumption while preserving the user com-
fort. Our proposed activity recognition approach is
useful foraccurate activity recognition at the level of
individual offices. By using low-cost, binary, and
wireless sensors we are able to recognize five office
activities for one person in a standard office room.
Through initial experiments, we show that the so-
lution can recognize user activity accurately with a
success rate of 94.81%. More importantly, false-
negatives (which may lead to discomfort) are kept
perfectively at 0%, satisfying user expectations at
work. These promising results suggest to investi-
gate further the office activity recognition with sim-
ple sensors. In particular, it is important to enrich
the set of recognized activities, possibly using other
types of simple sensors. It is also essential to con-
sider multi-user in one room and multi-room situa-
tions. Finally, we plan to couple approaches, as the
one presented here, with actual building control, to
measure the amount of actually saved energy and test
the resulting user experience, this in the context of the
GreenerBuildings project.
ACKNOWLEDGEMENTS
We thank Peter Kamphuis, Jeroen Jager and Dim-
itri de Jong for providing data on the survey re-
ported in Section 2. We also thank Sharique Arshi
for useful discussions on acoustic sensing. Tuan Anh
Nguyen is supported by the Vietnam International Ed-
ucation Development program (VIED). The work is
supported by the EU FP7 Project GreenerBuildings,
contract no. 258888 and the Dutch National Research
BEYOND INDOOR PRESENCE MONITORING WITH SIMPLE SENSORS
13
Council under the NWO Smart Energy Systems pro-
gram, contract no. 647.000.004.
REFERENCES
ADV (2011). Advantic Systems website http://
www.advanticsys.com/.
Agarwal, Y., Balaji, B., Gupta, R., Lyles, J., Wei, M., and
Weng, T. (2010). Occupancy-driven Energy Manage-
ment for Smart Building Automation. In Proceedings
of the 2nd ACM Workshop on Embedded Sensing Sys-
tems for Energy-Efficiency in Building, BuildSys ’10,
pages 1–6, New York, NY, USA. ACM.
Barbato, A., Borsani, L., Capone, A., and Melzi, S. (2009).
Home Energy Saving through a User Profiling Sys-
tem based on Wireless Sensors. In Proceedings of
the First ACM Workshop on Embedded Sensing Sys-
tems for Energy-Efficiency in Buildings, BuildSys ’09,
pages 49–54, New York, NY, USA. ACM.
Delaney, D. T., O’Hare, G. M. P., and Ruzzelli, A. G.
(2009). Evaluation of Energy-efficiency in Lighting
Systems using Sensor Networks. In Proceedings of
the First ACM Workshop on Embedded Sensing Sys-
tems for Energy-Efficiency in Buildings, BuildSys ’09,
pages 61–66, New York, NY, USA. ACM.
EPBD (2002). European Union Directive on the Energy
Performance of Buildings - 2002-91-E.
Erickson, V. L. and Cerpa, A. E. (2010). Occupancy
based Demand Response HVAC Control Strategy. In
Proceedings of the 2nd ACM Workshop on Embed-
ded Sensing Systems for Energy-Efficiency in Build-
ing, BuildSys ’10, pages 7–12, New York, NY, USA.
ACM.
Erickson, V. L., Lin, Y., Kamthe, A., Rohini, B., Surana,
A., Cerpa, A. E., Sohn, M. D., and Narayanan, S.
(2009). Energy Efficient Building Environment Con-
trol Strategies using Real-time Occupancy Measure-
ments. In Proceedings of the First ACM Workshop
on Embedded Sensing Systems for Energy-Efficiency
in Buildings, BuildSys ’09, pages 19–24, New York,
NY, USA. ACM.
Gao, G. and Whitehouse, K. (2009). The Self-programming
Thermostat: Optimizing Setback Schedules based on
Home Occupancy Patterns. In Proceedings of the First
ACM Workshop on Embedded Sensing Systems for
Energy-Efficiency in Buildings, BuildSys ’09, pages
67–72, New York, NY, USA. ACM.
GreenerBuildings (2010). GreenerBuildings Project web-
site http://www.greenerbuildings.eu/.
Kamthe, A., Jiang, L., Dudys, M., and Cerpa, A. (2009).
SCOPES: Smart Cameras Object Position Estimation
System. In Proceedings of the 6th European Confer-
ence on Wireless Sensor Networks, EWSN ’09, pages
279–295, Berlin, Heidelberg. Springer-Verlag.
Kim, Y., Charbiwala, Z., Singhania, A., Schmid, T., and
Srivastava, M. B. (2008). Spotlight: Personal Natural
Resource Consumption Profiler. HotEmNets 2008.
Newsham, G. R. and Birt, B. J. (2010). Building-level
Occupancy Data to Improve ARIMA-based Electric-
ity Use Forecasts. In Proceedings of the 2nd ACM
Workshop on Embedded Sensing Systems for Energy-
Efficiency in Building, BuildSys ’10, pages 13–18,
New York, NY, USA. ACM.
Niezen, G., Hu, J., and Feijs, L. M. G. (2010). From Events
to Goals: Supporting Semantic Interaction in Smart
Environments. In 1st Workshop on Semantic Interop-
erability for Smart Spaces (SISS2010).
Padmanabh, K., Malikarjuna, V, A., Sen, S., Katru, S. P.,
Kumar, A., C, S. P., Vuppala, S. K., and Paul, S.
(2009). iSense: a Wireless Sensor Network based
Conference Room Management System. In Proceed-
ings of the First ACM Workshop on Embedded Sensing
Systems for Energy-Efficiency in Buildings, BuildSys
’09, pages 37–42, New York, NY, USA. ACM.
Philip Levis, Sam Madden, J. P. R. S. A. W. D. G. J. H. M.
W. E. B. D. C. (2004). TinyOS: An Operating Sys-
tem for Sensor Networks. In in Ambient Intelligence.
Springer Verlag.
Singhvi, V., Krause, A., Guestrin, C., Garrett, Jr., J. H., and
Matthews, H. S. (2005). Intelligent Light Control us-
ing Sensor Networks. In Proceedings of the 3rd inter-
national conference on Embedded networked sensor
systems, SenSys ’05, pages 218–229, New York, NY,
USA. ACM.
SOFIA (2009). Artemis SOFIA website http://www.sofia-
project.eu/.
Tarzia, S. P., Dick, R. P., Dinda, P. A., and Memik, G.
(2009). Sonar-based Measurement of User Presence
and Attention. In Proceedings of the 11th interna-
tional conference on Ubiquitous computing, Ubicomp
’09, pages 89–92, New York, NY, USA. ACM.
Teixeira, T. and Savvides, A. (2008). Lightweight People
Counting and Localizing for Easily Deployable In-
doors WSNs. volume 2, pages 493–502.
TEP133 (2008). TinyOS Enhancement Proposals 133:
Packet-level Time synchronization.
Wang S, J. X. (1998). CO2-Based Occupancy Detection for
On-Line Outdoor Air Flow Control. volume 7, pages
165–181.
PECCS 2012 - International Conference on Pervasive and Embedded Computing and Communication Systems
14