Privacy as a Currency: Un-regulated?
Vishwas T. Patil and R. K. Shyamasundar
Department of Computer Science and Engineering, Information Security R&D Center,
Indian Institute of Technology Bombay, Mumbai 400076, India
Keywords:
Privacy, PII (Personally Identifiable Information), Social Networks, Access Control, Trust.
Abstract:
We are living in a time where many of the decisions that affect us are made by mathematical models. These
models rely on data. Precision and relevance of the decisions made by these models is dependent on quality of
the data being fed to them. Therefore, there is a rush to collect personal data. Majority of the organizations that
provide online services are at the forefront of collecting user data. Users, either voluntarily or by coercion,
divulge information about themselves in return of personalized service, for example. These organizations’
revenue model is based on advertisement where advertisers are paired with user profiles that are built on top of
collected data. This data is being used for a variety of purposes apart from delivering targeted advertisements.
Mathematical decision models are impartial to the data on which they operate. An error, omission or mis-
representation in data has an irrevocable consequence on our lives, at times, without corrective remedies.
This paper touches upon various facets of information gathering; information bias, economics of privacy,
information asymmetry – and their implications to our ecosystem if left unaddressed.
1 INTRODUCTION
The advent of communication mediums like newspa-
per, telegraph, telephone, television and Internet pro-
foundly impacted human lives by allowing humans to
trade or exert influence beyond their immediate phys-
ical sphere. Internet being the efficient, cost-effective,
real-time, two-way communication medium, upended
all other preceding mediums of communication and
has become the de-facto mode of communication to-
day. It has completely reformed the traditional meth-
ods of trade to the extent that online economy has
become the economy. In the early days of this trans-
formation from old services into new, there was a
lack of an obvious method to charge for these online
services. Fee-based and advertisement-based revenue
models emerged; the latter has prevailed and is preva-
lent even today. This is intriguing
1
(Branstetter, 2015)
and is the theme of this paper.
Advertisements existed even before Internet but
then the central problem was the conflation of au-
dience and traditional media outlets. Big compa-
nies with lots of advertising budget needed a way to
convince (psychologically manipulate) people to buy
their goods. Digital advertising fundamentally altered
this model. The computers can watch what one does
1
If you do not pay for the product, you are the product.
online and profile users based on their online behav-
ior. Through profiling, service platforms (and in turn
the advertisers) aim to know the exact extent to which
one is engaged with the service. User profiles grow
more accurate with each use of online services and
thus shrinks the famous purchase funnel
2
. Further-
more, the profiles are enriched using auxiliary infor-
mation (Calandrino et al., 2011) available from of-
fline platforms/services. In fact, the penchant for data
collection is so high that it has become a core objec-
tive of many online services the reasons could be
data about user interaction is incidental and the cost
to store data is negligible as compared to monetary
return the stored data promises.
Data is touted as new Gold (Popper, 2017;
Angwin, 2010). Therefore, it is subjected to hoarding,
re-sell, barter, etc. There are some obvious downsides
and legitimate concerns to data-driven revenue mod-
els (Ezrachi and Stucke, 2016). In case of online ad-
vertisement, it relies on the profiling of users, which
makes many people uncomfortable (Duhigg, 2012),
even if the service providers and advertisers say that
they do all of this anonymously (Hartzog and Rubin-
stein, 2017) and without invading privacy (Gao et al.,
2011; Manjoo, 2017). On the other hand, this trove
2
A marketing model which illustrates the theoretical cus-
tomer journey towards the purchase of a product.
586
Patil, V. and Shyamasundar, R.
Privacy as a Currency: Un-regulated?.
DOI: 10.5220/0006478705860595
In Proceedings of the 14th International Joint Conference on e-Business and Telecommunications (ICETE 2017) - Volume 4: SECRYPT, pages 586-595
ISBN: 978-989-758-259-2
Copyright © 2017 by SCITEPRESS Science and Technology Publications, Lda. All rights reserved
of harvested data has compelling usages beyond ad-
vertisement. For example, social networks help track
potential spread (thus containment strategies) of con-
tagious epidemics. Unlocking (WEF, 2013) this data
needs people’s trust. Whoever gets access to the trove
of user profiles has advantage over others who do
not have it. Therefore the methods to gather and in-
terpret data have become a trade secret. Complex
mathematical models are used for decision making,
at times, on incomplete data with astonishing preci-
sion (O’Neil, 2016). An error in gathered data or the
model will have serious repercussions. Application
scenarios vary from suitability of a candidate for a
job, premium for health insurance, one’s political be-
lief, etc. Data-driven mathematical models could lead
to empowerment or to discrimination.
Today, huge companies like Amazon, Facebook,
Google, and Netflix dominate the web. These cor-
porate giants enjoy an enormous amount of control
not only over what people see and do online but over
users’ private data. Through their privacy policies
and settings they (under-) inform and (partially) al-
low users to see what private data is being collected
and how it might be shared with third-parties. Among
these giants Facebook is of our special interest be-
cause of its application domain social network. In
a social network, a user interacts with other users
and the service provider is an intermediary. There-
fore, even when one user among the two specifies
her privacy policy to be restrictive and the other user
specifies her privacy policy liberally, the intermedi-
ary does not have to (or cannot) respect the restrictive
user’s policy this is what we have observed and re-
port our findings in next section. The amount of data
Facebook collects on users has helped it become the
world’s second-largest advertising company on mo-
bile devices (Economist, 2016a). From time to time
it tries to assuage its users about their privacy through
privacy settings. However, we have found that certain
functions/features of Facebook lead to subtle viola-
tions of those settings. In Section 5 we elaborate on
how those functions could be implemented to ensure
privacy-by-design. We also argue why “privacy as a
currency” model is not sustainable, and in order to
regulate the use of PII (Personally Identifiable Infor-
mation) through a cohesive access control model, a
common platform for PII is necessary where users,
service providers, and advertisers negotiate PII usage.
2 BACKGROUND
Social network services have an innate appeal to gen-
eral population as they allow the users to build social
relations with other people who share similar personal
or career interests, activities, backgrounds or real-life
connections. While doing so, users vet each other
into audience types; a panacea for advertisers. Ser-
vice providers innovate (Constine, 2016; Facebook,
2017b) ways to keep users engaged with the platform
by providing features and content relevant to a user.
User profiles are further enriched by all means possi-
ble, either by striking deals with other data aggrega-
tors (Halpern, 2016; Dewey, 2016) or through tech-
nology (FTC, 2017). The ultimate aim is to know as
much as possible about a user and her social neigh-
borhood.
Facebook owns four out of the ve most down-
loaded apps worldwide (SensorTower, 2017). It
has become more like a holding company for popu-
lar communications platforms than a social network
(Economist, 2016b). It appears that human psychol-
ogy (convenience of socializing online for free, lack
of apparent harm, value we place on our privacy)
plays a role behind the complicity of users in this mas-
sive data aggregation. For the privacy-aware users,
Facebook allows to specify privacy settings such that
users can decide who all can see or interact with their
social persona. The privacy settings safeguard infor-
mation from other users, not from Facebook. Face-
book is implied to be a trusted party. It enforces pri-
vacy policies of its users, it assures the users that it
internally regulates PII data usage, it also is the con-
sumer of the PII data it regulates, for its business in-
terests. There is a conflict-of-interests!
In the following we list out the reasons and dy-
namics that are at play behind this sublime currency
of privacy. There are 3 pillars of this ecosystem:
Platform Provider Allows users to form social
connections as they do in real-life. Entices users to
use the platform by providing compelling features
while recording all their interactions. Deploys au-
tomated language processes and sentiment analysis
on user interactions to categorize users into profiles,
affinities, and communities such that the organiza-
tion is optimized for business practices. It mone-
tizes (Saez-Trumper et al., 2014) the organized data
through advertisers and other entities that find value
in this data.
Businesses and Organizations The social media
has become an essential medium for brands and orga-
nizations because it helps them in;
advertisement, targeted promotions
actionable insights for decision making
brand monitoring & crisis detection
measurable engagement with communities
identify emerging trends/markets
Privacy as a Currency: Un-regulated?
587
identify undecided voters (linkfluence, 2017)
Users (The Product) All of the below reveal cer-
tain aspect of an individual to the platform.
targeted feeds: news, entertainment, friends
product reviews/referrals, redressal
expression of opinions, start interest groups
match-making: romance, teachers, plumbers, etc.
build reputation: seeking work/employment
location-based assistance, travel
A sense of control and protection is provided to the
users through privacy settings and privacy policy of
the platform. However, we have observed that even
stated policies being violated through legitimate in-
nocuous user actions on the platform of Facebook.
3 GAPS IN PRIVACY SETTINGS
Each user on Facebook is provided with pre-defined
relationship categories, called lists, along which
users can organize their relationships with others.
“Friends” is the basic relationship category to which
every user-to-user relationship (friendship) is added.
A user is allowed to organize friendship relations into
other pre-defined categories like “Family”, “Close
Friends”, Acquaintances” so that a distinct affinity
level could be imposed on relations. This is how
people, in real-world, intend to organize their rela-
tionships. This notion of categorizing (or listing of)
friends into affinity levels help users to specify who
can have access to their information. Labels are used
as access control policies over a user’s information.
Any requester who satisfies membership to the label
assigned with the post can access the post.
Facebook provides another set of labels for infor-
mation classification that is intensional, whose mem-
bers are not due to direct action by the user. “Friends
of friends”, “Public”, and any other affiliation-based
smart label like “University” or “School” fall under
this category. The whole gamut of information la-
belling in Facebook provides a very rich and flexi-
ble access (thus privacy) policy specification over a
user’s information. Users are allowed to change la-
bels of their objects as per their discretion. How-
ever, this flexibility in policy specification is not well-
understood (Liu et al., 2011) by majority of the users
and users end up in a state where their policy specifi-
cation may look innocuous, whereas it may not. We
show some such instances with the help of a hypo-
thetical scenario on Facebook as depicted in Table 1.
Each row represents a user’s actions in chronological
fashion. Therefore, we use A.t
2
to denote an action
of user A at time t
2
. All other actions, of every user,
up to time t
2
is the environment/status of social graph
(Bronson et al., 2013) w.r.t. A.t
2
. Thus, an action
should be analyzed in context with its current state of
the environment. Note that, since social graph is a co-
creation by its users, an individual has little or no con-
trol over the environment in which he/she is operat-
ing. An action/setting that seems privacy-preserving
can later be compromised by a change in environ-
ment. This will become clear as we go through the
scenarios below.
Nonrestrictive Change in Policy of an Object Risks
Privacy of Others Consider A.t
2
in context with
environment trace B.t
1
, B.t
3
. As user A is member of
B’s Family, through action A.t
2
user A has authored a
comment on P
B
1
. The environment at time t
2
ensured
that only Family members of B had access to object
P
B
1
. At t
3
user B has changed policy of his object P
B
1
from Family to Friends. Thus, user A’s comment is
exposed to friends of B without A’s consent.
Restrictive Change in Policy of an Object Sus-
pends Others’ Privileges Consider user action
E.t
5
in context with two other events in environment
D.t
2
, F.t
4
. User E has changed policy of her object
P
E
1
from Public to Only Me (restrictive). Prior to
policy change, users D and F have liked the object
as it was public. A restrictive change in policy over
P
E
1
locks out users from updating/retracting their own
comments or likes. At a later point in time, user E can
divulge list of users associated with her post in the
past.
Policy Composition using Intensional Labels is not
Privacy-preserving Consider user actions F.t
3
, E.t
4
in context with social list “School” at t
4
. Through
action in F.t
3
, user F has created an object P
F
3
with
a custom policy University-School. Here the inten-
tion of the user is to make the object available to his
friends from University but not from his School. Ac-
cording to the state of social graph at time t
3
nobody
gets access to object P
F
3
because University School.
At time t
4
, user E disassociates herself from social list
School and thus could get access to P
F
3
. Disassocia-
tion from a social list allow users to bypass the pri-
vacy/access intention of a custom policy composed of
social labels.
Like, Comment Operations are not Privacy-
preserving Consider user actions D.t
2
and F.t
4
in
context with environment event E.t
1
. On Facebook,
List of Friends is an object of user profile. In its pri-
vacy settings, Facebook allows to choose intended au-
dience for this object. We assume all users in this
scenario have set their audience to “Only Me” for this
object. The intention behind such a setting is not to let
the profile visitors know who their friends are, except
SECRYPT 2017 - 14th International Conference on Security and Cryptography
588
Table 1: Snapshots of associations formed in a social graph.
time
t
1
t
2
t
3
t
4
t
5
A A
authored
OnlyMe
P
A
1
A
authored
P
B
1
A
authored
Public
P
A
1
A
likes
P
F
2
A
likes
P
F
3
B B
authored
Family
P
B
1
B
authored
Public
P
D
1
B
authored
Friends
P
B
1
C C
authored
Friends
P
C
1
C
authored
P
D
1
C
authored
P
C
1
C
authored
P
B
1
C
likes
(A
authored
P
B
1
)
D D
authored
FFriends
P
D
1
D
likes
P
E
1
E E
authored
Public
P
E
1
E
likes
P
F
1
E
likes
P
D
1
E
authored
FFriends
P
F
3
E
authored
OnlyMe
P
E
1
F F
authored
School
P
F
1
F
authored
School-E
P
F
2
F
authored
University-School
P
F
3
F
likes
P
E
1
F
likes
P
A
1
Assumptions:
at time t
0
: (following is the state of different sets)
users = {A, B, C, D, E, F} objects = {P
A
1
, P
B
1
, P
C
1
, P
D
1
, P
E
1
, P
F
1
, P
F
2
, P
F
3
}
friendship edges = {(A,B), (B,C), (C,D), (D,E), (E,F), (F,A)} Family
B
= {A}
University = {E, F} School = {A, E, F}
at time t
4
: (user E has disassociated from list School) School = {A, F}
at time t
5
: University = {A, E, F} School = {F}
their mutual friends.
The way Facebook works, Newsfeed of a user
is supplied with relevant content from user’s social
circle. With a high probability friends posts appear
in Newsfeed to which the user may interact by
making a comment or like. These interactions get
consumed by the underlying social graph. When a
user interacts with objects with access policy set to
Public, those interactions also become public. Social
graph allows queries to public content. For exam-
ple, fb.com/search/FBID/photos-commented
returns all the photo type of objects on
which Alice has commented. Similarly,
fb.com/search/FBID/photos-liked returns
all photos liked by Alice. For a typical user, these
queries return objects from their friends. Any user of
Facebook can make these queries to social graph for
any other user of Facebook.
List of friends is a sensitive object of any user
and its privacy is important because knowing one’s
friend helps a social engineer to devise identity
theft/cloning, phishing attacks on the user (Jagatic
et al., 2007; Bilge et al., 2009). There are numerous
other objects (Facebook, 2017a) that are associated
with each user on Facebook; e.g., email, date of birth,
mobile OS type, currency, timezone, etc. Only a por-
tion of the complete object set is guarded by owner’s
access control/privacy settings. We observed that the
notion of privacy conferred upon its users by Face-
book is limited to human subjects alone, i.e., privacy
from fellow users or public. However the apps that are
integrated with Facebook’s ecosystem have a quasi
unfettered access to users’ objects. The ecosystem is
indeed a colossal social experiment of our time in-
volving platform owners (a few – controllers), adver-
tisers & tracker (many influencers & aggregators),
and the users (a lot voluntary reporters or power-
less subjects). This leads us to the analogy of “law
of jungle” to aptly describe the current state of PII
ecosystem where subjects pay a privacy tax to derive
benefits from Internet.
4 THE LAW OF THE JUNGLE
Browsers and mobile apps have become our interfaces
to online services. These interfaces along with their
underlying software & hardware platforms have a first
hand access to our PII. Evidently enough, this space is
controlled by service providers whose revenue model
Privacy as a Currency: Un-regulated?
589
is based on online targeted advertising. By provid-
ing features, which are usually turned on by default,
e.g., security feature of website screening for malware
detection on browsers and feature of better data con-
nectivity with the help of GPS on mobiles, they turn
these interfaces as sensors that continuously report
about user activity. By knowing the locale of these
interfaces and their time-zones it is straight-forward
to guess about users’ sleeping patterns – an important
health factor that insurers may like to use to evaluate
risk of an individual.
Next in the pecking order is: the content pub-
lishers. Content publishers have first hand access to
transactional attributes of its users. Most of the con-
tent is delivered via HTTP, which is a state-less pro-
tocol. To maintain state of a user across sessions over
the period of time, cookies are used. This helps pub-
lishers to track and learn the user behavior. Publish-
ers often support their content through the advertise-
ment revenue they collect by allowing advertisers tar-
get their users. Using third-party cookies, a cross-
publisher tracking of users is performed. Interfaces
accept third-party cookies by default as a matter of
convenience to users.
Next in the pecking order is: the connectivity
providers. ISPs can (have to) track their users for var-
ious reasons. Though there is a wide acceptance of
HTTPS protocol they can still observe the meta-data
about the communication and at times the communi-
cation itself (Upturn, 2016; Dubin et al., 2016). ISPs
have to collect PII like home/office address for billing,
and details of the financial instrument used for pay-
ment. This auxiliary data has potential to reveal an in-
dividuals credit rating, for example. Mobile ISPs have
advantage of knowing location of its customer in real-
time and call history. Mobile apps like Truecaller and
Facebook Messenger are capable of collecting loca-
tion and call logs, which can be turned-off by the user
unlike Mobile ISPs collecting logs unconditionally.
Next in the pecking order is: off-grid service
providers. Service providers like Citibank, Visa,
Uber, Walmart, record transactional data about their
customers. These data sets along with its meta-data
are of interest to themselves and others from same
industry domain or across the domains. For exam-
ple, (Cranshaw et al., 2010) establishes relationship
between mobility patterns (say from Uber) and struc-
tural properties of their underlying social network.
Next in the pecking order is: advertisers & aggre-
gators. They are responsible for targeting advertise-
ments to their intended audience. The evolved rev-
enue model is how many actual potential customers
get converted into real customers. This greatly de-
pends on accurately discovering the audience. Aggre-
gators help link data points from different web events
and off-grid events (BlueKai, 2010) and enrich the
profiles of intended audience.
Next in the pecking order is: analytics & intel-
ligence providers. They work on huge trove of PII
and off-grid auxiliary data sources in order to calcu-
late/guess intentions, future behaviors, values, prefer-
ences, habits, aspirations, etc. of an individual or an
audience. It only takes a few data points to track a set
of web & off-grid events back to a real person. Face-
book, in one of its experiments (Kramer et al., 2014),
showed its capability to play with the emotional quo-
tient of its users by tweaking its Newsfeed algorithm.
A social network analytics and intelligence startup
(linkfluence, 2017) offers developing campaign tac-
tics to identify pockets of undecided voters to try to
win them over (300 million sources and 100 million
posts harvested on a daily basis by linkfluence to de-
rive the actionable insights).
Last in the pecking order is: governments & law
enforcement. They are the lawful users of PII and
also the lawful collectors of it. In many countries
they do not have the technical, financial wherewithal
to collect and derive actionable insights. They fre-
quently rely on the decision to subpoena (accessnow,
2017) corporate giants for that shortcoming. They
find these corporate giants doing a complementary
3
job of surveillance and intelligence gathering. And
therefore, plausibly, may not have intentions to dis-
turb the established pecking order.
The ability for individuals to interact online with-
out sacrificing their PII is a vital part of the Inter-
net’s value, and is intimately related to its trustwor-
thiness. Users are left with limited or inconvenient
options at their disposal to conduct themselves in a
PII-preserving fashion on Internet.
Now these are the Laws of the Jungle, and
many and mighty are they; But the head and
the hoof of the Law and the haunch and the
hump is Obey! [Rudyard Kipling]
5 HOW TO PROTECT THE PII?
Today there is a constant battle between privacy ad-
vocates and advertisers, where advertisers try to push
new personalization technologies, and privacy advo-
cates try to stop them. However, as long as privacy
advocates are unable to propose an alternative person-
3
Facebook is turning into something of an extra-territorial
state run by a small, unelected government that relies ex-
tensively on privately held algorithms for social engineer-
ing (Bershidsky, 2017).
SECRYPT 2017 - 14th International Conference on Security and Cryptography
590
alization system that is private, this is a battle they are
destined to lose (Guha et al., 2011).
Privacy is about retaining the ability to disclose
data consensually, and with expectations about the
context and scope of sharing. Identifiability, linka-
bility of data, and the mining of vast quantities of ag-
gregated information all erode the individual’s ability
to manage disclosure, context and scope. Networks
depend on the use of unique (and often identifying)
numbers, and facilitate the instant global dissemina-
tion of information; increasingly, devices and appli-
cations gather and use geolocation data that builds up
into a unique profile for each user. A growing com-
mercial ecosystem based on targeted and behavioral
advertising results in an inexorable financial pressure
for service providers to exploit personal data. The pri-
vacy implications of the current Internet represent a
significant and growing concern (ISOC, 2017). The
concerted, persistent efforts of organizations to track
& target users will have detrimental consequences:
erosion of public trust (WEF, 2012), inhibits freedom
of expression and freedom of action online (even of-
fline, due to off-grid actors and extent of technology
integration in our daily lives). Before we delve into
the solutions, we must understand the nature of infor-
mation and constraints it imposes.
5.1 Nature of PII
Issue of privacy comes to fore as soon as an unin-
tended observer observes an information and learns
something more, which later could be associated with
that subject under observation. We have seen a
range of intended and unintended observers in Section
4. User actions on online services are transactional
therefore both parties involved in the transaction have
access and ownership to the incidental data arising
out of the transaction. Both parties must adhere to
common expectations about usage of this co-created
data. It is difficult to verify or validate whether the
expectations are respected, therefore enforcement is
difficult. A refuge in legislation is taken where ser-
vice provider through its privacy policy commits to
the contexts in which user’s PII will be used, shared.
In the presence of multiple observers (informed or
lawful), same events might be recorded by all the ob-
servers. Therefore, tracing a leaker of recorded events
becomes challenging. This fact indirectly encourages
the observers to be lax/exploitative with users’ PII.
Due to the architecture of our online ecosys-
tem and the composition of modern interwoven web-
services, a typical user request for a service spans
across different administrative domains that are par-
ticipating in the delivery of the web-service (for ex-
ample, Amazon and FedEx). Each domain operates
under its own stated privacy policy and geographic
legislative assurances. There is no universal legisla-
tion for Internet. The collected user data resides in
different administrative domains with respective pri-
vacy oversights that are often difficult to verify until
thoroughly investigated after a huge breach occurs.
User data is scattered across silos controlled by
different entities of the ecosystem. To enforce a
uniform usage policy that is acceptable to all stake-
holders of the ecosystems the data need to be available
for a uniform, consistent treatment. The incumbents
who are greatly benefiting from the status quo have
no incentive to contribute towards this effort.
5.2 Promising Approaches
5.2.1 Data Exchanges
Tapping on the potential perils of PII violations, on-
line services like respectnetwork.com have tapped
into the business of providing privacy to PII. It pro-
vides a platform for individuals where they can store
and control their PII. In (Riederer et al., 2011), a
mechanism called transactional privacy that can be
applied to personal information of users is provided.
Users decide what personal information about them-
selves is released and put on sale while receiving com-
pensation for it (e.g., datacoup.com). Aggregators
purchase access to exploit this information when serv-
ing advertisements to the user. Governments being
the biggest lawful collectors, consumers, and produc-
ers of data, are facilitating exchange of data (Eggers
et al., 2013) under Open Data Initiatives. These plat-
forms have the potential (WEF, 2014) to allow users
to store, control, curate, and monetize data as per their
preferences and intents. Such platform brings in a
level playing field but the pecking order will continue
to harness auxiliary data, meta-data about users to get
and edge over their competitors. Legislation can play
a role to rectify this behavior.
5.2.2 Legislation
There is a school of thought that believes that Gov-
ernments should nudge corporations to realize the
role of public trust in the success of online economy.
Through enacting legislations, governments can force
corporations to join hands and negotiate a strategy
that contributes in bringing back trustworthiness to
the Internet. Governments have methods (as it has
done for television industry) and tools (as it has done
for telecommunications industry) at their disposal to
bring in innovation through competition (Bergstein,
2017) and accessibility (Taplin, 2017). Legislations
Privacy as a Currency: Un-regulated?
591
(e.g., EU GDPR The EU General Data Protection
Regulation) have geographic restrictions, it does not
have universal reach. Legislation definitely has its
utility but it often falls short of coherent enforcement
on Internet.
5.2.3 Privacy-by-design
Unlike legislation, technology has a universal reach.
Privacy-by-design (Cavoukian, 2012) is an approach
where privacy expectations, out of the underlying
platform, are guaranteed through its design. Solu-
tions based on this approach are verifiable and en-
forceable. Privad (Private Verifiable Advertising) is
such an alternative system (Guha et al., 2011) for to-
day’s advertising ecosystem. FAITH (Facebook Ap-
plications: Identification, Transformation & Hyper-
visor) is another such approach (Lee et al., 2011)
designed for Facebook to mitigate privacy breaches
and manage social data. In (Juels, 2001; Hardt and
Nath, 2012; Guha et al., 2009; Cox et al., 2007) one
can find how systems can be designed to accommo-
date privacy concerns of users and audience measure-
ment requirements of platform owners. Security and
privacy enhancing measures usually come at a cost,
which is measured in terms of inconvenience. Users
use information systems for certain primary objec-
tives and when pressed against time, users tend to ig-
nore/bypass the security or privacy enhancing mea-
sure in order to quickly achieve the primary objective
(Kainda et al., 2010). Therefore, it is an important
design consideration to keep privacy enhancing mea-
sures invisible from users’ primary objectives.
Privacy-as-a-Service
In privacy enhancing services like πBox (Lee et al.,
2013), an external, communication agnostic trusted
platform is provided; where a user’s PII is aggre-
gated in a vault on user’s behalf and made avail-
able to advertisers. The platform takes responsibil-
ity to safeguard user’s privacy. In a natural exten-
sion of this approach, there could be multiple, in-
dependent PII vault providers on a single platform.
Extending it further, there could be multiple, inde-
pendent platforms. Assuming interoperability of PII
across vaults/platforms, there will be a competitive
spirit among privacy service providers to innovate and
offer user-driven privacy protections. Privacy-as-a-
service approach moves the users’ trust away from a
few powerful incumbents (like Facebook/Google) to a
platform where users can observe, curate, delete PII.
One can imagine such platforms are offered by con-
sortiums involving governments with regulatory over-
sights. Envisioning such a federated platform; guar-
antees about data access control, usage control, in-
tegrity, trustworthiness will be desired.
Context-aware control via Blockchain
Owing to the design guarantees about trust and trans-
parency emanating from blockchain algorithms, there
are systems like Enigma (Zyskind et al., 2015), and
MedRec (Azaria et al., 2016) that allow users to orga-
nize & store their PII on vaults in an encrypted format,
whereas access rights and access policies (the con-
texts in which rights should be honored) are specified
through smart contracts on the blockchain. Though
smart contracts provide flexible control over user’s PII
through programmable privacy policies, a user would
still require certain guarantees from PII consumer re-
garding post-access usage of PII. There could be sce-
narios in which users would like to participate in ap-
plications/surveys that require users to selectively de-
classify and desensitize their PII. This brings to life
following interesting challenges:
1. who should own and control the data generated
out of interactions between two or more subjects?
2. how to express (Wilton, 2013) subtle inter-
personal concepts like trust and discretion?
3. how to help a user in determining whether her in-
tended online activity is privacy-preserving w.r.t.
her current environment/state?
Since users do not trust PII consumers, the underly-
ing platform has to safeguard PII against: conflict-of-
interest, information leakage due to declassification.
Information flow control via RWFM
Information flow security assures that there is no re-
verse flow (i.e., transfer of high information to a low-
level subject). RWFM model (Narendra, N. V. and
Shyamasundar, R. K., 2017) assures information flow
security in a decentralized way and further introduces
robust declassification rules (as opposed to other sim-
ilar approaches) in the sense, a subject will not have
authority to declassify information to subjects who
have not participated in creating the information thus
far. These properties shall enable to preserve privacy
in the context of multi-level secure databases as well
in terms of views, transaction desensitization, declas-
sification, etc. An important desired property is to
ensure that any platform that is collecting and mon-
etizing PII data cannot be both consumer and regu-
lator of the collected data, simultaneously conflict-
of-interest (Brewer and Nash, 1989; Narendra K and
Shyamasundar, 2016).
SECRYPT 2017 - 14th International Conference on Security and Cryptography
592
Privacy-by-design approaches shift the onus of
privacy guarantees of a policy from legislative assur-
ance to technological assurance. And due to technol-
ogy’s universal reach we believe these are the most
promising and viable approaches.
To summarize, assuming that, in the near future,
our ecosystem converges towards tools and methods
for fair collection-and-usage of PII through a com-
mon platform, there will be a need for a cohesive ac-
cess control model for managing the PII throughout
its life-time. Some of the foreseeable expectations
from this platform would be: (i) assurance of integrity
of intent and usage of PII, (ii) feedback loop to users
alerting about privacy implications of their online ac-
tivities against their stated privacy preferences on the
platform. PII has become a currency and its usage
needs to be controlled. Once such control measures
are available, PII will be traded as a commodity on
this platform as any other commodity is traded in fi-
nancial markets.
6 CONCLUSION
Data should not be treated as Gold. Data should be
treated as Oil because it not only has an intrinsic
value but also has an utility as a fuel to run the
decision making algorithmic engines. Its unfettered
usage has a potential to pollute the online ecosystem
and turn it into an unhealthy one. The fuel needs to
be regulated and the engines need to be verifiably
standardized to acceptable emission levels.
ACKNOWLEDGEMENT
The work was carried out as part of research at IS-
RDC (Information Security Research and Develop-
ment Center), supported by 15DEITY004, Ministry
of Electronics and Information Technology, Govt. of
India.
REFERENCES
accessnow (2017). Transparency Reporting Index.
https://www.accessnow.org/transparency-reporting-
index/. accessnow.org.
Angwin, J. (Jul 30, 2010). The web’s new gold mine:
Your secrets. https://www.wsj.com/articles/S
B10001424052748703940904575395073512989404.
Wall Street Journal.
Azaria, A., Ekblaw, A., Vieira, T., and Lippman, A. (2016).
MedRec: Using Blockchain for Medical Data Access
and Permission Management. In 2
nd
International
Conference on Open and Big Data, pages 25–30.
Bergstein, B. (Apr 10, 2017). We Need More Al-
ternatives to Facebook. https://www.technology
review.com/s/604082/we-need-more-alternatives-to-
facebook/. MIT Technology Review.
Bershidsky, L. (Feb 17, 2017). Facebook Plans to
Rewire Your Life. Be Afraid. http://bv.ms/2lSQAtx.
Bloomberg.
Bilge, L., Strufe, T., Balzarotti, D., and Kirda, E. (2009).
All your contacts are belong to us: Automated iden-
tity theft attacks on social networks. In Proceedings
of the 18th International Conference on World Wide
Web, WWW ’09, pages 551–560. ACM.
BlueKai (2010). Take the mystery out of targeting.
http://www.bluekai.com/files/mediakitBlueKai.pdf.
The BlueKai Data Exchange.
Branstetter, G. (Feb 25, 2015). Why your privacy should
be a currency. https://www.dailydot.com/via/your-
privacy-should-be-a-currency/. The Daily Dot.
Brewer, D. F. C. and Nash, M. J. (1989). The chinese wall
security policy. In Proceedings of 1989 IEEE Sympo-
sium on Security and Privacy, pages 206–214.
Bronson, N., Amsden, Z., Cabrera, G., Chakka, P., Dimov,
P., Ding, H., Ferris, J., Giardullo, A., Kulkarni, S., Li,
H., Marchukov, M., Petrov, D., Puzar, L., Song, Y. J.,
and Venkataramani, V. (2013). TAO: Facebook’s Dis-
tributed Data Store for the Social Graph. In USENIX
ATC 13, pages 49–60.
Calandrino, J. A., Kilzer, A., Narayanan, A., Felten, E. W.,
and Shmatikov, V. (2011). ”you might also like:
privacy risks of collaborative filtering. In Proceedings
of the 2011 IEEE Symposium on Security and Privacy,
S&P ’11, pages 231–246. IEEE Computer Society.
Cavoukian, A. (Oct 2012). Privacy by Design
and the Emerging Personal Data Ecosystem.
https://www.ipc.on.ca/wp-content/uploads/Resources
/pbd-pde.pdf. Information and Privacy Commissioner,
Ontario, Canada.
Constine, J. (Sep 06, 2016). How Facebook News Feed
Works. https://techcrunch.com/2016/09/06/ultimate-
guide-to-the-news-feed/. TechCrunch.
Cox, L. P., Dalton, A., and Marupadi, V. (2007). Smoke-
screen: Flexible privacy controls for presence-sharing.
In Proceedings of the 5th International Conference on
Mobile Systems, Applications and Services, MobiSys
’07, pages 233–245. ACM.
Cranshaw, J., Toch, E., Hong, J., Kittur, A., and Sadeh, N.
(2010). Bridging the gap between physical location
and online social networks. In Proceedings of the 12th
ACM International Conference on Ubiquitous Com-
puting, UbiComp ’10, pages 119–128. ACM.
Dewey, C. (Aug 19, 2016). 98 personal data
points that facebook uses to target ads to you.
http://wapo.st/2bz22Cb. The Washington Post.
Dubin, R., Dvir, A., Pele, O., and Hadar, O. (2016). I
know what you saw last minute - encrypted HTTP
adaptive video streaming title classification. CoRR,
abs/1602.00490.
Privacy as a Currency: Un-regulated?
593
Duhigg, C. (Feb 16, 2012). How companies learn
your secrets. http://www.nytimes.com/2012/
02/19/magazine/shopping-habits.html. New York
Times.
Economist (Apr 7, 2016a). Facebook, the world’s
most addictive drug. http://www.economist.com/
blogs/graphicdetail/2016/04/daily-chart-5. The
Economist.
Economist (Apr 9, 2016b). The new face of face-
book: How to win friends and influence people.
http://www.economist.com/news/briefing/21696507-
social-network-has-turned-itself-one-worlds-most-
influential-technology-giants. The Economist.
Eggers, W. D., Hamill, R., and Ali, A. (2013). Data as
the new currency: Government’s role in facilitating
the exchange. http://deloitte.wsj.com/ riskandcompli-
ance/files/2013/11/DataCurrency report.pdf. Deloitte
Review.
Ezrachi, A. and Stucke, M. (2016). Virtual Competition:
The Promise and Perils of the Algorithm-Driven Econ-
omy. Harvard University Press.
Facebook (2017a). Graph API Explorer. https://develo
pers.facebook.com/tools/explorer/. Facebook for de-
velopers.
Facebook (2017b). How News Feed Works.
https://www.facebook.com/help/327131014036297.
Facebook.
FTC (Jan 2017). Cross-device tracking: A federal trade
commission staff report. https://www.ftc.gov/
system/files/documents/reports/cross-device-
tracking-federal-trade-commission-staff-report-
january-2017/ftc cross-device tracking report 1-23-
17.pdf. Technical report, FTC, USA.
Gao, H., Hu, J., Huang, T., Wang, J., and Chen, Y. (2011).
Security issues in online social networks. IEEE Inter-
net Computing, 15(4):56–63.
Guha, S., Cheng, B., and Francis, P. (2011). Privad: Practi-
cal privacy in online advertising. In Proceedings of the
8th USENIX Conference on Networked Systems De-
sign and Implementation, NSDI’11, pages 169–182.
USENIX Association.
Guha, S., Reznichenko, A., Tang, K., Haddadi, H., and
Francis, P. (2009). Serving ads from localhost for per-
formance, privacy, and profit. In In Proc. of the 8th
Workshop on Hot Topics in Networks (HotNets 09).
Halpern, S. (Dec 22, 2016). They have, right
now, another you. http://www.nybooks.com/
articles/2016/12/22/they-have-right-now-another-
you/. The New York Review of Books.
Hardt, M. and Nath, S. (2012). Privacy-aware personal-
ization for mobile advertising. In Proceedings of the
2012 ACM Conference on Computer and Communi-
cations Security, CCS ’12, pages 662–673. ACM.
Hartzog, W. and Rubinstein, I. (2017). The anonymization
debate should be about risk, not perfection. Commun.
of the ACM, 60(5):22–24.
ISOC (2017). Your digital footprint matters.
https://www.internetsociety.org/your-digital-
footprint. Internet Society.
Jagatic, T. N., Johnson, N. A., Jakobsson, M., and Menczer,
F. (2007). Social phishing. Commun. of the ACM,
50(10):94–100.
Juels, A. (2001). Targeted advertising ... and privacy too.
In Proceedings of the 2001 Conference on Topics in
Cryptology: The Cryptographer’s Track at RSA, CT-
RSA 2001, pages 408–424. Springer-Verlag.
Kainda, R., Flechais, I., and Roscoe, A. W. (2010). Security
and usability: Analysis and evaluation. In ARES 2010,
Fifth International Conference on Availability, Reli-
ability and Security, 15-18 February 2010, Krakow,
Poland, pages 275–282.
Kramer, A. D. I., Guillory, J. E., and Hancock, J. T. (2014).
Experimental evidence of massive-scale emotional
contagion through social networks. Proceedings of the
National Academy of Sciences, 111(24):8788–8790.
Lee, R., Nia, R., Hsu, J., Levitt, K. N., Rowe, J., Wu, S. F.,
and Ye, S. (2011). Design & implementation of faith,
an experimental system to intercept and manipulate
online social informatics. In Int. Conf. on Advances in
Social Networks Analysis & Mining, pages 195–202.
Lee, S., Wong, E. L., Goel, D., Dahlin, M., and Shmatikov,
V. (2013). πbox: A platform for privacy-preserving
apps. In Proceedings of the 10th USENIX Symposium
on Networked Systems Design and Implementation,
NSDI 2013, pages 501–514.
linkfluence (2017). Social media intelligence for brands and
agencies. https://linkfluence.com/en/.
Liu, Y., Gummadi, K. P., Krishnamurthy, B., and Mislove,
A. (2011). Analyzing facebook privacy settings: User
expectations vs. reality. In Proceedings of the 2011
ACM SIGCOMM Conference on Internet Measure-
ment Conference, IMC ’11, pages 61–70. ACM.
Manjoo, F. (Apr 5, 2017). The online ad indus-
try is undergoing self-reflection. that’s good news.
https://nyti.ms/2oHtaWc. New York Times.
Marshall, C. C. and Shipman, F. M. (2017). Who owns the
social web? Commun. of the ACM, 60(5):52–61.
Narendra K, N. V. and Shyamasundar, R. K. (2016). De-
centralized information flow securing method and sys-
tem for multilevel security and privacy domains. US
Patent 9,507,929.
Narendra, N. V. and Shyamasundar, R. K. (2017). A Com-
plete Generative Label Model for Lattice-based Ac-
cess Control Models. In The 15th International Con-
ference on Software Engineering and Formal Meth-
ods, SEFM 2017, Trento, Italy, September 4-8, 2017,
pages xx–xx. Springer.
O’Neil, C. (2016). Weapons of Math Destruction: How Big
Data Increases Inequality and Threatens Democracy.
Crown/Archetype.
Patil, V. T. and Shyamasundar, R. K. (2017). Undoing of
Privacy Policies on Facebook. In Proceedings of 31
st
Annual IFIP WG 11.3 Conference on Data and Appli-
cations Security and Privacy (DBSec 2017), 10359,
pages xx–xx. Springer.
Popper, N. (March 23, 2017). Banks and tech firms
battle over something akin to gold: Your data.
https://nyti.ms/2mTx7ov. New York Times.
SECRYPT 2017 - 14th International Conference on Security and Cryptography
594
Riederer, C., Erramilli, V., Chaintreau, A., Krishnamurthy,
B., and Rodriguez, P. (2011). For sale: Your data, By:
You. In Proc. of the 10
th
ACM Workshop on Hot Top-
ics in Networks, HotNets-X, pages 13:1–13:6. ACM.
Saez-Trumper, D., Liu, Y., Baeza-Yates, R., Krishnamurthy,
B., and Mislove, A. (2014). Beyond CPM and CPC:
Determining the Value of Users on OSNs. In Proceed-
ings of the Second ACM Conference on Online Social
Networks, COSN ’14, pages 161–168. ACM.
SensorTower (2017). Top Non-Game Apps by Down-
loads and Revenue - Worldwide, Q1 2017.
https://sensortower.com/blog/top-apps-q1-2017.
SensorTower.
Taplin, J. (Apr 22, 2017). Is It Time to Break Up Google?
https://nyti.ms/2p7Emhp. New York Times.
Upturn (March 2016). What ISPs Can See. Technical Re-
port, Upturn.
WEF (Feb 2013). Unlocking the Value of Personal Data:
From Collection to Usage. http://www3.wefor
um.org/docs/WEF IT UnlockingValuePersonalData
CollectionUsage Report 2013.pdf. World Economic
Forum & The Boston Consulting Group.
WEF (May 2012). Rethinking Personal Data: Strengthen-
ing Trust. http://www3.weforum.org/docs/WEF
IT
RethinkingPersonalData Report 2012.pdf. World
Economic Forum & The Boston Consulting Group.
WEF (May 2014). Rethinking Personal Data: Trust
and Context in User-Centred Data Ecosystems.
http://www3.weforum.org/docs/WEF RethinkingPer
sonalData TrustandContext Report 2014.pdf. World
Economic Forum.
Wilton, R. (2013). The Language of Privacy. http://www.
internetsociety.org/es/node/186003. Internet Society.
Zyskind, G., Nathan, O., and Pentland, A. (2015). De-
centralizing privacy: Using blockchain to protect per-
sonal data. In 2015 IEEE Security and Privacy Work-
shops, pages 180–184.
Privacy as a Currency: Un-regulated?
595