
known, but they are not well-suited for conflict resolu-
tion in OSN. More fine-grained solutions use majority
or consensus protocols (Alshareef et al., 2020; Carmi-
nati and Ferrari, 2011) or more complex strategies,
like (Hu et al., 2013) where majority or consensus is
applied after a vote based on weighted preferences
and trust levels of the controllers. Other works apply
game theory notions to negotiation for conflict resolu-
tion (Rajtmajer et al., 2016). However these methods
have limitations when applied to OSN. Negotiation to
agree on a decision creates additional burden for users.
Game theory is not flexible enough to represent human
behaviour due to social idiosyncrasies (Such and Cri-
ado, 2015), even if some works base their games on
trade-off between privacy and utility (Hu et al., 2014).
Some works focus on access conflict management
for pictures, proposing solutions to blur faces (Ilia
et al., 2015) or parts of the picture (Vishwamitra et al.,
2017). However, the generalisation of these methods
to other type of content is not easy.
Other works base their resolution method on thresh-
old values. These values are generally based on trust
levels assigned to users or on sensitivity levels associ-
ated with objects. For example, in (Hu et al., 2013),
the authors propose, in addition to the previously cited
vote strategy, to use the computed decision value as a
threshold to authorize or not the access. In (Such and
Criado, 2015), the authors propose to estimate the will-
ingness of changing their decision for the controllers,
to be used as a threshold to modify the controllers de-
cision. These methods may overload users with value
calculation, often requiring manual parameter setting,
which can be nontrivial. An alternative solution is
the assistant ELVIRA (Mosca and Such, 2022) which
implements a negotiation protocol without burdening
the controllers. However the negotiation is done at the
level of sets of users, loosing granularity.
Most of the methods in the literature are based
on some common decision-making factors we briefly
describe next. Some of them are used also in the
solution we propose (see Sec. 4.1).
Trust: trust can be defined as the belief that a
person is acting sincerely and won’t do any harm. In
conflict resolution methods, the trust or tie strength
(Such and Criado, 2015) represents the degree of trust
a user has in another one. Quantifying this concept
between users unknown from each other is difficult
and has lead to the design of dedicated algorithms for
OSNs, such as TidalTrust (Golbeck 2005).
Privacy Preferences: users may have different
points of view about personal information protection,
usually reflected in their default policies. The privacy
preferences give an estimation of the level of impor-
tance that privacy has for a user (Hu et al., 2011).
Item Sensitivity: the degree of sensitivity of an
object is linked to the concept of privacy. The item
sensitivity for a controller gives an indication of the
harm that can be done to him if the object falls in
wrong hands. This value can be an arbitrary value,
only depending on the personal point of view (Hu et al.,
2011) or it can be computed from the policies, relations
and trust among users (Such and Criado, 2015).
Importance of a Conflict: the importance of a
conflict measures the importance for a user that the
conflict resolves conforming with his own preferences.
This notion is used with the item sensitivity in (Such
and Criado, 2015) to compute the willingness of a user
to change his decision.
Sharing Gain vs Privacy Risk: the privacy risk
estimates the risk of being harmed if the object is ac-
cessed by some requesters. It is usually used to balance
the sharing gain, also called share loss (Hu et al., 2011)
or sharing utility (Mosca and Such, 2022). The sharing
gain measures the social advantages, such as maintain-
ing relationships or mutual empathy (Krasnova et al.,
2010), a user can have by giving access to an object.
3 OSN REPRESENTATION
Our work relies on threshold-based strategies based
on some decision-making factors. These latter will
be computed automatically from the social network
model, composed of two graphs representing the in-
terpersonal relations between users and the multi-
management of data, respectively.
A social network is basically composed of a set of
interconnected users and their shared content. Users
in OSN have a personal space (profile page, wall,...)
where they can share information such as texts, photos
or videos. Users are connected to each other through
interpersonal relationships, as for instance follower,
friend, close friend, etc. Relationships may be asym-
metric: for example, a user can follow someone, but
not be followed back. Users have privacy preferences
on what the other users can do with their information
and on their personal spaces. These preferences are
formalized as policies and are used to evaluate access
requests. We denote
Pol
u
(u
′
, o)
the evaluation result
of the policy of the user
u
for the object
o
when the
requester is
u
′
. This expression can be evaluated to
0
(or
1
), meaning the access is denied (or permitted)
following this policy.
User relationships at a given time
t
are represented
by a labelled directed graph
S = (U, E
s
, L
s
)
, where
U
are the nodes,
E
s
the edges and
L
s
a set of labels.
Nodes represent users and a directed edge, labeled
isRelated
, exists between two nodes if the first one is
Solving Access Control Conflicts in Multi-User Systems
493