require manual system operations by the system
operator, it takes long time to recover from the
system failure. This long downtime of the DB
system results in low system availability.
Consequently, the probability of that the time
interval in which data might be lost is within RPO is
highly affected by the synchronization interval. The
probability increases as the ratio of RPO to the
synchronization interval increases. Meanwhile, as
mentioned previously, the shorter synchronization
interval leads to lower performance of the DB
system.
In order to handle the trade-off between
performance and availability, many techniques to
determine the optimal checkpoint interval in terms
of performance, availability and reliability have been
studied. Many researchers proposed performance
models based on periodic checkpoint (e.g., (Dohi,
Ozaki and Kaio, 2002), (Young, 1974), (Chandy,
1975), (Baccelli, 1981), and (Gelenbe and
Hernandez, 1990)). The aperiodic checkpoint
placement methods to minimize execution time of
programs or tasks were proposed in the literatures
such as (Duda, 1983) and (Toueg and Babaoglu,
1984), and the methods to identify the optimal
checkpoint placement in terms of cost could be
found in (Fukumoto et al., 1992), (Ling et al., 2001),
(Dohi et al., 2002), (Ozaki et al., 2004), and (Ozaki
et al., 2006). However, these existing works do not
consider the effect of the time-consuming manual
resolution of data inconsistencies on performability.
To address this issue, we propose a method to
identify a synchronization interval which optimizes
performability by taking into account the effect of
the time-consuming manual resolution of data
inconsistencies on performability. The proposed
method identifies the optimal synchronization
interval by solving a stochastic reward nets (SRNs)
model (Trivedi, 2001) describing manual and
automatic failure-recovery behaviors of the DB
system with a given RPO. The proposed method is
quantitatively investigated in numerical examples of
identification of the optimal synchronization interval
in terms of performability. The proposed method
was studied as a part of a development project of an
in-house model-based system design and non-
functional property evaluation environment called
CASSI (Izukura et al., 2011). In design phase of a
system, CASSI predicts performance and
availability based on analytic models which are
automatically synthesized from system design in the
form of Systems Modeling Language (SysML). We
proposed several techniques for the automatic model
synthesis (e.g., (Machida et al., 2011) and (Tadano
et al., 2012)) and proposed model in this paper is
studied as an analytic model to improve the
prediction for DB systems.
This paper is organized as follows. Section 2
proposes performability optimization method.
Section 3 shows some numerical examples of the
proposed method. Section 4 gives summary and
future directions.
2 OPTIMAL
SYNCHRONIZATION
INTERVAL IDENTIFICATION
METHOD
This section describes the proposed method to
identify the optimal synchronization interval in
terms of performability. In order to identify the
optimal synchronization interval by taking into
account the effect of the time-consuming manual
resolution of data inconsistencies on performability,
a performability model for representing the behavior
of manual and automatic failure-recovery of the DB
system is introduced.
2.1 Overview
The proposed method identifies the optimal
synchronization interval based on the performability
model. As shown in Figure 1, in the proposed
method, the following steps are performed:
1. Input of parameters’ values of the
performability model
2. Performability model analysis
3. Identification of the optimal
synchronization interval
4. Modification of design of the DB system
In Step 1, the system designer inputs parameters
of the performability model according to the current
design of the DB system.
In Step 2, the proposed method analyzes the
performability based on the performability model
with the input parameter values.
In Step 3, based on the analysis results, the
proposed method identifies the optimal
synchronization interval which maximizes
performability.
In Step 4, the system designer modifies the
design of the DB system based on the identified
optimal synchronization interval.
MODELSWARD2013-InternationalConferenceonModel-DrivenEngineeringandSoftwareDevelopment
234