Annealing by Increasing Resampling
in the Unified View of Simulated Annealing
Yasunobu Imamura
1
, Naoya Higuchi
1
, Takeshi Shinohara
1
, Kouichi Hirata
1
and Tetsuji Kuboyama
2
1
Kyushu Institute of Technology, Kawazu 680-4, Iizuka 820-8502, Japan
2
Gakushuin University, Mejiro 1-5-1, Toshima, Tokyo 171-8588, Japan
Keywords:
Annealing by Increasing Resampling, Simulated Annealing, Logit, Probit, Meta-heuristics, Optimization.
Abstract:
Annealing by Increasing Resampling (AIR) is a stochastic hill-climbing optimization by resampling with
increasing size for evaluating an objective function. In this paper, we introduce a unified view of the con-
ventional Simulated Annealing (SA) and AIR. In this view, we generalize both SA and AIR to a stochastic
hill-climbing for objective functions with stochastic fluctuations, i.e., logit and probit, respectively. Since the
logit function is approximated by the probit function, we show that AIR is regarded as an approximation of
SA. The experimental results on sparse pivot selection and annealing-based clustering also support that AIR
is an approximation of SA. Moreover, when an objective function requires a large number of samples, AIR is
much faster than SA without sacrificing the quality of the results.
1 INTRODUCTION
The similarity search is an important task for informa-
tion retrieval in high-dimensional data space. Dimen-
sionality reduction such as SIMPLE-MAP (Shinohara
and Ishizaka, 2002) and Sketch (Dong et al., 2008) is
known to be one of the effective approaches for effi-
cient indexing and fast searching. In dimensionality
reduction, we have to select a small number of axes
with low distortion from the original space. This op-
timal selection gives rise to a hard combinatorial op-
timization problem.
Simulated annealing (SA) (Kirkpatrick and
Gelatt Jr., 1983) is known to be one of the most
successful methods for solving combinatorial op-
timization problems. It is a metaheuristic search
method to find an approximation optimal value
of an objective function. Initially, SA starts with
high temperature, and moves in the wide range of
search space by random walk. Then, by cooling the
temperature slowly, it narrows the range of search
space so that finally it achieves the global optimum.
On the other hand, we present a method called an
annealing by increasing resampling (AIR), which is
introduced originally for the sparse pivot selection for
SIMPLE-MAP as a hill-climbing algorithm by resam-
pling with increasing the sample size and by evaluat-
ing pivots in every resampling (Imamura et al., 2017).
AIR is suitable to optimization problems that sam-
pling is used due to the computational costs, and the
value of the objective function is given by the aver-
age of evaluations for each sample. For example, in
the pivot selection problem (Bustos et al., 2001), the
objective function is given by the average of the pair-
wise distances in the pivot space for each set of sam-
ples, and pivots are selected such that they maximize
the average.
In the processes near the initial stage of the AIR,
the sample size is small and then the local optimal is
not stable and moving drastically because the AIR re-
places the previous sample with an independent sam-
ple by resampling. On the other hand, in the processes
near the ending stage of the AIR, the sample size is in-
creasing and then the local optimal is stable. This pro-
cess of AIR is similar to conventional hill-climbing
algorithms. The larger the sample size grows, the
smaller the error in the evaluation becomes. At the
final stage, AIR works like a local search as SA. In
other words, AIR realizes behavior like SA. In addi-
tion, AIR is superior to SA on its computational costs
especially when the sample size for evaluating objec-
tive functions are very large, because AIR uses small
set of samples near the initial stage for which the eval-
uation can be done in very short time.
In the previous work (Imamura et al., 2017), we
introduce AIR for a specific problem, pivot selection.
In this paper, we show that AIR is applicable as a
more general optimization method through the unified
Imamura, Y., Higuchi, N., Shinohara, T., Hirata, K. and Kuboyama, T.
Annealing by Increasing Resampling in the Unified View of Simulated Annealing.
DOI: 10.5220/0007380701730180
In Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods (ICPRAM 2019), pages 173-180
ISBN: 978-989-758-351-3
Copyright
c
2019 by SCITEPRESS – Science and Technology Publications, Lda. All rights reserved
173