500 1000 1500 2000 2500 3000 3500 4000 4500
-0.5
0
0.5
500 1000 1500 2000 2500 3000 3500 4000 4500
-100
-80
-60
-40
-20
0
500 1000 1500 2000 2500 3000 3500 4000 4500
0
50
100
150
200
Figure 8: Real data example – behaviour of unbounded
(blue) and bounded (green) parameter estimates. The es-
timated parameter of the reduced model is plotted in red.
determination of rules for construction of the fictive G
it was possible to introduce the limiting functions for
the mentioned statistical moments and integrate them
into recursive algorithm of bounded parameter esti-
mation. In addition, parallel estimation of the full and
reduced models enable to minimize the prediction er-
ror in each estimation step.
Behaviour of the estimator was illustrated on
simulated data and then on real data taken from a
cold rolling mill.
REFERENCES
˚
Astr¨om, K. J. and Kumar, P. (2014). Control: A perspective.
Automatica, (50):3–43.
Benavoli, A., Chisci, L., Farina, A., Ortenzi, L., and Zappa,
G. (2006). Hard-constrained vs. soft-constrained pa-
rameter estimation. IEEE Transactions on aerospace
and electronic systems, 42(4):1224 – 1239.
Ettler, P. and Andr´ysek, J. (2007). Mixing models to im-
prove gauge prediction for cold rolling mills. In Pro-
ceedings of the 12th IFAC Symposium on Automation
in Mining, Mineral and Metal Processing, Qu´ebec,
Canada.
Ettler, P. and K´arn´y, M. (2010). Parallel estimation respect-
ing constraints of parametric models of cold rolling. In
Proceedings of the 13th IFAC Symposium on Automa-
tion in Mineral, Mining and Metal Processing (IFAC
MMM 2010), pages 63–68, Cape Town, South Africa.
K´arn´y, M. (1982). Recursive parameter estimation of re-
gression model when the interval of possible values is
given. Kybernetika, 18(2):164–178.
K´arn´y, M., B¨ohm, J., Guy, T., Jirsa, L., Nagy, I., Nedoma,
P., and Tesaˇr, L. (2005). Optimized Bayesian Dynamic
Advising: Theory and Algorithms. Springer, London.
Kopylev, L. (2012). Constrained parameters in applications:
Review of issues and approaches. ISRN Biomathemat-
ics, 2012:Article ID 872956.
Kulhav´y, R. and Zarrop, M. B. (1993). On a general con-
cept of forgetting. International Journal of Control,
58(4):905–924.
Mandelkern, M. (2002). Setting confidence intervals for
bounded parameters. Statistical Science, 17(2):194–
172.
Milanese, M., Norton, J., Piet-Lahanier, H., and (Eds.),
E. W. (1996). Bounding Approaches to System Identi-
fication. Springer.
Murakami, K. and Seborg, D. E. (2000). Constrained pa-
rameter estimation with applications to blending op-
erations. Journal of Process Control, 10:195–202.
Norton, J. P. (1987). Identification and application of
bounded-parameter models. Automatica, 23(4):497–
507.
Peterka, V. (1981). Bayesian Approach to System Identifica-
tion In P. Eykhoff (Ed.) Trends and Progress in System
Identification. Pergamon Press, Eindhoven, Nether-
lands.
Toulias, T. L. and Kitsos, C. P. (2014). On the proper-
ties of the generalized normal distribution. Discus-
siones Mathematicae Probability and Statistics, 34(1-
2):3549.
APPENDIX
Generalized Normal Distribution
Symmetric version of the generalized normal distribution
G is defined by 3 parameters: µ
G
(location), α (scale) and
β (shape).
Pdf of G is given by
f
G
(x| µ
G
,α, β) =
β
2αΓ(1/β)
exp
(
−
x− µ
G
α
β
)
, (62)
where α > 0, β > 0 and Γ denotes the gamma function
Γ(x) =
Z
∞
0
t
x−1
exp(−t)dt . (63)
For β = 2, G coincides with the normal distribution
N (µ
G
,
α
2
2
). For β → ∞, G converges pointwise to uniform
density on (µ
G
− α,µ
G
+ α).
Cdf of G is given by
F
G
(x| µ
G
,α, β) = (64)
1
2
"
1+
sgn(x− µ
G
)
Γ(1/β)
γ
1/β,
x− µ
G
α
β
!#
,
where γ means the lower incomplete gamma function
γ(x,x
0
) =
Z
x
0
0
t
x−1
exp(−t)dt . (65)
Remaining properties of G can be found for example in
(Toulias and Kitsos, 2014).