where c(x)
+
= max{c(x),0}, and ν > 0 is a fixed pa-
rameter. It is easy to see that F(x,x) = 0, where ei-
ther the left branch f (·) − f (x) −νc(x)
+
or the right
branch c(·) −c(x)
+
in the expression of F(·,x) is ac-
tive at x, i.e., attains the maximum, depending on
whether x is feasible for (12) or not. If x is infeasible,
meaning c(x) > 0, then the right hand term in the ex-
pression of F(·,x) is active at x, whereas the left hand
term equals −νc(x) < 0 at x. Reducing F(·,x) below
its value 0 at the current x therefore reduces constraint
violation. If x is feasible, meaning c(x) 6 0, then the
left hand term in F(·,x) becomes dominant, so reduc-
ing F(·,x) below its current value 0 at x now reduces
f , while maintaining feasibility, and where the true
optimization of f takes place.
Observe that if x
∗
is a local minimum of program
(12), it is also a local minimum of F(·,x
∗
), and then
0 ∈ ∂
1
F(x
∗
,x
∗
). The symbol ∂
1
here stands for the
Clarke subdifferential with respect to the first vari-
able. Indeed, if x
∗
is a local minimum of (12) then
c(x
∗
) 6 0, and so for y in a neighborhood of x
∗
we
have
F(y,x
∗
) = max{f (y) − f (x
∗
),c(y)}
> f (y) − f (x
∗
) > 0 = F(x
∗
,x
∗
).
This implies that x
∗
is a local minimum of F(·,x
∗
),
and therefore 0 ∈∂
1
F(x
∗
,x
∗
). We now present Algo-
rithm 3 for computing solutions of program (5).
Convergence theory of iterative algorithm 3 is dis-
cussed in (Gabarrou et al., 2013; Noll, 2010) and
based on these results, we can prove the following
theorem.
Theorem 1. Assume that functions f and c in pro-
gram (12) are lower-C
1
and satisfy the following con-
ditions:
(i) f is weakly coercive on the constraint set
Ω = {x ∈ R
n
x
: c(x) 6 0} in the sense that if
x
j
∈ Ω and kx
j
k → ∞, then f (x
j
) is not mono-
tonically decreasing.
(ii) c is weakly coercive in the sense that if kx
j
k→ ∞,
then c(x
j
) is not monotonically decreasing.
Then the sequence x
j
of serious iterates generated
by Algorithm 3 is bounded, and every accumulation
point x
∗
of the x
j
satisfies 0 ∈ ∂
1
F(x
∗
,x
∗
).
Notice that f = k·k
2
H
◦G(·) is a composite func-
tion of a semi-norm and a smooth mapping x 7→G(x),
which implies that it is lower-C
2
, and therefore also
lower-C
1
in the sense of (Rockafellar and Wets, 1998,
Definition 10.29). Theoretical properties of the spec-
tral abscissa c(x), used in the constraint, have been
studied in (Burke and Overton, 1994). Lower C
2
-
functions cover the preponderant part of non-smooth
functions encountered in applications. Convergence
Algorithm 3: Proximity control with downshift.
Parameters: 0 < γ <
e
γ < 1,0 < γ < Γ < 1,0 < q < ∞,
0 < c < ∞.
1: Initialize outer loop. Choose initial iterate x
1
and matrix Q
1
= Q
>
1
with −qI Q
1
qI. Ini-
tialize memory control parameter τ
]
1
such that
Q
1
+ τ
]
1
I 0. Put j = 1.
2: Stopping test. At outer loop counter j, stop if
0 ∈ ∂
1
F(x
j
,x
j
). Otherwise, goto inner loop.
3: Initialize inner loop. Put inner loop counter
k = 1, initialize τ
1
= τ
]
j
, and build initial work-
ing model F
1
(·,x
j
) using matrix Q
j
.
4: Trial step generation. Compute
y
k
= argminF
k
(y,x
j
) +
τ
k
2
ky −x
j
k
2
.
5: Acceptance test. If
ρ
k
=
F(y
k
,x
j
)
F
k
(y
k
,x
j
)
> γ,
put x
j+1
= y
k
(serious step), quit inner loop and
goto step 8. Otherwise (null step), continue inner
loop with step 6.
6: Update working model. Generate a cutting
plane m
k
(·,x
j
) = a
k
+ g
>
k
(·−x
j
) at null step y
k
and counter k using downshifted tangents. Com-
pute aggregate plane m
∗
k
(·,x
j
) = a
∗
k
+g
∗>
k
(·− x
j
)
at y
k
, and then build new working model
F
k+1
(·,x
j
).
7: Update proximity control parameter. Compute
secondary control parameter
e
ρ
k
=
F
k+1
(y
k
,x
j
)
F
k
(y
k
,x
j
)
and put
τ
k+1
=
(
τ
k
if
e
ρ
k
<
e
γ,
2τ
k
if
e
ρ
k
>
e
γ.
Increase inner loop counter k and loop back to
step 4.
8: Update Q
j
and memory element. Update ma-
trix Q
j
→ Q
j+1
respecting Q
j+1
= Q
>
j+1
and
−qI Q
j+1
qI. Then store new memory el-
ement
τ
]
j+1
=
(
τ
k
if ρ
k
< Γ,
1
2
τ
k
if ρ
k
> Γ.
Increase τ
]
j+1
if necessary to ensure
Q
j+1
+ τ
]
j+1
I 0. Increase outer loop counter j
and loop back to step 2.
ICINCO2013-10thInternationalConferenceonInformaticsinControl,AutomationandRobotics
310