3.2 Propagation of Focusing Parameters
With (10) and (12) we are able to calculate the change
of focusing at a single bounding surface. As the para-
meters k
m
and k
s
are defined relative to a certain inter-
section point P
i
they have to be remapped to the next
intersection point P
i+1
before applying these equa-
tions for the next surface. In the following k
+
denotes
the position of a focus point (meridional or sagittal)
after this remapping. With the Euclidean distance s
between the two reference points the propagation rule
for the focus point description is:
1
k
+
=
1
k
− s. (18)
In order to propagate the change of sagittal focus-
ing dependent on t, which is influenced by the sur-
faces’ curvatures, we need the first order Taylor ap-
proximation of (18). With the linear approximations
k(t) ≈ k
0
+ k
t
t and s(t) ≈ s
0
+ s
t
t we get:
k
+
(t) ≈ k
+
0
+ k
+
t
t =
k
0
1− s
0
k
0
+
k
t
+ s
t
k
0
2
(1− s
0
k
0
)
2
t. (19)
In an alternating manner of calculation of focusing
at a bounding surface and propagation of the corre-
sponding parameters to the next surface, we can cal-
culate the object sided focusing parameters k
m
, k
s
and
k
s,t
. As already mentioned, k
s
cannot be influenced
directly via the surfaces’ curvatures.
If we want to focus to infinity, we have to satisfy
k
m
!
= 0 and k
s,t
!
= 0 (20)
and in addition initially ensure that k
s
= 0.
3.3 Single-viewpoint Condition
Additional conditions can be satisfied if the optical
system design has enough degrees of freedom. For
example we can demand a single-viewpoint, which
means that all object sided principal rays intersect in
one point V = (0, 0, v)
T
on the optical axis. A single-
viewpoint is necessary if we want to remap a captured
image to other projection models like the cylindrical
projection without knowledge of scene depth. In or-
der to satisfy the single-viewpoint condition, the exit
angle ϕ(t) must satisfy
ϕ(t) = arctan
ζ
N
− v
ρ
N
. (21)
So we have to demand
ϕ
t
(t) −
ρ
N
ζ
N
t
− (ζ
N
− v)ρ
N
t
(ρ
N
)
2
+ (ζ
N
− v)
2
!
= 0. (22)
3.4 Combined Root Finding Problem
With (20) and (22) we have three root finding prob-
lems that share a common parameter vector (17). The
task of finding the corresponding parameter vector
can be formulated as a multi-dimensional nonlinear
least squares problem. To do so, we simply sum
up the squared conditions. Such a nonlinear least
squares problem can be solved using the Levenberg-
Marquardt algorithm. Starting with an initial guess,
this algorithm combines the Gauss-Newton algorithm
with gradient descent to robustly find a local mini-
mum. This local minimum should also be the global
minimum with a sum of squared errors equal to zero.
If the value at the local minimum is non-zero the ini-
tial root finding problem was not solved properly and
we have not found a valid solution.
In general there exists more than one global mini-
mum. A different minimum corresponds to a different
shape of the final system and often comes along with
a flipped inside-outside characteristic.
3.5 Final ODE System
The parameter vector (17) and the equations (14) and
(22) define a set of differential equations for ρ
i
, ζ
i
and ϕ (i = 2...N). Given appropriate initial values,
this system of ordinary differential equations can be
solved via numerical integration with standard meth-
ods like the Runge-Kutta methods.
It has to be mentioned that a valid solution can-
not be found for all sets of initial values. Sometimes
it is simply not possible to satisfy the conditions or
the solution’s range of validity is not sufficiently large.
However, for appropriate initial values the calculation
of the optical system is very fast.
4 SIMULATION RESULTS
In the last section we presented a construction scheme
for optical systems that directly considers meridional
and sagittal focus as well as a single-viewpoint. This
sections shows ray tracing results for a system draft
that was calculated using this scheme. Ray tracing
was performed using the spline interpolated numeri-
cal ODE solution. Material characteristics leading to
the chromatic dispersion are considered as well.
As the construction scheme currently does not
consider chromatic aberration we limited the refract-
ing entry and exit surface in a way that the princi-
pal rays traverse these perpendicularly. To bundle the
beams of rays on the image sensor we use two stan-
dard achromatic lenses (see Figure 5).
VISAPP2013-InternationalConferenceonComputerVisionTheoryandApplications
88