confidence in the robustness of the satellite on-board
image processing application.
6 CONCLUSION
Due to complex computations performed by satellite
on-board image processing applications, it is difficult
to find test cases that provoke mission-critical behavior
in a potentially huge input domain. In this paper, we
have presented a genetic algorithm that is specifically
tailored to automatically find test cases that provoke
real-time critical behavior or scenarios where the math-
ematical accuracy gets critically low.
To achieve this, we have defined a novel two-
criteria fitness function that is based on execution
time and mathematical accuracy of a given satellite
on-board image processing application. Using that
function our genetic algorithm automatically steers
the search to test cases that provoke long execution
times or inaccurate results or both. The tester is able to
specify which criterion has more impact on the fitness
value of a test case. Moreover, the tester specifies the
input parameters of the genetic algorithm, for exam-
ple, population size, termination conditions, etc. This
makes our genetic algorithm flexible and adaptable to
different test goals and various on-board image pro-
cessing applications. Further, the search space and
individual representation are based on the partitioning
of input parameters into equivalence classes. Areas not
relevant to solutions are eliminated since redundant
test cases are removed. This makes our search faster.
To demonstrate the efficiency of our genetic ap-
proach, we have investigated the capability of the al-
gorithm to automatically find test cases that support
robustness testing of a given satellite on-board image
processing application, namely the FGS algorithm as
an application with high criticality for the PLATO
mission. In our experiments, our genetic algorithm
automatically evolves test cases with higher execution
times and lower mathematical accuracy of the FGS
algorithm compared to random testing.
In this paper, we have considered the TASTE value
as a qualitative measure of mathematical accuracy. To
investigate the accuracy of the application more pre-
cisely, we plan to additionally consider errors of the
results, for example, angle errors for each axis, as
criteria for the mathematical accuracy. Furthermore,
we have evaluated our approach by means of a single
satellite on-board image processing application. Due
to the flexibility of our approach the suitability for
other application, for example, blob feature extraction
in the robotics domain, can be investigated.
REFERENCES
Alander, J. T. and Mantere, T. (1999). Automatic soft-
ware testing by genetic algorithm optimization, a case
study. In Proceedings of the 1st International Work-
shop on Soft Computing Applied to Software Engineer-
ing, pages 1–9.
Bhandari, D., Murthy, C., and Pal, S. K. (2012). Variance as
a stopping criterion for genetic algorithms with elitist
model. Fundamenta Informaticae, 120(2):145–164.
Gerdes, I., Klawonn, F., and Kruse, R. (2004). Evolution
¨
are
Algorithmen: Genetische Algorithmen - Strategien
und Optimierungsverfahren - Beispielanwendungen.
vieweg, 1 edition.
Griebach, D. (2020). Fine Guidance System Performance
Report. Technical Report PLATO-DLR-PL-RP-0003,
DLR.
H
¨
ansel, J., Rose, D., Herber, P., and Glesner, S. (2011).
An evolutionary algorithm for the generation of timed
test traces for embedded real-time systems. In Interna-
tional Conference on Software Testing, Verification and
Validation (ICST), pages 170–179. IEEE Computer So-
ciety.
Moheb R. Girgis (2005). Automatic test data generation for
data flow testing using a genetic algorithm. Journal of
Universal Computer Science, 11(6):898–915.
PENDER ELECTRONIC DESIGN GmbH (2011). Gr-xc6s-
product sheet.
Pertenais, M. (2019). Instrument Technical Requirement
Document. Technical Report PLATO-DLR-PL-RS-
0001, DLR.
Sharma, A., Patani, R., and Aggarwal, A. (2016). Software
testing using genetic algorithms. International Journal
of Computer Science & Engineering Survey, 7(2):21–
33.
Shuster, M. D. (2008). The taste test. Advances in the
Astronautical Sciences, 132.
Sthamer, H., Baresel, A., and Wegener, J. (2001). Evolution-
ary testing of embedded systems. Proceedings of the
14th International Internet & Software Quality Week
(QW01), pages 1–34.
Varshney, S. and Mehrotra, M. (2014). Automated software
test data generation for data flow dependencies using
genetic algorithm. International Journal, 4(2).
Wegener, J. and Mueller, F. (2001). A comparison of static
analysis and evolutionary testing for the verification of
timing constraints. Real-time systems, 21(3):241–268.
Witteck, U. (2018). Automated Test Generation for Satel-
lite On-Board Image Processing. Master’s thesis, TU
Berlin.
Witteck, U., Grießbach, D., and Herber, P. (2019). Test
Input Partitioning for Automated Testing of Satellite
On-board Image Processing Algorithms. In Proceed-
ings of the 14th International Conference on Software
Technologies - Volume 1: ICSOFT, pages 15–26. IN-
STICC, SciTePress.
A Genetic Algorithm for Automated Test Generation for Satellite On-board Image Processing Applications
135