Deep Learning for Radar Pulse Detection
Ha Q. Nguyen, Dat T. Ngo and Van Long Do
Viettel Research and Development Institute, Hoa Lac High-tech Park, Hanoi, Vietnam
Keywords:
Deep Neural Network, Radar Pulse Detection, Radar Pulse Parameter Estimation, Pulse Description Word,
Change Point Detection, Pruned Exact Linear Time.
Abstract:
In this paper, we introduce a deep learning based framework for sequential detection of rectangular radar
pulses with varying waveforms and pulse widths under a wide range of noise levels. The method is divided
into two stages. In the first stage, a convolutional neural network is trained to determine whether a pulse
or part of a pulse appears in a segment of the signal envelop. In the second stage, the change points in the
segment are found by solving an optimization problem and then combined with previously detected edges to
estimate the pulse locations. The proposed scheme is noise-blind as it does not require a noise floor estimation,
unlike the threshold-based edge detection (TED) method. Simulations also show that our method significantly
outperforms TED in highly noisy cases.
1 INTRODUCTION
The detection of radar pulses—or the estimation of
the times of arrival (TOAs) and the times of departure
(TODs)—plays a central role in passive location sys-
tems as it provides input for other algorithms to locate
the emitter (Torrieri, 1984; Poisel, 2005). This is a
challenging task since radar pulses are modulated and
coded with a variety of waveforms and most of the
time buried in noise. Existing methods for radar pulse
detection are usually threshold-based (Torrieri, 1974;
Iglesias et al., 2014) in which the thresholds are deter-
mined via an estimation of the noise statistics. These
methods work well with high or moderate Signal-to-
Noise-Ratios (SNRs) but perform poorly with low
SNRs. Furthermore, the noise floor estimation—a
prerequisite of these algorithms—is itself a hard prob-
lem, especially in highly varying environments.
In the past few years, Deep Learning (LeCun
et al., 2015; Goodfellow et al., 2016) has proved
a powerful tool for many tasks in computer vi-
sion and signal processing, notably, image classifica-
tion (Krizhevsky et al., 2012; Szegedy et al., 2015; He
et al., 2016) and object detection (Ren et al., 2017;
Redmon et al., 2016; Liu et al., 2016). Motivated
by these successes, we propose a novel method for
radar pulse detection in which edges are sequentially
estimated from segments of the received signal en-
velop via a deep-learning-based segment classi-
fication followed by a find-change-points algorithm
(Killick et al., 2012). The segment classification es-
sentially determines if a pulse is present, partially
present, or absent in a segment through a Convo-
lutional Neural Network (CNN). For a segment of
small-enough length, it can only fall into one of the
5 categories: ‘2 edges’, ‘TOA only’, ‘TOD only’, ‘All
pulse’, and ‘All noise’. Based on the output of the
CNN, the find-change-points routine seeks edges in
the segment by minimizing a cost function associated
to the number of edges, instead of a thresholding pro-
cedure. This approach is therefore able to get rid of
the unreliable noise floor estimation. The contribu-
tions of our paper are summarized as follows.
A novel CNN architecture for segment classifica-
tion.
An algorithm for adaptive segment classification
in which the CNN predicts the class of the current
segment based on the confidence of its previous
prediction.
An algorithm for sequential pulse localization
that combines the segment classification with
a find-change-points algorithm. This method
significantly surpasses the performance of the
Threshold-based Edge Detection (Iglesias et al.,
2014), especially in the low-SNR regimes.
The rest of the paper is outlined as follows: Sec. 2
formulates the problem. Sec. 3 constructs the CNN
and integrates it into an adaptive algorithm for seg-
ment classification. Sec. 4 presents the main algo-
32
Nguyen, H., Ngo, D. and Do, V.
Deep Learning for Radar Pulse Detection.
DOI: 10.5220/0007253000320039
In Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods (ICPRAM 2019), pages 32-39
ISBN: 978-989-758-351-3
Copyright
c
2019 by SCITEPRESS – Science and Technology Publications, Lda. All rights reserved