The differences in the final
)(nMSE for the M-
SMFTF and SFRLS algorithms are due to the use of
different forgetting factors
.
4.2 The Reduced M-SMFTF Case
In this simulation, we compare the convergence
performance of reduced size predictor M-SMFTF
algorithm and the NLMS algorithm. Figure 4
presents the results obtained with the speech signal,
sampled at 16 kHz, for the filter order L=256. We
simulated an abrupt change in the impulse response
at the 56320
th
samples. We use the following
parameters: the predictor order is P=20, the
forgetting factor is
P/11
=
. From this plot, we
observe that the re-convergence of M-SMFTF is
again faster than NLMS.
Figure 4: Comparative performance of the M-SMFTF and
NLMS with speech input, L=256, M-SMFTF: P=20,
=0.950,
=0.99,
a
c =0.1, E
0
=1; NLMS:
=1.
Different simulations have been done for
different sizes L and P, and all these results show
that there is no degradation in the final steady-
state
)(nMSE of the reduced size predictor algorithm
even for P<<L. The convergence speed and tracking
capability of the reduced size predictor algorithm
can be adjusted by changing the choice of the
parameters
,
and
a
c .
5 CONCLUSIONS
We have proposed more complexity reduction of
SMFTF (M-SMFTF) algorithm by using a new
recursive method to compute the likelihood variable.
The computational complexity of the M-SMFTF
algorithm is 6L operations per sample and this
computational complexity can be significantly
reduced to (2L+4P) when used with a reduced P-size
forward predictor (P<<L). The low computational
complexity of the M-SMFTF when dealing with
long filters and it a performance capabilities render
it very interesting for applications such as acoustic
echo cancellation. The simulation has shown that the
performances of M-SMFTF algorithm are better
than those of NLMS algorithm. The M-SMFTF
algorithm outperforms the classical adaptive
algorithms because of its convergence speed which
approaches that of the RLS algorithm and its
computational complexity which is slightly greater
than the one of the NLMS algorithm.
REFERENCES
Arezki, M., Benallal, A., Meyrueis, P., Guessoum A.,
Berkani, D., 2007. Error Propagation Analysis of Fast
Recursive Least Squares Algorithms. Proc. 9th
IASTED International Conference on Signal and
Image Processing, Honolulu, Hawaii, USA, August
20–22, pp.97-101.
Benallal, A., Gilloire, A., 1988. A New method to stabilize
fast RLS algorithms based on a first-order model of
the propagation of numerical errors. Proc. ICASSP,
New York, USA, pp.1365-1368
Benallal, A., Benkrid, A., 2007. A simplified FTF-type
algorithm for adaptive filtering. Signal processing,
vol.87, no.5, pp.904-917.
Cioffi, J., Kailath, T., 1984. Fast RLS Transversal Filters
for adaptive filtering. IEEE press. On ASSP, vol.32,
no.2, pp.304-337.
Gilloire, A., Moulines, E., Slock, D., Duhamel, P., 1996.
State of art in echo cancellation. In A.R. Figuers-vidal,
Digital Signal processing in telecommunication,
Springer, Berlin, pp.45–91
Haykin, S., 2002. Adaptive Filter Theory, Prentice-Hall.
NJ, 4
th
edition.
Macchi, O., 1995. The Least Mean Squares Approach with
Applications in Transmission, Wiley. New York.
Mavridis, P.P., Moustakides, G.V., 1996. Simplified
Newton-Type Adaptive Estimation Algorithms. IEEE
Trans. Signal Process, vol.44, no.8.
Moustakides, G.V., Theodoridis, S., 1999. Fast Newton
transversal filters - A new class of adaptive estimation
algorithms. IEEE Trans. Signal Process, vol.39, no.10,
pp.2184–2193.
Sayed, A.H., 2003. Fundamentals of Adaptive Filtering,
John Wiley & Sons. NJ,
Slock, D.T.M., Kailath, T., 1991. Numerically stable fast
transversal filters for recursive least squares adaptive
filtering,” IEEE transactions on signal processing,
vol.39, no.1, pp.92-114.
Slock, D.T.M., 1993. On the convergence behaviour of the
LMS and the NLMS algorithms. IEEE Trans. Signal
Processing, vol.42, pp.2811-2825.
Treichler, J.R., Johnson, C.R., Larimore, M.G., 2001.
Theory and Design of Adaptive Filter, Prentice Hall,
IMPROVEMENT OF THE SIMPLIFIED FTF-TYPE ALGORITHM
161