fine tuning and more sophisticated models could lead
to improved results. However, we prove that using a
LSTM network to predict trading behavioral features
makes sense as it improves prediction performances.
Detector performances are not really outstanding for
the moment but many prospects have already been
raised in the team to improve detecting performances.
First, as emphasized in the last part of the paper,
the definition of behavioral features is key. Focus-
ing on transactional data, as we already said, cho-
sen features must involve more aspects of transac-
tions structure. Furthermore, contextual data could be
enhanced with more market data, information about
traders’ communications or data linked to economic
news announcements.
Then, setting up the first prospect above will mul-
tiply the number of series both for the input and
output vectors. For this reason, a more sophisti-
cated prediction model would probably be used such
as DeepAR (Salinas et al., 2017) or attention-based
LSTM network (Qin et al., 2017).
Finally, an optimization of the parameters of the
detector would be necessary before any production
phase of the methodology. For that, either experts are
ready to invest time to help us analyze the detected
transactions for different parameters of the detector in
an iterative process or we design a generator of abnor-
mal transactions that have to be identified in order to
perform calibration. This parametrization could im-
ply the use of a more robust distance to define the
anomaly score as seen in (Cabana et al., 2019).
REFERENCES
Borovkova, S. and Tsiamas, I. (2019). An ensemble of
LSTM neural networks for high-frequency stock mar-
ket classification. Journal of Forecasting.
Cabana, E., Lillo, R., and Laniado, H. (2019). Multivariate
outlier detection based on a robust Mahalanobis dis-
tance with shrinkage estimators. Stat Papers.
Chan, P. and Mahoney, M. (2005). Modeling multiple time
series for anomaly detection. In Fifth IEEE Interna-
tional Conference on Data Mining (ICDM’05).
Chandola, V., Cheboli, D., and Kumar, V. (2009). Detecting
anomalies in a time series database. Computer Science
Department TR 09-004, University of Minnesota.
Elman, J. (1990). Finding structure in time. Cognitive sci-
ence.
Filzmoser, P. (2004). A multivariate outlier detection
method.
Goodfellow, I., Bengio, Y., and Courville, A. (2016). Deep
Learning. MIT Press. http://www.deeplearningbook.
org.
Hochreiter, S. and Schmidhuber, J. (1997). Long Short-
Term Memory. Neural Computation 9.
Lanbouri, Z. and Achchab, S. (2019). A new approach
for trading based on Long-Short Term memory tech-
nique. International Journal of Computer Science Is-
sues.
Mahalanobis, P. (1936). On the generalized distance in
statistics. Proceedings of the National Institute of Sci-
ences (Calcutta).
Malhotra, P., Vig, L., Shroff, G., and Agarwal, P. (2015).
Long Short Term Memory networks for anomaly de-
tection in time series. In ESANN.
Marchi, E., Vesperini, F., Weninger, F., Eyben, F., Squartini,
S., and Schuller, B. (2015). Non-linear prediction with
LSTM recurrent neural networks for acoustic novelty
detection. In 2015 International Joint Conference on
Neural Networks (IJCNN).
Qin, Y., Song, D., Chen, H., Cheng, W., Jiang, G., and Cot-
trell, G. (2017). A dual-stage attention-based recur-
rent neural network for time series prediction. CoRR.
Salehinejad, H., Sankar, S., Barfett, J., Colak, E., and
Valaee, S. (2018). Recent advances in Recurrent Neu-
ral Network (RNN). ArXiv, 1801.01078.
Salinas, D., Flunkert, V., and Gasthaus, J. (2017). DeepAR:
Probabilistic forecasting with autoregressive recurrent
networks. ArXiv, 1704.04110.
Salvador, C., Chan, P., and Brodie, J. (2004). Learning
states and rules for detecting anomalies in time series.
In FLAIRS Conference.
Sang, C. and Pierro, M. D. (2019). Improving trading
technical analysis with TensorFlow Long Short-Term
Memory (LSTM) neural network. The Journal of Fi-
nance and Data Science.
Shertinsky, A. (2018). Fundamentals of Recurrent Neu-
ral Network (RNN) and Long Short-Term Memory
(LSTM) network. ArXiv, 1808.03314.
Thi, N., Cao, V., and Le-Khac, N. (2018). One-class
collective anomaly detection based on Long Short-
Term Memory recurrent neural networks. ArXiv,
1802.00324.
Troiano, L., Villa, E. M., and Loia, V. (2018). Replicating
a trading strategy by means of LSTM for financial in-
dustry applications. IEEE Transactions on Industrial
Informatics, 14.
Yankov, D., Keogh, E., and Rebbapragada, U. (2008). Disk
aware discord discovery: Finding unusual time series
in terabyte sized datasets. Knowl Inf Syst, 17.
DATA 2020 - 9th International Conference on Data Science, Technology and Applications
150