
data characteristics into attention mechanisms. This
novel approach represents a step forward in enhancing
the predictive capabilities of machine learning mod-
els, laying the foundation for more sophisticated and
adaptive forecasting methodologies.
REFERENCES
(apr. 5, 2022). beijing pm2.5 dataset.
[online]. available: https://
archive.ics.uci.edu/ml/datasets/beijing+pm2.5+data.
(apr. 5, 2022). daily gold price dataset. [online]. available:
https://www.kaggle.com/datasets/nisargchodavadiya/
daily-gold-price20152021-time-serie.
(apr. 5, 2022). daily stock dataset. [online]. available:
https://www.kaggle.com/datasets/dsadads/databases.
(apr. 5, 2022). individual household electric power
consumption dataset. [online]. available:
https://archive.ics.uci.edu/ml/datasets/ individ-
ual+household+electric+power+consumption.
Abbasi, S., Mokhtarian, F., and Kittler, J. (2000). Enhanc-
ing css-based shape retrieval for objects with shallow
concavities. Image and vision computing, 18(3):199–
211.
Ashish, V., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L.,
Gomez, A. N., Kaiser,
˚
A., and Polosukhin, I. (2017).
Attention is all you need. In Proc. Adv. Neural Inf.
Process. Syst., volume 30, pages 105–109.
Ayachi, L., Benkhlifa, A., Jribi, M., and Ghorbel, F. (2020).
Une nouvelle description du contour de l’espace: Une
repr
´
esentation espace-echelle g
´
en
´
eralis
´
ee bas
´
ee sur la
courbure et la torsion.
Ayachi, L., Jribi, M., and Ghorbel, F. (2023). General-
ized torsion-curvature scale space descriptor for 3-
dimensional curves.
Bahdanau, D., Cho, K., and Bengio, Y. (2014). Neural ma-
chine translation by jointly learning to align and trans-
late. arXiv preprint arXiv:1409.0473.
Box, G. (2015). Time Series Analysis: Forecasting and
Control. Wiley, San Francisco, CA, USA.
ByoungSeon, C. (2012). ARMA Model Identification.
Springer, New York, NY, USA.
Cho, K., van Merrienboer, B., Bahdanau, D., and Bengio,
Y. (2014). On the properties of neural machine trans-
lation: Encoder-decoder approaches.
Dai, S., Chen, Q., Liu, Z., and Dai, H. (2020). Time se-
ries prediction based on emd-lstm model. J. Shenzhen
Univ. Sci. Eng., 37(3):221–230.
Gers, F. (2002). Applying lstm to time series pre-
dictable through time-window approaches. In Neural
Nets WIRN Vietri-01, pages 193–200, London, U.K.
Springer.
Junczys-Dowmunt, M., Grundkiewicz, R., Dwojak, T.,
Hoang, H., Heafield, K., Neckermann, T., Seide, F.,
Wiesler, S., Germann, U., and Fikri Aji, A. (2018).
Marian: Fast neural machine translation in c++. arXiv
preprint arXiv:1806.09235.
Liu, J. and Chen, S. (2019). Non-stationary multivariate
time series prediction with mix gated unit. J. Comput.
Res. Develop., 56(8):1642.
Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen,
D., Levy, O., Lewis, M., Zettlemoyer, L., and Stoy-
anov, V. (2019). Bert: Pre-training of deep bidirec-
tional transformers for language understanding. arXiv
preprint arXiv:1810.04805.
McLeod, A. I. and Li, W. K. (1983). Diagnostic check-
ing arma time series models using squared-residual
autocorrelations. Journal of Time Series Analysis,
4(4):269–273.
Nakkach, C., Zrelli, A., and Ezzeddine, T. (2022). Deep
learning algorithms enabling event detection: A re-
view. In 2nd International Conference on Industry
4.0 and Artificial Intelligence (ICIAI 2021). Atlantis
Press.
Nakkach, C., Zrelli, A., and Ezzedine, T. (2023). Long-term
energy forecasting system based on lstm and deep ex-
treme machine learning. Intelligent Automation & Soft
Computing, 37(1).
Sutskever, I., Vinyals, O., and Le, Q. V. (2014). Sequence
to sequence learning with neural networks. Advances
in neural information processing systems.
Torres, J. L., Garc
´
ıa, A., Blas, M. D., and De Francisco,
A. (2005). Forecast of hourly average wind speed
with arma models in navarre (spain). Solar Energy,
79(1):65–77.
Yu, Y., Si, X., Hu, C., and Zhang, J. (2019). A review of
recurrent neural networks: Lstm cells and network ar-
chitectures. Neural Computation, 31(7):1235–1270.
Curvature-Informed Attention Mechanism for Long Short-Term Memory Networks
1269