
dexes. In Proceedings of the ACM International Con-
ference on Web Search and Data Mining (WSDM).
Ding, S. and Suel, T. (2011). Faster top-k document re-
trieval using block-max indexes. In Proceedings of
the Annual International ACM SIGIR Conference on
Research and Development in Information Retrieval
(SIGIR).
Fan, R., Chang, K., Hsieh, C., X.R., W., and Lin, C.
(2008). Liblinear: A library for large linear classifica-
tion. Journal of Machine Learning Research, 9:1871–
–1874.
Harrell, F. (2001). Ordinal logistic regres-
sion. (In Regression Modeling Strategies).
Springer:Berlin/Heidelberg,Germany.
Hosmer, D. and Lemeshow, S. (2000). R.X. Applied Logis-
tic Regression. John Wiley and Sons: Hoboken, NJ,
USA.
Ifrim, G., Bakir, G., and Weikum, G. (2008). Fast logistic
regression for text categorization with variable-length
n-grams. In Proceedings of the International Con-
ference on Knowledge Discovery and Data Mining
(KDD).
Ifrim, G. and Wiuf, C. (2010). Bounded coordinate-descent
for biological sequence classification in high dimen-
sional predictor space. In Proceedings of the Interna-
tional Conference on Knowledge Discovery and Data
Mining (KDD).
Joachims, T. (1998). Text categorization with support vector
machines: Learning with many relevant features. In
In Proceedings of the 10th European Conference on
Machine Learning, pages 137—-142.
Johnson, M., Schuster, M., Le, Q., Krikun, M., Wu, Y.,
Chen, Z., Thorat, N., Viégas, F., Wattenberg, M., Cor-
rado, G., Hughes, M., and Dean, J. (2017). Google’s
multilingual neural machine translation system: En-
abling zero-shot translation. Transactions of the As-
sociation for Computational Linguistics, pages 339–
351.
Joulin, A., Grave, E., Bojanowski, P., and Mikolov, T.
(2017). Bag of tricks for efficient text classification.
In In Proceedings of the 15th Conference of the Eu-
ropean Chapter of the Association for Computational
Linguistics.
Kim, Y., Hahn, S., and Zhang, B. (2000). Text filtering
by boosting naive bayes classifiers. In In Proceedings
of the 23rd Annual International ACM SIGIR Confer-
ence on Research and Development in Information Re-
trieval.
Knuth, D., Morris, J., and Pratt, V. (1977). Fast pattern
matching in strings. SIAM Journal on Computing,
6(2):323—-350.
Kudo, T., Maeda, E., and Matsumoto, Y. (2005). An applica-
tion of boosting to graph classification. In Advances
in Neural Information Processing Systems 17 , pages
729–736. MIT Press.
Mamitsuka, H. and Naoki, A. (1998). Query learning strate-
gies using boosting and bagging. In Proceedings of the
Fifteenth International Conference on Machine Learn-
ing (ICML98).
Medhat, W., Hassan, A., and Korashy, H. (2014). Sentiment
analysis algorithms and applications: A survey. Ain
Sham Engineering Journal, 5:1093—-1113.
Mikolov, T., Chen, K., Corrado, G., and Dean, J. (2013).
Efficient estimation of word representations in vector
space. In In Proceedings of the 1st International Con-
ference on Learning Representations.
Pang, B. and Lee, L. (2008). Opinion mining and sentiment
analysis. Foundations and Trends in Information Re-
trieval, 2(1-2):1––135.
Schapire, R. and Singer, Y. (2000). Boostexter: A boosting-
based system for text categorization. Mach. Learn.,
39:135—-168.
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones,
L., Gomez, A., Kaiser, L., and Polosukhin, I. (2017).
Attention is all you need. In In Proceedings of the
31th Conference on Neural Information Processing
Systems.
Wang, S. and Manning, C. (2012). Baselines and bigrams:
Simple, good sentiment and topic classification. In In
Proceedings of the 50th Annual Meeting of the Associ-
ation for Computational Linguistics, pages 90––94.
ICPRAM 2024 - 13th International Conference on Pattern Recognition Applications and Methods
214