REFERENCES
Amato, D., Giancarlo, R., and Bosco, G. L. (2021). Learned
sorted table search and static indexes in small space:
Methodological and practical insights via an experi-
mental study. CoRR, abs/2107.09480.
Bloom, B. H. (1970). Space/time trade-offs in hash cod-
ing with allowable errors. Commun. ACM, 13(7):422–
426.
Boffa, A., Ferragina, P., and Vinciguerra, G. (2021).
A “learned” approach to quicken and compress
rank/select dictionaries. In Proceedings of the SIAM
Symposium on Algorithm Engineering and Experi-
ments (ALENEX).
BOTW (2021). Best of the Web – Free Business Listing.
https://botw.org. Last checked on Oct. 18, 2021.
Broder, A. and Mitzenmacher, M. (2002). Network Appli-
cations of Bloom Filters: A Survey. In Internet Math-
ematics, volume 1, pages 636–646.
Cho, K., van Merrienboer, B., G
¨
ulc¸ehre, C¸ ., Bahdanau, D.,
Bougares, F., Schwenk, H., and Bengio, Y. (2014).
Learning phrase representations using RNN encoder-
decoder for statistical machine translation. In Proc.
of the 2014 Conf. on Empirical Methods in Natural
Language Processing, EMNLP 2014, October 25-29,
2014, Doha, Qatar, A meeting of SIGDAT, a Special
Interest Group of the ACL, pages 1724–1734. ACL.
Cortes, C. and Vapnik, V. (1995). Support-vector networks.
Machine learning, 20(3):273–297.
Cox, D. R. (1958). The regression analysis of binary se-
quences. Journal of the Royal Statistical Society: Se-
ries B (Methodological), 20(2):215–232.
Dai, Z. and Shrivastava, A. (2020). Adaptive Learned
Bloom Filter (Ada-BF): Efficient utilization of the
classifier with application to real-time information fil-
tering on the web. In Advances in Neural Information
Processing Systems, volume 33, pages 11700–11710.
Curran Associates, Inc.
Duda, R. O. and Hart, P. E. (1973). Pattern Classification
and Scene Analysis. John Willey & Sons, New Yotk.
Duda, R. O., Hart, P. E., and Stork, D. G. (2000). Pattern
Classification, 2nd Edition. Wiley.
Ferragina, P., Lillo, F., and Vinciguerra, G. (2021). On the
performance of learned data structures. Theoretical
Computer Science, 871:107–120.
Ferragina, P. and Vinciguerra, G. (2020a). Learned Data
Structures. In Recent Trends in Learning From Data,
pages 5–41. Springer International Publishing.
Ferragina, P. and Vinciguerra, G. (2020b). The PGM-index:
a fully-dynamic compressed learned index with prov-
able worst-case bounds. PVLDB, 13(8):1162–1175.
Freedman, D. (2005). Statistical Models : Theory and Prac-
tice. Cambridge University Press.
Kipf, A., Marcus, R., van Renen, A., Stoian, M., Kemper,
A., Kraska, T., and Neumann, T. (2020). Radixspline:
A single-pass learned index. In Proc. of the Third In-
ternational Workshop on Exploiting Artificial Intelli-
gence Techniques for Data Management, aiDM ’20,
pages 1–5. Association for Computing Machinery.
Kraska, T., Beutel, A., Chi, E. H., Dean, J., and Polyzotis,
N. (2018). The case for learned index structures. In
Proc. of the 2018 Int. Conf. on Management of Data,
SIGMOD ’18, pages 489–504, New York, NY, USA.
Association for Computing Machinery.
Long, X., Ben, Z., and Liu, Y. (2019). A survey of related
research on compression and acceleration of deep neu-
ral networks. Journal of Physics: Conference Series,
1213:052003.
Ma, J. and Liang, C. (2020). An empirical analysis of the
learned bloom filter and its extensions. Unpublished.
Paper and code no more available on line.
Machine Learning Lab (2021). Hidden fraudulent urls
dataset. https://machinelearning.inginf.units.it/data-
and-tools/hidden-fraudulent-urls-dataset. Last
checked on Oct. 18, 2021.
Maltry, M. and Dittrich, J. (2021). A critical analysis of
recursive model indexes. CoRR, abs/2106.16166.
Marcus, R., Kipf, A., van Renen, A., Stoian, M., Misra,
S., Kemper, A., Neumann, T., and Kraska, T.
(2020a). Benchmarking learned indexes. arXiv
preprint arXiv:2006.12804, 14:1–13.
Marcus, R., Zhang, E., and Kraska, T. (2020b). CDFShop:
Exploring and optimizing learned index structures. In
Proc. of the 2020 ACM SIGMOD Int. Conf. on Man-
agement of Data, SIGMOD ’20, pages 2789–2792.
Marin
`
o, G. C., Ghidoli, G., Frasca, M., and Malchiodi,
D. (2021a). Compression strategies and space-
conscious representations for deep neural networks.
In Proceedings of the 25th International Conference
on Pattern Recognition (ICPR), pages 9835–9842.
doi:10.1109/ICPR48806.2021.9412209.
Marin
`
o, G. C., Ghidoli, G., Frasca, M., and Malchiodi,
D. (2021b). Reproducing the sparse huffman address
map compression for deep neural networks. In Re-
producible Research in Pattern Recognition, pages
161–166, Cham. Springer International Publishing.
doi:10.1007/978-3-030-76423-4 12.
Mitzenmacher, M. (2018). A model for learned bloom fil-
ters and optimizing by sandwiching. In Advances in
Neural Information Processing Systems, volume 31.
Curran Associates, Inc.
Mitzenmacher, M. and Vassilvitskii, S. (2020). Algorithms
with predictions. CoRR, abs/2006.09123.
Python Software Foundation (2021). pickle – python object
serialization. https://docs.python.org/3/library/pickle.
html. Last checked on Oct. 18, 2021.
UNIMAS (2021). Phishing dataset. https://www.fcsit.
unimas.my/phishing-dataset. Last checked on Oct. 18,
2021.
Vaidya, K., Knorr, E., Kraska, T., and Mitzenmacher, M.
(2021). Partitioned learned bloom filters. In Interna-
tional Conference on Learning Representations.
Zell, A. (1994). Simulation neuronaler Netze. habilitation,
Uni Stuttgart.
ICPRAM 2022 - 11th International Conference on Pattern Recognition Applications and Methods
682