Conneau, A., Khandelwal, K., Goyal, N., Chaudhary, V.,
Wenzek, G., Guzm
´
an, F., Grave, E., Ott, M., Zettle-
moyer, L., and Stoyanov, V. (2019). Unsupervised
cross-lingual representation learning at scale. arXiv
preprint arXiv:1911.02116.
Devlin, J., Chang, M.-W., Lee, K., and Toutanova, K.
(2018). Bert: Pre-training of deep bidirectional trans-
formers for language understanding. arXiv preprint
arXiv:1810.04805.
Kano, Y., Kim, M.-Y., Goebel, R., and Satoh, K. (2017).
Overview of coliee 2017. In COLIEE@ ICAIL, pages
1–8.
Lewis, M., Liu, Y., Goyal, N., Ghazvininejad, M., Mo-
hamed, A., Levy, O., Stoyanov, V., and Zettle-
moyer, L. (2019). Bart: Denoising sequence-to-
sequence pre-training for natural language genera-
tion, translation, and comprehension. arXiv preprint
arXiv:1910.13461.
Mikolov, T., Sutskever, I., Chen, K., Corrado, G. S., and
Dean, J. (2013). Distributed representations of words
and phrases and their compositionality. Advances in
neural information processing systems, 26.
Nguyen, H., Tran, V., and Nguyen, L. (2019). A deep learn-
ing approach for statute law entailment task in coliee-
2019. Proceedings of the 6th Competition on Legal
Information Extraction/Entailment. COLIEE.
Nguyen, H. T., Binh, D. T., Quan, B. M., and Le Minh,
N. (2021a). Evaluate and visualize legal embeddings
for explanation purpose. In 2021 13th International
Conference on Knowledge and Systems Engineering
(KSE), pages 1–6. IEEE.
Nguyen, H.-T. and Nguyen, L.-M. (2021). Sublanguage:
A serious issue affects pretrained models in legal do-
main. arXiv preprint arXiv:2104.07782.
Nguyen, H.-T., Tran, V., Nguyen, P. M., Vuong, T.-H.-Y.,
Bui, Q. M., Nguyen, C. M., Dang, B. T., Nguyen,
M. L., and Satoh, K. (2021b). Paralaw nets–cross-
lingual sentence-level pretraining for legal text pro-
cessing. arXiv preprint arXiv:2106.13403.
Nguyen, H.-T., Vuong, H.-Y. T., Nguyen, P. M., Dang, B. T.,
Bui, Q. M., Vu, S. T., Nguyen, C. M., Tran, V., Satoh,
K., and Nguyen, M. L. (2020). Jnlp team: Deep learn-
ing for legal processing in coliee 2020. arXiv preprint
arXiv:2011.08071.
Rabelo, J., Kim, M.-Y., Goebel, R., Yoshioka, M., Kano,
Y., and Satoh, K. (2019). A summary of the coliee
2019 competition. In JSAI International Symposium
on Artificial Intelligence, pages 34–49. Springer.
Rabelo, J., Kim, M.-Y., Goebel, R., Yoshioka, M., Kano, Y.,
and Satoh, K. (2020). Coliee 2020: methods for le-
gal document retrieval and entailment. In JSAI Inter-
national Symposium on Artificial Intelligence, pages
196–210. Springer.
Raffel, C., Shazeer, N., Roberts, A., Lee, K., Narang, S.,
Matena, M., Zhou, Y., Li, W., Liu, P. J., et al. (2020).
Exploring the limits of transfer learning with a uni-
fied text-to-text transformer. J. Mach. Learn. Res.,
21(140):1–67.
Shao, Y., Mao, J., Liu, Y., Ma, W., Satoh, K., Zhang, M.,
and Ma, S. (2020). Bert-pli: Modeling paragraph-
level interactions for legal case retrieval. In IJCAI,
pages 3501–3507.
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones,
L., Gomez, A. N., Kaiser, Ł., and Polosukhin, I.
(2017). Attention is all you need. Advances in neural
information processing systems, 30.
Vuong, Y. T.-H., Bui, Q. M., Nguyen, H.-T., Nguyen, T.-
T.-T., Tran, V., Phan, X.-H., Satoh, K., and Nguyen,
L.-M. (2022). Sm-bert-cr: a deep learning approach
for case law retrieval with supporting model. Artificial
Intelligence and Law, pages 1–28.
How Fine Tuning Affects Contextual Embeddings: A Negative Result Explanation
591