
7 CONCLUSION
In this paper, we present an approach called SynCRF
that allows to mine TRIZ parameters from patents.
This approach is part of a solved contradiction min-
ing process whose purpose is a fine-grained under-
standing of the inventions described in patents. Syn-
CRF is built with a deep neural encoder and a Condi-
tional Random Field. It relies on the syntactic struc-
ture of sentences to estimate pairwise potentials and
improve consistency in the predicted label sequences.
SynCRF shows solid improvements over the state of
the art with absolute improvements of 3 to 5% for all
metrics over the best baseline (XLNet-CRF-cs). It is
also highlighted that SynCRF learns more easily the
forbidden transitions and allows for example to im-
prove the precision by more than 20% compared to
the best baseline without constraints on the transitions
(XLNet-CRF).
ACKNOWLEDGEMENTS
This research was funded in part by the French Na-
tional Research Agency (ANR) under the project
”ANR-22-CE92- 0007-02”
REFERENCES
Altshuller, G. (1984). Creativity As an Exact Science. CRC
Press.
Berdyugina, D. and Cavallucci, D. (2020). Setting up
context-sensitive real-time contradiction matrix of a
given field using unstructured texts of patent contents
and natural language processing. In Triz Future 2020.
Cetintas, S. and Si, L. (2012). Effective query genera-
tion and postprocessing strategies for prior art patent
search. J. Assoc. Inf. Sci. Technol., 63:512–527.
Chai, Z., Jin, H., Shi, S., Zhan, S., Zhuo, L., and Yang,
Y. (2022). Hierarchical shared transfer learning for
biomedical named entity recognition. BMC Bioinfor-
matics, 23.
Chiu, J. P. and Nichols, E. (2016). Named Entity Recogni-
tion with Bidirectional LSTM-CNNs. Transactions of
the Association for Computational Linguistics, 4:357–
370.
Cho, K., van Merrienboer, B., G
¨
ulc¸ehre, C¸ ., Bahdanau, D.,
Bougares, F., Schwenk, H., and Bengio, Y. Learning
phrase representations using RNN encoder-decoder
for statistical machine translation. In Moschitti, A.,
Pang, B., and Daelemans, W., editors, Proceedings of
the 2014 Conference on Empirical Methods in Natural
Language Processing, EMNLP 2014, October 25-29,
2014, pages 1724–1734.
Chu, X., Ouyang, W., Li, h., and Wang, X. (2016). Crf-
cnn: Modeling structured information in human pose
estimation. In Lee, D., Sugiyama, M., Luxburg, U.,
Guyon, I., and Garnett, R., editors, Advances in Neu-
ral Information Processing Systems, volume 29. Cur-
ran Associates, Inc.
Devlin, J., Chang, M.-W., Lee, K., and Toutanova,
K. (2018). Bert: Pre-training of deep bidirec-
tional transformers for language understanding. cite
arxiv:1810.04805Comment: 13 pages.
Hochreiter, S. and Schmidhuber, J. (1997). Long short-term
memory. Neural Comput., 9(8):1735–1780.
Lafferty, J. D., McCallum, A., and Pereira, F. C. N. (2001).
Conditional random fields: Probabilistic models for
segmenting and labeling sequence data. In Proceed-
ings of the Eighteenth International Conference on
Machine Learning, ICML ’01, pages 282–289, San
Francisco, CA, USA. Morgan Kaufmann Publishers
Inc.
Lample, G., Ballesteros, M., Subramanian, S., Kawakami,
K., and Dyer, C. (2016). Neural architectures for
named entity recognition. In Proceedings of the 2016
Conference of the North American Chapter of the As-
sociation for Computational Linguistics: Human Lan-
guage Technologies, pages 260–270, San Diego, Cal-
ifornia. Association for Computational Linguistics.
Li, X., Zhang, H., and Zhou, X.-H. (2020). Chinese clini-
cal named entity recognition with variant neural struc-
tures based on bert methods. Journal of Biomedical
Informatics, 107:103422.
Peng, J., Bo, L., and Xu, J. (2009). Conditional neural
fields. In Bengio, Y., Schuurmans, D., Lafferty, J.,
Williams, C., and Culotta, A., editors, Advances in
Neural Information Processing Systems, volume 22.
Curran Associates, Inc.
Ramshaw, L. and Marcus, M. (1999). Text Chunking Us-
ing Transformation-Based Learning, pages 157–176.
Springer Netherlands, Dordrecht.
Saha, T., Saha, S., and Bhattacharyya, P. (2018). Explor-
ing deep learning architectures coupled with crf based
prediction for slot-filling. In Cheng, L., Leung, A.
C. S., and Ozawa, S., editors, Neural Information Pro-
cessing, pages 214–225, Cham. Springer International
Publishing.
Sun, J., Liu, Y., Cui, J., and He, H. (2022). Deep learning-
based methods for natural hazard named entity recog-
nition. Scientific Reports, 12:4598.
Vemulapalli, R., Tuzel, O., Liu, M.-Y., and Chellappa, R.
(2016). Gaussian conditional random field network
for semantic segmentation. In 2016 IEEE Conference
on Computer Vision and Pattern Recognition (CVPR),
pages 3224–3233.
Yang, Z., Dai, Z., Yang, Y., Carbonell, J., Salakhutdinov, R.,
and Le, Q. V. (2019). XLNet: Generalized Autoregres-
sive Pretraining for Language Understanding. Curran
Associates Inc., Red Hook, NY, USA.
Zheng, S., Jayasumana, S., Romera-Paredes, B., Vineet,
V., Su, Z., Du, D., Huang, C., and Torr, P. H. S.
(2015). Conditional random fields as recurrent neu-
ral networks. In 2015 IEEE International Conference
on Computer Vision (ICCV), pages 1529–1537.
SynCRF: Syntax-Based Conditional Random Field for TRIZ Parameter Minings
897