Neural Sequence Modeling in Physical Language Understanding

Avi Bleiweiss

2019

Abstract

Automating the tasks of generating test questions and analyzing content for assessment of written student responses has been one of the more sought-after applications to support classroom educators. However, a major impediment to algorithm advances in developing such tools is the lack of large and publicly available domain corpora. In this paper, we explore deep learning of physics word problems performed at scale using the transformer, a state-of-the-art self-attention neural architecture. Our study proposes an intuitive novel approach to a tree-based data generation that relies mainly on physical knowledge structure and defers compositionality of natural language clauses to the terminal nodes. Applying our method to the simpler kinematics domain that describes motion properties of an object at a uniform acceleration rate and using our neural sequence model pretrained on a dataset of ten thousand machine-produced problems, we achieved BLEU scores of 0.54 and 0.81 for predicting derivation expressions on real-world and synthetic test sets, respectively. Notably increasing the number of trained problems resulted in a diminishing return on performance.

Download


Paper Citation


in Harvard Style

Bleiweiss A. (2019). Neural Sequence Modeling in Physical Language Understanding. In Proceedings of the 11th International Joint Conference on Computational Intelligence (IJCCI 2019) - Volume 1: NCTA; ISBN 978-989-758-384-1, SciTePress, pages 464-472. DOI: 10.5220/0008071104640472


in Bibtex Style

@conference{ncta19,
author={Avi Bleiweiss},
title={Neural Sequence Modeling in Physical Language Understanding},
booktitle={Proceedings of the 11th International Joint Conference on Computational Intelligence (IJCCI 2019) - Volume 1: NCTA},
year={2019},
pages={464-472},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0008071104640472},
isbn={978-989-758-384-1},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 11th International Joint Conference on Computational Intelligence (IJCCI 2019) - Volume 1: NCTA
TI - Neural Sequence Modeling in Physical Language Understanding
SN - 978-989-758-384-1
AU - Bleiweiss A.
PY - 2019
SP - 464
EP - 472
DO - 10.5220/0008071104640472
PB - SciTePress