Author:
Avi Bleiweiss
Affiliation:
BShalem Research, Sunnyvale and U.S.A.
Keyword(s):
Kinematics, Recurrent Neural Networks, Long Short-term Memory, Sequence Model, Attention.
Related
Ontology
Subjects/Areas/Topics:
Artificial Intelligence
;
Biomedical Engineering
;
Biomedical Signal Processing
;
Computational Intelligence
;
Health Engineering and Technology Applications
;
Higher Level Artificial Neural Network Based Intelligent Systems
;
Human-Computer Interaction
;
Learning Paradigms and Algorithms
;
Methodologies and Methods
;
Neural Networks
;
Neurocomputing
;
Neurotechnology, Electronics and Informatics
;
Pattern Recognition
;
Physiological Computing Systems
;
Sensor Networks
;
Signal Processing
;
Soft Computing
;
Theory and Methods
Abstract:
Automating the tasks of generating test questions and analyzing content for assessment of written student responses has been one of the more sought-after applications to support classroom educators. However, a major impediment to algorithm advances in developing such tools is the lack of large and publicly available domain corpora. In this paper, we explore deep learning of physics word problems performed at scale using the transformer, a state-of-the-art self-attention neural architecture. Our study proposes an intuitive novel approach to a tree-based data generation that relies mainly on physical knowledge structure and defers compositionality of natural language clauses to the terminal nodes. Applying our method to the simpler kinematics domain that describes motion properties of an object at a uniform acceleration rate and using our neural sequence model pretrained on a dataset of ten thousand machine-produced problems, we achieved BLEU scores of 0.54 and 0.81 for predicting deri
vation expressions on real-world and synthetic test sets, respectively. Notably increasing the number of trained problems resulted in a diminishing return on performance.
(More)