# A Knowledge-Aware Sequence-to-Tree Network for Math Word Problem Solving Qinzhuo Wu, Qi Zhang*, Jinlan Fu, Xuanjing Huang Shanghai Key Laboratory of Intelligent Information Processing, School of Computer Science, Fudan University, Shanghai, China (qzwu17, qz, fujl16, xjhuang)@fudan.edu.cn # Abstract With the advancements in natural language processing tasks, math word problem solving has received increasing attention. Previous methods have achieved promising results but ignore background common-sense knowledge not directly provided by the problem. In addition, during generation, they focus on local features while neglecting global information. To incorporate external knowledge and global expression information, we propose a novel knowledge-aware sequence-to-tree (KA-S2T) network in which the entities in the problem sequences and their categories are modeled as an entity graph. Based on this entity graph, a graph attention network is used to capture knowledge-aware problem representations. Further, we use a tree-structured decoder with a state aggregation mechanism to capture the long-distance dependency and global expression information. Experimental results on the Math23K dataset revealed that the KA-S2T model can achieve better performance than previously reported best results. # 1 Introduction Math word problem solving has attracted increasing attention, and many math word problem solving systems have been developed. Early statistical learning methods (Feigenbaum et al., 1963; Fletcher, 1985; Bakman, 2007; Roy and Roth, 2016) extracted templates or features from problems and generated corresponding math expressions based on these templates or features. These methods require a large number of manually formulated features or can only be applied to small application problems in small areas. In recent years, many methods (Wang et al., 2017, 2018b; Xie and Sun, 2019) have been developed that apply neural networks to analyze math word problems, with Problem: Alan bought 2 green apples, 3 red apples, and 4 oranges for a total of \(50. Each apple weighed \(0.4\mathrm{kg}\) and is worth \(\$ 6\). Each orange weighs half as much as an apple. How much does each orange cost? ![](images/331f70e704507000f228a9ecbd0b6cc1dec55cc2369e40345d606a6c8a86a3f3.jpg) Figure 1: An example of a math word problem. With external knowledge, a model can capture the relationships between the entities in the problem. With the global information of a generated expression tree, a model can capture information between long-distance nodes. promising results. These methods use end-to-end models to directly generate the corresponding math expressions from the problem text. Although previous methods have achieved promising results, several problems remain that need to be addressed: 1) Background knowledge and common sense should be incorporated. For example, as shown in Figure 1, both apples and oranges are fruit. Humans are naturally aware of this common-sense information, but it is difficult for the model to learn this information from the problem text alone. 2) When generating expressions, sequence-to-sequence (Seq2Seq) methods tend to focus on local features and ignore global information. For example, the root node “/” of the expression tree in Figure 1 is directly adjacent to its right child “4”, but they are eight steps apart in the pre-order expression sequence. Xie and Sun (2019) proposed a sequence-to-tree (Seq2Tree) method for generating an expression tree in pre-order based on the parent node and the left sibling tree of each node. However, global information is still not being considered in the generated expression tree. To overcome these problems, we propose a novel knowledge-aware sequence-to-tree (KA-S2T) method for exploring how to better utilize external knowledge and capture the global expression information. The proposed model connects related entities and categories based on external knowledge bases to capture common-sense information and obtain better interaction between words. In addition, we designed a tree-structured decoder to capture the long-distance dependency and global expression information. KA-S2T updates all nodes in the generated expression at each time step, whereby the node state is updated by a recursive aggregation of its neighboring nodes. Through multiple iterations of aggregation, the model can use global information associated with the generated expression to generate the next node and thereby achieve better predictions. The main contributions of this paper can be summarized as follows: - We incorporate common-sense knowledge from external knowledge bases into math word problem solving tasks. - We propose a tree-structured decoder for generating math expressions. To incorporate global information associated with generated partial expressions, we recursively aggregate the neighbors of each node in the expression at each time step. - We conducted experiments on the Math23k dataset to verify the effectiveness of our KA-S2T model, and the results show that our model achieved better performance than previous methods. # 2 Models In this section, we define the problem and present our proposed KA-S2T model. As shown in Figure 2, we first use a bidirectional long short-term memory (LSTM) network to encode the math word problem sequences (Section 2.2). Then, we construct an entity graph based on external knowledge to model the relationships between different entities and categories in the problem (Section 2.3), and use a two-layer graph attention network (GAT) to obtain knowledge-aware problem representations (Section 2.4). Finally, we used a tree-structured decoder with a state aggregation mechanism to generate pre-order traversal math expression trees (Section 2.5). # 2.1 Problem Definition Consider the input sequence of a math word problem $\mathrm{X} = (x_{1}, x_{2}, \ldots, x_{n})$ . Our goal is to train a model that can generate its math expression $\mathrm{Y} = (y_{1}, y_{2}, \ldots, y_{n'})$ . The task is to estimate a probability distribution in which $\mathbf{P}(\mathrm{Y}|\mathrm{X}) = \prod_{t=1}^{n'} \mathbf{P}(y_{t}|y_{