论文部分内容阅读
Previous work has shown that joint modeling of two Natural Language Processing(NLP)tasks are effective for achieving better performances for both tasks.Lots of task-specific joint models are proposed.This paper proposes a Hierarchical Long Short-Term Memory(HLSTM)model and some its variants for modeling two tasks jointly.The models are flexible for modeling different types of combinations of tasks.It avoids task-specific feature engineering.Besides the enabling of correlation information between tasks,our models take the hierarchical relations between two tasks into consideration,which is not discussed in previous work.Experimental results show that our models outperform strong baselines in three different types of task combination.While both correlation information and hierarchical relations between two tasks are helpful to improve performances for both tasks,the models especially boost performance of tasks on the top of the hierarchical structures.