Skipping Word

Lingxun Meng, Yan Li, Mengyi Liu, Peng Shu
2016 Proceedings of the 25th ACM International on Conference on Information and Knowledge Management - CIKM '16  
Recent works using artificial neural networks based on word distributed representation greatly boost the performance of various natural language learning tasks, especially question answering. Though, they also carry along with some attendant problems, such as corpus selection for embedding learning, dictionary transformation for different learning tasks, etc. In this paper, we propose to straightforwardly model sentences by means of character sequences, and then utilize convolutional neural
more » ... orks to integrate character embedding learning together with point-wise answer selection training. Compared with deep models pre-trained on word embedding (WE) strategy, our character-sequential representation (CSR) based method shows a much simpler procedure and more stable performance across different benchmarks. Extensive experiments on two benchmark answer selection datasets exhibit the competitive performance compared with the state-of-the-art methods.
doi:10.1145/2983323.2983861 dblp:conf/cikm/MengLLS16 fatcat:snwshlnbyfgidflumlczmglrvq