Research Blog

Long Short-Term Memory With Dynamic Skip Connections

In recent years, long short-term memory network (LSTM) has been successfully used to model sequence data with variable length. However, LSTM still encounter difficulties in capturing the long-term dependence of natural language. In order to solve this problem, some researchers have proposed to use a fixed-length skip connections to improve the connection structure between edges. Although this design is simple and effective, it still cannot deal with variable-length dependencies. Therefore, we propose the following improvements:

Dynamic Skip Connections Mechanism

A dynamic skip connections mechanism is introduced to alleviate the lack of performance of LSTM on long-distance dependence. The dynamic skip connections mechanism can be used to directly connect two dependent words.


Dependency Learning Method Based On Reinforcement Learning

Since there is no label information of dependencies in the training data, we propose a new method based on reinforcement learning to automatically learn dependencies from the data.

Tao Gui, Qi Zhang, Lujun Zhao, Yaosong Lin, Minlong Peng, Jingjing Gong, Xuanjing Huang, Long Short-Term Memory with Dynamic Skip Connections, AAAI-19.

Adversarial Multi-Criteria Learning For Chinese Word Segmentation

Chinese word segmentation is an important task of Chinese natural language processing. The Chinese segmentation data is rich, and each Chinese segmentation data is obtained through expensive and time-consuming manual annotation. And because the word segmentation standards between each corpus are somewhat inconsistent, in the past, when training a word segmentation model, only one corpus was used and other corpora were ignored. This is undoubtedly a waste. If you can find a way to use the information of multiple word segmentation standard corpora, the model can be trained on larger-scale data, thereby improving the accuracy of word segmentation under each word segmentation standard. The multi-standard word segmentation problem has very high academic and application value. Experiments show that our method can use different standard segmentation data to improve the performance of Chinese segmentation. On 8 Chinese word segmentation data sets with different specifications, we have achieved better results than single standard word segmentation. The paper was published in ACL 2017 and won the Outstanding Paper Award.
Xinchi Chen, Zhan Shi, Xipeng Qiu, Adversarial Multi-Criteria Learning for Chinese Word Segmentation, ACL-2017.