Bidirectional RNN-based Attention Model for Jointly Intent Detection and Slot Filling

Khaldoon H. Alhussayni, Alexander Zamyatin and S. Eman Alshamery


Abstract:

Natural language understanding (NLU) module is a critical component in dialogue systems. These programs interact with the human in natural language. The purpose of NLU is to translate user text into a formula that computer can understand. NLU naturally includes identifying a userís intent often mentioned in intent detection and extracting semantic constituents from the userís query usually specified in slot filling. Intent detection and slot filling have predefined set of labels to assign an intent class and fill values slots in a semantic frame. Intent detection and slot filling usually processed either separately models or jointly model. Joint model simplifies the NLU system, as only one model needs to be trained and fine-tuned and can optimise performance for the two tasks. Furthermore, obtain better semantic frame by the relationship between intent and slot through learning. The joint training model improved the intent detection accuracy and slot filling F1 score further over the independent training model. In this work presents a bidirectional RNN-based attention model for NLU that can optimise intent detection and slot filling by combine context of local and global semantic information to focus more on discovery beneficial information correlated to the current output in the input data. The result ofour proposed model outperforms models for all tasks, where the relative enhancement is around 2.5% for the semantic frame, 1.3% for intent detection, and 0.3% for slot filling compared to the attention models on benchmark ATIS datasets.Also, test only intent attention without using any additional features for slot filling side, the results also outperform for all tasks, 2% for the semantic frame, 0.8% for intent detection, and 0.1% for slot filling.

Volume: 10 | Issue: 11

Pages: 404-410

Purchase this Article