Tags:BERT, intent detection, natural language understanding, slot filling and task oriented dialogue system
Abstract:
Natural Language Understanding is a core task when building conversational agents, fulfilling the objectives of understanding the user’s goal and detecting any valuable information regarding it. NLU implies Intent Detection and Slot Filling, to semantically parse the user’s utterance. One caveat when training a Deep Learning model for a domain specific NLU is the lack of specific datasets, which leads to poorly performing models. To overcome this, we experiment with fine-tuning BERT to jointly detect the user’s intent and the related slots, using a custom-generated dataset built around a organization specific knowledge base. Our results show that well-constructed datasets lead to high detection performances and the resulting model has the potential to enhance a future task-oriented dialogue system.
Leveraging BERT for Natural Language Understanding of Domain-Specific Knowledge