Abstract:
Cross-domain slot filling is a widely explored problem in spoken language understanding (SLU), which requires the model to transfer between different domains under data s...Show MoreMetadata
Abstract:
Cross-domain slot filling is a widely explored problem in spoken language understanding (SLU), which requires the model to transfer between different domains under data sparsity conditions. Dominant two-step hierarchical models first extract slot entities and then calculate the similarity score between slot description-based prototypes and the last hidden layer of the slot entity, selecting the closest prototype as the predicted slot type. However, these models only use slot descriptions as prototypes, which lacks robustness. Moreover, these approaches have less regard for the inherent knowledge in the slot entity embedding to suffer from the issue of overfitting. In this letter, we propose a Robust Multi-prototypes Aware Integration (RMAI) method for zero-shot cross-domain slot filling. In RMAI, more robust slot entity-based prototypes and inherent knowledge in the slot entity embedding are utilized to improve the classification performance and alleviate the risk of overfitting. Furthermore, a multi-prototypes aware integration approach is proposed to effectively integrate both our proposed slot entity-based prototypes and the slot description-based prototypes. Experimental results on the SNIPS dataset demonstrate the well performance of RMAI.
Published in: IEEE Signal Processing Letters ( Volume: 31)