King's College London

Research portal

Task-oriented Domain-specific Meta-Embedding for Text Classification

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

Xin Wu, Yi Cai, Qing Li, Tao Wang, Kai Yang

Original languageEnglish
Title of host publicationProceedings of the 2020 Conference on Empirical Methods in Natural Language Processing
DOIs
Published1 Nov 2020

King's Authors

Abstract

Meta-embedding learning, which combines complementary information in different word embeddings, have shown superior performances across different Natural Language Processing tasks. However, domain-specific knowledge is still ignored by existing meta-embedding methods, which results in unstable performances across specific domains. Moreover, the importance of general and domain word embeddings is related to downstream tasks, how to regularize meta-embedding to adapt downstream tasks is an unsolved problem. In this paper, we propose a method to incorporate both domain-specific and task-oriented information into meta-embeddings. We conducted extensive experiments on four text classification datasets and the results show the effectiveness of our proposed method.

View graph of relations

© 2020 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454