Paper
3 April 2023 Research on LatticeLSTM model based on data enhancement and self-attention mechanism
Yi Pan, Yong Zhou, He Liu, JinTao Zhang, JiaHua Wu
Author Affiliations +
Proceedings Volume 12599, Second International Conference on Digital Society and Intelligent Systems (DSInS 2022); 1259928 (2023) https://doi.org/10.1117/12.2673458
Event: 2nd International Conference on Digital Society and Intelligent Systems (DSInS 2022), 2022, Chendgu, China
Abstract
(Named Entity Recognition, NER) and (Relation Extraction, RE) are two basic tasks in Natural LanguageProcessing, NLP). Due to the indistinguishable boundaries between entities in Chinese and the lack of obvious formal signs, Named entity recognition has always been a difficult point in the Chinese field. Although it has made good progress in Chinese, it still lacks the semantic understanding ability in special fields and the effect is not ideal. In this paper, the algorithm of deep learning and self-attention mechanism are deeply studied. By improving LatticeLSTM model and integrating self-attention mechanism, the ability to understand Chinese semantics is improved, and a small amount of labeled data is expanded by data enhancement to build a data set in a special field to complete the task of named entity recognition.
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yi Pan, Yong Zhou, He Liu, JinTao Zhang, and JiaHua Wu "Research on LatticeLSTM model based on data enhancement and self-attention mechanism", Proc. SPIE 12599, Second International Conference on Digital Society and Intelligent Systems (DSInS 2022), 1259928 (3 April 2023); https://doi.org/10.1117/12.2673458
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Computer networks

Education and training

Performance modeling

Semantics

Deep learning

Artificial intelligence

Back to Top