Paper
12 October 2020 Multi-model transfer and optimization for cloze task
Jiahao Tang, Long Ling, Chenyu Ma, Hanwen Zhang, Jianqiang Huang
Author Affiliations +
Proceedings Volume 11574, International Symposium on Artificial Intelligence and Robotics 2020; 115740T (2020) https://doi.org/10.1117/12.2579412
Event: International Symposium on Artificial Intelligence and Robotics (ISAIR), 2020, Kitakyushu, Japan
Abstract
Substantial progress has been made recently in training context-aware language models. CLOTH is a human created cloze dataset, which can better evaluate machine reading comprehension. Although the author of CLOTH has done many experiments on BERT and context2wec, it is still worth studying the performance of other models. We applied the CLOTH dataset to other models and evaluated their performance based on different model mechanisms. The results showed that ALBERT performed well on the cloze task. The accuracy of ALBERT is 92.24%, which is 6.34% higher than the human performance. In addition, we introduce adversarial training into the model. Experiments show that adversarial training has significant effects in improving the robustness and accuracy of the model. On the BERT-large model, the accuracy rate is up to 0.15% after using adversarial training.
© (2020) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Jiahao Tang, Long Ling, Chenyu Ma, Hanwen Zhang, and Jianqiang Huang "Multi-model transfer and optimization for cloze task", Proc. SPIE 11574, International Symposium on Artificial Intelligence and Robotics 2020, 115740T (12 October 2020); https://doi.org/10.1117/12.2579412
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Performance modeling

Transformers

Neural networks

Back to Top