Citation: | ZHANG Chunxiang, SUN Ying, GAO Kexin, GAO Xueyao. Combine the Pre-trained Model with Bidirectional Gated Recurrent Units and Graph Convolutional Network for Adversarial Word Sense Disambiguation[J]. Journal of Electronics & Information Technology. doi: 10.11999/JEIT250386 |
[1] |
MENTE R, ALAND S, and CHENDAGE B. Review of word sense disambiguation and it’s approaches[EB/OL]. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4097221, 2022. doi: 10.2139/ssrn.4097221.
|
[2] |
ABRAHAM A, GUPTA B K, MAURYA A S, et al. Naïe Bayes approach for word sense disambiguation system with a focus on parts-of-speech ambiguity resolution[J]. IEEE Access, 2024, 12: 126668–126678. doi: 10.1109/ACCESS.2024.3453912.
|
[3] |
WANG Yue, LIANG Qiliang, YIN Yaqi, et al. Disambiguate words like composing them: A morphology-informed approach to enhance Chinese word sense disambiguation[C]. Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics, Bangkok, Thailand, 2024: 15354–15365. doi: 10.18653/v1/2024.acl-long.819.
|
[4] |
LI Linlin, LI Juxing, WANG Hongli, et al. Application of the transformer model algorithm in Chinese word sense disambiguation: A case study in Chinese language[J]. Scientific Reports, 2024, 14(1): 6320. doi: 10.1038/s41598-024-56976-5.
|
[5] |
WAEL T, ELREFAI E, MAKRAM M, et al. Pirates at arabicNLU2024: Enhancing Arabic word sense disambiguation using transformer-based approaches[C]. Proceedings of the Second Arabic Natural Language Processing Conference, Bangkok, Thailand, 2024: 372–376. doi: 10.18653/v1/2024.arabicnlp-1.31.
|
[6] |
MISHRA B K and JAIN S. Word sense disambiguation for Indic language using Bi-LSTM[J]. Multimedia Tools and Applications, 2024, 84(16): 16631–16656. doi: 10.1007/S11042-024-19499-9.
|
[7] |
LYU Meng and MO Shasha. HSRG-WSD: A novel unsupervised Chinese word sense disambiguation method based on heterogeneous sememe-relation graph[C]. Proceedings of the 19th International Conference on Advanced Intelligent Computing Technology and Applications, Zhengzhou, China, 2023: 623–633. doi: 10.1007/978-981-99-4752-2_51.
|
[8] |
PU Xiao, PAPPAS N, HENDERSON J, et al. Integrating weakly supervised word sense disambiguation into neural machine translation[J]. Transactions of the Association for Computational Linguistics, 2018, 6: 635–649. doi: 10.1162/tacl_a_00242.
|
[9] |
PADWAD H, KESWANI G, BISEN W, et al. Leveraging contextual factors for word sense disambiguation in Hindi language[J]. International Journal of Intelligent Systems and Applications in Engineering, 2024, 12(12s): 129–136. (查阅网上资料, 未找到doi信息, 请确认).
|
[10] |
LI Zhi, YANG Fan, and LUO Yaoru. Context embedding based on Bi-LSTM in semi-supervised biomedical word sense disambiguation[J]. IEEE Access, 2019, 7: 72928–72935. doi: 10.1109/ACCESS.2019.2912584.
|
[11] |
BARBA E, PROCOPIO L, CAMPOLUNGO N, et al. MulaN: Multilingual label propagation for word sense disambiguation[C]. Proceedings of the 29th International Conference on International Joint Conferences on Artificial Intelligence, Yokohama, Japan, 2021: 3837–3844. doi: 10.24963/ijcai.2020/531.
|
[12] |
JIA Xiaojun, ZHANG Yong, WU Baoyuan, et al. Boosting fast adversarial training with learnable adversarial initialization[J]. IEEE Transactions on Image Processing, 2022, 31: 4417–4430. doi: 10.1109/TIP.2022.3184255.
|
[13] |
RIBEIRO A H, SCHÖN T B, ZACHARIAH D, et al. Efficient optimization algorithms for linear adversarial training[C]. Proceedings of the 28th International Conference on Artificial Intelligence and Statistics, Mai Khao, Thailand, 2025: 1207–1215.
|
[14] |
LI J W, LIANG Renwei, YEH C H, et al. Adversarial robustness overestimation and instability in TRADES[EB/OL]. https://arxiv.org/abs/2410.07675, 2024.
|
[15] |
CHENG Xiwei, FU Kexin, and FARNIA F. Stability and generalization in free adversarial training[EB/OL]. https://arxiv.org/abs/2404.08980, 2024.
|
[16] |
ZHU Chen, CHENG Yu, GAN Zhe, et al. FreeLB: Enhanced adversarial training for natural language understanding[C]. Proceedings of the 8th International Conference on Learning Representations, Xi’an, China, 2020: 11232–11245.
|
[17] |
BAI Tao, LUO Jinqi, ZHAO Jun, et al. Recent advances in adversarial training for adversarial robustness[C]. Proceedings of the 30th International Joint Conference on Artificial Intelligence, Montreal, Canada, 2021: 4312–4321. doi: 10.24963/ijcai.2021/591.
|
[18] |
ZHANG Liwei. Word sense disambiguation model based on Bi-LSTM[C]. Proceedings of the 2022 14th International Conference on Measuring Technology and Mechatronics Automation, Changsha, China, 2022: 848–851. doi: 10.1109/ICMTMA54903.2022.00172.
|
[19] |
KIM Y. Convolutional neural networks for sentence classification[C]. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, Doha, Qatar, 2014: 1746–1751. doi: 10.3115/v1/d14-1181.
|
[20] |
YAO Liang, MAO Chengsheng, and LUO Yuan. Graph convolutional networks for text classification[C]. Proceedings of the 33rd AAAI Conference on Artificial Intelligence, Honolulu, USA, 2019: 7370–7377. doi: 10.1609/aaai.v33i01.33017370.
|
[21] |
HAMILTON W L, YING Z, and LESKOVEC J. Inductive representation learning on large graphs[C]. Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, USA, 2017, 30: 1025–1035.
|
[22] |
CUI Yiming , CHE Wanxiang, LIU Ting, et al. Pre-training with whole word masking for Chinese BERT[J]. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2021, 29: 3504–3514. doi: 10.1109/TASLP.2021.3124365.
|
[23] |
LIU Yinhan, OTT M, GOYAL N, et al. RoBERTa: A robustly optimized BERT pretraining approach[EB/OL]. https://doi.org/10.48550/arXiv.1907.11692.2019.7, 2019.
|
[24] |
CUI Yiming, CHE Wanxiang, LIU Ting, et al. Revisiting pre-trained models for Chinese natural language processing[C]. Proceedings of the Findings of the Association for Computational Linguistics: EMNLP, 2020: 657–668. doi: 10.48550/arXiv.2004.13922. (查阅网上资料,未找到出版地信息,请确认补充).
|
[25] |
CUI Yiming, CHE Wanxiang, WANG Shijin, et al. LERT: A linguistically-motivated pre-trained language model[EB/OL]. https://ymcui.com/pdf/lert.pdf, 2022.
|
[26] |
STURUA S, MOHR I, AKRAM M K, et al. jina-embeddings-v3: Multilingual embeddings with task LoRA[EB/OL]. https://arxiv.org/abs/2409.10173, 2024.
|
[27] |
张春祥, 张育隆, 高雪瑶. 基于多通道残差混合空洞卷积的注意力词义消歧[J]. 北京邮电大学学报, 2024, 47(5): 128–134. doi: 10.13190/j.jbupt.2023-179.
ZHANG Chunxiang, ZHANG Yulong, and GAO Xueyao. Multi-channel residual hybrid dilated convolution with attention for word sense disambiguation[J]. Journal of Beijing University of Posts and Telecommunications, 2024, 47(5): 128–134. doi: 10.13190/j.jbupt.2023-179.
|