高级搜索

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

小波变换与注意力双路融合的虚拟现实晕动症脑电检测模型

陈玥池 化成城 戴志安 付景琦 朱敏 汪秋宇 严颖 刘佳

陈玥池, 化成城, 戴志安, 付景琦, 朱敏, 汪秋宇, 严颖, 刘佳. 小波变换与注意力双路融合的虚拟现实晕动症脑电检测模型[J]. 电子与信息学报. doi: 10.11999/JEIT251233
引用本文: 陈玥池, 化成城, 戴志安, 付景琦, 朱敏, 汪秋宇, 严颖, 刘佳. 小波变换与注意力双路融合的虚拟现实晕动症脑电检测模型[J]. 电子与信息学报. doi: 10.11999/JEIT251233
CHEN Yuechi, HUA Chengcheng, DAI Zhian, FU Jingqi, ZHU Min, WANG Qiuyu, YAN Ying, LIU Jia. Wavelet Transform and Attentional Dual-Path EEG Model for Virtual Reality Motion Sickness Detection[J]. Journal of Electronics & Information Technology. doi: 10.11999/JEIT251233
Citation: CHEN Yuechi, HUA Chengcheng, DAI Zhian, FU Jingqi, ZHU Min, WANG Qiuyu, YAN Ying, LIU Jia. Wavelet Transform and Attentional Dual-Path EEG Model for Virtual Reality Motion Sickness Detection[J]. Journal of Electronics & Information Technology. doi: 10.11999/JEIT251233

小波变换与注意力双路融合的虚拟现实晕动症脑电检测模型

doi: 10.11999/JEIT251233 cstr: 32379.14.JEIT251233
基金项目: 国家自然科学基金(62206130),江苏省自然科学基金(BK20200821),南京信息工程大学人才启动经费 (2020r075)
详细信息
    作者简介:

    陈玥池:女,硕士生,研究方向为脑电信号处理、虚拟现实晕动症

    化成城:男,副教授,研究方向为脑电信号与脑机接口

    戴志安:男,硕士生,研究方向脑电迁移学习算法

    付景琦:男,硕士生,研究方向为脑电跨被试算法,图神经网络

    朱敏:女,硕士生,研究方向为脑电信号、少数导联检测虚拟现实晕动症

    汪秋宇:硕士生,研究方向为脑电信号、脑电信号特征提取、脑电信号的深度学习分析

    严颖:男,副教授,研究方向:复杂系统的故障诊断与故障预测、脑机、机器学习、优化等

    刘佳:女,教授,研究方向为多模态情感分析、计算机视觉与图像处理、虚拟/增强现实

    通讯作者:

    化成城 huachengcheng@nuist.edu.cn

  • 中图分类号: TN911.7; TP391

Wavelet Transform and Attentional Dual-Path EEG Model for Virtual Reality Motion Sickness Detection

Funds: The National Natural Science Foundation of China (62206130), The Natural Science Foundation of Jiangsu Province (BK20200821), The Startup Foundation for Introducing Talent of NUIST (2020r075)
  • 摘要: 虚拟现实晕动症(VRMS)指在虚拟现实(VR)环境中用户因前庭-视觉信息失调引发的严重眩晕,该症状阻碍沉浸式VR技术应用和推广。该文提出一种小波变换与注意力机制协同的双路融合模型(WTATNet),通过解耦VR运动刺激暴露后的休息态脑电(EEG)的时空特征,为VRMS的客观检测提供新方法。该模型分为两条支路,支路1对EEG时间维和导联维进行二维离散小波变换计算出小波系数,再将小波系数送入卷积层进行特征提取。支路2则是EEG经过一维卷积层滤波后,依次利用通道注意力模块和导联注意力模块强化通道维和导联维的关键特征。最后将两支路的特征融合并进行分类。该文使用VR游戏《超级滑翔翼2》诱发受试者的VRMS并记录他们在任务前后的休息态EEG评估WTATNet模型的性能,最终实现对受试者休息态下眩晕脑电与非眩晕脑电的分类。该模型对VRMS的识别准确率、F1-score、精确率和召回率(十折交叉验证的平均值)分别为98.39%, 98.39%, 98.38%和98.40%,优于目前的先进的EEG识别模型。结果表明所提方法可对VRMS进行检测,并用于进一步研究VRMS的产生因素和治理方法,对优化VR系统具有一定的指导意义。
  • 图  1  实验流程

    图  2  基于小波变换和导联注意力的双路模型(WTATNet)

    图  3  两种导联排序方式示意图

    图  4  两个注意力模块结构图

    图  5  对照实验特征可视化

    图  6  消融实验特征可视化

    表  1  一维导联排序方法

    方法导联顺序
    横向[Fp1, Fp2, F11, F7, F3, Fz, F4, F8, F12, FT11, FC3, FCz, FC4, FT12, T7, C3, Cz, C4, T8, CP3, CPz, CP4, P7, P3,
    Pz, P4, P8, O1, Oz, O2]
    竖向[F11, FT11, Fp1, F7, T7, P7, F3, FC3, C3, CP3, P3, O1, Fz, FCz, Cz, CPz, Pz, Oz, F4, FC4, C4, CP4, P4, O2, Fp2,
    F8, T8, P8, F12, FT12]
    随机1[O1, C3, P3, C4, F12, FT11, Oz, Pz, FC4, Fp1, F3, Cz, Fz, FT12, FCz, P7, Fp2, F11, P4, F7, CP4, P8, T8, O2, CPz,
    F8, FC3, T7, CP3, F4]
    随机2[F8, O2, Fz, P8, F12, O1, FC4, CP4, FCz, P3, T8, F3, F7, Pz, Cz, T7, CPz, FT11, C3, P4, Fp1, Fp2, P7, FC3,
    CP3, C4, F4, Oz, F11, FT12]
    随机3[C4, FC4, P7, Fp2, F4, P8, Fp1, P3, FT11, Cz, F3, F11, T8, CP3, F7F12, O2, F8, Oz, CP4, FT12, CPz,
    Pz, T7, C3, FC3, P4, FCz, Fz, O1]
    下载: 导出CSV

    表  2  对照实验结果(平均值±标准差)(%)

    模型/指标accuracyF1precisionrecall
    WTATNet(本文)98.39±0.5698.39±0.5398.38±0.5398.40±0.52
    DeepConvNet67.49±6.0364.00±8.1368.12±5.8878.98±2.73
    ShallowConvNet95.55±4.0395.49±4.1395.44±4.2096.14±3.16
    EEGNet72.40±4.4570.76±5.3272.91±4.0879.56±2.34
    EEGNet_old67.78±3.7365.04±4.9868.32±3.5876.79±2.08
    EEGNet_SSVEP79.25±4.5978.58±4.9279.59±4.4283.68±2.91
    ConTraNet[33]84.97±6.0584.66±6.5785.09±6.0396.82±3.75
    Conformer[34]93.34±2.2893.32±2.2993.42±2.2493.84±1.86
    FBCNet[35]98.04±0.9298.03±0.9398.02±0.9598.08±0.82
    下载: 导出CSV

    表  3  电极横向排序下模型消融实验结果(平均值±标准差)(%)

    模型/指标横向排序竖向排序
    accuracyF1precisionrecallaccuracyF1precisionrecall
    WTATNet98.39±0.5698.39±0.5398.38±0.5398.40±0.5297.97±0.5097.97±0.5097.97±0.5097.98±0.49
    模型196.61±0.5596.60±0.5596.59±0.5696.64±0.5496.12±0.7496.11±0.7396.10±0.7396.16±0.74
    模型297.03±0.6097.02±0.6196.99±0.6397.15±0.5496.91±0.5496.90±0.5496.86±0.5597.03±0.51
    模型397.38±0.3397.38±0.3397.36±0.3297.41±0.3594.54±5.3794.46±5.5594.57±5.2295.28±3.82
    模型496.73±0.6196.72±0.6196.69±0.6296.81±0.5995.66±0.5795.65±0.5895.60±0.6195.81±0.47
    模型596.57±0.4796.57±0.4796.57±0.4796.58±0.4796.91±0.6896.91±0.6896.89±0.6896.95±0.67
    模型697.15±0.6497.14±0.6497.13±0.6597.18±0.6297.20±0.4697.20±0.4697.19±0.4697.24±0.44
    模型796.81±0.3496.80±0.3496.78±0.3596.86±0.3297.05±0.4297.05±0.4297.02±0.4297.12±0.43
    模型896.61±0.7396.61±0.7396.59±0.7496.65±0.73////
    下载: 导出CSV

    表  4  不同导联重排方法下WTATNet结果(平均值±标准差)(%)

    排序/指标accuracyF1precisionrecall
    横向98.39±0.5698.39±0.5398.38±0.5398.40±0.52
    竖向97.97±0.5097.97±0.5097.97±0.5097.98±0.49
    随机196.30±0.6896.29±0.6996.29±0.6696.36±0.73
    随机297.09±0.3997.08±0.3997.07±0.3897.13±0.39
    随机396.95±0.6496.94±0.6496.92±0.6396.99±0.65
    下载: 导出CSV
  • [1] LIU Ran, XU Miao, ZHANG Yanzhen, et al. A pilot study on electroencephalogram-based evaluation of visually induced motion sickness[J]. Journal of Imaging Science and Technology, 2020, 64(2): 020501. doi: 10.2352/J.ImagingSci.Technol.2020.64.2.020501.
    [2] LIANG Tie, HONG Lei, XIAO Jinzhuang, et al. Directed network analysis reveals changes in cortical and muscular connectivity caused by different standing balance tasks[J]. Journal of Neural Engineering, 2022, 19(4): 046021. doi: 10.1088/1741-2552/ac7d0c.
    [3] KENNEDY R S, DREXLER J, and KENNEDY R C. Research in visually induced motion sickness[J]. Applied Ergonomics, 2010, 41(4): 494–503. doi: 10.1016/j.apergo.2009.11.006.
    [4] 顾展滔, 丁玎, 陈亦婷, 等. 面向虚拟现实晕动症的评估与缓解方法[J]. 浙江大学学报: 理学版, 2025, 52(1): 30–37. doi: 10.3785/j.issn.1008-9497.2025.01.004.

    GU Zhantao, DING Ding, CHEN Yiting, et al. Evaluation and mitigation methods for virtual reality motion sickness[J]. Journal of Zhejiang University: Science Edition, 2025, 52(1): 30–37. doi: 10.3785/j.issn.1008-9497.2025.01.004.
    [5] GOLDING J F. Motion sickness susceptibility[J]. Autonomic Neuroscience, 2006, 129(1/2): 67–76. doi: 10.1016/j.autneu.2006.07.019.
    [6] 蔡力, 翁冬冬, 张振亮, 等. 虚实运动一致性对虚拟现实晕动症的影响[J]. 系统仿真学报, 2016, 28(9): 1950–1956. doi: 10.16182/j.cnki.joss.2016.09.004.

    CAI Li, WENG Dongdong, ZHANG Zhenliang, et al. Impact of consistency between visually perceived movement and real movement on cybersickness[J]. Journal of System Simulation, 2016, 28(9): 1950–1956. doi: 10.16182/j.cnki.joss.2016.09.004.
    [7] MITTELSTAEDT J M. Individual predictors of the susceptibility for motion-related sickness: A systematic review[J]. Journal of Vestibular Research, 2020, 30(3): 165–193. doi: 10.3233/ves-200702.
    [8] 徐子超, 张玲, 肖水凤, 等. 晕动症机制及防治技术研究进展与展望[J]. 海军军医大学学报, 2024, 45(8): 923–928. doi: 10.16781/j.CN31-2187/R.20240056.

    XU Zichao, ZHANG Ling, XIAO Shuifeng, et al. Motion sickness mechanism and control techniques: Research progress and prospect[J]. Academic Journal of Naval Medical University, 2024, 45(8): 923–928. doi: 10.16781/j.CN31-2187/R.20240056.
    [9] 蔡永青, 韩成, 权巍, 等. 基于注意力机制的视觉诱导晕动症评估模型[J]. 浙江大学学报: 工学版, 2025, 59(6): 1110–1118. doi: 10.3785/j.issn.1008-973X.2025.06.002.

    CAI Yongqing, HAN Cheng, QUAN Wei, et al. Visual induced motion sickness estimation model based on attention mechanism[J]. Journal of Zhejiang University: Engineering Science, 2025, 59(6): 1110–1118. doi: 10.3785/j.issn.1008-973X.2025.06.002.
    [10] FENG Naishi, ZHOU Bin, ZHANG Qianqian, et al. A comprehensive exploration of motion sickness process analysis from EEG signal and virtual reality[J]. Computer Methods and Programs in Biomedicine, 2025, 264: 108714. doi: 10.1016/j.cmpb.2025.108714.
    [11] CAI Mengpu, CHEN Junxiang, HUA Chengcheng, et al. EEG emotion recognition using EEG-SWTNS neural network through EEG spectral image[J]. Information Sciences, 2024, 680: 121198. doi: 10.1016/j.ins.2024.121198.
    [12] CHAUDARY E, KHAN S A, and MUMTAZ W. EEG-CNN-souping: Interpretable emotion recognition from EEG signals using EEG-CNN-souping model and explainable AI[J]. Computers and Electrical Engineering, 2025, 123: 110189. doi: 10.1016/j.compeleceng.2025.110189.
    [13] MEISER A, LENA KNOLL A, and BLEICHNER M G. High-density ear-EEG for understanding ear-centered EEG[J]. Journal of Neural Engineering, 2024, 21(1): 016001. doi: 10.1088/1741-2552/ad1783.
    [14] LIAO C Y, TAI S K, CHEN R C, et al. Using EEG and deep learning to predict motion sickness under wearing a virtual reality device[J]. IEEE Access, 2020, 8: 126784–126796. doi: 10.1109/access.2020.3008165.
    [15] 韩敏, 孙磊磊, 洪晓军. 基于回声状态网络的脑电信号特征提取[J]. 生物医学工程学杂志, 2012, 29(2): 206–211.

    HAN Min, SUN Leilei, and HONG Xiaojun. Extraction of the EEG signal feature based on echo state networks[J]. Journal of Biomedical Engineering, 2012, 29(2): 206–211.
    [16] 张成, 汤璇, 杨冬平, 等. 基于深度学习的脑电信号特征检测方法[J]. 电子设计工程, 2024, 32(15): 156–160. doi: 10.14022/j.issn1674-6236.2024.15.033.

    ZHANG Cheng, TANG Xuan, YANG Dongping, et al. Method for EEG signal feature detection based on deep learning[J]. Electronic Design Engineering, 2024, 32(15): 156–160. doi: 10.14022/j.issn1674-6236.2024.15.033.
    [17] DAS A, SINGH S, KIM J, et al. Enhanced EEG signal classification in brain computer interfaces using hybrid deep learning models[J]. Scientific Reports, 2025, 15(1): 27161. doi: 10.1038/s41598-025-07427-2.
    [18] RAPARTHI M, MITTA N R, DUNKA V K, et al. Deep learning model for patient emotion recognition using EEG-tNIRS data[J]. Neuroscience Informatics, 2025, 5(3): 100219. doi: 10.1016/j.neuri.2025.100219.
    [19] GUO Xiang, LIANG Ruiqi, XU Shule, et al. An investigation of echo state network for EEG-based emotion recognition with deep neural networks[J]. Biomedical Signal Processing and Control, 2026, 111: 108342. doi: 10.1016/j.bspc.2025.108342.
    [20] CHEN He, SONG Yan, and LI Xiaoli. A deep learning framework for identifying children with ADHD using an EEG-based brain network[J]. Neurocomputing, 2019, 356: 83–96. doi: 10.1016/j.neucom.2019.04.058.
    [21] WANG Teng, HUANG Xiaoqiao, XIAO Zenan, et al. EEG emotion recognition based on differential entropy feature matrix through 2D-CNN-LSTM network[J]. EURASIP Journal on Advances in Signal Processing, 2024, 2024(1): 49. doi: 10.1186/s13634-024-01146-y.
    [22] 王春丽, 李金絮, 高玉鑫, 等. 一种基于时空频多维特征的短时窗口脑电听觉注意解码网络[J]. 电子与信息学报, 2025, 47(3): 814–824. doi: 10.11999/JEIT240867.

    WANG Chunli, LI Jinxu, GAO Yuxin, et al. A short-time window electroencephalogram auditory attention decoding network based on multi-dimensional characteristics of temporal-spatial-frequency[J]. Journal of Electronics & Information Technology, 2025, 47(3): 814–824. doi: 10.11999/JEIT240867.
    [23] DHONGADE D, CAPTAIN K, and DAHIYA S. EEG-based schizophrenia detection: Integrating discrete wavelet transform and deep learning[J]. Cognitive Neurodynamics, 2025, 19(1): 62. doi: 10.1007/s11571-025-10248-8.
    [24] MINHAS R, PEKER N Y, HAKKOZ M A, et al. Improved drowsiness detection in drivers through optimum pairing of EEG features using an optimal EEG channel comparable to a multichannel EEG system[J]. Medical & Biological Engineering & Computing, 2025, 63(10): 3019–3036. doi: 10.1007/s11517-025-03375-1.
    [25] SHEN Mingkan, WEN Peng, SONG Bo, et al. An EEG based real-time epilepsy seizure detection approach using discrete wavelet transform and machine learning methods[J]. Biomedical Signal Processing and Control, 2022, 77: 103820. doi: 10.1016/j.bspc.2022.103820.
    [26] WEN Dong, JIAO Wenlong, LI Xiaoling, et al. The EEG signals steganography based on wavelet packet transform-singular value decomposition-logistic[J]. Information Sciences, 2024, 679: 121006. doi: 10.1016/j.ins.2024.121006.
    [27] QIN Yuxin, LI Baojiang, WANG Wenlong, et al. ETCNet: An EEG-based motor imagery classification model combining efficient channel attention and temporal convolutional network[J]. Brain Research, 2024, 1823: 148673. doi: 10.1016/j.brainres.2023.148673.
    [28] ZHOU Kai, HAIMUDULA A, and TANG Wanying. Dual-branch convolution network with efficient channel attention for EEG-based motor imagery classification[J]. IEEE Access, 2024, 12: 74930–74943. doi: 10.1109/access.2024.3404634.
    [29] KIEU H D. Graph attention network for motor imagery classification[C]. 2024 RIVF International Conference on Computing and Communication Technologies (RIVF), Danang, Vietnam, 2024: 255–260. doi: 10.1109/RIVF64335.2024.11009062.
    [30] LENG Jiancai, GAO Licai, JIANG Xiuquan, et al. A multi‐feature fusion graph attention network for decoding motor imagery intention in spinal cord injury patients[J]. Journal of Neural Engineering, 2024, 21(6): 066044. doi: 10.1088/1741-2552/ad9403.
    [31] 化成城, 周占峰, 陶建龙, 等. 导联注意力及脑连接驱动的虚拟现实晕动症识别模型研究[J]. 电子与信息学报, 2025, 47(4): 1161–1171. doi: 10.11999/JEIT240440.

    HUA Chengcheng, ZHOU Zhanfeng, TAO Jianlong, et al. Virtual reality motion sickness recognition model driven by lead-attention and brain connection[J]. Journal of Electronics & Information Technology, 2025, 47(4): 1161–1171. doi: 10.11999/JEIT240440.
    [32] ALI O, SAIF-UR-REHMAN M, GLASMACHERS T, et al. ConTraNet: A hybrid network for improving the classification of EEG and EMG signals with limited training data[J]. Computers in Biology and Medicine, 2024, 168: 107649. doi: 10.1016/j.compbiomed.2023.107649.
    [33] SONG Yonghao, ZHENG Qingqing, LIU Bingchuan, et al. EEG conformer: Convolutional transformer for EEG decoding and visualization[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2023, 31: 710–719. doi: 10.1109/tnsre.2022.3230250.
    [34] MANE R, CHEW E, CHUA K, et al. FBCNet: A multi-view convolutional neural network for brain-computer interface[J]. arXiv preprint arXiv: 2104.01233, 2021. doi: 10.48550/arXiv.2104.01233.
    [35] PAN Junting, SAYROL E, GIRO-I-NIETO X, et al. Shallow and deep convolutional networks for saliency prediction[C]. The IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, USA, 2016: 598–606. doi: 10.1109/CVPR.2016.71.
    [36] 林艳飞, 臧博宇, 郭嵘骁, 等. 基于相频特性的稳态视觉诱发电位深度学习分类模型[J]. 电子与信息学报, 2022, 44(2): 446–454. doi: 10.11999/JEIT210816.

    LIN Yanfei, ZANG Boyu, GUO Rongxiao, et al. A deep learning method for SSVEP classification based on phase and frequency characteristics[J]. Journal of Electronics & Information Technology, 2022, 44(2): 446–454. doi: 10.11999/JEIT210816.
    [37] 熊鹏, 刘学朋, 杜海曼, 等. 基于平稳和连续小波变换融合算法的心电信号P, T波检测[J]. 电子与信息学报, 2021, 43(5): 1441–1447. doi: 10.11999/JEIT200049.

    XIONG Peng, LIU Xuepeng, DU Haiman, et al. Detection of ECG signal P and T wave based on stationary and continuous wavelet transform fusion[J]. Journal of Electronics & Information Technology, 2021, 43(5): 1441–1447. doi: 10.11999/JEIT200049.
  • 加载中
图(6) / 表(4)
计量
  • 文章访问数:  76
  • HTML全文浏览量:  31
  • PDF下载量:  10
  • 被引次数: 0
出版历程
  • 收稿日期:  2025-11-24
  • 修回日期:  2026-01-10
  • 录用日期:  2026-01-12
  • 网络出版日期:  2026-01-24

目录

    /

    返回文章
    返回