Advanced Search
Volume 47 Issue 6
Jun.  2025
Turn off MathJax
Article Contents
LI Ning, WANG Zan, SHU Gaofeng, ZHANG Tingwei, GUO Zhengwei. Siamese Network-assisted Multi-domain Feature Fusion for Radar Active Jamming Recognition Method[J]. Journal of Electronics & Information Technology, 2025, 47(6): 1837-1849. doi: 10.11999/JEIT240797
Citation: LI Ning, WANG Zan, SHU Gaofeng, ZHANG Tingwei, GUO Zhengwei. Siamese Network-assisted Multi-domain Feature Fusion for Radar Active Jamming Recognition Method[J]. Journal of Electronics & Information Technology, 2025, 47(6): 1837-1849. doi: 10.11999/JEIT240797

Siamese Network-assisted Multi-domain Feature Fusion for Radar Active Jamming Recognition Method

doi: 10.11999/JEIT240797 cstr: 32379.14.JEIT240797
Funds:  The Natural Science Foundation of Henan (242300421170)
  • Received Date: 2024-09-14
  • Rev Recd Date: 2025-04-18
  • Available Online: 2025-05-08
  • Publish Date: 2025-06-30
  •   Objective  The rapid development of electronic warfare technology has introduced complex scenarios in which active jamming presents considerable challenges to radar systems. On modern battlefields, the electromagnetic environment is highly congested, and various forms of active jamming signals frequently disrupt radar functionality. Although existing recognition algorithms can identify certain types of radar active jamming, their performance declines under low Jamming-to-Noise Ratio (JNR) conditions or when training data are scarce. Low JNR reduces the detectability of jamming signals by conventional methods, and limited sample size further constrains recognition accuracy. To address these challenges, neural network-based methods have emerged as viable alternatives. This study proposes a radar active jamming recognition approach based on multi-domain feature fusion assisted by a Siamese network, which enhances recognition capability under low JNR and small-sample conditions. The proposed method offers an intelligent framework for improving jamming recognition in complex environments and provides theoretical support for battlefield awareness and the design of effective counter-jamming strategies.  Methods  The proposed method comprises a multi-domain feature fusion subnetwork, a Siamese architecture, and a joint loss design. To extract jamming features effectively under low JNR conditions, a multi-domain feature fusion subnetwork is developed. Specifically, a semi-soft thresholding shrinkage module is proposed by integrating a semi-soft threshold function with an attention mechanism. This module efficiently extracts time-domain features and eliminates the limitations of manual threshold selection. To enhance the extraction of time-frequency domain features, a multi-scale convolution module and an additional attention mechanism are incorporated. To reduce the model’s dependence on large training datasets, a weight-sharing Siamese network is constructed. By comparing similarity between sample pairs, this network increases the number of training iterations, thereby mitigating the limitations imposed by small sample sizes. Finally, three loss functions are jointly applied: an improved weighted contrastive loss, an adaptive cross-entropy loss, and a triplet loss. This joint strategy promotes intra-class compactness and inter-class separability of jamming features.  Results and Discussions  When the number of training samples is limited (Table 6), the proposed method achieves an accuracy of 96.88% at a JNR of –6 dB with only 20 training samples, indicating its effectiveness under data-scarce conditions. With further reduction in sample size—specifically, when only 15 training samples are available per jamming type—the recognition performance of other methods declines substantially. In contrast, the proposed method maintains higher recognition accuracy, demonstrating enhanced stability and robustness under low JNR and limited sample conditions. This performance advantage is attributable to three key factors: (1) Multi-domain feature fusion integrates jamming features from multiple domains, preventing the loss of discriminative information commonly observed under low JNR conditions. (2) The weight-sharing Siamese network increases the number of effective training iterations by evaluating sample similarities, thereby mitigating the limitations associated with small datasets. (3) The combined use of an improved weighted contrastive loss, an adaptive cross-entropy loss, and a triplet loss promotes intra-class compactness and inter-class separability of jamming features, enhancing the model’s generalization capability.  Conclusions  This study proposes a radar active jamming recognition method that performs effectively under low JNR and limited training sample conditions. A multi-domain feature fusion subnetwork is developed to extract representative features from both the time and time-frequency domains, enabling a more comprehensive and discriminative characterization of jamming signals. A weight-sharing Siamese network is then introduced to reduce reliance on large training datasets by leveraging sample similarity comparisons to expand training iterations. In addition, three loss functions—an improved weighted contrastive loss, an adaptive cross-entropy loss, and a triplet loss—are jointly applied to promote intra-class compactness and inter-class separability. Experimental results validate the effectiveness of the proposed method. At a low JNR of –6 dB with only 20 training samples, the method achieves a recognition accuracy of 96.88%, demonstrating its robustness and adaptability in challenging electromagnetic environments. These findings provide technical support for the development of anti-jamming strategies and enhance the operational reliability of radar systems in complex battlefield scenarios.
  • loading
  • [1]
    YAN Junkun, LIU Hongwei, JIU Bo, et al. Joint detection and tracking processing algorithm for target tracking in multiple radar system[J]. IEEE Sensors Journal, 2015, 15(11): 6534–6541. doi: 10.1109/JSEN.2015.2461435.
    [2]
    LI Xinrui, CHEN Baixiao, CHEN Xiaoying, et al. An efficient hybrid jamming suppression method for multichannel synthetic aperture radar based on group iterative separation[J]. IEEE Transactions on Geoscience and Remote Sensing, 2024, 62: 5214316. doi: 10.1109/TGRS.2024.3409713.
    [3]
    祝存海. 基于特征提取的雷达有源干扰信号分类研究[D]. [硕士论文], 西安电子科技大学, 2017. doi: 10.7666/d.D01385660.

    ZHU Cunhai. Research on radar active jamming signal disturbance classification base on feature extraction[D]. [Master dissertation], Xidian University, 2017. doi: 1 0.7666/d.D01385660.
    [4]
    郝万兵, 马若飞, 洪伟. 基于时频特征提取的雷达有源干扰识别[J]. 火控雷达技术, 2017, 46(4): 11–15. doi: 10.3969/j.issn.1008-8652.2017.04.003.

    HAO Wanbing, MA Ruofei, and HONG Wei. Radar active jamming identification based on time-frequency characteristic extraction[J]. Fire Control Radar Technology, 2017, 46(4): 11–15. doi: 10.3969/j.issn.1008-8652.2017.04.003.
    [5]
    WANG Yafeng, SUN Boye, and WANG Ning. Recognition of radar active-jamming through convolutional neural networks[J]. The Journal of Engineering, 2019, 2019(21): 7695–7697. doi: 10.1049/joe.2019.0659.
    [6]
    LI Ming, REN Qinghua, and WU Jialong. Interference classification and identification of TDCS based on improved convolutional neural network[C]. 2020 Second International Conference on Artificial Intelligence Technologies and Application (ICAITA), Dalian, China, 2020: 012155. doi: 10.1088/1742-6596/1651/1/012155.
    [7]
    SHAO Guangqing, CHEN Yushi, and WEI Yinsheng. Deep fusion for radar jamming signal classification based on CNN[J]. IEEE Access, 2020, 8: 117236–117244. doi: 10.1109/ACCESS.2020.3004188.
    [8]
    WANG Jingyi, DONG Wenhao, and SONG Zhiyong. Radar active jamming recognition based on time-frequency image classification[C]. 2021 5th International Conference on Electronic Information Technology and Computer Engineering, Xiamen, China, 2021: 449–454. doi: 10.1145/3501409.3502153.
    [9]
    ZOU Wenxu, XIE Kai, and LIN Jinjian. Light‐weight deep learning method for active jamming recognition based on improved MobileViT[J]. IET Radar, Sonar & Navigation, 2023, 17(8): 1299–1311. doi: 10.1049/rsn2.12420.
    [10]
    KOCH G, ZEMEL R, and SALAKHUTDINOV R. Siamese neural networks for one-shot image recognition[C]. The 32nd International Conference on Machine Learning, Lille, France, 2015: 1–8.
    [11]
    WANG Pengyu, CHENG Yufan, DONG Binhong, et al. Convolutional neural network-based interference recognition[C]. 2020 IEEE 20th International Conference on Communication Technology, Nanning, China, 2020: 1296–1300. doi: 10.1109/ICCT50939.2020.9295942.
    [12]
    ZHOU Hongping, WANG Lei, and GUO Zhongyi. Compound radar jamming recognition based on signal source separation[J]. Signal Processing, 2023, 214: 109246. doi: 10.1016/j.sigpro.2023.109246.
    [13]
    YANG Jikai, BAI Zhiquan, HU Jiacheng, et al. Time-frequency analysis and convolutional neural network based Fuze jamming signal recognition[C]. The 25th International Conference on Advanced Communication Technology (ICACT), Pyeongchang, Korea, 2023: 277–282. doi: 10.23919/ICACT56868.2023.10079346.
    [14]
    ZHAO Minghang, ZHONG Shisheng, FU Xuyun, et al. Deep residual shrinkage networks for fault diagnosis[J]. IEEE Transactions on Industrial Informatics, 2020, 16(7): 4681–4690. doi: 10.1109/TII.2019.2943898.
    [15]
    LEI Sai, LU Mingming, LIN Jieqiong, et al. Remote sensing image denoising based on improved semi-soft threshold[J]. Signal, Image and Video Processing, 2021, 15(1): 73–81. doi: 10.1007/s11760-020-01722-3.
    [16]
    HUANG Jiru, SHEN Qian, WANG Min, et al. Multiple attention siamese network for high-resolution image change detection[J]. IEEE Transactions on Geoscience and Remote Sensing, 2022, 60: 5406216. doi: 10.1109/TGRS.2021.3127580.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(6)  / Tables(6)

    Article Metrics

    Article views (285) PDF downloads(66) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return