谢国波, 廖文康, 林志毅, 张家源. 基于MCRASN的遥感影像变化检测[J]. 应用光学, 2024, 45(2): 430-437. DOI: 10.5768/JAO202445.0203005
引用本文: 谢国波, 廖文康, 林志毅, 张家源. 基于MCRASN的遥感影像变化检测[J]. 应用光学, 2024, 45(2): 430-437. DOI: 10.5768/JAO202445.0203005
XIE Guobo, LIAO Wenkang, LIN Zhiyi, ZHANG Jiayuan. Remote sensing images change detection based on MCRASN[J]. Journal of Applied Optics, 2024, 45(2): 430-437. DOI: 10.5768/JAO202445.0203005
Citation: XIE Guobo, LIAO Wenkang, LIN Zhiyi, ZHANG Jiayuan. Remote sensing images change detection based on MCRASN[J]. Journal of Applied Optics, 2024, 45(2): 430-437. DOI: 10.5768/JAO202445.0203005

基于MCRASN的遥感影像变化检测

Remote sensing images change detection based on MCRASN

  • 摘要: 为了提升经配准高分辨率遥感影像对变化检测的精度,基于ChangeFormer提出了一种将移动卷积与相对注意力相结合的孪生网络(mobile convolution and relative attention Siamese network, MCRASN)。该网络以垂直布局结合移动卷积和相对注意力,构建多阶段组合编码器替换原网络编码器,高效地捕捉所需的多尺度细节特征和像素间相互关系信息,改进差异模块为1个可学习的距离度量模块进行距离计算,同时通过引入EFL(equalized focal loss)损失函数解决数据集正负样本失衡的问题以实现精确的变化检测。实验结果表明,所提出的MCRASN算法在LEVIR-CD数据集上具有更好的变化检测性能,其精确率、召回率、F1得分和总体精度分别为93.94%、89.26%、91.54%和99.18%,优于先前的多种检测方法。

     

    Abstract: In order to improve the accuracy of change detection in co-registered high-resolution remote sensing images, a Siamese network combining mobile convolution and relative attention (MCRASN) was proposed based on ChangeFormer. A multi-stage combined encoder was constructed to replace the original network encoder by using vertical layout combined with mobile convolution and relative attention to efficiently capture the required multi-scale detailed features and pixel correlation information, and the difference module was improved to be a learnable distance metric module for distance calculation. At the same time, the equalized focal loss (EFL) loss function was introduced to solve the problem of imbalance between positive and negative samples in the dataset to achieve accurate change detection. The experimental results show that the proposed MCRASN method has better change detection performance on the LEVIR-CD dataset, with precision, recall, F1 score and overall accuracy of 93.94%, 89.26%, 91.54% and 99.18%, respectively, which is superior to previous methods.

     

/

返回文章
返回