张冬冬, 王春平, 付强. 基于语义特征的遥感舰船关重部位检测网络[J]. 应用光学, 2023, 44(3): 595-604. DOI: 10.5768/JAO202344.0303004
引用本文: 张冬冬, 王春平, 付强. 基于语义特征的遥感舰船关重部位检测网络[J]. 应用光学, 2023, 44(3): 595-604. DOI: 10.5768/JAO202344.0303004
ZHANG Dongdong, WANG Chunping, FU Qiang. Detection network of critical parts for remote sensing ship based on semantic features[J]. Journal of Applied Optics, 2023, 44(3): 595-604. DOI: 10.5768/JAO202344.0303004
Citation: ZHANG Dongdong, WANG Chunping, FU Qiang. Detection network of critical parts for remote sensing ship based on semantic features[J]. Journal of Applied Optics, 2023, 44(3): 595-604. DOI: 10.5768/JAO202344.0303004

基于语义特征的遥感舰船关重部位检测网络

Detection network of critical parts for remote sensing ship based on semantic features

  • 摘要: 在近岸场景中,受背景影响,舰船关重部位误检概率高、检测精度低。针对以上问题,提出了一种基于语义特征的舰船关重部位检测网络,并命名为CPDNet(critical part detection network)。通过优化网络结构及引入注意力机制,提升网络的特征表达能力以及对关重部位的感知能力;基于语义信息,设计了语义掩膜模块,以降低背景对检测精度的影响;增加角度参数,使网络适用于具有方向性的目标;构建了舰船关重部位数据集,命名为CP-Ship,以验证所提网络的有效性。在CP-Ship数据集上的实验结果表明:所提网络的平均精度比RetinaNet提高了11.35%,与其他网络模型相比,其检测精度和速度均表现优异。

     

    Abstract: In near-shore scenes, under the influence of background, the probability of false detection and low detection accuracy of ship critical parts are high. To address the above problems, a detection network of ship critical parts based on semantic features was proposed, which named critical part detection network (CPDNet). Firstly, by optimizing the network structure and introducing the attention mechanism, the feature expression ability and the perception ability of the ship critical parts were improved. Secondly, based on semantic information, a semantic mask module was designed to reduce the impact of background on detection accuracy. In addition, the angle parameter was added to make the network applicable to targets with orientation. Finally, a ship critical parts dataset, named CP-Ship, was constructed to verify the effectiveness of the proposed network. The experimental results on the CP-Ship dataset show that the average accuracy of the proposed network is 11.35% higher than that of RetinaNet. Compared with other network models, the proposed network performs well in both detection accuracy and speed.

     

/

返回文章
返回