同志学, 赵涛, 王消为. 基于双目序列图像的测距定位与自车速度估计[J]. 应用光学, 2017, 38(5): 764-769. DOI: 10.5768/JAO201738.0502004
引用本文: 同志学, 赵涛, 王消为. 基于双目序列图像的测距定位与自车速度估计[J]. 应用光学, 2017, 38(5): 764-769. DOI: 10.5768/JAO201738.0502004
Tong Zhixue, Zhao Tao, Wang Xiaowei. Localization and ego-velocity estimation for vehiclebased on binocular image sequences[J]. Journal of Applied Optics, 2017, 38(5): 764-769. DOI: 10.5768/JAO201738.0502004
Citation: Tong Zhixue, Zhao Tao, Wang Xiaowei. Localization and ego-velocity estimation for vehiclebased on binocular image sequences[J]. Journal of Applied Optics, 2017, 38(5): 764-769. DOI: 10.5768/JAO201738.0502004

基于双目序列图像的测距定位与自车速度估计

Localization and ego-velocity estimation for vehiclebased on binocular image sequences

  • 摘要: 为了确定车辆在行驶过程中的相对位置与速度,提出一种基于双目序列图像的实时测距定位及自车速度估计方法。该方法利用车载双目视觉传感器采集周围环境的序列图像,并对同一时刻的左右图像进行基于SURF(speeded up robust features)特征的立体匹配,以获取环境特征点的景深,实现车辆测距定位;同时又对相邻两帧图像进行基于SURF特征的跟踪匹配,并通过对应匹配点在相邻两帧摄像机坐标系下的三维坐标,计算出摄像机坐标系在车辆运动前后的变换参数,根据变换参数估算出车辆的行驶速度。模拟实验表明,该方法具有良好的可行性,速度计算结果比较稳定,平均误差均在6%以内。

     

    Abstract: A real-time ranging location and ego-velocity estimation method based on binocular image sequences was presented to obtain the relative location and velocity of the vehicle in the running process. This method used the vehicle-borne binocular vision sensor to collect the image sequence of the surrounding environment. Then the depth of field of environmental feature points was obtained through matching feature points of left and right images at the same time based on speeded up robust features(SURF) in order to achieve vehicle location. Meanwhile, the feature points of two adjacent frames were tracked and matched based on SURF.The transformation parameter of the camera coordinate system before and after vehicle movement was computed through the 3D coordinates of the corresponding matching points in the two adjacent frames. And the velocity of the vehicle was estimated according to the transformation parameter. The simulation experiment results show that the method is feasible, and the speed calculation results are relatively stable, the average error is less than 6%.

     

/

返回文章
返回