论文部分内容阅读
无人机自主着舰末端视觉导引中舰机间相对位姿的估测,可以看作机载摄像机对甲板平面3D运动的估测。提出了一种光流分层方法:首先利用已知焦距的机载摄像机拍摄着舰靶标区域的图像序列,并采用Lucas方法计算相邻两帧图像的光流场;而后通过分层模型,将由光流场进行3D运动检测的非线性问题转化为了两个线性问题。该方法无需图像间的特征匹配,可线性解算出着舰靶标区域相对于无人机的三维运动参数,进而得到舰机间的相对位姿信息。计算机合成图和摄像机实拍图像的实验结果验证了该算法的正确性和有效性。
The estimation of the relative pose between the aircraft and naval vessels in the visual guidance of the autonomous landing of the UAV can be regarded as the estimation of 3D motion of the deck plane by the airborne camera. An optical flow stratification method is proposed. Firstly, the image sequence of the target area of the ship is captured by using an airborne camera of known focal length, and the optical flow field of two adjacent frames is calculated by the Lucas method. Then, The nonlinear problem of 3D motion detection of optical flow field translates into two linear problems. The method does not require feature matching between images and linearly solves the three-dimensional motion parameters of the landing target area relative to the UAV, thus obtaining the relative pose and attitude information of the aircraft. The experimental results of the computer composite image and the camera real image verify the correctness and effectiveness of the algorithm.