用户名: 密码: 验证码:
单目视觉/惯性组合导航可观性分析与动态滤波算法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
视觉/惯性组合导航以良好的互补性和自主性逐渐成为导航领域的一个新的研究热点和重要的发展方向。军民用领域的迫切需求使其凸显出重要的研究和应用价值,成为目前极具发展前景的导航技术。视觉/惯性组合导航系统具有强非线性,为了在载体线运动和角运动的条件下,能够高精度估计系统状态,需要深入、透彻地了解非线性系统的基本属性,特别是与状态估计性能密切相关的可观性特征。同时,现有视觉导航算法中大多数需要观测位置或尺寸已知的点特征或线特征,在未知环境下很难适用。因此,论文重点解决未知环境下单目视觉/惯性组合导航中两个问题,即单目视觉/惯性组合导航可观性分析和单目视觉/惯性组合导航动态滤波算法。主要研究工作内容如下:
     1)提出了基于线特征的纯单目视觉运动估计算法的改进方法。在基于线特征的Goddard方法的基础上,发现其公式推导错误,并将目标的加速度扩展为状态量,克服了Goddard方法只能在相对速度变化缓慢条件下应用的不足。
     2)分析了无惯性信息辅助条件下单目视觉导航的局限。对于高动态或长航程应用,基于点特征或线特征的滤波算法估计载体姿态的精度都不高,随着姿态误差的累积,长距离导航时系统的定位误差随距离呈非线性迅速增长。因此,在未知环境下,尤其是高动态或长航程应用时,纯单目视觉导航难以完成导航任务,需要引入其他传感器,如惯性系统。
     3)推导了基于高精度IMU的单目视觉/惯性组合导航系统的可观性条件。将误差方程线性化,采用EKF滤波估计误差状态。在摄像机高度已知的情况下,应用PWCS方法分别证明了基于点特征和线特征的组合导航姿态和速度误差的可观性。摄像机水平向前拍摄时,对于点特征,组合导航姿态和速度误差可观要求至少观测地面上三个不共线的点特征。对于线特征,组合导航姿态和速度误差可观要求至少观测地面上两条不平行的线特征。两者可观性条件等价。
     4)提出了基于高精度IMU的单目视觉/惯性组合导航滤波算法。对于点特征,单目视觉/惯性组合导航采用速度组合模式,利用惯性导航系统提供姿态信息,无需依靠视觉估计摄像机自身姿态。特征点不明显时,利用惯导系统短时间精度好的优势,确保导航的连续性。为提高视觉运动估计精度,通过组合导航实现摄像机标度因数误差的在线标定。与定期修正姿态的纯视觉算法相比,短距离的推车实验和长距离的跑车实验表明滤波算法定位精度更高,定位误差随时间增长率低。对于线特征,引入对偶四元数算子构造摄像机位置增量的表达式,将摄像机估计的速度与惯导系统测量的速度进行组合,建立滤波方程修正导航参数。实验表明滤波算法线特征匹配正确率高,能够连续多帧追踪。
     5)推导了基于低精度IMU和磁强计组合的单目视觉/惯性组合导航系统的可观性条件。针对低精度惯性系统,误差方程线性化会引入较大的非线性误差,采用矩阵滤波直接将姿态矩阵和速度作为状态进行估计,避免了非线性系统线性化带来的误差。以李导数理论为基础,构造可观性矩阵推导满足系统可观的特征观测集合,在摄像机高度已知且估计陀螺漂移和磁差的情况下,分析了可观性判据。对于点特征,组合导航姿态和速度可观要求至少观测地面上三个不共线的点特征,对载体运动的要求是:至少存在一个旋转自由度。对于线特征,组合导航姿态和速度可观要求至少观测地面上两条不平行的直线,对载体运动的要求是:至少存在一个旋转自由度。两者可观性条件等价。相对传统视觉导航的可观性结论,单目视觉/惯性组合导航的可观性条件减少特征数量的同时不再要求特征位置或尺寸已知,对未知环境下的组合导航系统设计和实现具有较强的实用价值。
     6)提出了基于低精度IMU和磁强计组合的单目视觉/惯性组合导航滤波算法。对于点特征和线特征,将姿态矩阵整体作为状态进行估计。相对四元数描述姿态,姿态矩阵的物理意义更为明显。相对其他非线性滤波算法,由于姿态矩阵的整体引入,原始非线性系统过程可以写成伪线性过程,即只含状态的一次项,不含二次项和高阶项,简化了滤波设计。
Vision/SINS integrated navigation is an important topic in the integrated navigation,and has attracted considerable attentions from the civil and the military area. In order toestimate precise state of vision/SINS integrated navigation system, observability analysismust be done after filter design. Since vision/SINS integrated navigation system is a non-linear system, observability analysis is necessary when the system with linear and angulardynamics. Most of visual navigation algorithms need observations of known features. Itishardtosatisfyintheunknownenvironment. Motivatedbytheabove-mentionedinsuffi-ciencies, the main purpose of this dissertation is to address two important and challengingissues of monocular camera/SINS integrated navigation in the unknown environment:observability analysis and dynamic filter algorithms. The main contributions include thefollowing aspects.
     1. The estimate of position and orientation of moving object from monocular camerawas investigated. The EKF was designed according to Goddard method. Some mistakesin Goddard method were rectified. At the same time, relative translation and rotationsecondderivations of the object were added as state assignment. Comparedwith Goddardmethod, simulated data and actual experimental data show that the improved method canget better precision and robustness.
     2. The insufficiency of monocular vision navigation was analyzed. The improvedmethod still cannot estimate precise state when the system with high dynamics or overlong distance. It need combine with other sensors, such as SINS.
     3. For the monocular camera/high accuracy IMU integrated navigation system, EKFis used to estimate the error state. Under the condition of the camera positioning height isknown,observabilityoftheerrorsofattitudeandvelocityisprovedbyPWCSmethod. Forpoint feature, the observability of attitude and velocity errors need at least there featuresare detected on the ground. For line feature, the observability need at least two linearlyindependent features are detected on the ground.
     4. The EKF algorithm is proposed based on point and line feature, respectively. Forpoint feature, velocity integrated mode is used in the integrated navigation. With the helpof SINS, camera isn’t need to estimate orientation. SINS provide orientation with cameracontinuously. Camera’s velocity is solved after sensor alignment and time synchroniza-tion, and used to amend SINS’s velocity and position by Kalman filter. Integrated nav- igation calibrates camera scale factor on-line. Compared with visual navigation methodof periodic updates to the orientation estimate, the proposed method is more accurate andmore robustness, low rate of error growth. For line feature, dual quaternion was intro-duced to describe the position and attitude of camera. The proposed algorithm deducedthe increment formula between image feature and camera position. The velocity compu-tation difference between the SINS and the camera is chosen as observation of integratednavigation. A Kalman filter is used to correct the integrated navigation error includingthe camera scale factor. The experiment results show the proposed algorithm is accuratefor structural road.
     5. For the monocular camera/low accuracy IMU/magnetometer integrated naviga-tion system, the matrix Kalman filter(MKF) is used to estimate the state. Under the con-dition of the camera positioning height is known and gyro biases and magnetic variationsneed to be estimated, observability analysis is directly based on nonlinear system withLie derivatives. For point feature, the observability of attitude and velocity need at leastthere features are detected on the ground. At least one degree of rotational freedom isexcited. For line feature, the observability need at least two linearly independent featuresare detected on the ground. At least one degree of rotational freedom is excited. Theobservability conditions of attitude and velocity are equivalent for point and line feature.Compared with visual navigation, observability conditions of monocular camera/IMU in-tegrated navigation need less features. At the same time, the position of feature is no longneed to be known. There are valuable for the filter design and realization of the integratednavigation system.
     6. The MKF algorithm is proposed based on point and line feature, respectively. Itis designed such that the estimate of the state matrix is expressed in terms of the matrixparametersoftheoriginalplant. TheMKFpreservesthenaturalformulationoftheattitudematrixtorearrangetheoriginalnonlinearprocessmodelinapseudo-linearprocessmodel.
引文
[1] Robert L. Machine perception of three-dimensional solids [C]. In Optical andElectron-Optical Information Processing. Cambridge, MA,1965:159–197.
    [2] Dissanayake M W M G, Newman P, Clark S, et al. A solution to the simultaneouslocalization and map budding (SLAM) problem [J]. IEEE Trans on Robotics andAutomatfon.2001,17(3).
    [3] Kim J, Sukkarieh S.6DoF SLAM aided GNSS/INS navigation in GNSS de-nied and unknown environments [J]. Journal of Global Positioning Systems.2005,4(1):120–128.
    [4] Kima J, Sukkarieh S. Real-time implementation of airborne-SLAM [J]. Roboticsand Autonomous Systems.2007,55:62–71.
    [5] Pinies P, Lupton T, Sukkarieh S, et al. Inertial aiding of inverse depth SLAM us-ing a monocular camera [C]. In IEEE International Conference on Robotics andAutomation. Roma Italy,2007:10–14.
    [6] Nützi G, Weiss S, Scaramuzza D, et al. Fusion of IMU and vision for absolutescale estimation in monocular SLAM [J]. Intelligence Robot System.2011,61:287–299.
    [7] Abdallah S M, Asmar D C, Zelek J S. Towards benchmarks for vision SLAMalgorithms [C]. In IEEE International Conference on Robotics and Automation.Orlando, Florida,2006.
    [8] Marr,姚国正等译.视觉计算理论[M].北京:科学出版社,1998.
    [9] Marr D. Vision: A Computational Investigation into the Human Representationand Processing of Visual Information [M]. New York: Freeman,1982.
    [10] Hebert M. Building and navigating maps of road scenes using active range andreflectance data [M]. Kluwer Academic Publishers,1990.
    [11] CampaniM,VerriA.Motionanalysisfromfirst-orderpropertiesofopticalflow[J].CVGIP: Image Understanding.1992,56(1):90–107.
    [12] Horn B K, Schunck B G. Determining optical flow [J]. Artificial Intelligence.1981,17(1-3):185–203.
    [13] Girosi F, Verri A, Torre V. Constraints for the computation of optical flow [C]. InWorkshop on Visual Motion.20-22Mar1989.
    [14] Verri A, Poggio T. Motion field and optical flow: qualitative properties [J]. IEEETransactions on Pattern Analysis and Machine Intelligence.1989,11(5):490–498.
    [15] Barnard S T, Thompson W B. Disparity analysis of images [J]. IEEE Trans. onPAMI.1980,2:333–340.
    [16] Kim Y C, Aggarwal J K. Positioning three-dimensional objects using stereo im-ages [J]. IEEE Journal of Robotics and Automation.1987, RA-1:361–373.
    [17] Canny J F. Computational approach to edge detection [J]. IEEE Trans on PatternAnalysis and Machine Intelligence.1986,8(6):679–697.
    [18] Gongjian W, Xiuzhi Z. Stereo Matching and Large Occlusion Detection Based onDisparity Points [J]. Journal of Software.2005,16(5):708–717.
    [19] Marroquin J, Mitter S, Poggio T. Probabilistic solution of ill-posed problems incomputational vision [J]. Journal of the American Statistical Association.1987,82(397):76–89.
    [20] Landzettel K, Steinmetz B-M,Brunner B, etal. Amicro-rovernavigation andcon-trol system for autonomous planetary exploration [J]. The International Journal oftheRoboticsSocietyofJapan:AdvancedRobotics,SpecialIssueServiceRoboticsin Space.2005(18):285–314.
    [21] Vergauwen M, Pollefeys M, Gool L V. A stereo vision system for support of plan-etary surface exploration [C]. In Second International Workshop on Computer Vi-sion Systems.2001.
    [22] Clark F, Larry H, Marcel S, et al. Rover navigation using stereo ego-motion [J].Robotics and Autonomous Systems.2003,43(4):215–229.
    [23] ClarkF,LarryH,MarcelS,etal.Stereoego-motionimprovementsforrobustrovernavigation [C]. In IEEE International Conference on Robotics and Automation.2001.
    [24] Nister D, Naroditsky O, Bergen J. Visual odometry [C]. In2004IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition. Washington,DC, USA,2004:652–659.
    [25] Mukundan R, Ramakrishnan K. A quaternion solution to the pose determinationproblem for rendezvous and docking simulations [J]. Mathematics and Computersin Simulation.1995,39:143–153.
    [26] Alonso R, Du J, Hughes D, et al. Relative navigation for formation flight of space-craft [C]. In Proceedings of the Flight Mechanics Symposium. Greenbelt, Mary-land,2001.
    [27] Quan L, Lan Z D. Linear N-point pose determination [J]. IEEE Trans. on PatternAnalysis and Machine Intelligence.1999,21(7):1–7.
    [28] ShakhnarovichG,ViolaP,DarrellT.Fastposeestimationwithparametersensitivehashing [C]. In IEEE International Conference on Computer Vision. Nice, France,2003:750–757.
    [29] GoddardJS.Poseandmotionestimationfromvisionusingdualquaternion-basedextendedkalmanfiltering[D].Knoxville:TheUniversityofTennessee,1997.
    [30]蔡喜平,戴永江,赵远等.运用计算机视觉对空间飞行器交会对接中的位置和姿态的测量[J].宇航学报.1995,16(4):80–84.
    [31]杜小平,崔占忠,曾德贤.基于IEKF序列图像分析航天器相对状态测量方法[J].装备指挥技术学院学报.2004,15(4):49–55.
    [32]林来兴,李灿.交会对接逼近阶段CCD相机的测量方法[J].宇航学报.1994,15(2):24–34.
    [33]郝颖明,朱枫,欧锦军.目标位姿测量中的三维视觉方法[J].中国图象图形学报.2002,7(12):1247–1251.
    [34]季健.微型飞行器视觉导航技术研究[D].南京:南京航空航天大学,2005.
    [35]于起峰,陆宏伟,刘肖琳.基于图像的精密测量与运动测量[M].北京:科学出版社,2002.
    [36]张世杰.基于单目视觉的航天器相对导航理论与算法研究[D].哈尔滨:哈尔滨工业大学,2005.
    [37] Xue-dong W. An application of unscented Kalman filter for pose and motion es-timation based on monocular vision [C]. In IEEE International Symposium on In-dustrial Electronics. Montreal, Canada,2006:2614–2619.
    [38]秦永元,张洪钱,汪叔华.卡尔曼滤波与组合导航原理[M].西安:西北工业大学出版社,1998.
    [39]杜萌,什梦印,王美玲.基于DSP的车载GPS/DR组合导航系统硬件设计[J].微计算机信息.2006,22(35):103–105.
    [40] Xu L, Zhu C. A research of fuzzy Kalman filter algorithm based on GPS/DR in-tegrated navigation-system [C]. In2nd International Conference on Power Elec-tronics and Intelligent Transportation System (PEITS2009).2009.
    [41]穆荣军,韩鹏鑫,崔乃刚等.星光导航原理及捷联惯导/星光组合导航方法研究[J].南京理工大学学报(自然科学版).2007,31(5):585–589.
    [42]叶泽田,杨勇,赵文吉等.车载GPS/IMU/LS激光成像系统外方位元素的动态标定[J].测绘学报.2011,40(3):345–350.
    [43]陈林,曹聚亮,李同安等. INS/vision组合导航中视觉系统动态定位方法研究[J].传感技术学报.2008,21(1):187–191.
    [44]崔乃刚,王晓刚,郭继峰.基于Sigma-point卡尔曼滤波的INS/VISION相对导航方法研究[J].宇航学报.2009,30(6):2220–2223.
    [45]陈林.基于动态视觉定位的惯性导航地标修正方法研究[D].长沙:国防科技大学,2007.
    [46]冯国虎,吴文启,曹聚亮等. Algorithm for monocular visual Odometry/SINSintegrated navigation [J].中国惯性技术学报.2011,19(3):210–215.
    [47]李耀军,潘泉,赵春晖等.基于INS/SMNS紧耦合的无人机导航[J].中国惯性技术学报.2010,18(3).
    [48]刘焘,庞秀芝,余静等.基于准三维视觉模型的组合导航系统性能研究[J].弹箭与制导学报.2009,29(6).
    [49] Mourikis A I, Trawny N, Roumeliotis S I, et al. Vision-aided inertial navigationfor spacecraft entry,descent, and landing [J]. IEEE Trans. Robot.2009,25(2):264–280.
    [50]吴伟仁,王大轶,李骥等.月球软着陆避障段定点着陆导航方法研究[J].中国科学:信息科学.2011,41(9):1054–1063.
    [51]邵巍,常晓华,崔平远等.惯导融合特征匹配的小天体着陆导航算法[J].宇航学报.2010,31(7):1748–1755.
    [52]朱圣英,崔祜涛,崔平远.基于路标信息的绕飞小天体探测器自主光学导航方法研究[J].电子学报.2010,38(9):2052–2058.
    [53]李爽.基于光学测量的相对导航方法及其在星际着陆中的应用研究[D].哈尔滨:哈尔滨工业大学,2007.
    [54] Larson C D. An intergrity framework for image-based navigaiton systems [D].Ohio: Air Force Institute of Technology,2010.
    [55] Cesetti A, Frontoni E, Mancini A, et al. A vision-based guidance system for UAVnavigation and safe landing using natural landmarks [J]. Journal of IntelligentRobotics Systems.2010,57:233–257.
    [56]陈大志,张广军.基于地标图像信息的惯导系统误差校正方法[J].北京航空航天大学学报.2003,29(1):79–82.
    [57]蒋鸿翔,徐锦法,高正.无人直升机视觉着陆中的运动状态估计算法[J].航空学报.2010,31(4):744–753.
    [58] Bazin J C, Kweon I, Demonceaux C, et al. UAV attitude estimation by vanishigpoints in catadioptric images [C]. In IEEE International Conference on Roboticsand Automation.2008:2743–2749.
    [59]任沁源.基于视觉信息的微小型无人直升机地标识别与位姿估计研究[D].杭州:浙江大学,2008.
    [60]张广军,周富强.基于双园特征的无人机着陆位置姿态视觉测量方法[J].航空学报.2005,26(3):344–348.
    [61] Kelly J, Sukhatme G S. Visual-inertial sensor fusion: localization, mapping andsensor-to-sensor self-calibration [J]. International Journal of Robotics Research.2011,30(1):56–79.
    [62]丁尚文,王惠南,刘海颖等.基于对偶四元数的航天器交会对接位姿视觉测量算法[J].宇航学报.2009,30(6):2145–2150.
    [63] Allen D, Johnson N, Alison A. Vision-aided inertial navigation for flight con-trol [J]. Journal of Aerospace Computing, Information and Communication.2005(2):348–360.
    [64] Andreas H. Relative position sensing by fusing monocular vision and inertial ratesensors [D]. California: Stanford University,2003.
    [65] Mourikis A I, Roumeliotis S I. A multi-state constraint Kalman filter for vision-aided inertial navigation [R].2006.
    [66] Mourikis A I, Roumeliotis S I. A multi-state constraint kalman filter for vision-aided inertial navigation [C]. In IEEE International Conference on Robotics andAutomation. Roma, Italy,2007.
    [67] Toupet O, Paduano J D, Panish R, et al. Augmenting state estimates with multiplecamera visual measurements [C]. In AIAA. Rohnert Park, California,2007:7–10.
    [68] Diel D D, DeBitetto P, Teller S. Epipolar constraints for vision-aided inertial nav-igation [C]. In IEEE Workshop on Motion and Video Computing.2005.
    [69] Zhou J, Li B. Exploting vertical lines in vision-based navigation for mobile robotplatforms [C]. In ICASSP.2007:465–468.
    [70] Aouf N, Sazdovski V, Tsourdos A, et al. Low altitude airborne SLAM with INSaided vision system [C]. In AIAA Guidance, Navigation and Control Conferenceand Exhibit. Hilton Head, South Carolina,2007.
    [71] Sasiadek J Z, Walker M J. Vision-based UAV navigaiton [C]. In AIAA Guidance,Navigaiton and Control Conference and Exhibit. Honolulu, Hawaii,2008.
    [72]孙庆有,屈亚丽.纯方位定位中的螺旋运动机动方式与系统的可测性[J].电光与控制.2007,14(6):22–25.
    [73]许志刚,盛安东,陈黎等.被动目标定位系统观测平台的最优机动轨迹[J].控制理论与应用.2009,26(12):1337–1344.
    [74]石章松.水下纯方位目标跟踪中的观测器机动航路对定位精度影响分析[J].电光与控制.2009,16(6):5–8.
    [75] Smith R, Self M, Cheeseman P. A stochastic map for uncertain spatial relation-ships[C].In4thInternationalSymposiumonRoboticsResearch.Cambridge,MA,1987:467–474.
    [76] Kim J, Sukkarieh S. Airborne simultaneous localization and map building [C]. InIEEE Conference on Robotics and Automation. Taipei,2003.
    [77] Bryson M, Sukkarieh S. Building a robust implementation of bearing-only inertialSLAM for a UAV [J]. Journal of Field Robotics.2007,24(2):113–143.
    [78] Jama M. Monicular vision based localization and mapping [D]. Kansas: KansasState University,2011.
    [79]李建,李小民,钱克昌等.基于双目视觉和惯性器件的微小型无人机运动状态估计方法[J].航空学报.2011,32(12):2310–2317.
    [80] Yu Z, Nonami K, Shin J, et al.3D vision based landing control of a small scaleautonomous helicopter [J]. International Journal of Advanced Robotic Systems.2007,4(1).
    [81] Se S, Jasiobdzki P. Stereo-vision based3D modeling for unmanned ground vehi-cles [C]. In SPIE.2007.
    [82] Sharkasi A T. Stereo vision based aerial mapping using GPS and inertial sen-sors [D]. Virginia: Virginia Polytechnic Institute and State University,2008.
    [83] KimJ,Ridley M,Nettleton E,et al. Real-timeexperiment offeature tracking/map-ping using a low-cost vision and GPS/INS system on an UAV platform [J]. Journalof Global Positioning Systems.2004,3(1-2):167–172.
    [84]郑大钟.线性系统理论[M].北京:清华大学出版社,1990.
    [85] Goshen-Meskin D,Bar-Itzhack I Y.Observability analysis of piece-wise con-stant systems-part II: application to inertial navigation in-flight alignment [J].IEEE Trans. on Aerospace and Electronic Systems.1992,28:1068–1075.
    [86] HamF,BrownR.Observability,eigenvaluesandKalmanfiltering[J].IEEETrans-actions on Aerospace and Electronics Systems.1983,19(2):269–273.
    [87]万德钧,房建成.惯性导航系统初始对准[M].南京:东南大学出版社,1998.
    [88] Terrell W J. A computational linearization principle for observability of nonlin-ear DAEs near a trajectory [C]. In Amer. Control Conf. Philadelphia, PA,1998:2515–2519.
    [89] Terrell W J. Local observability of nonlinear differential-algebraic equations(DAEs) from the linearization along a trajectory [J]. IEEE Transactions on Au-tomatic Control.2001,46:1947–1950.
    [90] Hermann R, Krener A J. Nonlinear controllability and observability [J]. IEEETransaction on Automatic Control.1977,22:728–740.
    [91] Betke M, Gurvits L. Mobile robot localization using landmarks [J]. IEEE Trans-actions on Robotics and Automation.1997,13(2):251–263.
    [92] Martinelli A. Closed-form solutions for attitude, speed, absolute scale and biasdetermination by fusing vision and inertial measurements [R].2011.
    [93] Mariottini G, Pappas G, Prattichizzo D, et al. Vision-based localization of leaderfollower formation [C]. In IEEE Conf. Decision and Control. Seville, Spain,2005:635–640.
    [94] Lee K, Wijesoma W, Guzman J. On the observability and observability analysisof SLAM [C]. In Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. Beijing, China,2006:3569–3574.
    [95] Martinelli A, Scaramuzza D, Siegwart R. Automatic selfcalibration of a visionsystem during robot motion [C]. In IEEE Int. Conf. Robot. Autom.,.2006:43–48.
    [96] Mirzaei F M, Roumeliotis S. A Kalman filter-based algorithm for IMU-cameracalibration: observability analysis and performance evaluation [J]. IEEE Trans.Robot.2008,24(5):1143–1156.
    [97] Sabatini A M. Kalman-filter-based orientation determination using inertial/mag-neticsensors:observabilityanalysisandperformanceevaluation[J].sensors.2011,11:9182–9206.
    [98] DaniilidisK.Hand-eyecalibrationusingdualquaternions[J].InternationalJournalof Robotics Research.1999:286–298.
    [99] Branets V N, Shmyglevsky I P. Introduction to the theory of strapdown inertialnavigation system [M].1992.
    [100]武元新.对偶四元数导航算法与非线性高斯滤波研究[D].长沙:国防科技大学,2005.
    [101] Zhang Y, Sommer G, Bayro-Corrochano E. The motor extend Kalman filter fordynamic rigid motion estimation from line observations [M]//Sommer G. Geo-metric Computing with Clifford Algebras. Springer,2001,2001:.
    [102] Zhang Z, Faugeras O.3-D dynamic scene analysis [M]. Berlin: Springer-Verlag,1992.
    [103] Shuster M. A survey of attitude representations [J]. J. Astronaut. Sci.1993(41):439–517.
    [104] Baruch M, Bar-Itzhack I. Optimal weighted orthogonalization of measuredmodes [J]. AIAA Journal.1978,16(4):346–351.
    [105] Choukroun D, Weiss H, Bar-Itzhack I, et al. Kalman filtering for matrix estima-tion [J]. IEEE Transaction on Aerospace and Electronic Systems.2006,42(1):147–159.
    [106] Choukroun D, Weiss H, Bar-Itzhack I, et al. Direction cosine matrix estimationfrom vector observations using a matrix kalman filter [C]. In AIAA Guidance,Navigation, and Control Conference and Exhibit. Austin, Texas,2003.
    [107] Bar-Itzhack I, Meyer J. On the convergence of iterative orthogonalization pro-cesses [J]. IEEE Trans. on Aerospace and Electronic Systems.1976,12:146–151.
    [108] SunD,CrassidisJ.Observabilityanalysisofsix-degree-of-freedomcongurationdetermination using vector observations [J].Journal of Guidance, Control, andDynamics.2002,25(6):9.
    [109] Chen H H. Pose determination from line-to-plane correspondences: existence con-dition and closed-form solutions [J]. IEEE Trans. Pattern Anal. Mach. Intell.1991,13(6):530–541.
    [110] Mirzaei F M, Roumeliotis S I. Globally optimal pose estimation from line corre-spondences [C]. In IEEE International Conference on Robotics and Automation.2011.
    [111] Broida T J, Chandrashekhar S, Chellappa R. Recursive3-D motion estimationfrom a monocular image sequence [J]. IEEE Trans on Aerospace and ElectronicSystems.1990,26(4).
    [112]伍雪冬,王耀南.基于视觉和扩展卡尔曼滤波的位姿和运动估计新方法[J].仪器仪表学报.2004,25(5):676–681.
    [113] Xue-dong W, Yao-nan W, Can-fei L. Pose and motion estimation from monocularvision based on IEKF,DD1and DD2filters [J]. Control Theory and Applications.2005,22(1):35–42.
    [114]左俊青,王惠南,刘海颖等.单目视觉下基于对偶四元数的卫星姿态的确定[J].中国惯性技术学报.2008,16(5):577–581.
    [115] Woffinden D C, Geller D K. Observability criteria for angles-only naviga-tion [J]. IEEE Transactions on Aerospace and Electronic Systems.2009,45(3):1194–1208.
    [116] Stein G P, Mano O, Shashua A. A robust method for computing vehicle ego-motion [C]. In IEEE Intelligent Vehicles Symposium. Dearborn, USA,2000:362–368.
    [117]刘威,魏存伟,赵逢等.基于单目视觉的自车运动参数鲁棒估计[J].机器人.2009,31(1):20–26.
    [118] Merwe R v d, Freitas J, Doucet A, et al. The unscented particle filter [R].2000.
    [119] Campbell J, Sukthankar R, Nourbakhsh I, et al. A robust visual odometry andprecipice detection system using consumer-grade monocular vision [C]. In2005IEEE International Conference on Robotics and Automation. Barcelona,Spain,2005:3421–3427.
    [120] Baümker M, Heimes F. New calibration and computing method for direct geo-referencing of image and scanner data using the position and angular data of anhybrid inertial navigation system [M]//Heipke C, KJacobsen, Wegmann H. In-tegrated Sensor OrientationVol.OEEPE Official Publcation. Frankfurt am Main:Bundesamt für Kartographie und Geodsie,2002:2002:197–212.
    [121] Goshen-Meskin D,Bar-Itzhack I Y.Observability analysis of piece-wise con-stant systems-part I: theory [J].IEEE Trans. on Aerospace and Electronic Sys-tems.1992,28:1056–1067.
    [122] Sabatini A. Estimating three-dimensional orientation of human body parts by in-ertial/magnetic sensing [J]. sensors.2011,11:1489–1525.
    [123] Wahha G. A least-square estimate of spacecraft attitude [J]. SLAM Rev.1965,7:409.
    [124] Gebre-Egziabher D, Elkaim G, Powell J. A gyro-free quaternion-based attitudedetermination system suitable for implementation using low cost sensors [C].In IEEE Position Location and Navigation Symposium. San Diego, USA,13-16March2000:185–192.
    [125] LeeJ,Park E. A fast quaternion-basedorientation optimizervia virtualrotation forhuman motion tracking [J]. IEEE Trans. Biomed. Eng.2009(56):1574–1582.
    [126] Yun X, Bachmann E, McGhee R. A simplified quaternion-based algorithm for ori-entaitionestimationfromearthgravityandmagneticfileldmeasurements[J].IEEETrans. Instrum. Meas.2008(57):638–650.
    [127] Markley F. Attitude determination and parameter estimation using vector obser-vations: theory [J]. J. Astronaut. Sci.1989(37):41–58.
    [128] Horn R, Johnson C. Topics in matrix analysis [M]. New York: Cambridge Univer-sity Press,1991.
    [129] Nissen D. A note on the variance of a matrix [J]. Economertrica.1968,36:603–604.
    [130] Bellman R. Introduction to matrix analysis [M]. New York: McGraw-Hill,1960.
    [131] Titterton D,Weston J.Strapdown inertial navigation technology [M].2nd.London,UnitedKingdom:PeterPeregrinusLtd.onbehalfoftheInstituteofElec-trical Engineers,2004.
    [132] Groves P D.Principles of GNSS, inertial, and multisensor integrated navigationsystems [M].Artech House,2008.
    [133] FerrarisF,GrimaldiU,ParvisM.Produreforeffortlessin-fieldcalibrationofthree-axis rate gyros and accelerometers [J]. Sens. Mater.1995,7:311–330.
    [134] Gebre-Egziabher D, Elkaim G, Powell J, et al. A non-linear, two-step estimationalgorithm for calibrating solid-state strapdown magnetometers [C]. In8th Inter-national Conference on Navigation Systems. St.Petersburg, Russia,28-30March2000.
    [135] Barchmann E, Yun X, Peterson C. An investigation of the effects of magnetic vari-ations on inertial/magnetic orientation sensors [C]. In IEEE International Confer-ence on Robotics and Automation(ICRA’04). New Orleans, LA, USA,26May2004:1115–1122.
    [136] de Vries W, Veeger H, CTM B, et al. Magnetic distortion in motion labs, implica-tions for validating inertial magetic sensors [J]. Gait Posture.2009(29):535–541.
    [137] Roetenberg D, Luinge H, Baten C M, et al. Compensation of magetic disturbancesimproves inertial and magnetic sensing of human body segment orientation [J].IEEE Trans on Neur. Syst. Rehabil. Eng.2005,12:395–405.
    [138] Kotecha P A, J Hand Djuric. Gaussian particle filtering [J]. IEEE Transactions onSignal Processing.2003,51(10):2592–2601.
    [139] Bar-ItzhackI,ReinerJ.Recursiveattitudedeterminationfromvectorobservations:DCM identification [J]. Journal of Guidance, Control, and Dynamics.1984,7:51–56.
    [140] Brogan W. Modern control theory [M]. Prentice-Hall: Springer-Verlag,1990.
    [141] Sastry S. Nonlinear systems: analysis, stability, and control [M]. New York:Springer-Verlag,1999.
    [142] Oliensis J. A new structure-from-motion ambiguity [J]. IEEE Transactions on Pat-tern Analysis and Machine Intelligence.2000,22(7):685–700.
    [143] Rehbinder H,Ghosh B K.Pose estimation using line-based dynamic vision andinertial sensors [J].IEEE Transactions on Automatic Control.2003,48(2):186–199.
    [144] Bryson M,Sukkarieh S.Observability analysis and active control for airborneSLAM [J].IEEE Transactions on Aerospace and Electronic Systems.2008,44(1):20.
    [145] Clifford W. Preliminary sketch of bi-quaternions [C]. In London Math. Soc.1873:381–395.
    [146] Study E. Von den bewegungen und umlegungen [J]. Mathematische Annalen.1891,39:441–566.
    [147] Walker M W, Shao L, Volz R A. Estimating3-d location parameters using dualnumber quaternions [C]. In CVGIP: Image Understanding.1991:358–367.
    [148] Daniilidis K, Bayro-Corrochano E. The dual quaternion approach to hand-eyecalibration [C]. In IEEE International Conference on Pattern Recognition.1996:A7E.6.
    [149] Kotelnikov A P. Screw calculus and some applications to geometry and mechan-ics [J]. Annuals of Imperial University of Kazan.1895.
    [150] Grosky W I, Tamburino L A. A unified approach to the linear camera calibrationproblem [J]. IEEE Trans on PAMI.1990,12:663–671.
    [151] Shin S S, Hung Y P, Lin W. Accurate linear technique for camera calibration con-sidering lens distortion by solving an eigenvalue problem [J]. Optical Engineer.1993,32:138–149.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700