1、 BA SLAM ORB SLAM2 SAD SAD Nvidia Jetson TX2 1 280720 60/s TK242 A 0493-2137 2019 12-1262-08 Hou Yonghong1 Liu Yan1 L Hualong1 Wu Qi1 Zhao Jian2 Chen Yanfang2 1.School of Electrical and Information Engineering Tianjin University Tianjin300072 China 2.Tianjin Zhongwei Aerospace Data System Technology
2、 Co.Ltd.Tianjin 300072 China Unmanned aerial vehicles UAVs have wide application in wilderness search and rescue environmental exploration and other fields.UAVs with autonomous navigation functions have been recently reported.To mitigate the real-time autonomous navigation problem of UAVs a rotor UA
3、V system was designed and implemented herein.The designed UAV system could autonomously perceive the unknown outdoor environment and realize real-time tra-jectory planning.The bundle adjustment optimized BA-optimized ORB SLAM2 algorithm was first used to obtain the posture information of UAV using a
4、 binocular camera.The push-broom perception method and modified sum of absolute differences SAD algorithm were then used to obtain environmental information and obstacle points.Lo-cal obstacle maps were constructed based on the environmental obstacle points and UAV posture information using aparalle
5、l calculation framework CUDA to improve the system performance.To solve the real-time autonomous navi-gation problem of UAVs the modified SAD algorithm only focuses on the sparse matching of pixel blocks with a fixed parallax size.According to the generated current local obstacle map and local traje
6、ctory library the motion track can be independently selected to achieve the effect of the real-time local trajectory planning.All the above func-tions were processed in the embedded NVIDIA Jetson TX2 processor equipped with a drone.Simulation and actualflying experiments show that the designed syste
7、m realizes real-time autonomous perception and trajectory planning function of UAVs in unknown outdoor scenes.When the resolution of the captured video is 1 280720 the pro-cessing speed can reach 60 frames per second.In summary this design provides a reference for improving the obsta-2019 12 1263 cl
8、e avoidance and navigation function of low-cost UA Vs.binocular vision visual odometer parallel calculation local trajectory planning autonomous naviga-tion 1 2 3 4 graphic processing unit GPU GPS GPS GPS 56 simul-taneous localization and mapping SLAM SLAM SLAM7 SLAM8 SLAM RGB-D 9-11 SLMA 12 RGB-D S
9、LAM 1 4 2kg Tarot 650 Pixhawk Pixhawk robot op-eration system ROS MavLink NVIDIA Jetson TX2 4 ARM A57 CPU 256 CUDA 8GB 7.5 W 1.3 TOPS trillion operations per second StereoLabs ZED 20 m ZED 1264 52 ORB SLAM2 SAD GPS GPS visual odometry VO ORB SLAM12-13 R t ORB SLAM ORB-SLAM ORB ORB 14 oriented FAST F
10、AST15 BRIEF16 binary robust independent elementary fea-ture PnP perspective-n-point bundle adjustment BA BA BA BA SO(3)R 3 t BA 3 i X 3 i x 2,a r g m i n()iii RtRt x RX t 1 xxyyx xXfcZXYYfcZZXbf cZ 2 Huber(,)xyf f b(,)x ycc Z X Y Z ORB SLAM BA ORB-SLAM12 LK BA,|,ill L LiRlK XRt,arg min()ill kkjkKjE
11、XRt 3 2j()jkj k kE xR Xt BA BA LK LP 3D K k k LP ORB SLAM2 GPS 2019 12 1265 LOG(,)4(,)(1,)Lxy Ixy Ix y(1,)(,1)(,1)Ix y Ixy Ixy 4(,)I xy(,)x y(,)Lxy SAD sum of abso-lute differences 17 SAD SAD CUDA 1280 720 12 12 NVIDIA Jetson TX2 GPU SAD 1 nn 2 3 d 4 SAD SAD left right0left right0|()()|()()nninniI i
12、IiSLiLi 5()Li 2 BfdZ(6)B f B f 5m d SAD 1266 52 SAD 2.1 2.2 PCL18 octomap 8 lb1xyx 7 x 01 y y y 7 3 4s 2s 2s 1 obsd refd obs refdd 2 01,Sss 0s 1s 0123,A aaaa 3a a 1 0s 1aa 2a 30aaa 0aa 3a 30aa 1s 1aa 2a a 0aa 3a 21aa 19 CPU NVIDIA Jetson TX2 GPU 256 CUDA CPU CUDA CUDA 2019 12 1267 NVIDIA Jetson TX2
13、3 2 GPU/ms/ms/s 1 640480 27.48 25.81 029.6 2 640480 05.32 02.20 122.0 3 1280720 08.68 02.23 085.0 20 376240 120.0 1 3 1 CUDA 640 480 CUDA 1 GPU 1 280 720 GPU 4 20 640 480 4 1280 720 NVIDIA Jetson TX2 4 3D Unreal Engine 4 UE4 Microsoft AirSim 21 Airsim GPS IMU UE4 3D Airsim API 5 a 5m 5 b 5 c d 3 a b
14、 c 1 d 2 1268 52 5 d 3 5 6 1280 720 60/s a b c 6 b 6 c 6 c 6 c 5m ORB SAD NVIDIA Jetson TX2 GPU 1 Sun J Li B Jiang Y et al.A camera-based target detection and positioning UAV system for search and res-cue SAR purposes J.Sensors 2016 16 11 1778.doi 10.3390/s16111778 2 Dunbabin M Marques L.Robots for
15、environmental monitoring Significant advancements and applications J.Robotics&Automation Magazine IEEE 2012 19 1 24-39.3 Wu X Abrahantes M Edgington M.MUSSE A de-signed multi-ultrasonic-sensor system for echolocation on multiple robots C/Intelligent Robot Systems IEEE.Tokyo Japan 2016 79-83.4 Liang
16、X Chen H Li Y et al.Visual laser-SLAM in large-scale indoor environments C/IEEE International Conference on Robotics&Biomimetics.Qingdao China 2017 19-24.5 LoquercioA Maqueda A I Del-Blanco C R et al.DroNet Learning to fly by driving J.IEEE Robotics and Automation Letters 2018 3 2 1088-1095.6 Smith
17、R C Cheeseman P.On the representation and 2019 12 1269 estimation of spatial uncertainty J.The International Journal of Robotics Research 1986 5 4 56-68.7 Qin T Li P Shen S.VINS-Mono A robust and versa-tile monocular visual-inertial state estimator J.IEEE Transactions on Robotics 2018 34 4 1004-1020
18、.8.J.2017 50 9 967-974.Hou Yonghong Ye Xiufeng Zhang Liang et al.A UAV human robot interaction method based on deep learning J.Journal of Tianjin University Science and Technology 2017 50 9 967-974 in Chinese.9 Mantecn T del Blanco C R Jaureguizar F et al.New generation of human machine interfaces f
19、or control-ling UAV through depth-based gesture recogni-tion C/International Society for Optics and Photonics.Baltimore USA 2014 90840C-1-90840C-11.10 Pfeil K Koh S L LaViola J.Exploring 3D ges-turemetaphors for interaction with unmanned aerial vehi-cles C/Proceedings of the 2013 International Confe
20、r-enceon Intelligent User Interfaces.Santa Monica USA 2013 257-266.11 Naseer T Sturm J Cremers D.Followme Personfollow-ing and gesture recognition with a quadrocop-ter C/2013 IEEE/RSJ International Conference on In-telligent Robots and Systems.Tokyo Japan 2013 624-630.12 Mur-Artal R Montiel J M M Ta
21、rdos J D.ORB-SLAM A versatile and accurate monocular SLAM sys-tem J.IEEE Transactions on Robotics 2017 31 5 1147-1163.13 Mur-Artal R Tardos J D.ORB-SLAM2 An open-source SLAM system for monocular stereo and RGB-D cameras J.IEEE Transactions on Robotics 2016 33 5 1255-1262.14 Rublee E Rabaud V Konolig
22、e K et al.ORB An efficient alternative to SIFT or SURF C/IEEE 2011 In-ternational Conference on Computer Vision.Barcelona Spain 2012 2564-2571 15 Rosten E Drummond T.Machine learning for high-speed corner detection C/European Conference on Computer Vision.Graz Austria 2006 430-443.16 Calonder M Lepe
23、tit V Strecha C et al.BRIEF Binary robust independent elementary features C/Eu-ropean Conference on Computer Vision.Heraklion Greece 2010 778-792.17 Hamzah R A Rahim R A Noh Z M.Sum of absolute differences algorithm in stereo correspondence problem for stereo matching in computer vision application
24、C/IEEE International Conference on Computer Science&Information Technology.Chengdu China 2010 652-657.18 Rusu R B Cousins S.3D is here Point cloud li-brary PCL C/IEEE International Conference on Ro-botics and Automation ICRA 2011.Shanghai China 2011 1-4.19 Liu S Watterson M Mohta K et al.Planning dy
25、nam-ically feasible trajectories for quadrotors using safe flight corridors in 3-D complex environments J.IEEE Robot-ics&Automation Letters 2017 2 3 1688-1695.20 Barry A J Florence P R Tedrake R.High speed au-tonomous obstacle avoidance with pushbroom stereo J.Journal of Field Robotics 2018 35 1 52-68.21 Shah S Dey D Lovett C et al.Airsim High-fidelity visual and physical simulation for autonomous vehi-cles C/Field and Service Robotics.Zurich Switzerland 2018 621-635.