Solution of problem of returning to the starting point of autonomously flying UAV by visual navigation
https://doi.org/10.37661/1816-0301-2020-17-2-17-24
Abstract
An autonomous visual navigation algorithm is considered, designed for “home“ return of unmanned aerial vehicle (UAV) equipped with on-board video camera and on-board computer, out of GPS and GLONASS navigation signals. The proposed algorithm is similar to the well-known visual navigation algorithms such as V-SLAM (simultaneous localization and mapping) and visual odometry, however, it differs in separate implementation of mapping and localization processes. It calculates the geographical coordinates of the features on the frames taken by on-board video camera during the flight from the start point until the moment of GPS and GLONASS signals loss. After the loss of the signal the return mission is launched, which provides estimation of the position of UAV relatively the map created by previously found features. Proposed approach does not require such complex calculations as V-SLAM and does not accumulate errors over time, in contrast to visual odometry and traditional methods of inertial navigation. The algorithm was implemented and tested with use of DJI Phantom 3 Pro quadcopter.
About the Authors
R. S. ZhukBelarus
Raman S. Zhuk, Junior Researcher
Minsk
B. A. Zalesky
Belarus
Boris A. Zalesky, Dr. Sci. (Phys.-Math.), Head of the Laboratory of Image Processing and Recognition
Minsk
Ph. S. Trotski
Belarus
Philip S. Trotski, Junior Researcher
Minsk
References
1. Durrant-Whyte H., Bailey T. Simultaneous localization and mapping: part I. IEEE Robotics and Automation Magazine, 2006, vol. 13, no. 2, pp. 99–110.
2. Bailey T., Durrant-Whyte H. Simultaneous localization and mapping (SLAM): part II. IEEE Robotics and Automation Magazine, 2006, vol. 13, no. 3, pp. 108–117.
3. Scaramuzza D., Fraundorfer F. Visual odometry [tutorial]. Part I: The first 30 years and fundamentals. IEEE Robotics and Automation Magazine, 2011, vol. 18, no. 4, рр. 80–92.
4. Fraundorfer F., Scaramuzza D. Visual odometry: part II: matching, robustness, optimization, and applications. IEEE Robotics and Automation Magazine, 2012, vol. 19, no. 2, pp. 78–90.
5. Forster C., Zhang Z., Gassner M., Werlberger M., Scaramuzza D. SVO: semidirect visual odometry for monocular and multicamera systems. IEEE Transactions on Robotics, 2017, vol. 33, no. 2, pp. 249–265.
6. Castro G., Nitsche M., Pire T., Fischer T., De Cristóforis P. Efficient on-board Stereo SLAM through constrained-covisibility strategies. Robotics and Autonomous Systems, 2019, vol. 116, pp. 192–205.
7. Qin T., Li P., Shen S. VINS-Mono: a robust and versatile monocular visual-inertial state estimator. IEEE Transactions on Robotics, 2017, vol. 34, no. 4, pp. 1004–1020.
8. Nisar B., Foehn P., Falanga D., Scaramuzza D. VIMO: simultaneous visual inertial model-based odometry and force estimation. IEEE Robotics and Automation Letters, 2019, vol. 4, no. 3, pp. 2785–2792.
9. Zalesky B. A., Trotski Ph. S. Parallel'naja versija detektora jekstremal'nyh osobyh tochek izobrazhenij [Parallel version of detector of extremal key points on images]. Informatika [Informatics], 2018, vol. 15, no. 2, pp. 55–63 (in Russian).
10. Forsyth D. A., Ponce J. Computer Vision: a Modern Approach. Prentice Hall, 2003, 693 p.
11. Hartley R. I. In defense of the eight-point algorithm. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1997, vol. 19, no. 6, pp. 580–593.
12. Nister D. An efficient solution to the five-point relative pose problem. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2004, vol. 26, no. 6, pp. 756–770.
Review
For citations:
Zhuk R.S., Zalesky B.A., Trotski P.S. Solution of problem of returning to the starting point of autonomously flying UAV by visual navigation. Informatics. 2020;17(2):17-24. (In Russ.) https://doi.org/10.37661/1816-0301-2020-17-2-17-24