Energy Efficient RANSAC Algorithm for Flat Surface Detection in Point Clouds

: pp. 47 – 53
Received: March 30, 2023
Revised: May 29, 2023
Accepted: June 08, 2023

A. Zhuchenko, O. Kuchkin, A. Sazonov, D. Zghurskyi. Energy efficient RANSAC algorithm for flat surface detection in point clouds. Energy Engineering and Control Systems, 2023, Vol. 9, No. 1, pp. 47 – 53.

National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”
National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”
National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”
National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”

Mobile robots control systems achieve greater efficiency through the use of robust environmental analysis algorithms based on data collected from optical sensors such as depth cameras, Light Detection and Ranging sensors (LIDARs). These data sources provide information about control object environment in point cloud. The work of such algorithms, as a rule, is aimed at detecting the objects of interest and searching for the specified objects, as well as relocating its own position on the scene. There are many different approaches for solving object detection problem in point clouds, but most of them require high computational resources. In this work, many variations of the random sample consensus (RANSAC) method are analyzed for objects defined by a mathematical model of an analytical form. Statistical characteristics of data analysis were used to compare the methods. The results demonstrate the most energy efficient flat surface detection method that processes 60 RGB-D camera frames per second.

  1. J. Ren, K. McIsaac, and R. Patel, “Modified newton’s method applied to potential field-based navigation for mobile robots,” IEEE Transactions on Robotics, vol. 22, no. 2, pp. 384–391, 2006.
  2. Aldao, E.; González-de Santos, L.M.; González-Jorge, H. LiDAR Based Detect and Avoid System for UAV Navigation in UAM Corridors. Drones 2022, 6, 185.
  3. Matous Vrba, Viktor Walter and Martin Saska. On Onboard LiDAR-based Flying Object Detection. 9 Mar 2023.
  4. Abhijeet Shenoi, Mihir Patel, JunYoung Gwak, Patrick Goebel, Amir Sadeghian, Hamid Rezatofighi, Roberto Mart´ın-Mart´ın, Silvio Savarese. JRMOT: A Multi-Modal Real-Time 3D Multi-Object Tracker and a New Large-Scale Dataset. 22 Jul 2020.
  5. Jinze Liu, Minzhe Li, Jiunn-Kai Huang, Jessy W. Grizzle. Realtime Safety Control for Bipedal Robots to Avoid Multiple Obstacles via CLF-CBF Constraints. 5 Jan 2023.
  6. Jiaqi Yang, Zhiqiang Huang, Siwen Quan, Qian Zhang, Yanning Zhang, Senior Member, IEEE and Zhiguo Cao. On Efficient and Robust Metrics for RANSAC Hypotheses and 3D Rigid Registration.10 Nov 2020.
  7. Sunglok Choi1, Taemin Kim2, Wonpil Yu1. Performance Evaluation of RANSAC Family. BMVC 2009
  8. Bin Tan, Nan Xue, Tianfu Wu, Gui-Song Xia. NOPE-SAC: Neural One-Plane RANSAC for Sparse-View Planar 3D Reconstruction. 30 Nov 2022.
  9. Fisher, M., Bolles, R.: Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography; Comm. of the ACM 24(6), 381-395, 1981.
  10. Jiri Matas and Ondrej Chum. Randomized RANSAC with Td,d test. Image and Vision Computing, 22(10):837–842, September 2004.
  11. J. Matas; O. Chum. Randomized RANSAC with sequential probability ratio test. 05 December 2005
  12. Torr, P.; Zisserman, A. MLESAC: A New Robust Estimator with Application to Estimating Image Geometry. Comput. Vis. Image Underst. 2000, 78, 138–156,
  13. Chum, O.; Matas, J. Matching with PROSAC—Progressive sample consensus. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, 20–26 June 2005; Volume 1, pp. 220–226.
  14. Ehsan Shojaedinia, Mahshid Majda, Reza Safabakhsha. Novel Adaptive Genetic Algorithm Sample Consensus. 26 Nov 2017.
  15. Zhang, Qingming, Buhai Shi, and Haibo Xu. 2019. "Least Squares Consensus for Matching Local Features" Information 10, no. 9: 275.