Abstract
Automation is a necessity in the changing digital era. It helps to increase the efficiency of any process requiring constant manual work. The automation process faces difficulties whenever an object arrangement task involves both texture and feature loss. The more challenging phase arises when these objects were previously unseen and have unknown dimensions, rendering conventional methods inapplicable. To overcome the existing challenges, we proposed an approach for random bin picking using sensor fusion of image and range sensor data. This innovative approach helps improve performance by handling difficulties encountered when handling objects that lack distinctive features, texture, or unique characteristics, especially when positioned in an obstructed manner within a storage bin. The proposed solution introduces efficient O-RANSAC-based algorithms, specifically designed to leverage range data obtained from a laser scanner [1], hence eliminating un-seen issues. This utilization of range data is instrumental in facilitating the picking and placing of objects within the bin. The research in this paper rigorously tests the effectiveness of the proposed algorithm across various scenarios, encompassing real-time operations, obscured placements, generalized situations, and instances involv- ing featureless objects. The model achieved an Accuracy of 94% in the pose estimation for the pickup of large cylindrical pellets and 74% for general cylindrical objects. This Accuracy supersedes the generalized objects when compared to state-of-the-art methods. This paper has also demonstrated reduced errors in the estimation of parameters when compared to the deterministic approach. The quantitative and qualitative analysis shows that the proposed model could be used to reduce manual work.
Get full access to this article
View all access options for this article.
