• English
    • Norsk
  • English 
    • English
    • Norsk
  • Administration
View Item 
  •   Home
  • Det matematisk-naturvitenskapelige fakultet
  • Institutt for informatikk
  • Institutt for informatikk
  • View Item
  •   Home
  • Det matematisk-naturvitenskapelige fakultet
  • Institutt for informatikk
  • Institutt for informatikk
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

An Environment-Aware Robot Arm Platform Using Low-Cost Sensors and Deep Learning

Mišeikis, Justinas
Doctoral thesis
View/Open
PhD-Miseikis-2019.pdf (18.43Mb)
Year
2019
Permanent link
http://urn.nb.no/URN:NBN:no-73530

Metadata
Show metadata
Appears in the following Collection
  • Institutt for informatikk [3652]
Abstract
The main aim of the thesis is to use low-cost hardware components to create an environment-aware robot arm system capable of analysing its workspace using vision sensors and ensure collision-free operation. The objective is not only to build a physical system, but the focus is to develop algorithms to understand the environment by using image and depth information from multiple cameras and allow the robot to calculate and execute safe movement trajectories in dynamically changing environment.

In this thesis, we have developed an automatic camera-to-robot calibration system to perform camera internal and Eye-to-Hand calibrations. Furthermore, a visually based reactive-reflexive robot behaviour method allows the robot to find safe trajectories throughout a dynamically changing environment with a capability of quickly reacting to unexpectedly appearing close obstacles. The robot was also used to charge electric vehicles by using vision-based guidance autonomously. Eventually, the work evolved to using deep learning approaches to recognise the robot and estimate its 3D position with a simple 2D colour image used as an input. Multi-objective convolutional neural networks and transfer learning techniques allowed to expand the method to more robot types when having a limited amount of training data.

The thesis concludes that commercially-available hardware can be integrated using advanced algorithms to precisely model the workspace of the robot and allow path planning and quick reactions to unexpected situations. Many 3D cameras as well as robots, including Universal Robots, Kuka iiwa LBR and Franka Emika Panda, were used to perform all the experiments in real-world conditions.
List of papers
Paper I: Automatic calibration of a robot manipulator and multi 3D camera system. J. Miseikis, K. Glette, O. J. Elle and J. Torresen 2016 IEEE/SICE International Symposium on System Integration (SII), Sapporo, 2016, pp. 735-741. This article is included in the thesis. Also available at: https://doi.org/10.1109/SII.2016.7844087
Paper II: Multi 3D camera mapping for predictive and reflexive robot manipulator trajectory estimation J. Miseikis, K. Glette, O. J. Elle and J. Torresen 2016 IEEE Symposium Series on Computational Intelligence (SSCI), Athens, 2016, pp. 1-8. This article is included in the thesis. Also available at: https://doi.org/10.1109/SSCI.2016.7850237
Paper III: 3D Vision Guided Robotic Charging Station for Electric and Plugin Hybrid Vehicles. J. Miseikis, M. Rüther, B. Walzel, M. Hirz and H. Brunner OAGM/AAPR & ARW Joint Workshop 2017, Wien, Austria. Nominated for the Best Student Paper Award. This article is included in the thesis. Also available at: https://arxiv.org/abs/1703.05381
Paper IV: Robot Localisation and 3D Position Estimation Using a Free-Moving Camera and Cascaded Convolutional Neural Networks. J. Miseikis, P. Knobelreiter, I. Brijacak, S. Yahyanejad, K. Glette, O. J. Elle, and J. Torresen, 2018 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Auckland, New Zealand, 2018, pp. 181-187. Finalist of the Best Student Paper Award. This article is included in the thesis. Also available at: http://urn.nb.no/URN:NBN:no-70950
Paper V: Multi-Objective Convolutional Neural Networks for Robot Localisation and 3D Position Estimation in 2D Camera Images. J. Miseikis, I. Brijacak, S. Yahyanejad, K. Glette, O. J. Elle and J. Torresen, 2018 15th International Conference on Ubiquitous Robots (UR), Honolulu, HI, USA, 2018, pp. 597-603. This article is included in the thesis. Also available in DUO at: http://urn.nb.no/URN:NBN:no-70949
Paper VI: Transfer Learning for Unseen Robot Detection and Joint Estimation on a Multi-Objective Convolutional Neural Network. J. Miseikis, I. Brijacak, S. Yahyanejad, K. Glette, O. J. Elle, and J. Torresen, 2018 IEEE International Conference on Intelligence and Safety for Robotics (ISR), Shenyang, 2018, pp. 337-342. Winner of the Best Student Paper Award. This article is included in the thesis. Also available in DUO at: http://urn.nb.no/URN:NBN:no-70937
Paper VII: Two-Stage Transfer Learning for Heterogeneous Robot Detection and 3D Joint Position Estimation in a 2D Camera Image using CNN. J. Miseikis, I. Brijacak, S. Yahyanejad, K. Glette, O. J. Elle, and J. Torresen. 2019 IEEE International Conference on Robotics and Automation (ICRA), Montreal, 2019. This article is included in the thesis. Also available at: https://arxiv.org/abs/1902.05718
 
Responsible for this website 
University of Oslo Library


Contact Us 
duo-hjelp@ub.uio.no


Privacy policy
 

 

For students / employeesSubmit master thesisAccess to restricted material

Browse

All of DUOCommunities & CollectionsBy Issue DateAuthorsTitlesThis CollectionBy Issue DateAuthorsTitles

For library staff

Login
RSS Feeds
 
Responsible for this website 
University of Oslo Library


Contact Us 
duo-hjelp@ub.uio.no


Privacy policy