Fritze Clemens: Hand-Gesture Based Mobile Robot Teleoperation

Supervisor: Linus Nwankwo, M.Sc.;
Univ.-Prof. Dr Elmar Rückert
Start date: 10th January 2022

 

Theoretical difficulty: mid
Practical difficulty: mid

Abstract

Nowadays, robots used for survey of indoor and outdoor environments are either teleoperated in fully autonomous mode where the robot makes a decision by itself and have complete control of its actions; semi-autonomous mode where the robot’s decisions and actions are both manually (by a human) and autonomously (by the robot) controlled; and in full manual mode where the robot actions and decisions are manually controlled by humans. In full manual mode, the robot can be operated using a teach pendant, computer keyboard, joystick, mobile device, etc.

Although the Robot Operating System (ROS) has provided roboticists with easy and efficient tools to teleoperate or command robots with both hardware and software compatibility on the ROS framework, however, there is a need to provide an alternative approach to encourage a non-robotic expert to interact with a robot. The human hand-gesture approach does not only enables the robot users to teleoperate the robot by demonstration but also enhances user-friendly interaction between robots and humans.

In the context of this thesis, the application of human hand gestures is proposed to teleoperate our mobile robot using embedded computers and inertial measurement sensors. First, we will teleoperate the robot on the ROS platform and then via hand gestures, leveraging on the framework developed by [1] and [2].

Tentative Work Plan

To achieve our aim, the following concrete tasks will be focused on:

  • Design and generate the Unified Robot Description Format (URDF) for the mobile robot
  • Simulate the mobile robot via the ROS framework (Gazebo, Rviz)
  • Set up the interfaces and serial connection between ROS and the robot control devices
  • Develop an algorithm to teleoperate the robot from the ROS platform and then using human hand gesture
  • Perform Simultaneous Localization and Mapping (SLAM) for indoor applications with the robot

References

[1] Nils Rottmann et al,  https://github.com/ROS-Mobile/ROS-Mobile-Android, 2020

[2] Wei Zhang, Hongtai Cheng, Liang Zhao, Lina Hao, Manli Tao and ChaoqunXiang, “A gesture-based Teleoperation System for Compliant Robot Motion“, Appl. Sci. 20199(24), 5290; https://doi.org/10.3390/app9245290

Nikolaus Feith: Sensor fusion with spiking neural networks

Supervisor: Univ.-Prof. Dr Elmar Rückert
Start date: 1st of July 2021

Theoretical difficulty: high
Practical difficulty: low

Abstract

Sensor systems in cyber physical systems are highly complex with lots of different sensor types. Some of them work redundantly, utilizing different physical measuring effects. The goal of this thesis is to investigate sensor fusion by using spiking neural networks in movement planning. Therefore, the first step is to simulate two or more sensors with dynamical measurement noise in Matlab and subsequently try to find an optimal sensor fusion with the usage of spiking neural networks. The main objective is to draw the most out of the respective sensor types and to minimize the weaknesses as much as possible. This could be a key technology for autonomous robotics and driving.

Plan

Related Work

[1] O. Bobrowski, R. Meir and Y. C. Eldar, “Bayesian Filtering in Spiking Neural Networks: Noise, Adaptation, and Multisensory Integration,” in Neural Computation, vol. 21, no. 5, pp. 1277-1320, May 2009, doi: 10.1162/neco.2008.01-08-692.

Implementation of spiking neural networks in robotics (sensory and planning):

[2] Xiuqing Wang, Zeng-Guang Hou, Feng Lv, Min Tan, Yongji Wang, “Mobile robots ׳†modular navigation controller using spiking neural networks”, Neurocomputing, Volume 134, 2014, Pages 230-238, ISSN 0925-2312, https://doi.org/10.1016/j.neucom.2013.07.055.

[3] Xiuqing Wang, Zeng-Guang Hou, Anmin Zou, Min Tan, Long Cheng, “A behavior controller based on spiking neural networks for mobile robots”, Neurocomputing, Volume 71, Issues 4–6, 2008, Pages 655-666, ISSN 0925-2312, https://doi.org/10.1016/j.neucom.2007.08.025.

[4] F. Alnajjar and Kazuyuki Murase, “Sensor-fusion in spiking neural network that generates autonomous behavior in real mobile robot,” 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence), 2008, pp. 2200-2206, doi: 10.1109/IJCNN.2008.4634102.

Spiking neural networks and filtering:

[5] GILRA, Aditya; GERSTNER, Wulfram. Predicting non-linear dynamics by stable local learning in a recurrent spiking neural network. Elife, 2017, 6. Jg., S. e28295.