Nowadays, robots used for survey of indoor and outdoor environments are either teleoperated in fully autonomous mode where the robot makes a decision by itself and have complete control of its actions; semi-autonomous mode where the robot’s decisions and actions are both manually (by a human) and autonomously (by the robot) controlled; and in full manual mode where the robot actions and decisions are manually controlled by humans. In full manual mode, the robot can be operated using a teach pendant, computer keyboard, joystick, mobile device, etc.
Although the Robot Operating System (ROS) has provided roboticists with easy and efficient tools to teleoperate or command robots with both hardware and software compatibility on the ROS framework, however, there is a need to provide an alternative approach to encourage a non-robotic expert to interact with a robot. The human hand-gesture approach does not only enables the robot users to teleoperate the robot by demonstration but also enhances user-friendly interaction between robots and humans.
In the context of this thesis, the application of human hand gestures is proposed to teleoperate our mobile robot using embedded computers and inertial measurement sensors. First, we will teleoperate the robot on the ROS platform and then via hand gestures, leveraging on the framework developed by  and .
Tentative Work Plan
To achieve our aim, the following concrete tasks will be focused on:
- Design and generate the Unified Robot Description Format (URDF) for the mobile robot
- Simulate the mobile robot via the ROS framework (Gazebo, Rviz)
- Set up the interfaces and serial connection between ROS and the robot control devices
- Develop an algorithm to teleoperate the robot from the ROS platform and then using human hand gesture
- Perform Simultaneous Localization and Mapping (SLAM) for indoor applications with the robot