image_pdfimage_print

07.10.2022 – Innovative Research Discussion

Meeting notes on the 4th of October, 2022

Location: Chair of CPS

Date & Time: 7th October, 2022, 09:15 am to 10:25 pm

Participants: Univ.-Prof. Dr. Elmar Rueckert, Linus Nwankwo, M.Sc.

Agenda

  1. Discussion on  ROS-Mobile Control
  2. Discussion on ODrive torque control

ROS-Mobile and ODrive torque control

  1.  Re-implement the o2s control approach to accommodate the information in the attached figure.
  2.  Write the Arduino code taking into account the rotation matrices
  3. Implement the open-loop torque control approach

04.10.2022 – Innovative Research Discussion

Meeting notes on the 4th of October, 2022

Location: Chair of CPS

Date & Time: 4th October, 2022, 12:15 am to 1:15 pm

Participants: Univ.-Prof. Dr. Elmar Rueckert, Linus Nwankwo, M.Sc.

 

Agenda

  1. Discussion of the research progress
  2. Discussion on the Hardware-X journal publication 
  3. Discussion of the next conference publication

Journal Publication : Hardware-X

  1.  Put the HardwareX manuscript on Arxiv to enable us to: 
  • cite it in the subsequent article
  • obtain the DOI
  • update the publication in our cloud with the Arxiv number

Conference Paper: O2S: Open Source Open Shuttle - A comparison of SLAM algorithms

  1.  Start working on the conference paper using the shared Latex template provided
  • Compare the various 2D SLAM algorithms
  •  Establish the key metrics for the evaluation
  • Evaluate their performance in real-world with the O2S
  • Evaluate visual SLAM (optional)
  • Check how 2D SLAM can be combined with RGB-D cameras with a deep neural network to improve the map quality (optional)

Future Steps

Intention signalling to improve human-robot interaction (HRI)

Next Meeting

Yet to be defined

29.08.2022 – Innovative Research Discussion

Meeting notes on the 29th of August, 2022

Location: Chair of CPS

Date & Time: 29th August, 2022, 11:30 am to 12.00 noon

Participants: Univ.-Prof. Dr. Elmar Rueckert, Linus Nwankwo, M.Sc.

 

Agenda

  1. Discussion of the research progress
  2. Discussion on the Hardware-X journal publication 
  3. Discussion of the next project after hardware-X submission

Topic 1: Journal Publication - Hardware-X

  1.  Re-design the robot base for the hardware-X publication
  2. Publish the following file:
    • CAD files including the URDF
    • ROS nodes and schematics
    • Show some experimental results e.g real-time control and monitoring with ROS-Mobile and Android-based devices; remote control from Arduino and joystick etc.
    • Perform SLAM and demonstrate some experiment.

Topic 2:  Conference paper for possible publication in IROS, ECMR etc:

  1.  Open source open shuttle (O2S) for SLAM and navigation applications:  – Compare the existing SLAM algorithms for efficient navigation in a cluttered and dynamic environment
  2. O2S intention signalling: – for safe human-robot interaction (HRI) using intention signalling.
  3. Mobile robot teleoperation through hand gestures approach (possible for publication at the National European conference )
  4. Deep navigation with O2S (proposal for Christoper M.Sc. thesis – subject to changes pending the outcome of the meeting on Friday)

Future Steps

Survey of open-shuttles in logistics:  Possible for A.I based journal publication

Next Meeting

Yet to be defined

24.03.2022 – Innovative Research Discussion

Meeting notes on the 23rd of March 2022

Location: Chair of CPS

Date & Time: 23rd March. 2022, 12 pm to 1.00 pm

Participants: Univ.-Prof. Dr. Elmar Rueckert, Linus Nwankwo, M.Sc.

 

Agenda

  1. Discussion of the research progress
  2. Discussion of related literature for 2D Lidar-based SLAM and path planning for improved autonomous navigation

Topic 1: SLAM + Path Planning Algorithms

  1.  Compare Lidar-based SLAM with RGBD-based
  2. Select the appropriate algorithm to build the SLAM system
  3. Evaluate the performances of the chosen SLAM + path planning algorithm in terms of:
    • computational efficiency
    • collision avoidance in a dynamic environment.

Topic 2: Real-time Implementation

  1.  Compare our two mobile bases: 
  2. Implement the chosen SLAM algorithm to solve the problem of loop closure that we are currently facing.
  3.  Send the robot to a specific goal location within the laboratory
  4.  

Future Steps

Toward improved autonomous navigation in real-time with any of our mobile bases.

Next Meeting

Yet to be defined

B.Sc. Thesis: Fritze Clemens on Gesture-Based Mobile Robot Teleoperation for Navigation Application

Supervisor: Linus Nwankwo, M.Sc.;
Univ.-Prof. Dr Elmar Rückert
Start date: 10th January 2022

 

Theoretical difficulty: mid
Practical difficulty: mid

Abstract

Nowadays, robots used for survey of indoor and outdoor environments are either teleoperated in fully autonomous mode where the robot makes a decision by itself and have complete control of its actions; semi-autonomous mode where the robot’s decisions and actions are both manually (by a human) and autonomously (by the robot) controlled; and in full manual mode where the robot actions and decisions are manually controlled by humans. In full manual mode, the robot can be operated using a teach pendant, computer keyboard, joystick, mobile device, etc.

Although the Robot Operating System (ROS) has provided roboticists with easy and efficient tools to teleoperate or command robots with both hardware and software compatibility on the ROS framework, however, there is a need to provide an alternative approach to encourage a non-robotic expert to interact with a robot. The human hand-gesture approach does not only enables the robot users to teleoperate the robot by demonstration but also enhances user-friendly interaction between robots and humans.

In the context of this thesis, the application of human hand gestures is proposed to teleoperate our mobile robot using embedded computers and inertial measurement sensors. First, we will teleoperate the robot on the ROS platform and then via hand gestures, leveraging on the framework developed by [1] and [2].

Tentative Work Plan

To achieve our aim, the following concrete tasks will be focused on:

  • Design and generate the Unified Robot Description Format (URDF) for the mobile robot
  • Simulate the mobile robot within the ROS framework (Gazebo, Rviz)
  • Set up the interfaces and serial connection between ROS and the robot control devices
  • Develop an algorithm to teleoperate the robot in the ROS using hand gesture
  • Use the algorithm to perform Simultaneous Localization and Mapping (SLAM) for indoor applications (simulation only)

References

[1] Nils Rottmann et al,  https://github.com/ROS-Mobile/ROS-Mobile-Android, 2020

[2] Wei Zhang, Hongtai Cheng, Liang Zhao, Lina Hao, Manli Tao and ChaoqunXiang, “A gesture-based Teleoperation System for Compliant Robot Motion“, Appl. Sci. 20199(24), 5290; https://doi.org/10.3390/app9245290

Thesis Document

09.12.2021 – Innovative Research Discussion

Meeting on the 9th of December 2021

Location: Chair of CPS

Date & Time: 9th Dec. 2021, 1pm to 1.30pm

Participants: Univ.-Prof. Dr. Elmar Rueckert, Linus Nwankwo, M.Sc.

 

Agenda

  1. Discussion of the research progress
  2. Discussion of related literature for robot intention communication and human robot interaction

Topic 1: Robot Intention Communication and Human Robot Interaction

  1. Study the given literature to identify the areas of possible improvement
  2. Investigate the integration of LED projector for robot intention communication to humans
  3. Implement a visual signal algorithm to communicate the robot’s intention and direction of motion to humans

Topic 2: Probabilistic SLAM

  1.  Review the  AMCL approach to track the pose of our robot against the map of the environment while mapping.
  2.  Design a SLAM algorithm based on the probabilistic framework and evaluate the performance in terms of loop closure detection, accurate pose estimation and computational cost.

Future Steps

Integrate deep Learning approach with Faster R-CNN model to:

  • detect for example if the person interacting with robot is wearing a face mask
  • classify the face mask worn based on the recommended or not

Next Meeting

Yet to be scheduled

13.10.2021 – Innovative Research Discussion

Meeting on the 13th of October 2021

Location: Chair of CPS

Date & Time: 13th October. 2021, 1pm to 2pm

Participants: Univ.-Prof. Dr. Elmar Rueckert, Linus Nwankwo, M.Sc.

 

Agenda

  1. Review of the previous meeting action points
  2. Discussion of the research progress and possible improvement

Topic 1: Review of the SLAM state-of-the-art

  1. Review the current literature to identify the areas of possible improvement
  2. Investigate the possible integration of RGB-D sensors with thermal sensors for indoor or outdoor SLAM
  3. Evaluate the impact of semantic segmentation and pose estimation on the quality of the map in dynamic environment

Topic 2: Existing Algorithms Performance Evaluation

  1. Implement a loop closure trigger for possible use with any of the SLAM algorithm
  2.  Implement the following SLAM algorithms and evaluate the performances in terms of optimality, accurate pose estimation, computational cost, etc.
  • RTABMap
  • Hector SLAM
  • GMapping

Next Steps

Implement own SLAM algorithm robust against:

  • Odometry Lost
  • Variation in weather and illumination
  • Weak dynamic object detection

Next Meeting

Yet to be scheduled

B.Sc. or M.Sc. Thesis/Project: Simultaneous Localization and Mapping (SLAM) in challenging real-world environments.

Supervisor: Linus Nwankwo, M.S.c;
Univ.-Prof. Dr Elmar Rückert
Start date: ASAP, e.g., 1st of October 2021

Theoretical difficulty: low
Practical difficulty: high

Introduction

The SLAM problem as described in [3] is the problem of building a map of the environment while simultaneously estimating the robot’s position relative to the map given noisy sensor observations. Probabilistically, the problem is often approached by leveraging the Bayes formulation due to the uncertainties in the robot’s motions and observations. 

SLAM has found many applications not only in navigation, augmented reality, and autonomous vehicles e.g. self-driving cars, and drones but also in indoor & outdoor delivery robots, intelligent warehousing etc. While many possible solutions have been presented in the literature to solve the SLAM problem, in challenging real-world scenarios with features or geometrically constrained characteristics, the reality is far different.

 

Some of the most common challenges with SLAM are the accumulation of errors over time due to inaccurate pose estimation (localization errors) while the robot moves from the start location to the goal location; the high computational cost for image, point cloud processing and optimization [1]. These challenges can cause a significant deviation from the actual values and at the same time leads to inaccurate localization if the image and cloud processing is not processed at a very high frequency [2]. This would also impair the frequency with which the map is updated and hence the overall efficiency of the SLAM algorithm will be affected.

For this thesis, we propose to investigate in-depth the visual or LiDAR SLAM approach using our state-of-the-art Intel Realsense cameras and light detection and ranging sensors (LiDAR). For this, the following concrete tasks will be focused on:

Tentative Work Plan

  • study the concept of visual or LiDAR-based SLAM as well as its application in the survey of an unknown environment.
  • 2D/3D mapping in both static and dynamic environments.
  • localise the robot in the environment using the adaptive Monte Carlo localization (AMCL) approach.
  •  write a path planning algorithm to navigate the robot from starting point to the destination avoiding collision with obstacles.
  • real-time experimentation, simulation (MATLAB, ROS & Gazebo, Rviz, C/C++, Python etc.) and validation.

About the Laboratory

Robotics & AI-Lab of the Chair of Cyber-Physical Systems is a research innovative lab focusing on robotics, artificial intelligence, machine and deep learning, embedded smart sensing systems and computational models. To support its research and training activities, the laboratory currently has:
  • additive manufacturing unit (3D and laser printing technologies).
  • metallic production workshop.
  • robotics unit (mobile robots, robotic manipulators, robotic hands, unmanned aerial vehicles (UAV))
  • sensors unit (Intel Realsense (LiDAR, depth and tracking cameras), Inertial Measurement Unit (IMU), OptiTrack cameras etc.)
  • electronics and embedded systems unit (Arduino, Raspberry Pi, e.t.c)

Expression of Interest

Students interested in carrying out their Master of Science (M.Sc.) or Bachelor of Science (B.Sc.) thesis on the above topic should immediately contact or visit the Chair of Cyber Physical Systems.

Phone: +43 3842 402 – 1901 

Map: click here

References

[1]  V.Barrile, G. Candela, A. Fotia, ‘Point cloud segmentation using image processing techniques for structural analysis’, The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLII-2/W11, 2019 

[2]  Łukasz Sobczak , Katarzyna Filus , Adam Domanski and Joanna Domanska, ‘LiDAR Point Cloud Generation for SLAM Algorithm Evaluation’, Sensors 2021, 21, 3313. https://doi.org/10.3390/ s21103313.

[3]  Wolfram Burgard, Cyrill Stachniss, Kai Arras, and Maren Bennewitz , ‘SLAM: Simultaneous
Localization and Mapping’,  http://ais.informatik.uni-freiburg.de/teaching/ss12/robotics/slides/12-slam.pdf

18.08.2021 – Innovative Research Discussion

Meeting on the 18th of August 2021

Location: Chair of CPS

Date & Time: 18th Aug. 2021, 1pm to 2pm

Participants: Univ.-Prof. Dr. Elmar Rueckert, Linus Nwankwo, M.Sc.

 

Agenda

  1. Review of the previous meeting action points
  2. Discussion of the first research topic
  3. Robot for Control, SLAM, Navigation & Motion Planning Research

Topic 1: Design and Implementation of an Environment Survey Robot

Considerations

Safe human – robot interaction : The robot must have to project the direction of motion in the working environment (enabling humans to note the direction of motion of the robot).

Navigation and Motion Planning: Move around the environment with smooth turning motion (no jerky) and gradual stopping motion. The robot must at the same time detect and avoid obstacles at the workspace.

Localization and Mapping: Update the map of the unknown environment and at the same time keeping track of its current location with respect to the map.

Topic 2: Robots into consideration for the tasks

  1. Segway RMP Mobile Platform – RMP Lite 220/400 
  2. Segway RMP Mobile Platform – RMP Smart 260 – request for technical details sent (awaiting reply)

Next Steps

  1. Investigate the interconnection of Realsense cameras as a Kalman filter
  2. Experimentation with the Lego EV3

Next Meeting

Next meeting was scheduled on Thursday, 26th August, 2021. 11:00 am – 12:00 pm.

14.08.2021 – Meeting Notes

Meeting on the 13th of August 2021

Location: Chair of CPS

Date & Time: 13th Aug. 2021, 1pm to 2pm

Participants: Univ.-Prof. Dr. Elmar Rueckert, Linus Nwankwo, M.Sc.

 

Agenda

  1. Organisatorial progress
  2. Feedback to the research talk presentation
  3. Topics of promising future research direction.
  4. Next steps.
  5. Date of the next meeting.

Top 1: Organisational Update

  1. Resident permit application – still in progress (awaiting appointment)

Top 2: Feedback to Research Talk presentation

  1.  Use of the University of Leoben slides template for official presentation is required.
  2. Use of social media platforms (Facebook, LinkedIn, Instagram, etc.) as footer or header for any presentation not required.
  3. Microsoft office tools, Latex etc. can be used for documents preparation

Top 3: Topics of Promising Future Research Direction

  1.  Linear Quadratic Regulator (LQR) optimization using reinforcement learning: In this topic, we propose to study how reinforcement machine learning techniques could be used to optimize LQR controllers.
  2. Design and implementation of an Environmental Survey Robot (to be called MoLBot):  In this topic, we proposed to develop a teleprensence robot to be used as a demonstrative system for testing control designs, navigations and motion planning algorithms with the capabilities of interacting with the environment (providing one with a remote pair of mobile eyes and ears to have remote presence at any location). Interactive and sensing features should includes Touch screen, LiDAR sensor, RGB-D camera, or a laser scanner  etc.
  3.  Development of an Innovative Stairs Climbing Robot: This project is proposed to develop a robot with the capabilities of climbing floor steps. The type of wheels to adapt the robot for the steps climbing not yet decided.
  4. Elevator Caller Robot: This project aims to develop a robot that can autonomously operate the university elevator. The features should at least include ROS Bluetooth with secure encryption.
  5. EV3 Lego Project:  In this project, we proposed to assemble the existing EV3 Lego robot parts and train students on environmental perception, navigation and motion planning.

Next Steps

  1. Search for adaptable robot wheels for the implementation of topic 3 above
  2. Search for a robot base for the implementation of topic 2 above

Next Meeting

Next meeting was scheduled on Wednesday, 18th August, 2021. 11:00 am – 12:00 pm.