1

Meeting on the 5th, October 2022

Location: Chair of CPS

Date & Time: 5th Oct 2022

Participants: Univ.-Prof. Dr. Elmar Rueckert, DI Nikolaus Feith, BSc

 

Agenda

  1.  Update and more detailed definition of the the project
  2. Master thesis GNN for Motion Planning
  3. Next steps

Top 1: Update and more detailed definition of the the project

  • Last week, work continued on the ROS framework. There were problems with the inverse kinematics solver, because the Jaccobian matrix is generated incorrectly by the used Python packages. For the time being we will only work with the Impedance Effort Controller from Franka.
  • Furthermore, the CPS Lecture Repository was created and the CoppeliaSim simulation file and the Python script to operate it were programmed. In addition, a description of how to work with the simulation was written.
  • Further details were discussed in the ROS framework. The framework should work with different simulation programs (CoppeliaSim, IssacSim). The goal is to implement an RL environment, so that an initial solution can be given with the pen on the tablet. This initial solution (trajectory) should be used as motion primitives for the RL. The weights of the MPs are to be learned. Furthermore, it should be possible to manipulate the MP in real time, so that the user can further refine the trajectory in the learning process. The Gym library will be used for the evaluation. For the experiment, the lathe and the milling machine will be used (button press, lever operation, etc.). 
  • A research on related work shall be done, especially if such systems/frameworks have already been developed with ROS2. 
  • Not all sensors, HW devices, robots, etc. need to be implemented by myself, but there need to be a template for the different parts, so every one can extend the frame work.
  • XBox Controller: The Xbox controller is to be used for teleoperation for the robots, as well as the possibility for recording experiments should be implemented. E.g. fixation of a configuration, so that the same experiment can be recorded again and again from the same configuration or the possibility to separate a recording by pressing a button, so that the experiment has to be started only once and not for every run again.

Top 2: Master thesis GNN for Motion Planning

  1. Definition of a master thesis with the topic: GNN for Motion Planning
  2. https://github.com/rainorangelemon/gnn-motion-planning
  3. https://rainorangelemon.github.io/NeurIPS2021/paper.pdf

Top 3: Next Steps

  1. ROS
  2. ROS2
  3. Tablett for shared control (user correction)
  4. Hardware: which hardware is needed to work with the Framework



04.10.2022 – Innovative Research Discussion

Meeting notes on the 4th of October, 2022

Location: Chair of CPS

Date & Time: 4th October, 2022, 12:15 am to 1:15 pm

Participants: Univ.-Prof. Dr. Elmar Rueckert, Linus Nwankwo, M.Sc.

Agenda

  1. Discussion of the research progress
  2. Discussion on the Hardware-X journal publication 
  3. Discussion of the next conference publication

Journal Publication : Hardware-X

  1.  Put the HardwareX manuscript on Arxiv to enable us to: 
  • cite it in the subsequent article
  • obtain the DOI
  • update the publication in our cloud with the Arxiv number

Conference Paper: O2S: Open Source Open Shuttle – A comparison of SLAM algorithms

  1.  Start working on the conference paper using the shared Latex template provided
  • Compare the various 2D SLAM algorithms
  •  Establish the key metrics for the evaluation
  • Evaluate their performance in real-world with the O2S
  • Evaluate visual SLAM (optional)
  • Check how 2D SLAM can be combined with RGB-D cameras with a deep neural network to improve the map quality (optional)

Future Steps

Intention signalling to improve human-robot interaction (HRI)

Next Meeting

Yet to be defined




30.09.2021 Meeting Notes

Meeting Details

Date : 30th September 2022

Time : 11:30 – 12:30

Location : Chair of CPS, Montanuniverität Leoben

Participants: Univ.-Prof. Dr. Elmar Rueckert, Vedant Dave

Agenda

  1. Humanoids paper ready from Reviewers comments
  2. Extend the Conference paper for the Journal
  3. Active Exploration with Forward and Inverse Model learning

Topic 1: Humanoids Paper

  1. Change the paper according to the reviews.
  2. Add Real-world Experiments.

Topic 2: Science Robotics Paper

  1. Extend the paper for learning objects at different locations.
  2. Conduct experiments with multiple objects on the table.
  3. Enable object tracking and extend it.
  4. Extension to Riemannian Manifold to reduce the Orientation errors.

Topic 3: Active Exploration

  1. Survey on Exploration strategies and Empowerment.
  2. Trying to work on relationships between Maximum Entropy of Latent variables and Tasks.
  3. Trying to find literature on Learning Phase Jumps.
  4. Goal Babbling.

Literature

Inverse Dynamic Predictions

  1. S. Bechtle, B. Hammoud, A. Rai, F. Meier and L. Righetti, “Leveraging Forward Model Prediction Error for Learning Control,” 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021, pp. 4445-4451, doi: 10.1109/ICRA48506.2021.9561396..
  2. Eysenbach, Benjamin, et al. “Diversity is all you need: Learning skills without a reward function.” arXiv preprint arXiv:1802.06070 (2018).
  3. Klyubin, Alexander S., Daniel Polani, and Chrystopher L. Nehaniv. “All else being equal be empowered.” European Conference on Artificial Life. Springer, Berlin, Heidelberg, 2005.

Next Meeting

TBD 




Digital Competencies – Data Safety, Privacy and Content Search on the net

Getting started with Pytorch using Cuda acceleration

This tutorial gives an instruction on installing Cuda and enabling Cuda acceleration using Pytorch in Win10. Installation in Linux or Mac systems are all possible. An additional .py file will verify whether the current computer configuration uses the Cuda or not. The following instruction assumes that you have already installed Python IDE, e.g., Anaconda, Pycharm, Visual Studio…

Step 1: Check which Cuda version is supported by your current GPUs under this website. From the left figure, we can see that A100 supports Cuda 11.0. It is also reported from other blogs/ forums that A100 can support Cuda 11.1. In this post, we install Cuda 11.1.

Step 2: Download Nvidia Cuda Toolkit 11.1 (the same version as Cuda in Step 1) from the website. In Win10, for instance, we follow up the choice as shown right. The size of exe(local) is around 3.1GB. After downloading, run the .exe and perform installation. It may take some minutes to complete installation.


Step 3: On the homepage of Pytorch, choose the appropriate options as shown in the left figure. IMPORTANT: The cuda version must be the same as in Step 1. It is also recommended to use Stable version. After finishing the , copy the command into Anaconda Powershell Prompt or other command prompt where you install packages for Python. Waiting for the installation, which may require larger than 1GB disk space and takes some minutes for installation. You could also find historical version of Pytorch in that homepage.

Verify your installation with .py file

You could download a cuda-test.py file and run it. If the result shows ‘cuda’, then you can enjoy the Cuda acceleration for training neural networks!

Using Multiple GPUs for further acceleration

Running Pytorch with Multiple GPUs can further increase the efficiency. We have 8 GPU cards and can be used parallely for training. Please refer to (1) (2) (3) for details. 




150.000 MINT – Digital competencies (0.66SH P, WS)


This entry course discusses major competences all students should have to study at the MUL. The Chair of CPS provides tutorials on

  • Data Safety, Privacy and Content Search on the net.
  • Learning Python using online tools.
  • Learning to develop 3D-CAD models using online tools.
  • Using powerful online team working tools including shared documents.
  • Using data repositories and creating your personal webpage.

 

Links and Resources

Location & Time

  • See the MUOnline link. 
  • CPS Presentations are on the: 17.10.2022 at 11.00 in the HS1 Studierendenzentrum.

Posts on Digital Competencies




Meeting on the 28th, September 2022

Meeting on the 13th of August 2021

Location: Chair of CPS

Date & Time: 13th Aug. 2021, 9 am to 9:30 am

Participants: Univ.-Prof. Dr. Elmar Rueckert, Nikolaus Feith, BSc

 

Agenda

  1.  Project definition
  2. Update and next steps

Top 1: Project definition

Three main projects:

  1. ROS CPS Framework
  2. Reimplementation GNN for Motion Planning
  3. Literature research

The development of a ROS/ROS2 framework to control the CPS robots.
Furthermore, an interface for the use of a tablet is to be programmed in ROS2. On the tablet trajectories can be designed (movement primitives) which can be changed directly with the pen. For this a simple GUI is needed for the display and manipulation of trajectories as well as the display of the robot in 3D in real time. As well as the use of the CPS glove.
This program is intended to be an application of active learning methods, especially for shared and preference learning.
Finally, a connection will be made to the methods of Probabilistic Inference, see GNN for Motion Planning.
If possible, a comprehensive platform including ROS Mobile should be developed.

Top 2: Update and next steps

  1. Update: 
    1. Last week a Joint Effort Controller was implemented and tested in Gazebo. The application in combination with the inverse solvers of the robotic toolbox was tested, but still leads to errors in the control.
    2. Literature research: Bishop
  2. Next steps:
    1. In the coming week, these errors will be corrected and the tests on the Franka robot arm will start.
    2. Literature research: finish Bishop



Integrated CPS Project or B.Sc. Thesis: Mobile Navigation via micro-ROS

Supervisors:

Start date: October 2022

 

Qualifications

  • Interest in controlling and simulating mobile robotics
  • Interest in Programming in Python and ROS or ROS2

 Keywords: Mobile robot control, robot operating system (ROS), ESP32

Description

The goal of this project or thesis is to develop a control and sensing interface for our mobile robot “RMP220“. The RMP220 has two powerful brush-less motors equipped with two magnetic encoders.

Learn in this project how to read the sensor values and how to control the motors via micro-ros on a ESP32 controller.

Links:

 

Note: This project is also offered as Internship position.


https://www.youtube.com/watch?v=-MfNrxHXwow

Single Person Project or Team Work

You may work on the project alone or in teams of up to 4 persons.

For a team work task, the goals will be extended to control the robot via ROS 2 and to simulate it in Gazebo or RViz.

Interested?

If this project sounds like fun to you, please contact Linus Nwankwo or Elmar Rueckert or simple visit us at our chair in the Metallurgie building, 1st floor.




Internship Position – Mixed Reality Robot Teleoperation with Hololens 2

Description

Mixed Reality (AR) interface based on Unity 3D for intuitive programming of robotic manipulators (UR3). The interface will be implemented within on the ROS 2 robotic framework.

Note: This project is also offered under the Integrated CPS project course or as B.Sc or M.Sc. Thesis

Qualifications

  • Basic skills in Python or C++
  • ROS
  • Unity3D or C#

 Keywords: Augmented Reality, Robotic Interfaces, Engineering, Graphical Design

Duration

Minimum 3 months. Preferably 5-6 months.

Abstract

Robots will become a necessity for every business in the near future. Especially companies that rely heavily on the constant manipulation of objects will need to be able to constantly repurpose their robots to meet the ever changing demands. Furthermore, with the rise of Machine Learning, human collaborators or ” robot teachers” will need a more intuitive interface to communicate with them, either when interacting with them or when teaching them.

In this project we will develop a novel Mixed (Augmented) Reality Interface for teleoperating the UR3 robotic manipulator. For this purpose we will use AR glasses to augment the user’s reality with information about the robot and enable intuitive programming of the robot. The interface will be implemented on a ROS 2 framework for enhanced scalability and better integration potential to other devices.

Outcomes

This internship will result to an innovative graphical interface that enables non-experts to program a robotic manipulator.

The intern will get valuable experience in the Robot Operating System (ROS) framework and developing graphical interfaces on Unity. The student will also get a good understanding of robotic manipulators (like UR3) and develop a complete engineering project.

Funding

Please contact us via cps@unileoben.ac.at if you want to join us for an internship.

We will support you during your application for an internship grant. Below we list some relevant grant application details.

CEEPUS grant (European for undergrads and graduates)

Find details on the Central European Exchange Program for University Studies program at https://grants.at/en/ or at https://www.ceepus.info.

In principle, you can apply at any time for a scholarship. However, also your country of origin matters and there exist networks of several countries that have their own contingent.

Ernst Mach Grant (Worldwide for PhDs and Seniors)

Find details on the program at https://grants.at/en/ or at https://oead.at/en/to-austria/grants-and-scholarships/ernst-mach-grant.

Rest Funding Resourses

Apply online at http://www.scholarships.at/




Integrated CPS Project or B.Sc./M.Sc. Thesis: Mixed Reality Robot Teleoperation with Hololens 2

Supervisors:

Start date: October 2022

 

Qualifications

  • Basic skills in Python or C++
  • ROS or Unity3D/C#

 Keywords: Augmented Reality, Robotic Interfaces, Engineering, Graphical Design

Description

Mixed Reality (AR) interface based on Unity 3D for intuitive programming of robotic manipulators (UR3). The interface will be implemented within on the ROS 2 robotic framework.

Note: This project is also offered as Internship position.


https://www.youtube.com/watch?v=-MfNrxHXwow

Abstract

Robots will become a necessity for every business in the near future. Especially companies that rely heavily on the constant manipulation of objects will need to be able to constantly repurpose their robots to meet the ever changing demands. Furthermore, with the rise of Machine Learning, human collaborators or ” robot teachers” will need a more intuitive interface to communicate with them, either when interacting with them or when teaching them.

In this project we will develop a novel Mixed (Augmented) Reality Interface for teleoperating the UR3 robotic manipulator. For this purpose we will use AR glasses to augment the user’s reality with information about the robot and enable intuitive programming of the robot. The interface will be implemented on a ROS 2 framework for enhanced scalability and better integration potential to other devices.

Outcomes

This thesis will result to an innovative graphical interface that enables non-experts to program a robotic manipulator.

The student will get valuable experience in the Robot Operating System (ROS) framework and developing graphical interfaces on Unity. The student will also get a good understanding of robotic manipulators (like UR3) and develop a complete engineering project.

Interested?

If this project sounds like fun to you, please contact Fotios Lygerakis by email at fotios.lygerakis@unileoben.ac.at or simple visit us at our chair in the Metallurgie building, 1st floor.




13.09.2021 Meeting Notes

Meeting Details

Date : 13th September 2022

Time : 12:30 – 1:30

Location : Chair of CPS, Montanuniverität Leoben

Participants: Univ.-Prof. Dr. Elmar Rueckert, Vedant Dave

Agenda

  1. Learning Consistent Forward and Inverse Dynamics.

Topic 1: Idea Development

  1. Thinking in terms of Closed loop systems and feedback controllers.
  2. Regularizing Forward model via Inverse model.
  3. Single-step and Multi-step prediction models.
  4. Comparing Multi-step predictions with Movement Primitives.

Topic 2: Toy Example

  1. Generate a toy dataset(Temperature) with just single parameter.
  2. Try forward model to approximate out-of-distribution testing data.
  3. If it fails, try to regularize it with Inverse model and check if it works out.

Literature

Inverse Dynamic Predictions

  1. Cooper, Richard. (2010). Forward and Inverse Models in Motor Control and Cognitive Control. Proceedings of the International Symposium on AI Inspired Biology – A Symposium at the AISB 2010 Convention.
  2. Moore, Andrew. “Fast, robust adaptive control by learning only forward models.” Advances in neural information processing systems 4 (1991).

Next Meeting

TBD