1

HRI-SL: Human-Robot Interaction with Sign Language

Start date: Open

Location: Leoben

Position Types: Thesis/Internship

Duration: 3-6 months, depending on the level of applicant’s proficiency on the asked qualifications.

 

Keywords: Human-Robot Interaction (HRI), Human Gesture Recognition, Sign Language, Robotics, Computer Vision, Large Language Models (LLMs), Behavior Cloning, Reinforcement Learning, Digital Twin, ROS-2

Supervisor:

You can work on this project either by doing a B.Sc or M.Sc. thesis or an internship*.

Abstract

As the interaction with robots becomes an integral part of our daily lives, there is an escalating need for more human-like communication methods with these machines. This surge in robotic integration demands innovative approaches to ensure seamless and intuitive communication. Incorporating sign language, a powerful and unique form of communication predominantly used by the deaf and hard-of-hearing community, can be a pivotal step in this direction. 

By doing so, we not only provide an inclusive and accessible mode of interaction but also establish a non-verbal and non-intrusive way for everyone to engage with robots. This evolution in human-robot interaction will undoubtedly pave the way for more holistic and natural engagements in the future.

DALL·E 2023-02-09 17.32.48 - robot hand communicating with sign language

Project Description

The implementation of sign language in human-robot interaction will not only improve the user experience but will also advance the field of robotics and artificial intelligence.

This project will encompass 4 crucial elements.

  1. Human Gesture Recognition with CNNs and/or Transformers – Recognizing human gestures in sign language through the development of deep learning methods utilizing a camera.
    • Letter-level
    • Word/Gloss-level
  2. Chat Agent with Large Language Models (LLMs) – Developing a gloss chat agent.
  3. Finger Spelling/Gloss gesture with Robot Hand/Arm-Hand –
    • Human Gesture Imitation
    • Behavior Cloning
    • Offline Reinforcement Learning
  4. Software Engineering – Create a seamless human-robot interaction framework using sign language.
    • Develop a ROS-2 framework
    • Develop a robot digital twin on simulation
  5. Human-Robot Interaction Evaluation – Evaluate and adopt the more human-like methods for more human-like interaction with a robotic signer.
1024-1364
Hardware Set-Up for Character-level Human-Robot Interaction with Sign language.
Example of letter-level HRI with sign language: Copying agent

Qualifications

  • Currently pursuing a Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, Mechanical Engineering, Mathematics or related fields.
  • Strong programming skills in Python and experience with deep learning frameworks such as PyTorch or TensorFlow.
  • Experience working with robotics hardware.
  • Knowledge of computer vision and image processing techniques
  • Strong problem-solving skills and ability to work independently and collaboratively.
  • Good written and verbal communication skills in English.
  • Passion for creating technology that is accessible and inclusive for everyone
  • Experience in working on research projects or coursework related to robotics or artificial intelligence is a plus

Opportunities

This project provides an excellent opportunity to gain hands-on experience in cutting-edge research, working with a highly collaborative and supportive team. The student/intern will also have the opportunity to co-author research papers and technical reports, and participate in conferences and workshops.

Application/Questions

Send us your CV accompanied by a letter of motivation at fotios.lygerakis@unileoben.ac.at with the subject: “Internship/Thesis Application | Sign Language Robot Hand”

Funding

* This project does not offer a funded position. Below we list some relevant grant application details.

CEEPUS grant (European for undergrads and graduates)

Find details on the Central European Exchange Program for University Studies program at https://grants.at/en/ or at https://www.ceepus.info.

In principle, you can apply at any time for a scholarship. However, also your country of origin matters and there exist networks of several countries that have their own contingent.

Ernst Mach Grant (Worldwide for PhDs and Seniors)

Find details on the program at https://grants.at/en/ or at https://oead.at/en/to-austria/grants-and-scholarships/ernst-mach-grant.

Rest Funding Resourses

Apply online at http://www.scholarships.at/




Sign Language Robot Hand [M.Sc. Thesis/Int. CPS Project]

Abstract

Human-Robot Interaction using Sign Language is a project that aims to revolutionize the way we communicate with machines. With the increasing use of robots in our daily lives, it is important to create a more natural and intuitive way for humans to communicate with them.

Sign language is a unique and powerful form of communication that is widely used by the deaf and hard-of-hearing community. By incorporating sign language into robot interaction, we can create a more inclusive and accessible technology for everyone.

Moreover, sign language will provide a new and innovative way to interact with robots, making it possible for people to control and communicate with them in a way that is both non-verbal and non-intrusive.

Note: This project is also offered as Internship position.

DALL·E 2023-02-09 17.32.48 - robot hand communicating with sign language

Thesis Description

The implementation of sign language in human-robot interaction will not only improve the user experience but will also advance the field of robotics and artificial intelligence. This project has the potential to bring about a new era of human-robot interaction, where machines and humans can communicate in a more natural and human-like way. Therefore, the Human-Robot Interaction using Sign Language project is a crucial step toward creating a more accessible and user-friendly technology for everyone.

This thesis will encompass three crucial elements. The first part will focus on recognizing human gestures in sign language through the development of deep learning methods utilizing a camera. The second part will involve programming a robotic hand to translate text back into gestures. Finally, the third part will bring together the first two components to create a seamless human-robot interaction framework using sign language.

Qualifications

  • Currently pursuing a Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, Mechanical Engineering, Mathematics or related fields.
  • Strong programming skills in Python
  • Experience with deep learning frameworks such as PyTorch or TensorFlow.
  • Experience working with robotics hardware
  • Knowledge of computer vision and image processing techniques
  • Good written and verbal communication skills in English.

Interested?

If this project sounds like fun to you, please contact Fotios Lygerakis by email at fotios.lygerakis@unileoben.ac.at or simple visit us at our chair in the Metallurgie building, 1st floor.




Meeting Notes December 2022

Meeting 01/12

Updates

  • experiments with caltech 101
    • too small dataset. Network needs pretraining
    • too big bictures. problems with gpu memory when training and big storage space when saving models
  • refactor code to better scale for more evaluation techniques
  • reviewed XAI methods.
  • Further literature review for representation learning

MS Student Updates

  • Melanie
    • Image Segmentation on Steel Defect dataset
    • Next on: Deep Optical Flow

Other activities

  • Hololens 2 review
  • plan to publish the AR project as internship position
    • LinkedIn -> CPS page?
    • MUL
    • Emails
  • Share Christmas video with the public relations team of MUL
  • linked in account ->  page

Next on

  • do experiments for interpolating latent space
  • run experiments with more datasets
  • reconstruct the paper for
  • literature review on representation learning
  • seminar talk for latent space representation and explainability in neural networks, feature maps. organize meetings (1 paper per week)

Meeting 08/12

Updates

  •  

MS Student Updates

  • Melanie
    •  
    • Next on:

Other activities

  • PhD in Computer Science

Next on

  •  



Meeting on the 23th, November 2022

Location: Chair of CPS

Date & Time: 23rd Nov 2022

Participants: Univ.-Prof. Dr. Elmar Rueckert, DI Nikolaus Feith, BSc

 

Agenda

  1. Update
  2. Future Steps

Top 1: Update

  • RH8D:
    • Finished the hardware interface, no more communication issues with the left hand.
    • Sample Position controller
  •  Webserver:
    • basics in websockets and js/css/html try outs
    • established connection with ROS2 via rosbridge
    • literature research on related work (shared control, webservers in robotics)
    • search for libraries to display dynamic plots (flot.js or dc.js)

Top 2: Future Steps

  •  



Maximilian Pettinger, B.Sc.

Student Assistant at the Montanuniversität Leoben

IMG_E2067[1]

Short bio: Maximilian Pettinger, B.Sc started at CPS in November  2022.

Maximilian Pettinger is a master student in Polymer Engineering and bachelor student in Mechanical Engineering, both Montanuniversity Leoben. Prior to his master program he studied Polymer Engineering at the Montanuniversität Leoben, where he passed his Bachelor defense in January 2022. Furthermore, he is a member of the MotoStudent Team (MontanFactory Racing) of the University of Leoben.

Research Interests

  • Robotics, MicroROS, 

Thesis

Contact

Maximilian Pettinger, B.Sc 
Student Assistent at the Chair of Cyber-Physical-Systems
Montanuniversität Leoben
Franz-Josef-Straße 18, 
8700 Leoben, Austria 

Email:   maximilian.pettinger@stud.unileoben.ac.at




Unitree GO1

https://cps.unileoben.ac.at/wp/UnitreeGO1_firststeops_at_CPS.mov

The video shows our Unitree GO1 robot at its first steps at CPS. This quadruped robot can locomote in rough terrain, autonomously avoids obstacles like stones or blocking barriers, and provides a large number of sensors for navigation and mapping research projects. 

Links

Videos

  • Research videos using the robot will be presented here. 

 

Publications

  • Publications about the robot as well as related topics will be found here.



10.11.2022 – Innovative Research Discussion

Meeting notes on the 20th of October, 2022

Location: Chair of CPS

Date & Time: 10th November, 2022, 1:30 pm to 2:28 pm

Participants: Univ.-Prof. Dr. Elmar Rueckert, Linus Nwankwo, M.Sc.

 

Agenda

  1. General Discussion
  2. Discussion on the data set from Privatklinik Graz
  3. Next action

General Discussion

  1.  New cards are added to the Deck app, check the ones that required actions.
  2.  There will be a ROS2 meeting with Nils by 5 pm on 11.11.2022.

Data-set from Privitklinik Graz

  1. Reproduce the failed experiments.

  2. Evaluate the S-PTAM and ORB-SLAM visual SLAM algorithms on the recorded data.

  3. Evaluate Hector SLAM and GMapping algorithms on the recorded dataset(s) with a limited field of view of the lidar data.

    • Remove 90° in the frontal direction
    • Remove 120° in the frontal direction
    • Remove 90 or 120° in the frontal and in the backwards direction

Do next

  1. Implement a filter node that filters out the noise from the data.
  2. Build the map from the hospital’s building plan (bp)



Meeting on the 16th, November 2022

Location: Chair of CPS

Date & Time: 5th Oct 2022

Participants: Univ.-Prof. Dr. Elmar Rueckert, DI Nikolaus Feith, BSc

 

Agenda

  1. Update
  2. Future Steps

Top 1: Update

  • Literature search on Related work to Online Reinforcement Learning and Preference Learning was started
  •  Hardware interface for RH8D was implemented and needs to be tested now. Some issues with exceeding cycle time of the mainloop were found.

Top 2: Future Steps

  • Start working on webservices and ROS2
  • MP:
    • What do we need for kinestetic teaching, shared control etc.
    • Whiech Parameters are changed and how to display them
  • Start with simulation and not a real robot
  • GNN: for the future reimplementation of toy tasks and research whats new in this field or what is still left out
  •  RH8D: Try fixing the communication error with Vedant’s SDK
  • Webserver/Websockets:
    • What are requirements for the webserver/websocket to use it with ROS2 and shared control



Zoll / Imports / Exports

Für Exporte u. Importe aus dem Nicht-EU Ausland

  • Zollvorschriften: https://mydhl.express.dhl/at/de/help-and-support/customs-clearance-advice/customs-regulatory-updates.html
  • HAM. Code check: https://hs.e-to-china.com/

Importe / Exporte über die DHL

Hier sollen alle Daten vorab auf MyDHL+online eingetragen werden.

Hilfe gibte es hier

  • https://dhl-news.com/624-82NYR-FTBW2X-4YH5EJ-1/c.aspx

Ansprechperson MUL

Finanzbuchhaltung: Nadja Schulhofer (nadja.schulhofer@unileoben.ac.at).




04.11.2022 Meeting Notes

Meeting Details

Date: 3rd October 2022

Time : 08:30 – 09:00

Location : Chair of CPS, Montanuniverität Leoben

Participants: Univ.-Prof. Dr. Elmar Rueckert, Vedant Dave

Agenda

  1. Extension idea formulation for Journal Paper
  2. Work on exploration and curiosity

Topic 1: Science Robotics Paper

  1. Bi-level Probabilistic Movement Primitives due to uneven error propagation in different stages.
  2. Read literature [1] and see if we find something.

Topic 3: Dynamic Exploration and Curiosity

  1. We got out baseline [2] and now we try to implement this paper.
  2. Try the same model on more complex environments and more out-of-box goals.
  3. In process, first implement [3].

Literature

  1. R. Lioutikov, G. Maeda, F. Veiga, K. Kersting and J. Peters, “Inducing Probabilistic Context-Free Grammars for the Sequencing of Movement Primitives,” 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018, pp. 5651-5658, doi: 10.1109/ICRA.2018.8460190.
  2. Mendonca, Russell, et al. “Discovering and achieving goals via world models.” Advances in Neural Information Processing Systems 34 (2021): 24379-24391.
  3. Hafner, Danijar, et al. “Learning latent dynamics for planning from pixels.” International conference on machine learning. PMLR, 2019.

Next Meeting

TBD