1

190.017 Advanced Machine and Deep Learning (5SH IL, WS)

Course Content & Topics

Theoretical and practical aspects of computing and learning with neural networks. Investigation of the most commonly used algorithms for deep learning. From the practical perspective, various learning algorithms and types of neural networks will be implemented and applied to artificial and real-world problems. A list of the topics that will be covered is as follows:

  • Theoretical background on machine learning
  • Feedforward neural networks
    • Training methods, optimization algorithms
    • Regularization, generalization
    • Convolutional neural networks
  • Recurrent neural networks (LSTMs, GRUs, etc.)
  • Deep generative models
    • Variational autoencoders
    • Generative adversarial networks
    • Denoising diffusion probabilistic models
  • Attention & Transformers

Learning Objectives

After positive completion of the course, students will be able to:

  • Understand and apply the fundamental concepts of learning principles to implement commonly used architectures in deep learning and to develop novel architectures.
  • Design and train complex deep neural networks in supervised and unsupervised learning scenarios which requires a thorough theoretical and practical understanding of the algorithms.
  • Identify relevant and important features, benefits and limitations of different neural network models and algorithms, e.g., with respect to their practical, generalization- and regularization abilities.
  • Apply state-of-the-art deep learning methods to different problems and to analyze, monitor and visualize the models’ performance and limitations.

Location & Time

Location: HS Thermoprozesstechnik (05ME01124) during October. Starting in November, lectures are planned to be held in HS 3 Studienzentrum (35SZ02211).

Time: Tuesdays & Thursdays 10:00 – 12:00. Detailed schedule can be found here.

Grading

* Continuous assessment and written exam: Details will be presented in the first lecture unit.

* Task assignments: Several practical assignments have to be implemented. For each assignment a written report and/or slides have to be submitted. Details will be presented in the first lecture unit.

* Grading scheme: 0-49.9 Points (5), 50-62.4 Points (4), 62.5-74.9 Points (3), 75-87.4 Points (2), 87.5-100 Points (1)

(With an overall score of up to 75%, an additional oral performance review MAY (!) also be required if the positive performance record is not clear. In this case, you will be informed as soon as the grades are released. If you have already received a grade via MU online, you will not be invited to another oral performance review.)

Prerequisites

  • Formal prerequisite: Introduction to Machine Learning VU (“190.018”) or L+P (“190.012” and “190.013”).
  • Recommended prerequisites: Good Python programming skills, Fundamentals of Probability Theory, Basic Algebra & Vector Calculus.

Literature

– Ian Goodfellow, Yoshua Bengio and Aaron Courville, “Deep Learning”, 2016.
– Christopher M Bishop, “Pattern Recognition and Machine Learning”, 2006.




Open Project, MSc. or BSc. Thesis – Multimodal Human-Autonomous Agents Interaction Using Pre-trained Language and Visual Foundation Models

Supervisor: Linus Nwankwo, M.Sc.;
Univ.-Prof. Dr Elmar Rückert
Start date:  As soon as possible

 

Theoretical difficulty: mid
Practical difficulty: High

Abstract

In this project or thesis, we aim to enhance the method proposed in [1] for robust natural human-autonomous agent interaction through verbal and textual conversations. 

The primary focus would be to develop a system that can enhance the natural language conversations, understand the 

semantic  context of the robot’s task environment, and abstract this information into actionable commands or queries. This will be achieved by leveraging the capabilities of pre-trained large language models (LLMs) – GPT-4, visual language models (VLMs) – CLIP, and audio language models (ALMs) – AudioLM.

Tentative Work Plan

To achieve the objectives, the following concrete tasks will be focused on:

  • Initialisation and Background:
    • Study the concept of LLMs, VLMs, and ALMs.
    • How LLMs, VLMs, and ALMs can be grounded for autonomous robotic tasks.
    • Familiarise yourself with the methods at the project website – https://linusnep.github.io/MTCC-IRoNL/.
    •  
  • Setup and Familiarity with the Simulation Environment
    • Build a robot model (URDF) for the simulation (optional if you wish to use the existing one).
    • Set up the ROS framework for the simulation (Gazebo, Rviz).
    • Recommended programming tools: C++, Python, Matlab.
    •  
  • Coding
    • Improve the existing code of the method proposed in [1] to incorporate the aforementioned modalities—the code to be provided to the student.
    • Integrate other LLMs e.g., LLaMA and VLMs e.g., GLIP modalities into the framework and compare their performance with the baseline (GPT-4 and CLIP).
    •  
  • Intermediate Presentation:
    • Present the results of your background study or what you must have done so far.
    • Detailed planning of the next steps.
    •  
  • Simulation & Real-World Testing (If Possible):
    • Test your implemented model with a Gazebo-simulated quadruped or differential drive robot.
    • Perform the real-world testing of the developed framework with our Unitree Go1 quadruped robot or with our Segway RMP 220 Lite robot.
    • Analyse and compare the model’s performance in real-world scenarios versus simulations with the different LLMs and VLMs pipelines.
    •  
  • Optimize the Framework for Optimal Performance and Efficiency (Optional):
    • Validate the model to identify bottlenecks within the robot’s task environment.
    •  
  • Documentation and Thesis Writing:
    • Document the entire process, methodologies, and tools used.
    • Analyse and interpret the results.
    • Draft the project report or thesis, ensuring that the primary objectives are achieved.
    •  
  • Research Paper Writing (optional)
    •  

Related Work

[1]  Linus Nwankwo and Elmar Rueckert. 2024. The Conversation is the Command: Interacting with Real-World Autonomous Robots Through Natural Language. In Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’24). Association for Computing Machinery, New York, NY, USA, 808–812. https://doi.org/10.1145/3610978.3640723.

[2]  Nwankwo, L., & Rueckert, E. (2024). Multimodal Human-Autonomous Agents Interaction Using Pre-Trained Language and Visual Foundation ModelsarXiv preprint arXiv:2403.12273.




After Business Trip Paperwork

New Obligation: Submit along comparitive offers

(This starts from 30 July 2024)

You are required to submit a comparative analysis of the prices for your trip along with other documents for claims.

Documents to submit and print in hard copy:

  • Conference/summer school schedule
  • Transport ticket (flight/intercity train/city train/bus)
  • Registration fee
  • Spesenabrechnung/Reisekostennachweis (from SAP)
  • Accomodation
  • Comparative analysis of the prices (only be paid 50% of the flight costs, if this document is not submitted along)

New Obligation: Monthly data entry for öbb tickets

(This starts from 26 Jun 2024)

You can find the entry form at here: https://cloud.cps.unileoben.ac.at/index.php/s/GTFTrT8btK7mMtW

Procedure to submit paperwork to Financial Department

Published on 21 May 2024

Update 1 on 26 Jun 2024

Update 2 on 30 July 2024

1. Login into SAP

At your SAP, click on “Meine Reisen und Spesen”.

2. Click on your desired Trip

In my case, I will show example in Austria.

 

Click “Welter” to proceed.

3. At the Main Page with 4 steps

Step 1: Verify every information especially Kontierung (Your project number)

Next, click on checkbox with * and then proceed with “Belege erfassen”

Step 2: Add all related claims

Step 3: Validate

There are two options: Save it for future or Sent it to financial department


4. Final step

  • Prepare all the original receipts and keep a copy with you.
  • Print out the above from system
  • Put the documents at “Dienstreisen Folder” at Regina’s place
  • Bring the folder to Uni Post Office at 1st floor of old building.




Print a Poster

Kindly ask for permission before proceed to poster printing.

 

To print a poster, you can either go for:

Option 1: Mail Boxes Leoben

Price list:

A0: ~20.00 euro

CPS account: KST 101900

Email them, and they will record at CPS account.


Option 2: ÖH Leoben

Fill the form at : https://www.oeh-leoben.at/de/plotauftrag

Price list:

A0: ~6.63 euro

A1: ~3.35 euro

Only cash payment, and pay it when obtaining.




Zeitungsinterview Kosmo

Unser Lehrling Kosmo Obermayer berichtet über seinen Lehrberuf als Informationstechnologe mit Betriebstechnik und gibt spannnende Einblicke in seinen Alltag am Lehrstuhl für Cyber-Physical-Systems.

Der Artikel ist unter diesem Link erreichbar.

Ein fröhlicher Kosmo




Internship/Thesis in Robot Learning

Are you fascinated by the intricate dance of robots and objects? Do you dream of pushing the boundaries of robotic manipulation? If so, this internship is your chance to dive into the heart of robotic innovation!

You can work on this project either by doing a B.Sc or M.Sc. thesis or an internship.

Job Description

This internship offers a unique opportunity to explore the exciting world of robotic learning. You’ll join our team, working alongside cutting-edge robots like the UR3 and Franka Emika cobots, and advanced grippers like the 2F Adaptive Gripper (Robotiq), the dexterous RH8D Seed Robotics Hand and the LEAP hand. Equipped with tactile sensors, you’ll delve into the world of grasping, manipulation, and interaction with diverse objects using Deep Learning methods.

Start date: Open

Location: Leoben

Duration: 3-6 months

Supervisors:

Keywords:

  • Robot learning
  • Robotic manipulation
  • Reinforcement Learning
  • Sim2Real
  • Robot Teleoperation
  • Imitation Learning
  • Deep learning
  • Research

Responsibilities

  • Collaborate with researchers to develop and implement novel robotic manipulation learning algorithms in simulation and in real-world.
  • Gain hands-on experience programming and controlling robots like the UR3 and Franka Emika cobots.
  • Experiment with various grippers like the 2F Adaptive Gripper, the RH8D Seed Robotics Hand and the LEAP hand, exploring their functionalities.
  • Develop data fusion methods for vision and tactile sensing.
  • Participate in research activities, including data collection, analysis, and documentation.
  • Contribute to the development of presentations and reports to effectively communicate research findings.

Qualifications

  • Currently pursuing a Bachelor’s or Master’s degree in Computer Science,
    Electrical Engineering, Mechanical Engineering, Mathematics or related
    fields.
  • Solid foundation in robotics fundamentals (kinematics, dynamics, control theory).
  • Solid foundation in machine learning concepts (e.g., supervised learning, unsupervised learning, reinforcement learning, neural networks, etc)
  • Strong programming skills in Python and experience with deep learning frameworks such as PyTorch or TensorFlow.
  • Excellent analytical and problem-solving skills.
  • Effective communication and collaboration skills to work seamlessly within the research team.
  • Good written and verbal communication skills in English.
  • (optional) Prior experience in robot systems and published work.

Opportunities and Benefits of the Internship

  • Gain invaluable hands-on experience with state-of-the-art robots and grippers.
  • Work alongside other researchers at the forefront of robot learning.
  • Develop your skills in representation learning, reinforcement learning, robot learning and robotics.
  • Contribute to novel research that advances the capabilities of robotic manipulation.
  • Build your resume and gain experience in a dynamic and exciting field.
     

Application

Send us your CV accompanied by a letter of motivation at fotios.lygerakis@unileoben.ac.at with the subject: “Internship Application | Robot Learning”

Related Work

  • MViTac: Self-Supervised Visual-Tactile Representation Learning via Multimodal Contrastive Training
  • M2CURL: Sample-Efficient Multimodal Reinforcement Learning via Self-Supervised Representation Learning for Robotic Manipulation

Funding

We will support you during your application for an internship grant. Below we list some relevant grant application details.

CEEPUS grant (European for undergrads and graduates)

Find details on the Central European Exchange Program for University Studies program at https://grants.at/en/ or at https://www.ceepus.info.

In principle, you can apply at any time for a scholarship. However, also your country of origin matters and there exist networks of several countries that have their own contingent.

Ernst Mach Grant (Worldwide for PhDs and Seniors)

Find details on the program at https://grants.at/en/ or at https://oead.at/en/to-austria/grants-and-scholarships/ernst-mach-grant.

Rest Funding Resourses

Apply online at http://www.scholarships.at/




B.Sc. Thesis: Reineke Peter on Deep Learning for Predicting Fluid Dynamics

Supervisor: Univ.-Prof. Dr Elmar Rückert

Project: K1-MET P3.4
Start date: 1st of May 2024

Theoretical difficulty: high
Practical difficulty: mid

Topic

The the steel production, the steel quality heavily depends on the dynamic processes of the meniscus level fluctuations in the mold. These complex dynamic  processes can be observed using IR cameras observing the surface level and the casting powder temperature. 

The goal of this thesis is to develop and compare deep learning approaches (CNNs, transformers) for predicting fluid dynamics in lab prototype environment. 

Tasks

  • Literature research of state of the art, see references
  • Lab prototype environment for generating complex (structured and chaotic) fluid dynamics
  • Dataset recording, visualization and annotation
  • Deep Learning algorithm implementation (CNNs & Transformers)
  • Evaluation on different datasets (predictable dynamics, complex dynamics, synchronous and async. surface level dynamics, chaotic dynamics).
  • Thesis writing.

References




B.Sc. Thesis: Sukal Tanja on Creating a Python development environment for LEGO Ev3 robot systems

Supervisor: Univ.-Prof. Dr Elmar Rückert

Start date: 1st of August 2023

Theoretical difficulty: mid
Practical difficulty: mid

Thema der Arbeit

LEGO Ev3 Robotersysteme werden am Lehrstuhl in der Lehre eingesetzt, um einen einfachen Einstieg in die Robotik zu ermöglichen. Zahlreiche Algorithmen können erprobt werden: 

  • Pfadplanung und Navigation
  • Kalman Filter
  • Kartierung / SLAM
  • Objektmanipulation 
  • Kamerabasierte Objekterkennung
  • Regelungsalgorithmen
  • Telemetrieaufgaben
  • usw.

Unsere EV3 Systeme sind mit einem Linux Betriebsystem (https://www.ev3dev.org) ausgestattet und können in der Programmiersprache Micro-Python bespielt werden. 

Ziel dieser Arbeit ist es eine Entwicklungsumgebung für die klassische Python Programmiersprache zu schaffen. Dabei sollen Beispielprojekte umgesetzt und Limitierungen dokumentiert werden. 

Aufgaben

  • Recherche und Dokumentation zur State-of-the-Art
  • Beispielprojekte Implementieren
  • Git Repository mit Dokumentation erstellen.
  • Dokumentation der Arbeitsschritte & Verfassen der Diplomarbeit

Wissenschaftlicher Beitrag

  • Entwicklung und Implementierung einer Entwicklungsumgebung in Python für die Lehre. 
  • Veröffentlichung des Source Codes. 

Abschlussarbeit

B.Sc. Thesis by Tanja Sukal on Open-Source LEGO EV3 Python Framework for Teaching, 2024.




M.Sc. Thesis: Einberger Stefan on Retrofitting of a Cyber-Physical System to a reactive molding machine for thermoset resins

Supervisor: Univ.-Prof. Dr Elmar Rückert

Company: Ottronic GmbH
Start date: 1st of October 2023

Theoretical difficulty: mid
Practical difficulty: mid

Thema der Arbeit

Bei Ottronic bildet die Verkapselung unserer Elektroniken und Motoren mittels eines eigens adaptierten Reactive Injection Molding (RIM) Verfahren die Grundlage für die Produktion von medienresistenten Elektroniken und elektrischen Hochleistungsantrieben. Im Zuge des RIM werden sogenannte b-staged Duroplaste unter einem präzisen Druck- und Temperaturprofil verarbeitet, geformt und final ausgehärtet. Um die angestrebte höchste Produktqualität, mit Blick auf Medizintechnik-Applikationen, zu gewährleisten, muss dieser Prozess mit jedem Schuss optimal eingestellt sein. Daher soll ein Cyber Physical System (CPS) entwickelt werden, welches auf unsere RIM-Anlagen nachgerüstet wird.

Ziel der Diplomarbeit ist, dass das finale CPS selbstständig Prozessschwankungen (Chargenschwankungen, Hallen-/Maschinentemperatur, Feuchte, etc.) erkennen kann und Regelparameter des Spritzgussvorgangs (Schmelzzeit, Aushärtezeit, Einpresskraft, etc.) anpassen um eine gleichbleibende Produktqualität ohne menschliche Kontrolle zu garantieren.

Dabei soll in einem ersten Schritt der aktuelle Prozess beschrieben werden. Daraus sollen die notwendigen Produktionsparameter abgeleitet und deren Auswirkungen auf den Prozess analysiert sowie die Kernpunkte zur Prozessoptimierung definiert werden, sowie ein Modell dafür entwickelt werden. Das zu entwickelnde Modell dient dann in weitere Folge als Basis für das CPS um den Prozess zu bewerten zu können, Abweichungen zu detektieren und Regelparameter abzuändern.

In weiterer Folge soll das Modell auf der Maschinensteuerung CPS integriert und implementiert werden. Abschließend muss noch die Verknüpfung der neu gewonnen Intelligenz des CPS mit der aktuellen Regelung der Maschine verknüpft werden um eine nahezu autonome Prozessführung zu garantieren, sowie eine neue ressourcen- als auch energieeffizientere Spritzpressmethode garantieren.

Aufgaben

  • Recherche und Dokumentation zur State-of-the-Art
  • Erfassung aller notwendigen Produktionsparameter sowie Analyse derer auf Produktqualität
  • Entwicklung eines Modelles zur virtuellen Beschreibung des Spitzgussprozesses Implementierung des Modells auf Maschinensteuerung als Basis für ein CPS
  • Identifizierungen von Abweichungen im Prozess sowie Implementierung von Gegenmaßnahmen
  • Verknüpfung der Prozessregelung mit Steuerungsempfehlungen des CPS.
  • Dokumentation der Arbeitsschritte & Verfassen der Diplomarbeit

Wissenschaftlicher Beitrag

  • Entwicklung und Implementierung eines CPS, welches einen RIM Prozesses erfassen und abbilden kann.
  • Aufbau von Methoden um auf Basis des CPS Änderungen des RIM-Prozesses erkennen und bewerten zu können
  • Retrofitting der gewonnen Intelligenz in eine bestehende Maschinensteuerung



Montanuniversität Leoben logos

Here’s a link to download logos in full resolutions:

https://qm.unileoben.ac.at/en/qm-documents/q4-communication