image_pdfimage_print

GelSight Mini Tactile Sensor

The GelSight Mini is an affordable, compact tactile sensor that provides high-resolution 2D and 3D surface mapping. Its plug-and-play design enables users to begin operations within five minutes of setup. Key features include:

  • High-Performance Mapping: Delivers detailed 2D and 3D surface imagery, capturing fine textures and features.

  • AI Integration: Compatible with existing computer vision techniques and software, facilitating AI training and research.

  • Compact and Robust Design: Easily integrates into various robotic systems and withstands rigorous prototyping environments.

  • Quick Gel Replacement: Allows toolless and rapid gel changes, enhancing efficiency during iterative testing.

The GelSight Mini is suitable for professionals, academics, and hobbyists interested in touch-based interfaces, robotics, and industrial scanning. Its user-friendly interface and affordability make it an accessible tool for advancing tactile sensing applications

Videos

  • Research videos using the robot will be presented here. 
 

Publications

Sorry, no publications matched your criteria.

Digit Tactile Sensor

The DIGIT sensor is a compact, low-cost, and high-resolution vision-based tactile sensor designed for in-hand robotic manipulation tasks. It improves upon traditional tactile sensors by offering a smaller form factor, enhanced durability, and streamlined manufacturing, making it suitable for multi-fingered robotic hands. DIGIT utilizes an elastomer surface to measure contact forces via image deformation captured by an embedded camera, delivering precise tactile feedback. Its modular design allows easy replacement of components, supports task-specific elastomers, and ensures robustness under repeated use. With a cost of approximately $15 per unit in batch manufacturing, DIGIT provides an accessible and effective solution for tactile sensing in robotics.

Videos

  • Research videos using the robot will be presented here. 
 

Publications

Sorry, no publications matched your criteria.

Leap Hand

The LEAP Hand is a low-cost, efficient, and anthropomorphic robotic hand designed for dexterous manipulation and robot learning. The hand is robust, durable, and capable of exerting large torques over extended periods. With a novel kinematic structure that retains all degrees of freedom in any finger position, it supports a wide range of manipulation tasks, including grasping, teleoperation, and in-hand object rotation. The LEAP Hand is open-source, with detailed assembly instructions, simulation tools, and APIs, making it accessible and scalable for research and development.

Videos

  • Research videos using the robot will be presented here. 
 

Publications

Sorry, no publications matched your criteria.

Internship/Thesis in Robot Learning

Are you fascinated by the intricate dance of robots and objects? Do you dream of pushing the boundaries of robotic manipulation? If so, this internship is your chance to dive into the heart of robotic innovation!

You can work on this project either by doing a B.Sc or M.Sc. thesis or an internship.

Job Description

This internship offers a unique opportunity to explore the exciting world of robotic learning. You’ll join our team, working alongside cutting-edge robots like the UR3 and Franka Emika cobots, and advanced grippers like the 2F Adaptive Gripper (Robotiq), the dexterous RH8D Seed Robotics Hand and the LEAP hand. Equipped with tactile sensors, you’ll delve into the world of grasping, manipulation, and interaction with diverse objects using Deep Learning methods.

Start date: Open

Location: Leoben

Duration: 3-6 months

Supervisors:

Keywords:

  • Robot learning
  • Robotic manipulation
  • Reinforcement Learning
  • Sim2Real
  • Robot Teleoperation
  • Imitation Learning
  • Deep learning
  • Research

Responsibilities

  • Collaborate with researchers to develop and implement novel robotic manipulation learning algorithms in simulation and in real-world.
  • Gain hands-on experience programming and controlling robots like the UR3 and Franka Emika cobots.
  • Experiment with various grippers like the 2F Adaptive Gripper, the RH8D Seed Robotics Hand and the LEAP hand, exploring their functionalities.
  • Develop data fusion methods for vision and tactile sensing.
  • Participate in research activities, including data collection, analysis, and documentation.
  • Contribute to the development of presentations and reports to effectively communicate research findings.

Qualifications

  • Currently pursuing a Bachelor’s or Master’s degree in Computer Science,
    Electrical Engineering, Mechanical Engineering, Mathematics or related
    fields.
  • Solid foundation in robotics fundamentals (kinematics, dynamics, control theory).
  • Solid foundation in machine learning concepts (e.g., supervised learning, unsupervised learning, reinforcement learning, neural networks, etc)
  • Strong programming skills in Python and experience with deep learning frameworks such as PyTorch or TensorFlow.
  • Excellent analytical and problem-solving skills.
  • Effective communication and collaboration skills to work seamlessly within the research team.
  • Good written and verbal communication skills in English.
  • (optional) Prior experience in robot systems and published work.

Opportunities and Benefits of the Internship

  • Gain invaluable hands-on experience with state-of-the-art robots and grippers.
  • Work alongside other researchers at the forefront of robot learning.
  • Develop your skills in representation learning, reinforcement learning, robot learning and robotics.
  • Contribute to novel research that advances the capabilities of robotic manipulation.
  • Build your resume and gain experience in a dynamic and exciting field.
     

Application

Send us your CV accompanied by a letter of motivation at fotios.lygerakis@unileoben.ac.at with the subject: “Internship Application | Robot Learning”

Related Work

  • MViTac: Self-Supervised Visual-Tactile Representation Learning via Multimodal Contrastive Training
  • M2CURL: Sample-Efficient Multimodal Reinforcement Learning via Self-Supervised Representation Learning for Robotic Manipulation

Funding

We will support you during your application for an internship grant. Below we list some relevant grant application details.

CEEPUS grant (European for undergrads and graduates)

Find details on the Central European Exchange Program for University Studies program at https://grants.at/en/ or at https://www.ceepus.info.

In principle, you can apply at any time for a scholarship. However, also your country of origin matters and there exist networks of several countries that have their own contingent.

Ernst Mach Grant (Worldwide for PhDs and Seniors)

Rest Funding Resourses

Apply online at http://www.scholarships.at/

Meeting Notes July 2023

Meeting 06/07

Research

  • Investigating Representation Collapse in Reinforcement Learning Agents from Vision
    • plan/structure?
    • what RL algorithms?
      • visual data
      • Gehart Neumann, Marc Toussaint, Joustus Piater (Innsburg)
      • Define a research question
      • Focus on some domain
      •  
  • Unnormalized Contrastive learning
    • All CL models use l2 normalization of the representation
      • Stability: Normalizing the representations ensures that they all have the same magnitude. This can make the learning process more stable, as it prevents the model from assigning arbitrarily large or small magnitudes to the representations.

      • Focus on direction: By constraining the representations to have a fixed magnitude, the learning process focuses on the direction of the vectors in the embedding space. This is often what we care about in tasks like contrastive learning, where the goal is to make the representations of similar inputs point in similar directions.

      • Computational convenience: As mentioned earlier, many computations, such as the dot product between two vectors, are easier to perform and interpret in normalized spaces.

      • Interpretability: Normalized representations are often more interpretable, as the angle between two vectors can be directly interpreted as a measure of similarity or dissimilarity.

    • BUT, this come to the expense of
      • Decreased Capacity: With normalization, the model’s capacity to represent data is reduced since it can only rely on the direction of vectors in the embedding space. This limitation may result in the model being less able to capture complex patterns in the data.
      • Missing Magnitude Information: The absence of magnitude information in normalized vectors removes the ability to convey meaningful data properties such as confidence levels or other relevant characteristics. Normalization discards this information, limiting the model’s understanding of the data.
    • IDEA: remove the l2 regularization
      • Regularize the model to penalize large magnitudes.
      • Scale the representations to a desired range.
      • Design a custom loss function considering both direction and magnitude
  • Breaking Binary: Towards a Continuum of Conceptual Similarities in Self-Supervised Learning
    • will take more time to set-up
    • will leave it for later

PhD Registration

  • registered

M.Sc. Students/Interns

  • Iye Szin presenting next week her work until now.

ML Course

  • Publish Video Tutorial on pytorch

Miscellaneous

  • Summer School in Cambridge
    • Poster?

Meeting 25/07

Research

  • Goal-oriented working mode:
    • define subgoals and milestones
    • (make sure that you can evaluate them, and define criteria of success, scores, etc.)
    • till 17.08.2023 10:00
  • Define topic, sub-problem, open challenge, your approach, toy task, full experiment
  • RAAD2024, 20.12.2023 concept paper with first results
  • Spring 2024 A+ robotics conference paper on simulation experiments.
  • Summer 2024 A+ robotics paper on real robot experiments

M.Sc. Students/Interns

Miscellaneous

 

Meeting Notes June 2023

Meeting 15/06

Research

  • reviews for ECAI (2/6) (Vedant is working on one of them)
  • Research leads:
    1. Dimensionality Collapse of Visual Representations in Reinforcement Learning
    2. Improve SwAV architecture by creating better latent space clusters with the use of Sparse Autoencoders

PhD Registration

  • waiting for admission office response

M.Sc. Students/Interns

  • Iye Szin has a working prototype

ML Course

  • Tutorial on pytorch
  • pending grading for assignments 5 and 6

Miscellaneous

  • Summer School
    • Registration done
    • Air tickets booked
    • accommodation booked
  • English course got postponed
 

Meeting 22/06

Research

  • Reviews for ECAI 2023 done.

M.Sc. Students/Interns

ML Course

Miscellaneous

 

Meeting 29/06

Research

M.Sc. Students/Interns

ML Course

Miscellaneous

Meeting Notes May 2023

Meeting 11/05

Research

  • submitted CR-VAE paper to ECAI
  • Research leads:
    1. Dimensionality Collapse of Visual Representations in Reinforcement Learning
    2. Improve SwAV architecture by creating better latent space clusters with the use of Sparse Autoencoders

PhD Registration

  • Signed Application
  • Will hand it over to the Admissions office

M.Sc. Students/Interns

  • Possible PhD position for Iye Szin
  • Early June first draft presentation

ML Course

  • Assignment 5

Miscellaneous

  • Kleinwassertal
 

Meeting 25/05

Research

Literature Review

  1. Dimensionality Collapse of Visual Representations in Reinforcement Learning
  2. Improve SwAV architecture by creating better latent space clusters with the use of Sparse Autoencoders
 
ECAI review papers
  • 6 papers assigned = 16hours(2 days)/paper = 96 hours(12 days)
  • More feasible to review 2 papers.
  • deadline 16 June

M.Sc. Students/Interns

ML Course

  • Graded up to assignment 4
  • Assignment 6 out

Miscellaneous

 

Meeting Notes April 2023

Meeting 20/04

Research

  • reviewing paper for IROS
  • working on CR-VAE paper
  • experiment for the SL competition

PhD Registration

  • waiting for Toussaint’s response
  • maybe contact other professors?
    • rudolf

M.Sc. Students/Interns

  • Iye Szin
    • SL competition; deadline May 1

ML Course

  • grades for assignment 2 out

Miscellaneous

  • Summer Schools
    • ProbAI accepted (registration until 26/04)
    • ETH & RLSS waiting list
  • May-June Leaves
    • 19 May
    • 30 May – 6 June
  • Move May 1 to May 10 vacation
 

Meeting 27/04

Research

  • working on CR-VAE paper
  • image encoder for the SL competition

PhD Registration

  • Mentor: Rudolf Lioutikov
  • Application need signature from Rudolf

M.Sc. Students/Interns

  • Iye Szin
    • SL competition; deadline May 1

ML Course

  • Assignment 4: Regression

Miscellaneous

  • Summer Schools
    • Accepted:
      • M2LSS registered
      • ProbAI declined it
    • Rejected
      • ETH
      • RLSS
    • Applied:
      • LxMLS
      • Ellis Recommendation letter
  • internship application

Meeting Notes March 2023

Meeting 10/03

Research

  • Plan to participate in the air hockey challenge
  • Literature review for the right model

PhD Registration

  • todo: prepare Email

M.Sc. Students/Interns

  • Iye Szin work plan
  • Internship will lead to her thesis

ML Assistantship

  • Syllabus
  • Prepare exercises 

ML Course

  • Moodle to upload files (discussed)
  • Link to latex for the report (done)

Miscellaneous

  • No time to attend the research seminar, ML course takes too much of my time. (discussed)
  • 2 days work from home 31.05 & 01.06
  • Vacation 02.06 – 11.06
  • Medium GPUs for WS in the lab (RTX 3060 or 3070)
 

Meeting 13/03

Research

  • Rebuttal

ML Course

  • Assignment 1 preparation

Meeting 23/03

Research

  • respond to ICML Chairs about reviewer 1
  • Searched for alternative conferences
    • ECAI
    • BCCV
  • Literature review on SSL problems
  • RL Revision

M.Sc. Students/Interns

  • Iye Szin steady progress

Ph.D. registration

  • Email send to Toussaint

ML Course

  • Assignment 1 grades
  • post pdf

Miscellaneous

  • Summer School Applications
  • Paper Review accepted for IROS 2023
  • fill the form for IAS retreat
 

Meeting 30/03​

Research

  • waiting for ICML final decision
  • when out, I will compile the comments
    • data augmentation influence on MI
    • etc
  • submit to
  • ECAI
    • ICVS ranking is C
  • Next on: Dimensionality collapse in representation learning
    • currently reading about it
  • Air hockey challenge
    • start with SAC
    • continue with a model-based RL method, like world models

M.Sc. Students/Interns

  • Iye Szin struggling with ROS2 but in a logical frame

Ph.D. registration

  • Email sent to Toussaint. Waiting for responce

ML Course

  • Assignment 3 is out

Miscellaneous

  • Summer School Applications
  • Paper Review for IROS 2023
  • submitted the application for IAS retreat
 
Li Jing, Pascal Vincent, Yann LeCun, & Yuandong Tian (2021). Understanding Dimensional Collapse in Contrastive Self-supervised Learning. arXiv preprint arXiv:2110.09348.