Main areas of work in the Robot and Motion Lab

In the Robot and Human Motion Lab, or RaHM-Lab for short, we are working with a wide variety of methods for mobile and stationary robot systems. In particular, the question of efficient collaboration between these systems and humans (human-robot collaborations) will be explored. This includes the work on new concepts including proof of concept as well as the development of prototypes up to validation in cooperative research between university and industry. To this end, we operate in the following areas of competence:

  • Development of mobile robot systems for indoor and outdoor areas and special applications
  • Mobile manipulation in household and industry
  • Autonomous mobile systems
  • Collaborative robot systems
  • Human-Machine-Interface (HMI)
  • Optical marker-based 3D motion analysis
  • 2D/3D sensor data analyses
  • 3D object measurement and modelling
  • Sensor data fusion (e.g. Kalman filter for inertial sensors)
  • Deep Neural Networks (e.g. for anomaly detection)
  • AI-based robot calibrations
  • Metrological determination of Denavit-Hartenberg parameters
Recommended products

The compact Vero cameras have sensor resolutions of either 1.3 or 2.2 megapixels. The camera has a variable zoom lens, which makes it especially suited for smaller capture volumes where it is especially important to have an optimum field of view. The Vero’s attractive price combined with its light weight and small size makes it a great choice for smaller labs and studios.

The Optima is the flagship among force plates. The patented calibration technology guarantees the highest possible accuracy across the entire surface of the plate – ideal for gait analysis, biomechanical research and other applications where the highest quality data is essential.

Tracker has been designed for the requirements and workflow of Engineering users wanting to track the position and orientation of objects with as little effort and as low latency as possible. Perfect for many applications in robotics, UAV tracking, VR and human-machine interaction, Tracker lets you define what you want to track with a couple of mouse clicks – and then you can just leave in the background tracking. A simple SDK lets you connect the output data stream to your own software.

Motion analysis laboratory for human-robot collaborations

For motion analysis of humans and robots, especially in collaborative applications, a marker-based optical system is ideal due to its high flexibility. Laser scanners used in the context of robotics achieve higher accuracies in principle, but they do not allow simultaneous measurement of the poses of several robot segments or of the human. For this reason, we decided to use an 11-camera Vicon-Vero system for motion capture and two force plates to collect ground reaction forces and centre of pressures.

Due to the reduction of the observation volume to a few cubic metres, the retroreflective markers in static poses can be tracked to within a few 1/100mm. This allows us to collect the kinematic data for the calibration of the robot itself. To complement this, force plates measure the standing position of the worker in a collaborative scenario with the robot. These plates are additionally used to calibrate the force-torque sensors integrated in the robot. After a simple modification, we are also able to carry out gait and movement analyses of the upper extremities according to the HUX model (Heidelberg Upper Extremities Model).

The motion data will mainly be collected using Vicon Tracker as the software offers robust streaming with low latencies of the poses of marker clusters. In addition, the software provides a quality measure of the specific poses. This greatly facilitates automation under experimentally demanding conditions.

Are you interested in a similar solution?

Ask here!