IRL Header

Interactive Realtime Lab

Investigating human motion in real-time

The Interactive Realtime Lab (IRL) investigates human movement, brain activity, and motor adaptation using advanced technologies. Equipped with a state-of-the-art movement lab, mobile brain imaging, virtual and augmented reality, movement sensor systems, and a 6-degree-of-freedom motion platform, the IRL precisely measures detailed movement kinematics, kinetics, and muscular activation patterns, along with brain activity during tasks like walking and balancing. Real-time biomechanical and neuromechanical models estimate internal metrics such as joint loads and muscle activation patterns, displayed on an immersive 3D projection screen. This setup allows interaction with video game elements and enhances understanding of how real-time feedback and rewards facilitate motor learning and adaptation. Additionally, the lab collects large datasets from neurological populations and serves as a rapid prototyping platform for technical development, addressing both fundamental and applied science questions.

Key Technologies

Our 9-camera Vicon system captures movement at 330 Hz with 2.2-megapixel resolution, enabling precise tracking of even the smallest movements. With minimal latency, it provides real-time insights into biomechanics for refining athletic performance, studying movement neuromechanics, or enhancing rehabilitation. This platform helps create a clearer understanding of your unique movement attributes.


Our innovative setup also boasts three high-speed RGB cameras strategically positioned at fixed locations within the IRL to capture the intricacies of every step. As individuals navigate the dynamic environment, these high resolution 2.2 MP cameras capture every movement at a frame rate of 120 Hz. This allows us to collect high quality RGB videos to act as a reference in order to train and evaluate computer vision models of gait for different pathologies.


With the use of electromyography (EMG) sensors, we can provide a real-time glimpse into the neuromuscular symphony of your every move. Our 16-channel low latency Cometa Pico system is rapid to mount, records at high levels of fidelity, and has extremely low latency. As you move, the EMG sensors capture the electrical impulses generated by your muscles, unveiling an individual map of how you control each movement.


We use a treadmill to record walking data, enabling the collection of multiple strides for biomechanical modeling. Operating at the optimal range of our motion capture system, the Forcelink treadmill can accelerate at 15 m/s² and reach a maximum speed of 10 m/s with minimal belt slippage. Its two independently controlled belts allow for a variety of environmental perturbation and adaptation protocols, which are difficult to achieve in a typical overground lab.


The treadmill and force plates are mounted on a 6-degree-of-freedom motion platform, similar to flight simulators. This platform enables rapid movement in any direction and allows for various perturbations during walking or standing. It tracks orientation changes in virtual environments, such as VR goggles or a semi-circular projection screen, enhancing immersion and adding a new dimension to gait training and assessment in the IRL.


Underneath the split-belt treadmill, two force platforms capture data at 1000Hz as you are walking. This data is one of the requirements for full biomechanical modelling and allows full inverse kinetics for each of the main lower limb joints.


The lab-wide interconnection layer, developed internally, enables modular communication between all IRL systems with high-bandwidth, high-fidelity, low-latency data streams. Its modular design allows for easy integration of new devices that have a programmatic communication interface. Based on modern open-source protocols like MQTT, ROS, and Airflow, the interconnect layer supports real-time processing nodes, providing various levels of processed and combined signals within the data space.


A 5m wide semi-circular projection screen is installed in front of the treadmill, allowing us to display digital content during walking. Virtual environments can, for instance, replicate real-life scenarios, providing a dynamic and engaging platform for gait rehabilitation. We also develop games and scenes tailored to individual patient requirements, constructing augmented feedback scenarios or letting individuals navigate through immersive landscapes, overcome obstacles, or refine their walking patterns.


Alongside the semi-circular projection screen, the lab includes head-mounted systems for virtual reality, such as the HTC Vive 2, and augmented reality, like the Microsoft Hololens. These systems create immersive experiences and enhance the visual complexity of projected environments. Connected through the interconnect layer, they allow for seamless integration of visual scenes. This setup enables us to investigate sensory conflict paradigms, exploring how visual and proprioceptive inputs contribute to motor control and helping us understand individual sensory processing abilities.


Wearable haptic displays are crucial for creating movement illusions that influence patterns both consciously and subconsciously. Comprising small, configurable haptic motors, they provide synchronized vibrations with low latency across physiological frequency bands. We use the Elitac Science Suit and a self-developed modular haptic platform to deliver feedback to the wearer’s skin. These systems allow us to investigate tightly controlled multimodal feedback and design movement paradigms for real-world applications.


Using open-source designs from Stanford, we developed a 2-unit cable-based robot that attaches to a participant’s pelvis on the treadmill. This system applies forces up to 30 kg in either direction with a rise time under 150 ms. Our custom control software provides access to raw data from the motors and sensors, integrating motion capture and force plate data through the Lab-wide Interconnect for adaptive operation. The Bump’em acts as a perturbation device, triggering specific force profiles based on predefined movement events. Additionally, we generate force fields that interact continuously with the participant’s motion, enabling us to investigate movement adaptation and learning processes to support rehabilitation goals.


The IRL features a 5-sensor wearable motion capture solution based on inertial measurement units (IMUs). Data is synchronously recorded with all other lab systems. This allows the development and validation of generalized and patient-specific models for calculation of key gait parameters. Our focal interests include metrics of movement stability and of clinically relevant gait parameters for Stroke and Parkinson’s. Developed models are evaluated for their clinometric properties and then employed in the mobile lab setting. Our wearable motion system supports onboard logging and data streaming, which we use in motion-driven game development.


Associated Projects

Lab Manager

Lake Lucerne Institute AG
Rubistrasse 9
6354 Vitznau