Exoskeleton User Intent Detection
Overview
Humans have been walking on two legs for millenia, while the popularity of bipedal locomotion in robots has surged only in the past few decades. Humans have perfected the art of walking such that the mechanics of the task are subconscious. However, these motor pathways can be severely disrupted from neuromuscular injury such as a spinal cord injury (SCI). Despite the apparent ease of healthy human gait, it remains an open research question to replicate that ease of locomotion in standalone robots and in robotic lower-limb exoskeletons. Exoskeletons, such as the EksoGT by EksoBionics, have the opportunity to aid in repeatable, overground gait rehabilitation for humans with mobility deficiencies. When the robot and human are combined as closely as required for locomotion assistance, it is imperative that the interactions be safe and intuitive.
In the exoskeleton intent recognition thrust of the ROAM lab we have partnered with EksoBionics to study the interactions between human gait intent and lower-limb exoskeleton control laws so that assistive devices can perform in concert with the human body. When the robot better understands the human’s intentions, it will better serve the human in the task of restoring walking abilities when they have been lost.
Experiment
Our first approach to studying intent signals in exoskeleton-assisted walking was to perform experiments in which exo users made intent changes while walking in the device. Specifically, subjects were told to either speed up, slow down, or make no change to their gait. Our study revealed that sensors already integrated with the device, such as motor encoders and torque commands, show statistically significant differences in their trajectories before and after the commanded intent change (Gambon et al., ICORR, 2019). Ongoing work in our lab seeks to leverage these onboard measurements for an online intent-detection algorithm.
Technical Approach
One current approach being studied in the group for exoskeleton user intent detection is leveraging Multi-Model Kalman filtering to infer the user’s desired gait based on the position and velocity of the exoskeleton user’s center of mass. Humans are high degree-of-freedom systems and as a result the dynamics of walking are very complex. Template models are simple models that capture the salient features of legged locomotion. Template models are used to abstract the dynamics of bipedal locomotion while searching for periodic passive gaits. This strategy allows us to generate a reference gait library to which we can compare the exoskeleton sensor data and estimate the gait parameters of the user. Next steps include testing these algorithms on the exoskeleton and studying the human-robot interaction that results.