Moving Assistive Mobility Forward with AI
Data-driven controller enables real-time assistance modulation by accurately estimating both the user and the environment states.
As we go through our daily routines of work, chores, errands and leisure pursuits, most of us take our mobility for granted. Conversely, many people suffer from permanent or temporary mobility issues due to neurological disorders, stroke, injury, and age-related causes. Research in the field of robotic exoskeletons has shown significant potential to provide assistive support for patients with permanent mobility constraints, as well as an effective additional tool for rehabilitation and recovery after injury.
Though the field has made great progress in the hardware and devices for these assistive technologies, there are limitations in ease of use and in the ability to move from walking to running, from flat ground to slopes and stairs, and across different terrains. Recent developments to create exoskeleton controllers that are more responsive to the user’s environment via user-based variables such as gait and slope calculations provide rapid yet imprecise outputs. More recent inquiry into data-driven improvements such as vision-based labeling and classification are extremely promising additions in the goal to develop a true synchronous user and device interface. A major hindrance to this data-driven approach is the need for burdensome mounted cameras and on-board computing to allow for real-time in use adjustments to the environmental terrain encountered.
In order to address these barriers, Aaron Young, Associate Professor in the Woodruff School of Mechanical Engineering and Director of the Exoskeleton and Prosthetic Intelligent Controls (EPIC) Lab, and Dawit Lee, Postdoctoral Scholar at Stanford, have created an artificial intelligence (AI)-based universal exoskeleton controller that uses information from onboard mechanical sensors without the added weight and complexity of mounted vision based systems. The new work, published in Science Advances presents a controller that holistically captures the major variations encountered during community walking in real-time. The team combined data from the Americans with Disabilities Act (ADA) building guidelines that characterize ambulatory terrains in slope level degrees with a gait phase estimator to achieve dynamic switching of assistance types between multiple terrains and slopes and delivery to the user with little to no delay.
In this work, we have created a new, open-source knee exoskeleton design that is intended to support community mobility. Knee assist devices have tremendous value in activities such as sit-to-stand, stairs, and ramps where we use our biological knees substantially to accomplish these tasks. The neat accomplishment in this work is that by leveraging AI, we avoid the need to classify these different modes discretely but rather have a single continuous variable (in this case rise over run of the surface) to enable continuous and unified control over common ambulatory tasks such as walking, stairs, and ramps. We demonstrate that on novel users of the device, we can track both the environment and the user’s gait state with very high accuracy out of the lab in community settings. It is an exciting time in the field as we see more studies, such as this one, showing promise in tackling real-world mobility challenges
- Aaron Young, Associate Professor in the Woodruff School of Mechanical Engineering and Director of the Exoskeleton
and Prosthetic Intelligent Controls (EPIC) Lab
Using this combination of a universal slope estimator and a gait phase estimator, the team achieved results in the dynamic modulation of exoskeleton assistance that have never been achieved by previous approaches and moves the field closer to creating an adaptive and effective assistive technology that seamlessly integrates into the daily lives of individuals, promoting enhanced mobility and overall well-being. This work also has the potential to enable a mode-specific assistance approach tailored to the user’s specific biomechanical needs.
The assistance approach using our intelligent controller, presented in this work, provides users with support at the right timing and with a magnitude that closely matches the varying biomechanical effort they produce as they move through the community. Our assistance approach was preferred for community navigation and was more effective in reducing the user’s energy consumption compared to conventional methods. We also open-sourced the design of the robotic knee exoskeleton hardware and the dataset used to train the models with this publication which allows other researchers to build upon our developments and further advance the field. This work demonstrates an exciting example of AI integration into a wearable robotic system, showcasing its successful outcomes and significant potential.
- Dawit Lee; Postdoctoral Scholar, Stanford
- Christa M. Ernst; Research Communications Program Manager
Original Publication
Dawit Lee, Sanghyub Lee, and Aaron J. Young, “AI-Driven Universal Lower-Limb Exoskeleton System for Community Ambulation,” Science Advances, Vol 10, Issue 51, Dec. 18, 2024, https://doi.org/10.1126/sciadv.adq0288
Prior Related Work
D. Lee, I. Kang, D. D. Molinaro, A. Yu, A. J. Young, Real-time user-independent slope prediction using deep learning for modulation of robotic knee exoskeleton assistance. IEEE Robot. Autom. Lett. 6, 3995–4000 (2021).
Funding Provided by
NIH Director’s New Innovator Award DP2-HD111709