<img src="https://spectrum.ieee.org/media-library/side-view-of-a-power-wheelchair-equipped-with-a-padded-bucket-seat-and-tablet-sized-monitor-below-a-computer-generated-maps-of.jpg?id=65316423&width=1245&height=700&coordinates=0%2C469%2C0%2C469"/><br/><br/><p>Wheelchair users with severe disabilities can often navigate tight spaces better than most robotic systems can. <span>A wave of new smart-wheelchair research, including findings presented in Anaheim, Calif. earlier this month, is now testing whether AI-powered systems can, or should, fully close this gap.</span></p><p><a href="https://user.informatik.uni-bremen.de/cmandel/" target="_blank">Christian Mandel</a>—senior researcher at the <a href="https://www.dfki.de/en/web" target="_blank">German Research Center for Artificial Intelligence</a> (DFKI) in Bremen, Germany—<span>co-led a research team together with his colleague <a href="https://user.informatik.uni-bremen.de/autexier/index.php" target="_blank">Serge Autexier</a></span><span> that developed prototype sensor-equipped electric wheelchairs designed to navigate a room-full of potential obstacles. The researchers also tested a new safety system that integrated sensor data from the wheelchair and from sensors in the room, including from </span><a href="https://spectrum.ieee.org/tag/drones" target="_self">drone</a><span>-based </span>color and depth cameras<span>.</span></p><p>Mandel says the team’s smart wheelchairs were both semi-autonomous and autonomous.</p><p>“Semi-autonomous is the shared control system where the person sitting in the wheelchair uses the joystick to drive,” Mandel says. “Fully autonomous is controlled by natural language input. You say, ‘Please drive me to the coffee machine.’”<a href="#_msocom_2" target="_blank"></a></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Close-up of a thin rectangular camera installed underneath an electric wheelchair's joystick controller." class="rm-shortcode" data-rm-shortcode-id="29dd9c2e4ce2197312ba056b72e6a791" data-rm-shortcode-name="rebelmouse-image" id="d4669" loading="lazy" src="https://spectrum.ieee.org/media-library/close-up-of-a-thin-rectangular-camera-installed-underneath-an-electric-wheelchair-s-joystick-controller.jpg?id=65317537&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Joystick and camera.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DFKI</small></p><p>The researchers conducted experiments (<a target="_blank">part of a larger project called the </a><a href="https://www.dfki.de/en/web/research/projects-and-publications/project/rexasi-pro" target="_blank">Reliable and Explainable Swarm Intelligence for People With Reduced Mobility</a>, or REXASI-PRO) using two identical smart wheelchairs that contained two lidars, a 3D camera, odometers, user interfaces, and an embedded computer.</p><p>Semi-autonomous wheelchair operations involved participants controlling the wheelchair with a joystick. In the wheelchairs’ autonomous mode, navigation involved the open-source <a href="https://roboticsbackend.com/ros2-nav2-tutorial/" rel="noopener noreferrer" target="_blank">ROS2 Nav2</a> navigation system using natural-language input. The wheelchairs also used Simultaneous Localization and Mapping (<a href="https://en.wikipedia.org/wiki/Simultaneous_localization_and_mapping" rel="noopener noreferrer" target="_blank">SLAM</a>) maps and local obstacle-avoidance motion controllers.</p><p>One scenario that Mandel and his team tested involved the user pressing a key on the wheelchair’s human-machine interface, speaking a command, then confirming or rejecting the instruction via that same interface. Once the user confirmed the command, the mobility device guided the user along a path to the destination, while sensors attempted to detect obstacles in the way and adjust the mobility device accordingly to avoid them.</p><h3>When Are Smart Wheelchairs Bad Value?</h3><p>According to Pooja Viswanathan, CEO & founder of the <a rel="noopener noreferrer" target="_blank">Toronto-based</a> Braze Mobility, research in the field of mobile assistive technology should also prioritize keeping these devices readily available to everyday consumers.</p><p>“Cost remains a major barrier,” she says. “Funding systems are often not designed to support advanced add-on intelligence unless there is very clear evidence of value and safety. Reliability is another barrier. A smart wheelchair has to work not just in ideal conditions, but in the messy, variable conditions of daily life. And there is also the human factors dimension. Users have different cognitive, motor, sensory, and environmental needs, so one solution rarely fits all.”</p><p>For its part, Braze makes <a href="https://brazemobility.com/" rel="noopener noreferrer" target="_blank">blind spot sensors</a> for electric wheelchairs. The sensors detect obstacles in areas that can be difficult for a user to see. The sensors can also be added to any wheelchair to transform it into a smart wheelchair by providing multi-modal alerts to the user. This approach attempts to support the user rather than replace them.</p><p>According to Louise Devinge, a biomedical research engineer from <a href="https://en.wikipedia.org/wiki/Research_Institute_of_Computer_Science_and_Random_Systems" rel="noopener noreferrer" target="_blank">IRISA</a> (Research Institute of Computer Science and Random Systems) in Rennes, France, the increased complexity of smart wheelchairs demands more sensing. And that requires careful management of communication and synchronization within the wheelchair’s system. “The more sensing, computation, and autonomy you add,” she says, “The harder it becomes to ensure robust performance across the full range of real-world environments that wheelchair users encounter.”</p><p>In the near-term, in other words, the field’s biggest challenge is not about replacing the wheelchair user with AI smarts but rather about designing better partnerships between the user and the technology.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Rendering of an electric wheelchair moving towards a wall. The chair is divided into four ground-parallel quadrants that each represent a different safety zone where intersections with obstacles are checked. At the same height as these quadrants, are four lines on the wall that represent virtual laser scans. " class="rm-shortcode" data-rm-shortcode-id="2855c65574c8448acce8716906bddddc" data-rm-shortcode-name="rebelmouse-image" id="f0d73" loading="lazy" src="https://spectrum.ieee.org/media-library/rendering-of-an-electric-wheelchair-moving-towards-a-wall-the-chair-is-divided-into-four-ground-parallel-quadrants-that-each-re.jpg?id=65316452&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Data representations used by the 3D Driving Assistant. These include immutable sensor percepts such as laser scans, and point clouds, as well as derived representations like the virtual laser scans and grid maps. Finally, the robot shape collection describes the wheelchair’s physical borders at different heights.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DFKI</small></p><h3>Where Will Smart Wheelchairs Go From Here?</h3><p>Mandel says he expects to see smart wheelchairs ready for the mainstream marketplace within ten years.</p><p>Viswanathan says the REXASI-PRO system, while out of reach of <a href="https://spectrum.ieee.org/assistive-technology-lidar-wheelchair" target="_self">present-day smart wheelchair technologies</a>, is important for the longer term. “It reflects the more ambitious end of the smart wheelchair spectrum,” she says. “Its strengths appear to lie in intelligent navigation, advanced sensing, and the broader effort to build a wheelchair that can interpret and respond to complex environments in a more autonomous way. From a research standpoint, that is exactly the kind of work that pushes the field forward. It also appears to take seriously the importance of trustworthy and explainable AI, which is essential in any mobility technology where safety, reliability, and user confidence are paramount.”</p><p>Mandel says he’s ultimately in pursuit of the inspiration that got him into this field years ago. As a young researcher, he says, he helped develop a smart wheelchair system controllable with a head joystick.</p><p>However, Mandel says he realized after many trials that the smart wheelchair system he was working on had a long way to go because, as he says, “At that point in time, I realized that even persons that had severe handicaps [traveling through] a narrow passage, they did very, very well.</p><p>“And then I realized, okay, there is this need for this technology, but never underestimate what [wheelchair users] can do without it.”</p><p><a target="_blank">The DFKI researchers presented </a><a href="https://www.dfki.de/en/web/research/projects-and-publications/publication/16538" target="_blank">their work</a> earlier this month at the <a href="https://conference.csun.at/event/2026/session-schedule" rel="noopener noreferrer" target="_blank">CSUN Assistive Technology Conference</a> in Anaheim, Calif.</p><p><em>This article was supported by the <a href="https://spectrum.ieee.org/tag/ieee-foundation" target="_self">IEEE Foundation</a> and a Jon C. Taenzer fellowship grant.</em></p>

Wheelchair users with severe disabilities can often navigate tight spaces better than most robotic systems can. A wave of new smart-wheelchair research, including findings presented in Anaheim, Calif. earlier this month, is now testing whether AI-powered systems can, or should, fully close this gap.
Christian Mandel—senior researcher at the German Research Center for Artificial Intelligence (DFKI) in Bremen, Germany—co-led a research team together with his colleague Serge Autexier that developed prototype sensor-equipped electric wheelchairs designed to navigate a room-full of potential obstacles. The researchers also tested a new safety system that integrated sensor data from the wheelchair and from sensors in the room, including from drone-based color and depth cameras.
Mandel says the team’s smart wheelchairs were both semi-autonomous and autonomous.
“Semi-autonomous is the shared control system where the person sitting in the wheelchair uses the joystick to drive,” Mandel says. “Fully autonomous is controlled by natural language input. You say, ‘Please drive me to the coffee machine.’”
Joystick and camera.DFKI
The researchers conducted experiments (part of a larger project called the Reliable and Explainable Swarm Intelligence for People With Reduced Mobility, or REXASI-PRO) using two identical smart wheelchairs that contained two lidars, a 3D camera, odometers, user interfaces, and an embedded computer.
Semi-autonomous wheelchair operations involved participants controlling the wheelchair with a joystick. In the wheelchairs’ autonomous mode, navigation involved the open-source ROS2 Nav2 navigation system using natural-language input. The wheelchairs also used Simultaneous Localization and Mapping (SLAM) maps and local obstacle-avoidance motion controllers.
One scenario that Mandel and his team tested involved the user pressing a key on the wheelchair’s human-machine interface, speaking a command, then confirming or rejecting the instruction via that same interface. Once the user confirmed the command, the mobility device guided the user along a path to the destination, while sensors attempted to detect obstacles in the way and adjust the mobility device accordingly to avoid them.
According to Pooja Viswanathan, CEO & founder of the Toronto-based Braze Mobility, research in the field of mobile assistive technology should also prioritize keeping these devices readily available to everyday consumers.
“Cost remains a major barrier,” she says. “Funding systems are often not designed to support advanced add-on intelligence unless there is very clear evidence of value and safety. Reliability is another barrier. A smart wheelchair has to work not just in ideal conditions, but in the messy, variable conditions of daily life. And there is also the human factors dimension. Users have different cognitive, motor, sensory, and environmental needs, so one solution rarely fits all.”
For its part, Braze makes blind spot sensors for electric wheelchairs. The sensors detect obstacles in areas that can be difficult for a user to see. The sensors can also be added to any wheelchair to transform it into a smart wheelchair by providing multi-modal alerts to the user. This approach attempts to support the user rather than replace them.
According to Louise Devinge, a biomedical research engineer from IRISA (Research Institute of Computer Science and Random Systems) in Rennes, France, the increased complexity of smart wheelchairs demands more sensing. And that requires careful management of communication and synchronization within the wheelchair’s system. “The more sensing, computation, and autonomy you add,” she says, “The harder it becomes to ensure robust performance across the full range of real-world environments that wheelchair users encounter.”
In the near-term, in other words, the field’s biggest challenge is not about replacing the wheelchair user with AI smarts but rather about designing better partnerships between the user and the technology.
Data representations used by the 3D Driving Assistant. These include immutable sensor percepts such as laser scans, and point clouds, as well as derived representations like the virtual laser scans and grid maps. Finally, the robot shape collection describes the wheelchair’s physical borders at different heights.DFKI
Mandel says he expects to see smart wheelchairs ready for the mainstream marketplace within ten years.
Viswanathan says the REXASI-PRO system, while out of reach of present-day smart wheelchair technologies, is important for the longer term. “It reflects the more ambitious end of the smart wheelchair spectrum,” she says. “Its strengths appear to lie in intelligent navigation, advanced sensing, and the broader effort to build a wheelchair that can interpret and respond to complex environments in a more autonomous way. From a research standpoint, that is exactly the kind of work that pushes the field forward. It also appears to take seriously the importance of trustworthy and explainable AI, which is essential in any mobility technology where safety, reliability, and user confidence are paramount.”
Mandel says he’s ultimately in pursuit of the inspiration that got him into this field years ago. As a young researcher, he says, he helped develop a smart wheelchair system controllable with a head joystick.
However, Mandel says he realized after many trials that the smart wheelchair system he was working on had a long way to go because, as he says, “At that point in time, I realized that even persons that had severe handicaps [traveling through] a narrow passage, they did very, very well.
“And then I realized, okay, there is this need for this technology, but never underestimate what [wheelchair users] can do without it.”
The DFKI researchers presented their work earlier this month at the CSUN Assistive Technology Conference in Anaheim, Calif.
This article was supported by the IEEE Foundation and a Jon C. Taenzer fellowship grant.
Autonomous navigation systems are technologies that enable vehicles or devices to navigate and operate without human intervention. These systems use a combination of sensors, algorithms, and artificial intelligence to interpret their environment, make decisions, and execute movements safely and efficiently.
Assistive robotics refers to robotic systems designed to aid individuals with disabilities or mobility challenges. These robots can enhance the quality of life by providing support in daily activities, improving independence, and facilitating mobility through advanced technologies.