It can sometimes be difficult to weave your way through busy public thoroughfares on foot but just think how much more difficult it is for users of electric wheelchairs. Imagine if the chair was equipped with sensors and used adaptive AI control to provide obstacle avoidance and autonomous steering capability.  That would offer users a whole new level of independence.

A research team from Kent University (UK) have been working on this problem with the aim to integrate sophisticated control and AI learning into wheelchair control system to improve quality of life and give independence to wheelchair users:
 
 
The team has developed a wheelchair capable of navigating very narrow spaces and avoiding obstacles by using a LiDAR laser-light radar system to map the environment. It combines user input signals together with a degree of AI learning to produce control algorithms for the user’s driving style, taking into account other physiological variables such as the user’s heart rate and reaction time to determine their level of awareness and provide an appropriate level of control autonomy.

The video shows how the person controls the wheelchair using head movement and eye tracking function. The wheelchair user said "Essentially, the wheelchair tracks my head movement and controls it as I move. Turning my head to the left causes the chair to turn left and vice versa." The method of user control input is highly individual, according to the users level of ability; in addition to head-tracking they also can use iris or nose direction control.

C, C++ and Python have been used for software development. The team also make use of the resources available at the open-source Robot Operating System (ROS) which has a whole host of useful software libraries, drivers and tools for developers of robotics and on this project, provide links with development partners for code sharing to maintain code interoperability.