Abstract
Tongue drive system (TDS) is a tongue-operated, minimally invasive, unobtrusive, noncontact, and wireless assistive technology that infers userspsila intentions by detecting and classifying their voluntary tongue motions, and translating them to user-defined commands. We have developed customized interface circuitry between an external TDS (eTDS) prototype and a commercial powered wheelchair (PWC) as well as three control strategies to evaluate the tongue motion as an alternative control input for wheeled mobility. We tested the eTDS performance in driving PWCs on 12 able-bodied human subjects, of which 11 were novice. The results showed that all subjects could complete navigation tasks by operating the PWC using their tongue motions. Despite little prior experience, the average time using the eTDS and the tongue was only approximately three times longer than using a joystick and the fingers. Navigation time was strongly dependant on the number of issued commands, which reduced by gaining experience. Particularly, the unintended issued commands (the Midas touch problem) were rare, demonstrating the effectiveness of the tongue tracking and external magnetic field cancellation algorithms as well as the safety of the TDS for wheeled mobility.