TY - JOUR
T1 - An EEG/EMG/EOG-based multimodal human-machine interface to real-time control of a soft robot hand
AU - Zhang, Jinhua
AU - Wang, Baozeng
AU - Zhang, Cheng
AU - Xiao, Yanqing
AU - Wang, Michael Yu
N1 - Funding Information:
The study reported was founded by the National Natural Science Foundation of China (Grant No. 51675413) and the 13th 5-Year Development Plan of the Equipment Pre-Research Field Foundation of China (Grant No.61400030701).
Publisher Copyright:
Copyright © 2019 Zhang, Wang, Zhang, Xiao and Wang.
PY - 2019/3/29
Y1 - 2019/3/29
N2 - Brain-computer interface (BCI) technology shows potential for application to motor rehabilitation therapies that use neural plasticity to restore motor function and improve quality of life of stroke survivors. However, it is often difficult for BCI systems to provide the variety of control commands necessary for multi-task real-time control of soft robot naturally. In this study, a novel multimodal human-machine interface system (mHMI) is developed using combinations of electrooculography (EOG), electroencephalography (EEG), and electromyogram (EMG) to generate numerous control instructions. Moreover, we also explore subject acceptance of an affordable wearable soft robot to move basic hand actions during robot-assisted movement. Six healthy subjects separately perform left and right hand motor imagery, looking-left and looking-right eye movements, and different hand gestures in different modes to control a soft robot in a variety of actions. The results indicate that the number of mHMI control instructions is significantly greater than achievable with any individual mode. Furthermore, the mHMI can achieve an average classification accuracy of 93.83% with the average information transfer rate of 47.41 bits/min, which is entirely equivalent to a control speed of 17 actions per minute. The study is expected to construct a more user-friendly mHMI for real-time control of soft robot to help healthy or disabled persons perform basic hand movements in friendly and convenient way.
AB - Brain-computer interface (BCI) technology shows potential for application to motor rehabilitation therapies that use neural plasticity to restore motor function and improve quality of life of stroke survivors. However, it is often difficult for BCI systems to provide the variety of control commands necessary for multi-task real-time control of soft robot naturally. In this study, a novel multimodal human-machine interface system (mHMI) is developed using combinations of electrooculography (EOG), electroencephalography (EEG), and electromyogram (EMG) to generate numerous control instructions. Moreover, we also explore subject acceptance of an affordable wearable soft robot to move basic hand actions during robot-assisted movement. Six healthy subjects separately perform left and right hand motor imagery, looking-left and looking-right eye movements, and different hand gestures in different modes to control a soft robot in a variety of actions. The results indicate that the number of mHMI control instructions is significantly greater than achievable with any individual mode. Furthermore, the mHMI can achieve an average classification accuracy of 93.83% with the average information transfer rate of 47.41 bits/min, which is entirely equivalent to a control speed of 17 actions per minute. The study is expected to construct a more user-friendly mHMI for real-time control of soft robot to help healthy or disabled persons perform basic hand movements in friendly and convenient way.
KW - Electroencephalogram (EEG)
KW - Electromyogram (EMG)
KW - Electrooculogram (EOG)
KW - Multimodal human-machine interface (mHMI)
KW - Soft robot hand
UR - http://www.scopus.com/inward/record.url?scp=85065593463&partnerID=8YFLogxK
U2 - 10.3389/fnbot.2019.00007
DO - 10.3389/fnbot.2019.00007
M3 - Article
C2 - 30983986
AN - SCOPUS:85065593463
SN - 1662-5218
VL - 13
JO - Frontiers in Neurorobotics
JF - Frontiers in Neurorobotics
M1 - 7
ER -