Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...

Details

Autor(en) / Beteiligte
Titel
Analysis of motor imagery data from EEG device to move prosthetic hands by using deep learning classification
Ist Teil von
  • AIP Conference Proceedings, 2022, Vol.2537 (1)
Ort / Verlag
Melville: American Institute of Physics
Erscheinungsjahr
2022
Beschreibungen/Notizen
  • Controlling the artificial hand using the mind is a dream for many people who had lost their limbs. Brain-Computer Interface (BCI) technology is hoped in making these things happen by connecting commands and responses to the brain as information in the control system. However, the complexity of the EEG signal becomes a challenge in realizing. The use of a deep learning-based classification model is expected to be a solution for classifying the hand movements imagined by the user as an input to the electric artificial hand control system. The main aim of this study is to classify EEG signals from the human brain in real-time using a non-invasive EEG headset for two different hand operations: rest and grip. OpenBCI Ultracortex Mark IV Headset was used in this study. This study proposes a solution for the classification of rest and grip hand movement by exploiting a Long Short-Term Memory (LSTM) network and Convolutional Neural Network (CNN) to learn the electroencephalogram (EEG) time-series information. EEG signals were recorded from 1 healthy subject via brain waves at specific locations on the scalp, at points F3, Fz, F4, FC1, FC2, C3, CZ, C3. A wide range of time-domain features are extracted from the EEG signals and used to train an LSTM and CNN to perform the classification task. This headset can capture brain waves that include artefacts such as limb movement, heartbeat, blink, and many more. Raw EEG from the headset was processed for event detection. Raw EEG from the headset was filtered using Butterworth bandpass filtering to separate the signal data into a new dataset containing alpha, beta, and both ranges. The results of this study indicate that the classification model using the CNN technique for the classification of two types of hand movements is able to achieve an accuracy of 95.45% at the highest, while the LSTM technique can achieve an accuracy of 93.64 %. Detected events were then used to trigger control signals to a prosthetic hand controlled by microcontroller.
Sprache
Englisch
Identifikatoren
ISSN: 0094-243X
eISSN: 1551-7616
DOI: 10.1063/5.0098178
Titel-ID: cdi_scitation_primary_10_1063_5_0098178

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX