• Keine Ergebnisse gefunden

Movements of the same upper limb can be classified from low-frequency time-domain EEG signals

N/A
N/A
Protected

Academic year: 2022

Aktie "Movements of the same upper limb can be classified from low-frequency time-domain EEG signals"

Copied!
1
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Movements of the same upper limb can be classified from low-frequency time-domain EEG signals

P. Ofner

1

, A. Schwarz

1

, J. Pereira

1

, G. R. Müller-Putz

1*

1Institute of Neural Engineering, Graz University of Technology, Graz, Austria

*Stremayrgasse 16/IV, A-8010 Graz, Austria. E-mail: gernot.mueller@tugraz.at

Introduction: Brain-computer interfaces (BCIs) can be used to control neuroprostheses of spinal cord injured (SCI) persons. A neuroprosthesis can restore different movement functions (e.g., hand open/close, supination/pronation etc.), and requires a BCI with a sufficiently high number of classes. However, sensorimotor rhythm-based BCIs can often only provide less than 3 classes, and new types of BCIs need to be developed.

Since a couple of years, a new EEG feature has evolved: low-frequency time-domain signals. For example movement trajectories [1] and movement directions [2] were decoded using this feature. In the present study, we investigated whether low-frequency time-domain signals can also be used to classify several (executed) hand/arm movements of the same limb. A BCI relying on the imagination of such movements may be used to control a neuroprosthesis more naturally and provide a higher number of classes.

Material, Methods and Results: We recorded the electroencephalogram (EEG) using 61 channels and movement data from 15 healthy subjects using g.USBamps (g.tec, Austria) and ArmeoSpring (Hocoma, Switzerland). The subjects sat in a chair with the arm supported by the ArmeoSpring. We instructed subjects to execute hand open/close, supination/pronation, and elbow extension/flexion movements according to cues presented on a computer screen, i.e., 6 movement types/classes. In total, 360 trials (60 trials per class) were recorded. We removed independent components contaminated with artifacts and down-sampled data to 16 Hz. Subsequently, we referenced to a common average reference and applied a 0.3 – 3 Hz 4th order zero-phase Butterworth band- pass filter to extract the low-frequency signals. Finally, we time-locked the data to the movement onset and classified the EEG with a shrinkage regularized linear discriminant analysis. To avoid overfitting we applied a 10x10-fold cross-validation. Fig. 1 shows the classification accuracies of all subjects and the average. The average classification accuracy peaked at 0.0625s after the movement onset with a classification accuracy of 37%. Classification accuracies were significant above 24% (p=0.05, Bonferroni corrected for multiple time- points, Adjusted Wald interval).

Figure 1. Classification accuracies of all 15 subjects. Time point 0s corresponds to the movement onset; thick black line is the average;

horizontal solid line is the chance level; horizontal dashed line is the significance level.

Discussion: We have shown that low-frequency time domain signals can be used to discriminate between different movements of the same upper limb. Movement accuracies peak after the movement onset but reach significantly high classification accuracies before the movement onset. This shows that upcoming movements can be classified from the movement planning phase. This is crucial for a BCI applicable for end users with SCI who cannot execute all movements anymore. However, the classification accuracies are rather low and further studies have to investigate whether user training improves classification accuracies.

Significance: Low-frequency time-domain signals contain information about upcoming movements and may serve as a control signals for future neuroprostheses. This would allow more degrees-of-freedom and a more natural control.

Acknowledgements: This work is supported by the European ICT Programme Project H2020-643955

“MoreGrasp” and the ERC Consolidator Grant “Feel Your Reach”.

References

[1] Bradberry T J, Gentili R J, Contreras-Vidal J L. Reconstructing Three-Dimensional Hand Movements from Noninvasive Electroencephalographic Signals. J Neurosci, 30(9): 3432-3437, 2010.

[2] Lew E Y, Chavarriaga R, Silvoni S, Millan J del R. Single trial prediction of self-paced reaching directions from EEG signals.

Front Neurosci, 8(222), 2014.

DOI: 10.3217/978-3-85125-467-9-69 Proceedings of the 6th International Brain-Computer Interface Meeting, organized by the BCI Society

Published by Verlag der TU Graz, Graz University of Technology, sponsored by g.tec medical engineering GmbH 69

Referenzen

ÄHNLICHE DOKUMENTE

The last two episodes represent the aftereffect phase with single-step targets.. steps, as illustrated in Fig. It decreased in all groups at the onset of adaptation by 11.38 ± 19.11

In addition to the fixed window length of 17 s used for correlation, we also analyzed how the classification accuracy changes in dependence on the length of the window (see Fig..

The present study evaluated features about individual finger movements of one hand, which were previously studied using ECoG [Miller et al., 2009], in noninvasive EEG,

In this study, we explore to decode not only hand but also elbow velocity from EEG signals when subjects move upper limb.. Keywords: BCI, Arm Movement Trajectory, EEG, Upper

In three patients who had suffered critical soft tissue defects (Figure 12), prosthetic hand function was measured both before and after bionic reconstruction

If saccadic size is larger with low spatial frequencies and smaller with high spatial frequencies, as stated above, then it should follow that low-pass W ltered images will lead

[1] discovered that low-frequency electroencephalography (EEG) signals can be used to decode executed movement trajectories, and also our group decoded 3D hand positions from

We decoded imagined hand close and supina- tion movements from seven healthy subjects and investi- gated the influence of the visual input.. We found that mo- tor imagination of