Session to session transfer learning using regularized four parameters common spatial pattern method

Zaineb M. Alhakeem, Ramzy S. Ali

Abstract


Brain computer interface (BCI) has many useful applications to help disabled people that have an active brain with difficulties in movements and speaking. One of these applications is the wheelchair, this device is operated always by just one user no sharing or borrowing the device. One-user applications need features extraction methods with high classification accuracy and small training datasets, the variability of the subjects’ mood during the recorded sessions and the tiredness during the long sessions are serious problems that affect the classification accuracy in these applications. Transfer learning can solve the problem, by recording short and separated sessions for the same subject in different training times or days. The proposed method in this paper uses motor imagery (MI) signals from different recorded sessions by one user to build an acceptable size training dataset. To regularize different recording sessions, four tuning parameters that are independent from each other are generated using a loop, these parameters are used to find the ratios of the covariance matrices. The suggested method gives very good performance using a different number of training samples compared with six different common spatial patterns (CSP) methods using only two channels.

Keywords


Brain computer interface; Common spatial pattern; Electroencephalography; Motor imagery; Session to session learning

Full Text:

PDF


DOI: https://doi.org/10.11591/eei.v12i6.6079

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Bulletin of EEI Stats

Bulletin of Electrical Engineering and Informatics (BEEI)
ISSN: 2089-3191, e-ISSN: 2302-9285
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).