Developing Explainable Deep Learning Models Using EEG for Brain Machine Interface Systems

dc.contributor.advisorContreras-Vidal, Jose L.
dc.contributor.committeeMemberParikh, Pranav J.
dc.contributor.committeeMemberBuckner, Cameron J.
dc.contributor.committeeMemberMayerich, David
dc.contributor.committeeMemberNguyen, Hien Van
dc.creatorSujatha Ravindran, Akshay
dc.creator.orcid0000-0001-7213-9587
dc.date.accessioned2022-06-15T23:41:40Z
dc.date.createdDecember 2021
dc.date.issued2021-12
dc.date.submittedDecember 2021
dc.date.updated2022-06-15T23:41:41Z
dc.description.abstractDeep learning (DL) based decoders for Brain-Computer-Interfaces (BCI) using Electroencephalography (EEG) have gained immense popularity recently. However, the interpretability of DL models remains an under-explored area. This thesis aims to develop and validate computational neuroscience approaches to make DL models more robust and explainable. First, a simulation framework was developed to evaluate the robustness and sensitivity of twelve back-propagation-based visualization methods. Comparing to ground truth features, after randomizing model weights and labels, multiple methods had reliability issues: e.g., the gradient approach, which is the most used visualization technique in EEG, was not class or model-specific. Overall, DeepLift was the most reliable and robust method. Second, we demonstrated how model explanations combined with a clustering approach can be used to complement the analysis of DL models applied to measured EEG in three tasks. In the first task, DeepLift identified the EEG spatial patterns associated with hand motor imagery in a data-driven manner from a database of 54 individuals. Explanations identified different strategies used by individuals and exposed the issues in limiting decoding to the sensorimotor channels. The clustering approach improved the decoding in high-performing subjects. In the second task, we used GradCAM to explain the Convolutional Neural Network’s (CNN) decision associated with detecting balance perturbations while wearing an exoskeleton, deployable for fall prevention. Perturbation evoked potentials (PEP) in EEG (∼75 ms) preceded both the peak in electromyography (∼180 ms) and the center of pressure (∼350 ms). Explanation showed that the model utilized electro-cortical components in the PEP and was not driven by artifacts. Explanations aligned with dynamic functional connectivity measures and prior studies supporting the feasibility of using BCI-exoskeleton systems for fall prevention. In the third task, the susceptibility of DL models to eyeblink artifacts was evaluated. The frequent presence of blinks (in 50% trials or more), whether they bias a particular class or not, leads to a significant difference in decoding when using CNN. In conclusion, the thesis contributes towards improving the BCI decoders using DL models by using model explanation approaches. Specific recommendations and best practices for the use of back-propagation-based visualization methods for BCI decoder design are discussed.
dc.description.departmentElectrical and Computer Engineering, Department of
dc.format.digitalOriginborn digital
dc.format.mimetypeapplication/pdf
dc.identifier.urihttps://hdl.handle.net/10657/9229
dc.language.isoeng
dc.rightsThe author of this work is the copyright owner. UH Libraries and the Texas Digital Library have their permission to store and provide access to this work. Further transmission, reproduction, or presentation of this work is prohibited except with permission of the author(s).
dc.subjectEEG
dc.subjectDeep Learning
dc.subjectExplainability
dc.subjectInterpretability
dc.subjectBrain-Machine Interface
dc.titleDeveloping Explainable Deep Learning Models Using EEG for Brain Machine Interface Systems
dc.type.dcmiText
dc.type.genreThesis
dcterms.accessRightsThe full text of this item is not available at this time because the student has placed this item under an embargo for a period of time. The Libraries are not authorized to provide a copy of this work during the embargo period.
local.embargo.lift2023-12-01
local.embargo.terms2023-12-01
thesis.degree.collegeCullen College of Engineering
thesis.degree.departmentElectrical and Computer Engineering, Department of
thesis.degree.disciplineElectrical Engineering
thesis.degree.grantorUniversity of Houston
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
SUJATHARAVINDRAN-DISSERTATION-2021.pdf
Size:
13.3 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 2 of 2
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
4.44 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.82 KB
Format:
Plain Text
Description: