Achieving Low-Cost and High-Efficient Robust Inference and Training for Convolutional Neural Networks

dc.contributor.advisorFu, Xin
dc.contributor.committeeMemberChen, Jinghong
dc.contributor.committeeMemberWu, Xuqing
dc.contributor.committeeMemberNguyen, Hien Van
dc.contributor.committeeMemberJoardar, Biresh Kumar
dc.creatorWang, Lening
dc.creator.orcid0000-0001-7717-1199
dc.date.accessioned2024-01-24T16:39:16Z
dc.date.createdAugust 2023
dc.date.issued2023-08
dc.date.updated2024-01-24T16:39:16Z
dc.description.abstractThe popularity of Convolutional Neural Networks (CNNs) has skyrocketed in recent years, making them one of the most widely used and influential deep learning architectures in the field of computer vision and image recognition. However, the practical implementation of CNNs poses several notable challenges. The first challenge is about the training efficiency, as the computational demands of training CNNs can be substantial. Another challenge arises in the domain of security, as CNNs are susceptible to a variety of adversarial attacks, including adversarial input and backdoor insertions. This dissertation focuses specifically on addressing these aforementioned challenges. To tackle the issue of training efficiency, two novel approaches are presented: WRR (Write Reduction on ReRAM) and BS-pFL (Bit Stream guided personalized Federated Learning). WRR introduces an architecture for an in-memory CNN training accelerator that lever-ages the emerging resistive random access memory (ReRAM). BS-pFL, on the other hand, presents a lightweight pruning-based CNN training framework designed for edge devices. It improves training efficiency through the predicting and pruning of insignificant parameters using bit streams, eliminating the need for calculations of those insignificant parameters. In order to address the security concerns in CNNs, two defense schemes are introduced: PV-NA (Process Variation Guided Neuron Aware Noise Injection) and LP-RFL (Label Guided Pruning for Robust Federated Learning). PV-NA is a noise-based defense scheme that specifically targets on adversarial inputs. It utilizes the undervolting technique and process variation to generate hardware based diverse noise patterns, effectively mitigating the impact of misleading adversarial noises during inference. LP-RFL, on the other hand, is a low-cost weight gradient pruning-based defense scheme developed to counter backdoor insertions during the training time. By efficiently pruning the significant malicious weight gradients from backdoored training examples, LP-RFL effectively prevents backdoor insertions and fortifies the integrity of the CNN model.
dc.description.departmentElectrical and Computer Engineering, Department of
dc.format.digitalOriginborn digital
dc.format.mimetypeapplication/pdf
dc.identifier.citationPortions of this document appear in: L. Wang, M. Sistla, M. Chen, and X. Fu, “Bs-pfl: Enabling low-cost personalized federated learning by exploring weight gradient sparsity,” in 2022 International Joint Conference on Neural Networks (IJCNN), 2022, pp. 1–8; and in: L. Wang, Q. Wan, P. Ma, J. Wang, M. Chen, S. L. Song, and X. Fu, “Enabling high-efficient reram-based cnn training via exploiting crossbar-level insignificant writing elimination,” IEEE Transactions on Computers, pp. 1–12, 2023.
dc.identifier.urihttps://hdl.handle.net/10657/16028
dc.language.isoeng
dc.rightsThe author of this work is the copyright owner. UH Libraries and the Texas Digital Library have their permission to store and provide access to this work. UH Libraries has secured permission to reproduce any and all previously published materials contained in the work. Further transmission, reproduction, or presentation of this work is prohibited except with permission of the author(s).
dc.subjectMachine Learning
dc.subjectCNN
dc.subjectRobus
dc.subject, Security
dc.subjectAccelerator
dc.titleAchieving Low-Cost and High-Efficient Robust Inference and Training for Convolutional Neural Networks
dc.type.dcmitext
dc.type.genreThesis
dcterms.accessRightsThe full text of this item is not available at this time because the student has placed this item under an embargo for a period of time. The Libraries are not authorized to provide a copy of this work during the embargo period.
local.embargo.lift2025-08-01
local.embargo.terms2025-08-01
thesis.degree.collegeCullen College of Engineering
thesis.degree.departmentElectrical and Computer Engineering, Department of
thesis.degree.disciplineElectrical Engineering
thesis.degree.grantorUniversity of Houston
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy

Files

License bundle

Now showing 1 - 2 of 2
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
4.43 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.81 KB
Format:
Plain Text
Description: