Performance Tuning and Modeling of Communication in Parallel Applications

dc.contributor.advisorGabriel, Edgar
dc.contributor.committeeMemberSubhlok, Jaspal
dc.contributor.committeeMemberShi, Weidong
dc.contributor.committeeMemberGurkan, Deniz
dc.creatorJha, Shweta 1986-
dc.date.accessioned2019-09-13T18:32:22Z
dc.date.available2019-09-13T18:32:22Z
dc.date.createdMay 2017
dc.date.issued2017-05
dc.date.submittedMay 2017
dc.date.updated2019-09-13T18:32:24Z
dc.description.abstractThe goal of high performance computing is executing very large problems in the least amount of time, typically by deploying parallelization techniques. However, in- troducing parallelization to an application also introduces synchronization and com- munication overhead, which in turn creates a performance bottleneck. Performance modeling and tuning can be used to predict and ease this bottleneck to improve the overall performance of the application. There are two aspects of an application which can be improved from performance point of view, namely, the computational section and the communication section. The time spent in communication operations is a major factor in determining the scalability of parallel applications. Tuning the parameters of a communication library can be used to adapt its characteristics to a particular platform, minimizing the communication time of an application. On the other hand performance modeling can be used to predict the performance using the network and application attributes. The goal of this dissertation is to improve the performance of a parallel applica- tion by performance tuning and performance modeling. Specifically, we introduce the notion of a personalized MPI library, highlighting the necessity and the methodology each application needs to have a communication library tuned for the particular plat- form. Secondly, this dissertation contributes towards the theoretical understanding of impact and limitations of point-to-point communication performance on collective communication and the overall application. This study has been further extended to develop performance models for communication aspect of collective I/O for one and two dimensional data decomposition, and for two file partitioning strategies, namely even and static partitioning.
dc.description.departmentComputer Science, Department of
dc.format.digitalOriginborn digital
dc.format.mimetypeapplication/pdf
dc.identifier.citationPortions of this document appear in: Jha, Shweta, and Edgar Gabriel. "Impact and limitations of point-to-point performance on collective algorithms." In 2016 16th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGrid), pp. 261-266. IEEE, 2016. And in: Jha, Shweta, Edgar Gabriel, and Saber Feki, "A Personalized MPI library for Exascale Applications and Environments," Workshop on Exascale MPI at Supercomputing Conference 2014, November 17, 2014,New Orleans, LA, USA.
dc.identifier.urihttps://hdl.handle.net/10657/4509
dc.language.isoeng
dc.rightsThe author of this work is the copyright owner. UH Libraries and the Texas Digital Library have their permission to store and provide access to this work. UH Libraries has secured permission to reproduce any and all previously published materials contained in the work. Further transmission, reproduction, or presentation of this work is prohibited except with permission of the author(s).
dc.subjectPerformance tuning
dc.subjectPerformance models
dc.subjectCollective communication
dc.subjectPoint-to-point communication
dc.titlePerformance Tuning and Modeling of Communication in Parallel Applications
dc.type.dcmiText
dc.type.genreThesis
thesis.degree.collegeCollege of Natural Sciences and Mathematics
thesis.degree.departmentComputer Science, Department of
thesis.degree.disciplineComputer Science
thesis.degree.grantorUniversity of Houston
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
JHA-DISSERTATION-2017.pdf
Size:
8.72 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.81 KB
Format:
Plain Text
Description: