Digitized Theses and Dissertations (1940 - 2009)
Permanent URI for this communityhttps://hdl.handle.net/10657/3932
About UH Libraries' Digitized Theses and Dissertations Project
University of Houston (UH) Libraries is engaged in a multi-year project to digitize and deliver online its collection of print theses and dissertations dating back to 1940, making the full breadth of scholarship produced by UH students more readily accessible around the world. There is no cost to the author for this service.
Alumni and other readers will be able to view these works as they are processed and made available through UH's open access repository. Works that are presumed to be under copyright will be restricted only to users who have an active CougarNet ID.
Please note, text may be faint or difficult to read, and pages may be missing or misnumbered in the print copies of theses or dissertations. UH Libraries staff have made every effort to provide the highest possible quality representation of the original works. To protect privacy and other rights, some personally identifiable information and/or copyrighted material is redacted from the works in this collection.
Theses and dissertations will continue to be made available through interlibrary loan (ILL) to other libraries, as when they were only available in print.
Requests for withdrawing works (except electronic theses and dissertations) must be directed to the online Takedown Request Form. Any other questions about this project may be directed to cougarroar@uh.edu.
Browse
Browsing Digitized Theses and Dissertations (1940 - 2009) by Department "Computer Science, Department of"
- Results Per Page
- Sort Options
Item 3D forward modeling by four way domain decomposition and four way I/O concurrent management with absorbing boundaries implemented on the Cray X-MP/48(1985) Juang, Shou-an; King, Willis K.; Johnson, Olin G.; Pyle, Leonard Duane; Gardner, Gerald H. F.A 3D acoustic wave equation modeling program was designed and implemented on the CRAY X-MP vector processor. Domain decomposition is used to split the problem into two or more concurrent sub-tasks. The program is designed to handle problems of the order 256x256x128 spatial gird points and approximately 3000 time steps. The Fast Fourier Transformation method is used as the computional basis of the 3D modeling program, A brief overview of the CRAY X-MP system is given and the algorithm used in solving the wave eqution is also discussed. The design of the 3D forward modeling program has the following properties : (1) Four way concurrent I/O management (SLICE4); (2) Physical model input facility; (3) Domain decomposition and (4) Absorbing boundaries. The measured times, snapshots and time section graphs of the test models are presented. Factors affecting the computing time and improvements of the 3D forward modeling program will also be discussed. Finally the physical model results (tank data) is compared with the numerical results of the forward modeling to obtain comparisons of seismic events.Item 3D high order finite difference modeling and migration program implemented on the Cray X-MP out-of-core version(1985) Shen, Liyang, 1961-; King, Willis K.; Johnson, Olin G.; Pyle, Leonard Duane; Gardner, Gerald H. F.This paper describes the development and testing of a vectorized, 3D out-of-core modeling/migration program with absorbing boundary conditions for the Cray X-MP supercomputers using a high order finite difference method. The program has several alternatives for the user to select: (1). modeling or migration. (2). fourth order or sixth order finite-difference. (3). with or without absorbing boundary conditions. In the program, density is assumed to be a constant. Velocity v(x,y,z) can vary with x,y,z. The velocity matrix should be provided by the user. The program is aimed at dealing with very large data sets. In order to efficiently process a large amount of data, 4 parallel input/output channels are used to transfer data between main memory and disks. The I/O is also asynchronous.Item 3D migration on the Cray-XMP(1986) Lhemann, Olivier; Johnson, Olin G.; Gardner, Gerald H. F.; King, Willis K.; Leiss, Ernst L.; Pyle, Leonard DuaneThe implementation on the Cray X-MP of 3D migration algorithms derived from the Phase Shift method is presented. Due to the enormous amount of data to process, 3D migration algorithms are very demanding in I/O resources and CPU time. A package of subroutines, called Slice-4, has been designed in order to offer an efficient scheme to store and retrieve data from disk to main memory. Efficiency is achieved through an appropriate storage scheme, asynchronism and parallelism in the data transfers. The implementation of the Phase Shift migration algorithm using Slice-4 is described. This algorithm assumes the absence of lateral variations in the velocity field and is thus not appropriate for the migration of complicated geological structures. A new algorithm, called the Phase Shift Plus Correction (PSPC) method is presented. In this algorithm, which takes into account lateral velocity variations, the migration is performed through a two steps process. In the first step, a downward extrapolation is performed using the Phase Shift algorithm and a constant velocity. In the second step, a correction term is introduced. This term is a function of the difference between the actual space varying velocity field and the constant reference velocity. Depending on the complexity of the correction term, several schemes are available. In the first order scheme, the correction term is merely a time-shift. In the second order and high-order schenes, corrections are performed for both the time-shift and diffraction term. The interest of the PSPC algorithm is that, in contrast to other methods, the correction is performed without the use of a finite difference scheme. The method is thus more accurate, stability conditions are minimum and it is very fast. The algorithm is tested against real 2D and 3D data. Benchmarks results obtained on a dedicated Cray X-MP are also presented.Item A 2D time reversal depth migration program for the Cyber 205 vector processor(1983) Horng-Jyh, Yang; King, Willis K.; Johnson, Olin G.; Pyle, Leonard Duane; Gardner, Gerald H. F.A time reversal two-dimensional (2D) depth migration program was designed and implemented on the GDC Cyber 205 vector processor. The program was designed to handle a 256 times 256 spatial grid. The Fourier method was used as the computational basis of the program. In this thesis, we first describe the depth migration method in mathematical generality. We then discuss the basic features of vector processors and present an overview of the Cyber 205 system. Finally, we give details of program implementation. The purpose of this research was to implement the above migration program on a high performance vector processor in order to shorten processing time. Testing of the program was performed by comparison to a program using the same algorithm on the VAX-11/780 with a FPS-100 Array Processor. This program was written by Chung (1982). The measured computing time and plotted snap-shots for various test cases are shown. Results are discussed with respect to improvement of the computing time.Item A 3D forward modeling program for the Cyber 205 vector processor(1982) Cheng, Tien You; Johnson, Olin G.; Pyle, Leonard Duane; Gardner, Gerald H. F.A 3D acoustic wave equation modeling program was designed and implemented on the CDC Cyber 205 vector processor. The program was designed to handle a 256x256x128 spatial grid and approximately 3000 time steps within 40 hours of computing time. The Fourier method was selected as the computational basis of the 3D modeling program. Some basic features of vector processors are introduced such as pipelining, vector instructions ...etc. A brief overview of the Cyber 205 system and the algorithm used in solving the wave equation are also discussed. The design of the 3D forward modeling program has the following properties (1) Halfword (32-bit) storage and computation. (2) Four way concurrent 1/0 management (SLICE4). (3) Dynamic startup computation involving only those spatial points at which wave activity is present. The measured times and snapshots of the test models are presented. Factors affecting the computing time and improvements of the 3D forward modeling program are also discussed.Item A backup and recovery system for a relational database(1983) Wang, Jim-mei Vivian; Huang, Stephen S. H.; Rusinkiewicz, Marek; Scamell, Richard W.Database recovery is always of essential concern in a database processing environment. A database could be damaged in one of several ways, including system crashes, physical damage, inadvertent erasing by an operator, or by an application program error. The effect and impact of such loss on the user's installation can be minimized through the use of database recovery programs. In this thesis, the REQUEST Relational Database System is implemented to protect the database by making periodic copies of the database and by recording data changes on the log/journal tape(s). This thesis discusses the design and the implementation of REQUEST'S five utility programs: Database Image Copy Utility, Database Log/Journal Utility, Log Purging Utility, Log Cumulation Utility and Database Restore Utility. In an event of failure in the REQUEST database, the latest 'good' copy is updated with changes that have been logged in the journal since the copy was made, thus restoring the REQUEST database to its condition at the point of failure.Item A basic operating system for TI-960A minicomputer(1977) Shang, Hen Kuo; Huang, J.Presented in this thesis is an operating system that is designed for teaching minicouputer operating system fundamentals. With a minimum investment in a miniconputer, software support is needed to provide aids for student training in the development of programs. This TTYOS operating system is developed to meet that need. The computer used is a Texas Instruments Model 960A processor with 8K memory, two cassette tape drives, an interval timer, and a teletype. Major functions included in this operating system are input/output supervision, interrupt handling, program debugging, operator communication, storage assignment, and other utility services. A programmer User's Manual and a Source Listing of this operating system are included.Item A basic real-time operating system for Ti-960A minicomputer(1979) Lee, Peter Ying-Huang; Huang, J. C.; King, Willis K.; Tavora, Carlos J.Presented in this thesis is a real-time operating system for process control applications in which a fixed number of concurrent tasks are executed periodically with frequencies chosen by the operator. The computer used is a Texas Instruments Model 960A processor, with 16K 16- bit word memory, an interval timer, a teletype, and two cassette drives. Major functions included in this real-time operating system are interrupt handler, scheduler, clock process, operator communication process, and supervisor call subroutines. A source listing of this operating system is included.Item A batch operating system for a Microdata 1600/30 : intercommunication between processes(1975) Konrat, Jean-LucPart of the implementation of a general purpose batch Operating System for a MICRODATA 1600/30 is presented in this thesis. This Operating System, with a resident of 4K bytes only, works in a minimum configuration of 16K bytes. It may be used both in a batch configuration, with a spooling system, and in stand-alone configuration. In this thesis, the system is described in terms of its decomposition into processes ; a general scheme for intercommunication between processes is presented, and the control language is viewed as a tool for the description of the processes in the system. The memory constraints are solved by a succession of overlays, and the linkage to the context of a user program is danonstrated. The description of the rest of the system may be found in the thesis of Xavier Mangin [June, 1975]. Both theses are needed to get a full understanding of the Operating System.Item A categorization of the theory of generalized-inverses of matrices(1970) Dreussi, Joseph F.; Meicler, Marcel; Newhouse, AlbertThis study is a survey of the theory of the generalized-inverses of matrices as defined by Penrose. The material has been categorized under the following major areas: the Generalized-Inverse, the Reflexive Generalized-Inverse, the Least Squares Generalized-Inverse, the Left Weak Generalized-Inverse, the Right Weak Generalized-Inverse, and the Pseudo-Inverse of matrices.Item A COBOL subroutine library time calculations(1979) Miller, Marilyn J.; Johnson, Olin G.; King, Willis K.; Gelb, Betsy D. B.; Newhouse, AlbertSubroutines are a well known enhancement to many data processing systems, but for some reason, business programmers working in COBOL have never developed a comprehensive subroutine library analagous to the IBM FORTRAN scuentific subroutine package (SSP) or the IMSL library. This thesis intends to show that there is a need for such a library and presents twenty-seven routines dealing with commonly needed time calculations. This would be only one "chapter" of a comprehensive collection which could also include such subjects as depreciation calculations, interest calculations, and income tax tables.Item A comparative study of height-balanced tree(1987) Wang, Ronghuey Alice; Huang, Stephen S. H.; Simpson, Anne L.; Wu, Tiee-JianHeight-balanced trees (H-trees), a recently proposed data structure, is a variant of B-trees. An H([beta], [gamma], [delta]) tree is defined by three parameters: [beta], the size of a node; [gamma], the minimal number of grandsons which a node must have; and [delta], the minimal number of leaves which bottom nodes must have. The purpose of this research is to study H-trees empirically. Algorithms, to insert and delete elements, are implemented. The results of our experiments confirm the validity of existing theories. For example, by varying the parameters [delta] and [gamma], significant changes in the tree's performance are observed: the height of H-trees decreases as [gamma] increases, and the storage utilization increases as [delta] increases. Moreover, comparisons of H-trees with other variants of B-trees are shown to demonstrate the superiority of H-trees. Specifically, the average storage utilization of H-trees may be higher than that of B-trees by almost 20%.Item A comparative study of several classes of binary split trees(1982) Ong, Mei-Ling Lin; Huang, Stephen S. H.; Bastani, Farokh B.; Cheng, Philip E.Split trees are a data structure for storing static records with skewed frequency distribution. Each node of the split tree contains two values, one of them being the key, the other being a split value. By separating the split value from the key value we can "decouple" the conflicting functions of frequency ordering and subtree construction. When a key is not found in the root, the split value is used as a guide for further search. By comparing the search value to the split value, rather than the key value, we decide whether to go right or left from a node for the remaining search. Several classes of suboptimal binary split trees (Median Split Trees, Weight Split Trees, and Restricted Median Split Trees) are discussed in this research. These suboptimal binary split trees can be constructed efficiently with satisfactory performance in terms of average access cost. The definitions and algorithms for constructing these suboptimal binary split trees are stated. Comparison among these suboptimal binary split trees, optimal binary search trees, and optimal binary split trees are done by analyzing the behavior of extremal (best or worst) cases and extensive computer simulations.Item A comparison of two hidden surface removal algorithms(1985) Huang, Jen-Jer; Elmasri, Ramez A.; Pyle, Leonard Duane; Auchmuty, J. F. GilesRecently there has been a greater need for using computers to produce pictures for 3D opaque objects. Because most practical display devices are 2D, 3D objects must be projected into 2D display devices, with the considerable attendant loss of information which can sometimes create ambiguities in the image. In the real world, the opaque material of objects obstructs the light rays from hidden parts and prevents us from seeing them. Nevertheless, when we use a computer to generate an image, no such automatic elimination would take place when 3D objects are projected onto the 2D projection plane. Therefore, we must apply a hidden-line or hidden-surface removal algorithm to the set of objects. In this thesis, we implement two hidden-surface removal algorithms, and then compare their performance to give us clues to which algorithm to use under different conditions.Item A computational study of a non-negative, fixed point formulation of the linear programming problem(1986) Shih, Chao-tso; Pyle, Leonard Duane; Decell, Henry P., Jr.; Czejdo, BogdanComputer codes implementing Dantzig's Simplex Algorithm [2] are widely used to solve linear programming problems. Geometrically, the Simplex Algorithm generates a finite sequence of pair-wise adjacent vertices of the polytope of feasible solutions and the structure of the Simplex Algorithm depends, in an essential way, upon the properties of vertices. In this thesis a non-vertex oriented alternative to the Simplex Algorithm is studied. An earlier version of a non-negative, fixed point formulation of the general linear programming problem is refined and some computational experiments seeking to accelerate convergence of a related infinite process are reported. This infinite process involves generation of a vector sequence converging to a non-negative fixed point, z=Pz, where P is an orthogonal projection matrix depending, explicitly on A, b, c and A+; A, b and c define the related linear programming problem and A+ is the generalized inverse of A [8]. The formulation z = Pz >= 0 is obtained by combining the primal, a problem equivalent to its dual, and the Duality Theorem of linear programming. The computational experiments reported treat the special class of problems called Transportation Problems [9] for which a closed form of the generalized inverse is known [10]. One scheme reported here was successful in reducing iteration counts, as compared to the method proposed in [10]. A conjecture is stated, based on observations made during the computational experiments, regarding upper and lower bounds of the range of the objective function. It is noted that the non-negative fixed point formulation given in chapter II provides a non-trivial set of problems for which Karmarker's method [5] fails.Item A computer assisted instruction system to teach the Fortran language(1983) Le, Gia-Loi Thi; Czejdo, Bogdan; El-Asfouri, Souhail; Garson, James W.This thesis discusses the use of computers in teaching programming languages. A system to teach FORTRAN to students who have no previous programming experience is proposed. The main functions of the system (instruction, simulation, and testing) are examined. The modules of the program designed to implement the system are analyzed. The data structures for this program are described. Some of the possible extensions to the system are recommended for the future consideration. The system is written in Pascal and implemented on the DEC VAX 11/780, using the special graphics mode of the VT100 terminal.Item A computer program for multiple-attributes multiple-atlernatives decision making problem using fuzzy sets(1978) Cheng, Yuen-Yee Marie; McInnis, Bayliss C.; Anderson, Robert B.; Schatz, Joseph A.This thesis presents an algorithm for a multiple-attributes multiple-alternatives decision making problem based on fuzzy set theory. The computer program which implements the algorithm will handle a large data base defining imprecise relationships between input information and possible outcomes. An application of the algorithm to the problem of ranking possible diseases using chemical laboratory test data is given here to show the capability of this program. A computer program has been designed so as to minimize the memory size and computation time. The three main components of this program are: (1) Determination of membership functions based on input data, (2) Determintaion of possible outcomes using fuzzy algorithm, and (3) Ranking of the possible outcomes. Results of testing the computer program with the medical diagnosis problem and other decision problems are given.Item A data dictionary system for a high-level data model(1983) Chang, Kei; Elmasri, Ramez A.; Rusinkiewicz, Marek; Scamell, Richard W.Data is a resource in both its physical and descriptive aspects. Both should be managed as any other major organizational resource. A data dictionary system is a formal method to handle ana control the data information. It enables management to enforce data definition standards; it supplies information about the creation, usage and relationships of data; it eliminates the data redundancy and data inconsistency; it aids the security of sensitive data definitions against unauthorized use. In this paper, we first introduce the data dictionary system concept and discuss the Entity-Category-Relationship (ECR) model. Then a data dictionary design based on the ECR model is presented. Both the method and examples of the schema and the data dictionary files are presented. The data dictionary files creation, contents, function and relationships are discussed. In addition, function procedures which help the users to get the information from the data dictionary files are presented.Item A data-flow analysis method(1976) Lim, Kwang Cook; Huang, J.; Bau, BettyPresented in this thesis is a method by which one can determine for a given program graph (1) what data definitions reach each node in the graph, (2) what data items have an upward exposed use at each node, and (3) what data definitions are "live" on each edge in the graph. The method is developed based on the topological sorting of edges and the acyclic image of a program graph. It is conceptually simple and easy to implement.Item A database design tool for the entity-category-relationship model(1985) Huang, Doris C. P. Chin; Elmasri, Ramez A.; Johnson, Olin G.; Scamell, Richard W.A user-friendly Database Design Tool based on the Entity-Category-Relationship (ECR) model is developed in this thesis. This tool provides the user with the following capabilities : 1. To create an ECR database schema by using the schema design system. 2. Specify transactions which can be used to update the database by using the transaction design system. This tool is an implementation of the GORDAS (Graph-Oriented Data Selection) DDL statements proposed for the ECR database model. An important feature of the database design tool is its user-friendliness. The user is guided through the list of options, straight-forward questions, help and schema display facilities in this system. In doing this, the system actually converts the database/transaction design into a sequence of (1) selection of multiple choices, (2) answers to true/false questions, (3) the input to the inquires of key information such as datatypes, class name and expressions, etc. This approach makes the database/transaction design easier and friendly for a user who knows little of the ECR database model as well as for an experienced user. It can also be considered as a training tool as well.