State Mandated Summer Programs in Texas: Determining the Effect on High School Students' English Test Scores

dc.contributor.advisorMountain, Lee
dc.contributor.committeeMemberHutchison, Laveria F.
dc.contributor.committeeMemberDay, Susan X.
dc.contributor.committeeMemberHale, Margaret A.
dc.creatorJones, Heather A.
dc.creator.orcid0000-0001-6581-0731
dc.date.accessioned2017-04-10T02:30:51Z
dc.date.available2017-04-10T02:30:51Z
dc.date.createdMay 2015
dc.date.issued2015-05
dc.date.submittedMay 2015
dc.date.updated2017-04-10T02:30:53Z
dc.description.abstractHigh stakes testing holds schools accountable for student performance. When students fail standardized tests, schools are held responsible for raising student scores. On May 31, 2014, ninth grade students in Texas took the STAAR English I End-of-Course exam. Of the over 350,000 students in the state who took the exam, 28 percent failed the combined reading and writing exam (Texas Education Agency, 2014). The state requires that school districts provide “accelerated instruction” for any student who fails “to perform satisfactorily on an end-of-course assessment instrument required for graduation” (HB5, 2013, p. 36). This accelerated instruction was offered by a large suburban school district in the form of a summer program in June of 2014 to prepare students for the re-test in July. Summer programs have long been used by many districts to further students’ coursework, and are now more frequently being used for test preparation. However, the research on the effectiveness of summer programs to increase test scores is inconclusive. Therefore, this study focused on whether or not one summer-mandated program in Texas improved student test scores. It also compared the summer program’s curriculum and instructional strategies to the best practices of teaching reading and writing. The study answered this research question: Was there a significant difference between the July passing rates of the students who took the summer program (Group A) and the students who did not (Group B)? The data was analyzed using an analysis of covariance (ANCOVA) to compare the difference in scores between the attendees (N = 390) and the non-attendees (N = 257). Then the demographic subgroups were further analyzed using an ANCOVA to detect significant differences between and within the different groups. It was found that attendance in the summer program had a significant effect on students’ retest scores. Through further demographic analysis, it was also found that economically disadvantaged attendees and non-attendees had higher average scores as a group than the attendees as a whole. Additionally, the session a student attended (AM/PM) had a significant effect on their retest score. Through the comparison of the summer program curriculum to best practice, it was found that the program used some best practice strategies, but the majority of the strategies fell under the category of test preparation. Interpretation of these results suggested that, although the summer program appeared to help 60 percent of its attendees pass the retest, it was not effective for the 40 percent of attendees who failed the exam a second time. Also 47 percent of non-attendees passed the exam on their second try without attending the summer program. These results suggest that the summer program had limited educational significance.
dc.description.departmentCurriculum and Instruction, Department of
dc.format.digitalOriginborn digital
dc.format.mimetypeapplication/pdf
dc.identifier.urihttp://hdl.handle.net/10657/1699
dc.language.isoeng
dc.rightsThe author of this work is the copyright owner. UH Libraries and the Texas Digital Library have their permission to store and provide access to this work. Further transmission, reproduction, or presentation of this work is prohibited except with permission of the author(s).
dc.subjectStandardized testing
dc.subjectState mandated
dc.subjectSummer programs
dc.subjectStruggling readers
dc.subjectStruggling writers
dc.subjectLiteracy
dc.subjectSecondary
dc.titleState Mandated Summer Programs in Texas: Determining the Effect on High School Students' English Test Scores
dc.type.dcmiText
dc.type.genreThesis
thesis.degree.collegeCollege of Education
thesis.degree.departmentCurriculum and Instruction, Department of
thesis.degree.disciplineCurriculum and Instruction
thesis.degree.grantorUniversity of Houston
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Education

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
JONES-DISSERTATION-2015.pdf
Size:
1.89 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.81 KB
Format:
Plain Text
Description: