State Mandated Summer Programs in Texas: Determining the Effect on High School Students' English Test Scores

Date

2015-05

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

High stakes testing holds schools accountable for student performance. When students fail standardized tests, schools are held responsible for raising student scores.
On May 31, 2014, ninth grade students in Texas took the STAAR English I End-of-Course exam. Of the over 350,000 students in the state who took the exam, 28 percent failed the combined reading and writing exam (Texas Education Agency, 2014). The state requires that school districts provide “accelerated instruction” for any student who fails “to perform satisfactorily on an end-of-course assessment instrument required for graduation” (HB5, 2013, p. 36). This accelerated instruction was offered by a large suburban school district in the form of a summer program in June of 2014 to prepare students for the re-test in July. Summer programs have long been used by many districts to further students’ coursework, and are now more frequently being used for test preparation. However, the research on the effectiveness of summer programs to increase test scores is inconclusive. Therefore, this study focused on whether or not one summer-mandated program in Texas improved student test scores. It also compared the summer program’s curriculum and instructional strategies to the best practices of teaching reading and writing. The study answered this research question: Was there a significant difference between the July passing rates of the students who took the summer program (Group A) and the students who did not (Group B)? The data was analyzed using an analysis of covariance (ANCOVA) to compare the difference in scores between the attendees (N = 390) and the non-attendees (N = 257). Then the demographic subgroups were further analyzed using an ANCOVA to detect significant differences between and within the different groups. It was found that attendance in the summer program had a significant effect on students’ retest scores. Through further demographic analysis, it was also found that economically disadvantaged attendees and non-attendees had higher average scores as a group than the attendees as a whole. Additionally, the session a student attended (AM/PM) had a significant effect on their retest score.
Through the comparison of the summer program curriculum to best practice, it was found that the program used some best practice strategies, but the majority of the strategies fell under the category of test preparation. Interpretation of these results suggested that, although the summer program appeared to help 60 percent of its attendees pass the retest, it was not effective for the 40 percent of attendees who failed the exam a second time. Also 47 percent of non-attendees passed the exam on their second try without attending the summer program. These results suggest that the summer program had limited educational significance.

Description

Keywords

Standardized testing, State mandated, Summer programs, Struggling readers, Struggling writers, Literacy, Secondary

Citation