Wednesday, June 1, 2011

Report of Impacts of Library Summer Reading Program

Libraries around the country are poised to start their summer reading programs. I just ran into this evaluation report done June 2010 on impacts of library summer reading programs:

The Dominican Study: Public Library Summer Reading Programs Close the Reading Gap (http://www.dom.edu/academics/gslis/downloads/DOM_IMLS_book_2010_FINAL_web.pdf)

The report was done by Susan Roman, Deborah Carran, and Carole Fiore through Dominican University Graduate School of Library & Information Science. It was funded by an IMLS National Leadership Grant.

The study was a much-needed follow up to 30-yr old seminal research on library reading programs and took on the ambitious scope of being a national study, looking at effects of reading programs in several states across the country. The study sample was all 3rd grade students (going into 4th grade) from 11 schools in large and small communities in urban, rural and suburban areas. Schools had to have 50% or more students receiving free/reduced lunch (standard measure for children living poverty). Researchers collected data using surveys of students, teachers, public librarians and school librarians as well as results from the Scholastic Reading Inventory administered before and after summer reading programs.

Because researchers were trying to have a broad, national scale for the study, they were unable to create a carefully controlled environment for the research. The design was causal comparative - there was no control group, students opted into the summer reading programs as they chose, and summer reading programs (as well as the experiences of non-participants) varied.

Results of the research show some great data and insight into the value of summer reading programs as identified by students, parents, teachers and librarians. These groups strongly believe summer reading programs make a difference. Unfortunately, researchers were unable the demonstrate a correlation between participation in summer reading program and increased scores on the Scholastic Reading Inventory (SRI). While students who participated in the program universally scored higher on the pre- and post-tests than students who didn't participate, there was no evidence that the program caused further increase in these already-high scores.

This would have been a real Holy Grail for demonstrated impacts in ways that funders so often want to see. In fact, results actually showed an increase in scores on the SRI for students who did not participate in the summer reading program. This is a puzzling result, since we usually take it for granted that reading levels drop over the summer months. As the researchers point out, they don't know what was going on for non-participants - maybe they were involved in alternative reading programs. Without the ability to control the context more, it's difficult to interpret these results.

I wonder if the researchers dug into the effects of the summer reading program, controlling for socio-economic status. For example, if they looked at SRI scores with participation in the program and socio-economic status as independent variables, along with an interaction term. I'm imagining a regression that looks like:

SRI score = B1(Program Participation) + B2(Socio-economic status) + (Program Participation * Socio-economic status)

My hypothesis is that perhaps there is a differential effect of summer reading programs - they are nice but not necessary for students from high-income families but they are invaluable resources for students from low-income families. This would certainly support the idea of libraries as democratizing agents in communities.

In the end, the researchers call for a more focused, controlled study of summer reading programs to drill down to quantifiable impacts. I am intrigued by their works so far and hope that it goes further.

No comments:

Post a Comment