One of my favorite evaluation tools in the Card Sort. To understand visitors' perspective on something, hand them a stack of cards, each with a 1-2 word description of the topic you have in mind. Ask visitors to pull out the words that best and least describe the topic (limit them to 3-4; many people will want to pull out 8 or 9 cards). Then follow up and ask them why they chose the cards they did.
When you analyze the data, it's interested to add them the numbers for what cards were chosen the most, and which cards were ignored the most. Add to that some really great qualitative information about why people made their selections. Often, you find that people have different interpretations of the words than you, as the researcher, had.
Here's an example. Let's say you want to know how people perceive the library's current collection of fiction. Put together a list of 10-15 words that could possibly describe the collection - and don't be afraid to include some negative words (good selection, new materials, lots of options, worn out, not relevant to me, etc). The words you choose are important. Keep them simple, so people can process them quickly. But be specific and even a bit daring - that will bring out interesting comments and discussion. Above all, make sure they are relevant to what you want to know about. Test your words with a few people from the organization. Then try the list out on a few patrons before going live with the study.
As a rule of thumb, when you go live, ask as many people to do the card sort as it takes until you feel like answers are getting redundant. If you must have a number, I would recommend 30 people as a minimum.
To analyze results, tally up the total of "best" and "worst" hits each word got. Then compare what people said as their reasons why they chose those words. What patterns or themes do you see? One note: the more accurately you transcribe people's responses to why they chose words, the better your qualitative analysis will be. Resist the temptation to summarize people's statements when collecting the data. Try to write down what they say as close to word-for-word as possible. It's tough, but worth it. You don't want to add your layer of interpretation until all the data is collected.
Showing posts with label library evaluation. Show all posts
Showing posts with label library evaluation. Show all posts
Thursday, April 5, 2012
Tuesday, June 14, 2011
US Impact Study - National Research on the Benefits from Internet Access at Public Libraries
The University of Washington Information School recently released findings from the US IMPACT Public Library Study. This was a large-scale national study, funded by IMLS and the Bill & Melinda Gates Foundation, looking at internet use at public libraries.
The data was gathered from an impressive scale of telephone surveys, online surveys via public library computers and case studies/interviews at a few libraries.
The report lays out broad use statistics and demographics, but combines this with insightful and detailed analysis. It's a thought-provoking read (though time-consuming, unless you opt for the executive summary).
I'm a little wary of some of their findings since part of their data comes from people who are already library computer users (the online survey). Evaluation-speak, this is selecting on the dependent variable, which could introduce bias into the results. They did have a substantial sample from the random phone surveys and it looks like they did some creative mathematical weighting to combine the phone and web samples to reduce bias as much as possible. But it's something to keep in mind.
A few findings I found particularly interesting.
Almost 1/3 of Americans used their public library for internet access.
While there is higher use of library internet connections among people in households living below the poverty line (44%, higher for young adults and seniors), "people of all ages, incomes, races, and levels of education go to the library for Internet access, whether they have a connection at home or not". Libraries are still great agents of democracy - computer access is for everyone, used by everyone.
Internet access = young adult access. Young adults (14-18 year olds) are high library computer users. What a great initial step for libraries to involve this traditionally hard-to-reach group.
Patrons rely on library computers to take care of the everyday routine tasks as well as to take life-changes steps.
Library computer access differs from other options for computers and wireless (cafes, etc) because it is truly free (no feeling obligated to buy a drink first), it offers a quiet space for work, and it comes with staff to help navigate all ranges of computer and technology issues.
Library internet users can be segmented into 3 groups. Power users, who use the library as their sole access point and come almost daily. Supplemental users, who use the library internet routinely but have other internet options. Occasional users who use the library interest in an emergency, during a time of transition, or to do the quick occasional task.
Low income patrons are less likely to use library internet access overall, but if they do use it, they are more likely to be very frequent users. Same for 19-24 year olds. Youth 14-18 are the most likely user group of library internet and are also very frequent users.
The data was gathered from an impressive scale of telephone surveys, online surveys via public library computers and case studies/interviews at a few libraries.
The report lays out broad use statistics and demographics, but combines this with insightful and detailed analysis. It's a thought-provoking read (though time-consuming, unless you opt for the executive summary).
I'm a little wary of some of their findings since part of their data comes from people who are already library computer users (the online survey). Evaluation-speak, this is selecting on the dependent variable, which could introduce bias into the results. They did have a substantial sample from the random phone surveys and it looks like they did some creative mathematical weighting to combine the phone and web samples to reduce bias as much as possible. But it's something to keep in mind.
A few findings I found particularly interesting.
Almost 1/3 of Americans used their public library for internet access.
While there is higher use of library internet connections among people in households living below the poverty line (44%, higher for young adults and seniors), "people of all ages, incomes, races, and levels of education go to the library for Internet access, whether they have a connection at home or not". Libraries are still great agents of democracy - computer access is for everyone, used by everyone.
Internet access = young adult access. Young adults (14-18 year olds) are high library computer users. What a great initial step for libraries to involve this traditionally hard-to-reach group.
Patrons rely on library computers to take care of the everyday routine tasks as well as to take life-changes steps.
Library computer access differs from other options for computers and wireless (cafes, etc) because it is truly free (no feeling obligated to buy a drink first), it offers a quiet space for work, and it comes with staff to help navigate all ranges of computer and technology issues.
Library internet users can be segmented into 3 groups. Power users, who use the library as their sole access point and come almost daily. Supplemental users, who use the library internet routinely but have other internet options. Occasional users who use the library interest in an emergency, during a time of transition, or to do the quick occasional task.
Low income patrons are less likely to use library internet access overall, but if they do use it, they are more likely to be very frequent users. Same for 19-24 year olds. Youth 14-18 are the most likely user group of library internet and are also very frequent users.
Labels:
evaluation report,
IMLS,
internet access,
library evaluation
Thursday, June 2, 2011
New survey resource! All Our Ideas
Check out this nifty new tool for doing a quick survey to rank items: http://www.allourideas.org/
This is a great resource for institutions like museums and libraries that want to poll visitors on potential new services or programs. Imagine one of those staff meetings where you create a wish list of things you'd like to do for your visitors. This list may look something like:
- Validate parking
- Provide more food options in vending/snack area
- Have more staff in exhibit areas to give visitors more information about exhibits
- Install more benches, chairs in gallery areas
- Section off part of the library as a "No Shhhh Zone" that people can use for group work
- Offer story time programs on Saturday mornings
It's great for staff to generate ideas about how to improve visitor services. With All Our Ideas you can quickly take it to the next step and ask visitors to respond to the wish list. The site takes your list and gives visitors 2 of the options from the list and they have to vote for one item over the other (or select "I don't know"). They repeat this process over and over in a matter of seconds or minutes. The data is aggregated to a ranking of options, along with fun visualizations of the data.
A great feature of the tool is that respondents can also add their own idea. That idea is then put into the list to be voted on by others. What a fun way to bring fresh ideas to the table - and have immediate visitor feedback on them.
I will add my cautionary, unsolicited advise about asking visitors for ideas. When you ask visitors for ideas, ask about the things they are experts on, not the things you are an expert on.
For example, if you ask visitors "What kind of programs do you want us to do?", you inevitably gets answers that are way out of your budget, mission, or capacity. But what do you expect? You're the experts in program development - you know the profession and the feasibility. On the other hand, if you ask parents/caregivers "What new play items would you like to see in the baby area?", they can fill you in on the latest trends in baby toys that they talk about every week at playgroup.
Keep visitors talking about what they know about, and you can translate that into exceptional experiences that meet their needs.
This is a great resource for institutions like museums and libraries that want to poll visitors on potential new services or programs. Imagine one of those staff meetings where you create a wish list of things you'd like to do for your visitors. This list may look something like:
- Validate parking
- Provide more food options in vending/snack area
- Have more staff in exhibit areas to give visitors more information about exhibits
- Install more benches, chairs in gallery areas
- Section off part of the library as a "No Shhhh Zone" that people can use for group work
- Offer story time programs on Saturday mornings
It's great for staff to generate ideas about how to improve visitor services. With All Our Ideas you can quickly take it to the next step and ask visitors to respond to the wish list. The site takes your list and gives visitors 2 of the options from the list and they have to vote for one item over the other (or select "I don't know"). They repeat this process over and over in a matter of seconds or minutes. The data is aggregated to a ranking of options, along with fun visualizations of the data.
A great feature of the tool is that respondents can also add their own idea. That idea is then put into the list to be voted on by others. What a fun way to bring fresh ideas to the table - and have immediate visitor feedback on them.
I will add my cautionary, unsolicited advise about asking visitors for ideas. When you ask visitors for ideas, ask about the things they are experts on, not the things you are an expert on.
For example, if you ask visitors "What kind of programs do you want us to do?", you inevitably gets answers that are way out of your budget, mission, or capacity. But what do you expect? You're the experts in program development - you know the profession and the feasibility. On the other hand, if you ask parents/caregivers "What new play items would you like to see in the baby area?", they can fill you in on the latest trends in baby toys that they talk about every week at playgroup.
Keep visitors talking about what they know about, and you can translate that into exceptional experiences that meet their needs.
Labels:
all our ideas,
library evaluation,
museum evaluation,
online visitor surveys,
visitor feedback
Wednesday, June 1, 2011
Report of Impacts of Library Summer Reading Program
Libraries around the country are poised to start their summer reading programs. I just ran into this evaluation report done June 2010 on impacts of library summer reading programs:
The Dominican Study: Public Library Summer Reading Programs Close the Reading Gap (http://www.dom.edu/academics/gslis/downloads/DOM_IMLS_book_2010_FINAL_web.pdf)
The report was done by Susan Roman, Deborah Carran, and Carole Fiore through Dominican University Graduate School of Library & Information Science. It was funded by an IMLS National Leadership Grant.
The study was a much-needed follow up to 30-yr old seminal research on library reading programs and took on the ambitious scope of being a national study, looking at effects of reading programs in several states across the country. The study sample was all 3rd grade students (going into 4th grade) from 11 schools in large and small communities in urban, rural and suburban areas. Schools had to have 50% or more students receiving free/reduced lunch (standard measure for children living poverty). Researchers collected data using surveys of students, teachers, public librarians and school librarians as well as results from the Scholastic Reading Inventory administered before and after summer reading programs.
Because researchers were trying to have a broad, national scale for the study, they were unable to create a carefully controlled environment for the research. The design was causal comparative - there was no control group, students opted into the summer reading programs as they chose, and summer reading programs (as well as the experiences of non-participants) varied.
Results of the research show some great data and insight into the value of summer reading programs as identified by students, parents, teachers and librarians. These groups strongly believe summer reading programs make a difference. Unfortunately, researchers were unable the demonstrate a correlation between participation in summer reading program and increased scores on the Scholastic Reading Inventory (SRI). While students who participated in the program universally scored higher on the pre- and post-tests than students who didn't participate, there was no evidence that the program caused further increase in these already-high scores.
This would have been a real Holy Grail for demonstrated impacts in ways that funders so often want to see. In fact, results actually showed an increase in scores on the SRI for students who did not participate in the summer reading program. This is a puzzling result, since we usually take it for granted that reading levels drop over the summer months. As the researchers point out, they don't know what was going on for non-participants - maybe they were involved in alternative reading programs. Without the ability to control the context more, it's difficult to interpret these results.
I wonder if the researchers dug into the effects of the summer reading program, controlling for socio-economic status. For example, if they looked at SRI scores with participation in the program and socio-economic status as independent variables, along with an interaction term. I'm imagining a regression that looks like:
SRI score = B1(Program Participation) + B2(Socio-economic status) + (Program Participation * Socio-economic status)
My hypothesis is that perhaps there is a differential effect of summer reading programs - they are nice but not necessary for students from high-income families but they are invaluable resources for students from low-income families. This would certainly support the idea of libraries as democratizing agents in communities.
In the end, the researchers call for a more focused, controlled study of summer reading programs to drill down to quantifiable impacts. I am intrigued by their works so far and hope that it goes further.
The Dominican Study: Public Library Summer Reading Programs Close the Reading Gap (http://www.dom.edu/academics/gslis/downloads/DOM_IMLS_book_2010_FINAL_web.pdf)
The report was done by Susan Roman, Deborah Carran, and Carole Fiore through Dominican University Graduate School of Library & Information Science. It was funded by an IMLS National Leadership Grant.
The study was a much-needed follow up to 30-yr old seminal research on library reading programs and took on the ambitious scope of being a national study, looking at effects of reading programs in several states across the country. The study sample was all 3rd grade students (going into 4th grade) from 11 schools in large and small communities in urban, rural and suburban areas. Schools had to have 50% or more students receiving free/reduced lunch (standard measure for children living poverty). Researchers collected data using surveys of students, teachers, public librarians and school librarians as well as results from the Scholastic Reading Inventory administered before and after summer reading programs.
Because researchers were trying to have a broad, national scale for the study, they were unable to create a carefully controlled environment for the research. The design was causal comparative - there was no control group, students opted into the summer reading programs as they chose, and summer reading programs (as well as the experiences of non-participants) varied.
Results of the research show some great data and insight into the value of summer reading programs as identified by students, parents, teachers and librarians. These groups strongly believe summer reading programs make a difference. Unfortunately, researchers were unable the demonstrate a correlation between participation in summer reading program and increased scores on the Scholastic Reading Inventory (SRI). While students who participated in the program universally scored higher on the pre- and post-tests than students who didn't participate, there was no evidence that the program caused further increase in these already-high scores.
This would have been a real Holy Grail for demonstrated impacts in ways that funders so often want to see. In fact, results actually showed an increase in scores on the SRI for students who did not participate in the summer reading program. This is a puzzling result, since we usually take it for granted that reading levels drop over the summer months. As the researchers point out, they don't know what was going on for non-participants - maybe they were involved in alternative reading programs. Without the ability to control the context more, it's difficult to interpret these results.
I wonder if the researchers dug into the effects of the summer reading program, controlling for socio-economic status. For example, if they looked at SRI scores with participation in the program and socio-economic status as independent variables, along with an interaction term. I'm imagining a regression that looks like:
SRI score = B1(Program Participation) + B2(Socio-economic status) + (Program Participation * Socio-economic status)
My hypothesis is that perhaps there is a differential effect of summer reading programs - they are nice but not necessary for students from high-income families but they are invaluable resources for students from low-income families. This would certainly support the idea of libraries as democratizing agents in communities.
In the end, the researchers call for a more focused, controlled study of summer reading programs to drill down to quantifiable impacts. I am intrigued by their works so far and hope that it goes further.
Subscribe to:
Posts (Atom)