How do you know what works and what doesn’t work when it comes to child nutrition or related human services? Ever since cruise lines first began building mock suites for passengers to try out before installing the rooms on ocean liners in the 1940s, businesses have been devising trial runs for a small number of consumers to test merchandise prior to mass production. After all, you have drug companies doing clinical trials. How do kids and parents feel about trial runs, also known as usability testing when it also applies to human services?
Who does nutrition trial runs such as usability testing on kids in foster care? Can human services or nutrition products be tested in trial runs with children and families as users? How do you test the use and popularity or health benefits of gadgets or any other products?
Today, companies still make important changes based on this “usability testing” before taking their goods to the wider market, and researchers from the University of North Carolina at Chapel Hill (UNC) are part of a team that says that what works for cell phones and video games may also work for human services, says the October 1, 2013 Frank Porter Graham Child Development Institute (UNC) news release, “Researchers bring product testing to foster care system.” You can read the original study or its abstract, “Usability testing, initial implementation, and formative evaluation of an evidence-based intervention: Lessons from a demonstration project to reduce long-term foster care.”
Karen Blase, senior scientist at the University of North Carolina at Chapel Hill (UNC’s) Frank Porter Graham Child Development Institute, and Mark Testa, a distinguished professor in UNC’s School of Social Work, collaborated with a research team from the University of Kansas and James Bell Associates to examine usability testing in the foster care system. They found that usability testing not only diagnosed what worked and didn’t work with a new intervention for families—it also encouraged problem-solving and collaboration.
Business leaders have long understood the value of subjecting a product to intended and unintended uses before moving to full-scale production
“That wrist strap on your video game remote is probably the result of an exuberant child accidentally flinging a prototype through a TV during pilot testing,” says Blase in the news release. Blase is a longtime expert on the science of putting ideas into practice. “A strap is a lot easier to add before you ship three or four million remotes.”
Despite usability testing’s history in the corporate sector, within the fields of program evaluation and applied social science, the testing is new and under-researched. The research team believed that before widely putting an intervention in place, usability testing could create a critical window that would allow for analysis and improvement.
In 2010, the federal government created the Permanency Innovations Initiative (PII) to improve outcomes among children in foster care and designated six grantees with innovative interventions designed to help children find permanent homes. To determine usability testing’s effectiveness in a human services setting, the researchers examined a new intervention from one of these grantees: the Kansas Intensive Permanency Project (KIPP), a public-private partnership of the University of Kansas, the state’s Department for Children and Families, and the state’s network of private providers of foster care.
KIPP was establishing a new intervention for families in order to help children with “serious emotional disturbance” find permanent homes within three years; these children are more than three times as likely to remain in long-term foster care as their peers
The new intervention aimed to enhance effective parenting practices to reduce the need for foster care. Usability testing assessed the viability and acceptance of the new intervention as the first families began to participate.
With Becci A. Akin, the study’s co-principal investigator from the University of Kansas, Blase, Testa, and colleagues published their findings in the journal Evaluation and Program Planning. Just as the addition of a simple wrist strap has saved untold televisions from hurtling remotes, usability testing identified several challenges for KIPP leaders to address in order to avoid later unwanted consequences.
First, the testing identified the types of families least likely to accept the new intervention. Administrators then devised strategies that brought the participation rate up to their 70% target
Usability testing also showed that a 7-workday timeframe for initial “evaluation assessments” was not feasible due to the need to schedule multiple parties while addressing logistical obstacles. As a result, KIPP administrators adjusted their expectations, doubled the timeframe, and relieved families and professionals of an unnecessarily stressful deadline.
As importantly, usability testing told KIPP what was working. Because the intervention required video recording each session, administrators were concerned that families might not engage in the process. However, nearly every parent that agreed to participate in the new intervention continued through several sessions. Although usability testing revealed that parents with older youth were less likely to consent to the intervention, overall, on measures of engagement with families the intervention exceeded expectations.
Usability testing and leadership procedures that work or don’t work
Usability testing also encouraged front-line staff to report to leadership about procedures that didn’t work. In turn, staff combined efforts with their immediate supervisors and administrators to fix early problems. With such collaboration, procedural changes improved engagement with families, including new talking points tailored for parents based upon the age of their child.
Blase sees potential for more applications for usability testing in human service settings. “Making small tests of change, which translate to improvements early on, can prevent us from walking too far down a path that won’t lead us out of the woods,” the news release explains. For more information, check out the sites of the National Research Implementation Network (NIRN), the Frank Porter Graham Child Development Institute and the State Implementation and Scaling-up Evidence-based Practices Center (SISEP).