This blog is part of a series in collaboration with Brock University. Written by a student in the ‘Program Evaluation in Professional Practice’ course, this blog details a student’s first-hand experience conducting a program evaluation during a placement with the Brock Niagara Penguins. The blog aims to provide reflections and best practices for sport stakeholders who are conducting program evaluations in similar contexts.
The Brock Niagara Penguins are a para sport organization in St. Catharines, Ontario, offering a variety of programs (e.g., swimming, sitting volleyball, wheelchair basketball) to youth and adults living with physical disabilities. The organization strives to provide a safe, barrier-free environment where participants can be physically active, enhance their confidence, and become well-rounded community leaders.
In March 2020, the COVID-19 pandemic forced the Ontario government to shut down non-essential businesses. Such closures prompted a shift from in-person to virtual programming for the Penguins—a reality for many sport organizations across the country.
As part of a field placement in the Penguins’ organization beginning in January 2021, we worked with stakeholders to design and implement an evaluation of their programming with the goal of exploring how the transition from in-person to virtual programming impacted athletes’ sense of self-confidence and physical activity levels.
Building It Up: Choosing the Right Tools
When building our evaluation plan, an important first step was to choose data collection methods that best fit our program context, stakeholders, and evaluation needs. To evaluate the transition from in-person to virtual programming in the Niagara Penguins, two different data collection methods were selected: (1) surveys and (2) semi-structured interviews.
Surveys
Surveys were chosen as our primary source of data collection because they allowed for flexibility in terms of content and question style, as well as timeliness of distribution (Newcome & Triplett, 2015). Specifically, we chose to use electronic surveys due to the remote nature of the pandemic. Electronic surveys are a cost-friendly alternative to paper surveys, can be completed in a relatively short time frame, and can reach a broad audience (Newcome & Triplett, 2015).
We created an electronic survey for this evaluation using a free online survey platform (Google Forms), and distributed a link to the survey to program participants via e-mail. This included all registered Penguins athletes who attended a virtual fitness program that ran once a week beginning in January of 2021. Participants were surveyed in March, about three months into programming. On the surveys, participants were asked to rate their confidence and physical activity levels while participating in the program.
In developing our survey, we learned that we needed to build it up and then knock it down to ensure that we were asking the right questions. For us, the “right” questions were:
- Clear, answerable, and to the point
- Relevant to our evaluation questions
- Engaging for participants without taking too much of their time
Our virtual survey was returned over a two-week period by nine participants who regularly attended the weekly fitness program. This helped us gain a sense of whether participants felt that virtual programming impacted their physical activity adherence and self-confidence amidst the COVID-19 pandemic.
Semi-structured interviews
Semi-structured interviews (completed virtually) were chosen as a complementary data collection tool to help further explore the impact of virtual Penguin’s programming on participants’ physical activity levels and self-confidence. While the surveys allowed us to understand the “what,” semi-structured interviews enabled us to have in-depth discussions with our participants to understand the “how” and “why” (Crabtree & DiCicco-Bloom, 2006).
By using semi-structured interviews in our evaluation, we were able to learn more about how a sample of Penguins participants experienced virtual programming. However, we also learned that it is important to think about who to interview to best answer specific evaluation questions. For example, in our evaluation, we chose to interview participants who attended both in-person (pre-pandemic) and virtual programming (during the pandemic). It would not have been relevant for us to interview a participant who only experienced one form of delivery, as we asked them to compare their experiences. We also felt it was best to interview athletes, as opposed to coach facilitators, to get an unbiased perspective of programming.
When using semi-structured interviews for program evaluations, we also learned that it is important to:
- Develop rapport with participants by offering to speak with them prior to the interview, providing them with sample interview questions, and clearly explaining their role in the study and how their responses will be used to ensure that they are comfortable to speak freely and openly with you.
- Ask if participants consent to having the interview recorded so that it is easy to review and get a clear sense of participants’ responses.
- Rehearse the interview guide (e.g., practice interviewing a colleague or friend) to maximize flow and enhance participant engagement when conducting the interview.
- Consider having an interpreter or translator present during the interview(s) to aid in language comprehension.
Completing follow-up interviews with seven of the nine surveyed athletes provided clarity around how virtual programming helped support athletes’ physical activity and confidence levels during the COVID-19 pandemic, as well as how it differed from in-person programming. For instance, similar to an in-person class, a Penguin’s participant shared that:
“It [virtual programming] sort of sets this routine where you think okay, I have to work out at this certain time… I can’t just put it off and make an excuse not to. There’s a scheduled appointment and people expecting you to be there.”
Knocking It Down: Lessons Learned along the Way
Through our evaluation, we learned that the virtual programming run through Niagara Penguins did have a positive impact on participants’ physical activity levels, as well as self-confidence. In addition, despite the many challenges that COVID-19 has presented to community-based sport organizations, the pandemic offered several advantages for the evaluation process.
For instance, over the last year, the pandemic has increased technology-utilization and enabled exploration of virtual interviews and survey platforms. Instead of creating hard-copy surveys and distributing them to select participants, we were able to electronically share our survey with all program participants. As well, we were able to hold interviews over a virtual meeting platform, where participants could feel comfortable in their own home environment, choose to use a camera (or not), and interviews took less time because participants did not have to travel to a physical program site to complete them—all of which increased participants willingness to participate.
Based on this evaluation, we encourage other community-based sport programs to consider leveraging their current resources to evaluate the parts of a program that are working well and the areas of a program could be improved, particularly given the recent transition for many programs into virtual settings. Every evaluation is different, and what I learned is that “building it up” and “knocking it down” are important steps involved in selecting the right questions, choosing the right tools to answer those questions, and ultimately answering those questions in a meaningful way.