The Sport Information Resource Centre
Use double quotes to find documents that include the exact phrase: "aerodynamic AND testing"
The Sport Information Resource Centre

The Youth Concussion Awareness Network (You-CAN) is a novel, peer-led program focused on concussion education and awareness for high-school students across Canada. Findings from the use of You-CAN program in school settings show that youth with higher concussion knowledge are more likely to report a concussion to an adult and to provide social support to a peer.

Using an evidence-based approach, the Coaching Association of Canada (CAC) developed tools to improve the experiences of coaches in mentorship programs. Training for Effective Mentees is a free resource that equips mentees with the knowledge, connections, and tools to create a better mentorship experience.

For students, a blog writing assignment can enhance learning of course concepts through opportunities to self-reflect and put the research into their own words. Blogs are also an important tool for delivering evidence-based research and information to sport stakeholders. In the SIRC blog, discover three steps that instructors can use to integrate blog writing into learning activities for the benefit of students and sport practitioners alike.

Blogs are an important tool for delivering evidence-based research and information to the sport community. In the SIRC blog, read how course instructors at Brock University used blog writing as a powerful teaching tool to reinforce student learning and mobilize sport and recreation research.

This blog introduces a new blog series written by students in the ‘Program Evaluation in Professional Practice’ and ‘Child and Youth Work in Community Recreation’ courses at Brock University. Drawing on research evidence and their own experiences, students in both classes wrote blogs focused on the application of course-related concepts, such as how to implement a program evaluation in a sport setting. The top blogs were published by SIRC—which can be viewed here and here.

To kick off the series, course instructors Drs. Meghan Harlow and Corliss Bean describe how they used blog writing as a powerful teaching tool to reinforce student learning and mobilize sport and recreation research.

Blog writing and knowledge mobilization

With no shortage of sport-related research published each year, it is increasingly important that research evidence be tailored and accessible to knowledge users—including sport and recreation leaders, coaches, staff, volunteers, athletes, and policymakers. Knowledge mobilization is the process of translating academic research into easily accessible resources and products for knowledge users (Heilig & Brewer, 2019). By emphasizing clear, non-academic language, blog writing is one way of mobilizing knowledge.

Blogs are increasingly recognized as an effective platform to share research information with diverse audiences (Phipps et al., 2012). When used in academia, the publication of blogs through credible organizations can increase the likelihood that evidence-based research be delivered to those who can benefit from it most. With COVID-19 restrictions limiting in-person learning opportunities, blogging has become an innovative, yet accessible learning tool throughout the pandemic.

Blogging could be considered part of an online ecosystem—a system of interconnecting and interacting social media platforms that empower researchers and practitioners to connect, share and collaborate.

(Stoneham & Kite, 2017, p. 333)

The blog assignment

Throughout each course, students were tasked with writing two blogs. The first blog required students to use academic literature to support the application of key course-related concepts in practice (e.g., how to implement a program evaluation in a sport setting, best practices for fostering youth development in a youth recreation context). The second blog required students to reflect on their personal, professional, and academic development through their placement experiences in a sport and/or recreation context. For example, two students worked with Niagara Penguins, an organization that provides recreational and competitive sport programs for youth and young adults with disabilities, while another student worked with Start2Finish, a youth-serving organization that promotes the health and wellness of children through literacy and fitness education.

The top student contributions were selected, submitted to, and published by community sport and recreation knowledge brokers across the country, including Youth Research and Evaluation eXchange (YouthREX), Tamarack Institute, and SIRC.

The benefits of blog writing

Blog writing can enhance students’ learning of course concepts through opportunities to self-reflect and put research into their own words (Chretien et al., 2008). Blog writing can also develop students’ personal and professional skills, including written communication, critical thinking and reflection, and the application of research to practice. For example, one student described the blog writing assignment as a “really important exercise to explore my writing skills and get comfortable with using academic language or processes in casual or ‘plain language’ settings.”

Through the process of publishing student blogs with knowledge brokers such as SIRC, top students were provided with opportunities to collaborate with their instructors and people working in the sport and recreation sectors.  Such initiatives can also benefit sport and recreation communities by working to bridge research and practice, and reinforcing the value of student experiential education experiences in sport and recreation settings.

3 steps to using blog writing in the classroom

Considering the potential value of blog writing for both students and the broader sport community, we reflect on three steps that instructors can use to integrate blog writing into their own classrooms.

1. Prepare the assignment: Create a blog writing assignment outline and decide on the desired structure of the blog(s), including length, focus, and topic. Consider what blog platforms (e.g., Wix, WordPress) could be used to host and share blogs for the purpose of the assignment. When introducing the assignment to students, discuss the mechanics of effective blog writing by reviewing examples of quality blogs for formatting, style, word count, and audience, while connecting them with plain-language writing tools.

2. Connect with potential partners: Reach out to organizations with blog platforms that complement your course learning objectives. Instructors who engage in community-based research may consider starting this process by leveraging existing relationships that they have within the community. Sharing student blogs from a class-based blog platform—giving partners an idea of what the student blogs might look like—may be a good place to start. Take the time to determine the desired parameters of the organization’s blog (e.g., timelines, scope, and formatting), and consider how you can make the partnership beneficial for all parties involved (i.e., organizations, students, and instructors).

3. Work with students: Be prepared to work with students to enhance, refine, and format their blogs to ensure they are high quality before forwarding them to partners for review, sharing, and publication. Organizations may also help amplify the blogs through sharing on their social media platforms.

Students’ perspectives

At the end of both courses, students were asked about their experience with blog writing. Several students described blog writing as an attractive exercise to engage in creative, personal, and plain-language writing:

I like the opportunity for creativity in writing! …You can get across the same information in a blog assignment as an academic paper assignment but you can incorporate so much more of your tone, voice and personality!

Writing about my experience in plain language allowed me to share my experience more authentically.

It’s not often that I get to include my own personal thoughts and opinions rather than regurgitating what someone wrote in a published journal. It seems much less daunting to write an 800-word blog than an 800-word paper, something I appreciate during the chaos of fourth year and the pandemic.

With benefits for students, instructors, and knowledge users, blog writing is an innovative and accessible way to enhance student learning and mobilize research to sport and recreation communities.

This blog is part of a series in collaboration with Brock University. Written by a student in the ‘Program Evaluation in Professional Practice’ course, this blog details a student’s first-hand experience conducting a program evaluation during a placement with the Brock Niagara Penguins. The blog aims to provide reflections and best practices for sport stakeholders who are conducting program evaluations in similar contexts.

The Brock Niagara Penguins are a para sport organization in St. Catharines, Ontario, offering a variety of programs (e.g., swimming, sitting volleyball, wheelchair basketball) to youth and adults living with physical disabilities. The organization strives to provide a safe, barrier-free environment where participants can be physically active, enhance their confidence, and become well-rounded community leaders.  

In March 2020, the COVID-19 pandemic forced the Ontario government to shut down non-essential businesses. Such closures prompted a shift from in-person to virtual programming for the Penguins—a reality for many sport organizations across the country.

As part of a field placement in the Penguins’ organization beginning in January 2021, we worked with stakeholders to design and implement an evaluation of their programming with the goal of exploring how the transition from in-person to virtual programming impacted athletes’ sense of self-confidence and physical activity levels.  

Building It Up: Choosing the Right Tools

When building our evaluation plan, an important first step was to choose data collection methods that best fit our program context, stakeholders, and evaluation needs. To evaluate the transition from in-person to virtual programming in the Niagara Penguins, two different data collection methods were selected: (1) surveys and (2) semi-structured interviews.


Surveys were chosen as our primary source of data collection because they allowed for flexibility in terms of content and question style, as well as timeliness of distribution (Newcome & Triplett, 2015). Specifically, we chose to use electronic surveys due to the remote nature of the pandemic. Electronic surveys are a cost-friendly alternative to paper surveys, can be completed in a relatively short time frame, and can reach a broad audience (Newcome & Triplett, 2015).

We created an electronic survey for this evaluation using a free online survey platform (Google Forms), and distributed a link to the survey to program participants via e-mail. This included all registered Penguins athletes who attended a virtual fitness program that ran once a week beginning in January of 2021. Participants were surveyed in March, about three months into programming. On the surveys, participants were asked to rate their confidence and physical activity levels while participating in the program.

In developing our survey, we learned that we needed to build it up and then knock it down to ensure that we were asking the right questions. For us, the “right” questions were:

Our virtual survey was returned over a two-week period by nine participants who regularly attended the weekly fitness program. This helped us gain a sense of whether participants felt that virtual programming impacted their physical activity adherence and self-confidence amidst the COVID-19 pandemic.

Semi-structured interviews

Young women participating in online meeting using laptop

Semi-structured interviews (completed virtually) were chosen as a complementary data collection tool to help further explore the impact of virtual Penguin’s programming on participants’ physical activity levels and self-confidence. While the surveys allowed us to understand the “what,” semi-structured interviews enabled us to have in-depth discussions with our participants to understand the “how” and “why” (Crabtree & DiCicco-Bloom, 2006).

By using semi-structured interviews in our evaluation, we were able to learn more about how a sample of Penguins participants experienced virtual programming. However, we also learned that it is important to think about who to interview to best answer specific evaluation questions. For example, in our evaluation, we chose to interview participants who attended both in-person (pre-pandemic) and virtual programming (during the pandemic). It would not have been relevant for us to interview a participant who only experienced one form of delivery, as we asked them to compare their experiences. We also felt it was best to interview athletes, as opposed to coach facilitators, to get an unbiased perspective of programming.

When using semi-structured interviews for program evaluations, we also learned that it is important to:

Completing follow-up interviews with seven of the nine surveyed athletes provided clarity around how virtual programming helped support athletes’ physical activity and confidence levels during the COVID-19 pandemic, as well as how it differed from in-person programming. For instance, similar to an in-person class, a Penguin’s participant shared that:

“It [virtual programming] sort of sets this routine where you think okay, I have to work out at this certain time… I can’t just put it off and make an excuse not to. There’s a scheduled appointment and people expecting you to be there.”

Knocking It Down: Lessons Learned along the Way

Two young women practice sitting volleyball

Through our evaluation, we learned that the virtual programming run through Niagara Penguins did have a positive impact on participants’ physical activity levels, as well as self-confidence. In addition, despite the many challenges that COVID-19 has presented to community-based sport organizations, the pandemic offered several advantages for the evaluation process.

For instance, over the last year, the pandemic has increased technology-utilization and enabled exploration of virtual interviews and survey platforms. Instead of creating hard-copy surveys and distributing them to select participants, we were able to electronically share our survey with all program participants. As well, we were able to hold interviews over a virtual meeting platform, where participants could feel comfortable in their own home environment, choose to use a camera (or not), and interviews took less time because participants did not have to travel to a physical program site to complete them—all of which increased participants willingness to participate.

Based on this evaluation, we encourage other community-based sport programs to consider leveraging their current resources to evaluate the parts of a program that are working well and the areas of a program could be improved, particularly given the recent transition for many programs into virtual settings. Every evaluation is different, and what I learned is that “building it up” and “knocking it down” are important steps involved in selecting the right questions, choosing the right tools to answer those questions, and ultimately answering those questions in a meaningful way.

This blog is part of a series in collaboration with Brock University. Written by a student in the ‘Program Evaluation in Professional Practice’ course, this blog draws on a student’s first-hand experience conducting a program evaluation during a placement with the Bounce Back League. In this blog, Ashley Romano, a 4th year undergraduate student, offers practical tips to engage children in program evaluations, and make evaluation fun!

Designing activities that cater to the children’s skills and interest is a routine part of any children’s sport program. In the context of program evaluation, choosing data collection methods that work for kids is just as important.

Research has shown children are capable, accurate, and valuable communicators, yet many data collection methods come with considerable limitations related to effectively engaging and understanding children’s experiences (Driessnack & Furukawa, 2011). To overcome these limitations, creative approaches to data collection account for children’s skills and abilities while making evaluation engaging and fun.

Using an evaluation of a children’s sport program—the Bounce Back League—as an example, this blog explores creative approaches to data collection, including movement-based methods, and offers tips and tricks for implementing movement-based methods with children.

Limitations of traditional data collection methods

Traditional approaches to data collection, such as surveys or assessments, may not be suitable for children for several reasons. First, methods that rely on adults to observe and/or report on children’s progress or experiences (e.g., physical literacy assessments, progress reports) diminish what children have to offer. Children think differently than adults, but not less (Carter & Ford, 2013). Second, many self-report techniques, such as surveys, were developed for adults and adapted to be “child friendly” (Driessnack & Furukawa, 2011), but these adaptations do not always ensure sensitivity to the childhood experience. Often, these methods receive low response rates, have poor reliability, and are influenced by social desirability (i.e., wanting to give the “right” answer; Soland et al., 2019). In contrast, creative approaches to data collection can be developmentally tailored, provide valuable insight directly from children’s experiences, and save time by doubling as a program activity.

Creative approaches to data collection

The term “creative” fits nicely within the scope of methods for collecting data with children. Creative approaches incorporate developmentally appropriate strategies, including but not limited to, individual or group interviews, creative play, or thinking (Christian et al., 2010). Creative approaches resonate with the childhood experience and cater to children’s skills and interests (Carter & Ford, 2013). Movement-based methods are one example of a creative approach to data collection that is particularly relevant for evaluations of sport and physical programs. These methods tend to be low-cost, easy to implement, are typically well-received by children due to the active nature of involvement, and can help improve on-task behaviour and performance when engaging in these methods (Savina et al., 2016).

A group of happy children of boys and girls run in the Park on the grass on a Sunny summer day.

Movement-based methods can be understood as integrating physical activity into data collection and can take many forms, such as cooperative games or team sport in which children use the structure or rules of the game to provide feedback about a program or their experiences. For example, in Ship, Shore, Anchor, children run to the wall labelled  “yes/agree” or “no/disagree” when questions are called out about the program. This method is explained in more detail below.

Movement-based methods can be used to capture how and why sport fosters development through children’s perspectives. Using movement is an engaging process that helps children ‘buy in’—and it is fun!

Movement-based methods in the Bounce Back League

The Bounce Back League (BBL) is a nationally-run trauma-sensitive sport program for children aged eight to 12 years old. BBL uses a proactive approach to build children’s skills, including resiliency, as many who attend BBL are at risk of experiencing trauma or adversity. In the BBL, program evaluations are crucial to understand children’s program experiences, and given its sport-focused nature, movement-based methods are a natural fit. While several movement-based methods have been used to inform BBL evaluation, Ship, Shore, Anchor is a notable example.

Ship, Shore, Anchor uses space and movement to elicit children’s thoughts and ideas about their program experience and can be run by program staff or an external evaluator. To prepare, staff label one wall “yes/agree” and the opposite wall “no/disagree.” Another space or object (e.g., a hula-hoop or mat) is labelled the anchor—a neutral space for children who are unsure or do not feel comfortable answering. A staff member then calls out “I-statements” that relate to children’s experiences (e.g., “I feel safe at BBL”) or outcomes (e.g., “I learned to not give up when things are difficult”). How children respond to the questions can be documented by a facilitator. At the end of the activity, staff can foster a reflective discussion in a debrief with the children that is aligned with the evaluation purpose. This discussion can include questions such as “What skills did you use today?” or “What did you learn about yourself through this activity?”. If feasible, the debrief can be audio-recorded or notes can be taken and analyzed as part of the evaluation data.

Tips and tricks for implementing movement-based methods

From player recruitment and retention, to leadership development, to effective coach education, SIRC’s Researcher/Practitioner Match Grant recipients will address a range of issues to help achieve gender equality in the Canadian sport system. Learn more about this initiative and all the sport organization/university matches that were supported. Outcomes and key learnings from all projects will be shared by SIRC to support evidence-based program planning and policy development.

In the newest SIRCTalks episode, Dr. Nicole LaVoi, Director of The Tucker Center for Research on Girls and Women in Sport, discusses the importance of using data and evidence to facilitate social change. Watch the episode here.

SIRCTalks is an exciting new series of SIRC videos and podcasts profiling innovative areas of sport research. Some of the latest episodes feature insight from Dr. Joe Baker (York University) on athlete development, Dr. Tara-Leigh McHugh (University of Alberta) on the effect of major games on indigenous youth, and Mike Bara (Hockey Canada) on using research to inform injury prevention. Click here for more episodes.