PME 802
Welcome to PME 802
Program Inquiry and Evaluation
Program Inquiry and Evaluation
StrongStart BC- Program Evaluation Design
Step 1: Select and Describe Program Context
Program Focus:
StrongStart BC is an early childhood educational program that is offered to children ages 0-5 in British Columbia. It provides “rich learning environments designed for early learning development – language, physical, cognitive, social and emotional” (StrongStart BC, n.d.).
Program Goals:
“The overall goal of StrongStart BC is to support the development of young children consistent with the goals of the British Columbia Early Learning Framework and to provide opportunities for adults to observe and practice effective strategies that support early learning” (StrongStart BC Early Learning Programs, n.d.).
Program Details:
There are over 329 StrongStart programs offered in BC which are led by certified early childhood educators, and at no cost to families. The learning services are play based and encompass music, art, dramatic play, puzzles, blocks, and story time activities. “StrongStart BC’s early learning centres are located in school facilities and operate five days per week, for a minimum of three hours per day. Centres are most often open in the mornings, Monday to Friday” (StrongStart BC Early Learning Programs, n.d.).
Community Demographics:
StrongStart BC’s targeted demographics are:
-families or caregivers with children ages 0-5
-families or caregivers with varying socioeconomic statuses
Step 2: Identify Purpose for Evaluation and Specify Evaluation Questions
Program Inquiry Purpose:
The purpose of this evaluation is to assess StrongStart BC’s effectiveness in achieving its intended goals. The evaluation approach best suited for this program evaluation is impact evaluation. “Impact Evaluation assesses program effectiveness in achieving its ultimate goals” (Types of Evaluation, n.d.).
Key Evaluation Questions:
-Has StrongStart BC’s early learning centres attracted new families and raised community awareness of its program?
-Has the information provided by StrongStart BC’s early learning centres supplying children with the academic and social skills needed for Kindergarten?
-After attending the program, are children demonstrating a preparedness for Kindergarten?
-Has StrongStart BC provided parents/caregivers with skills to support their child’s learning?
-How well did the program work and did it produce the intended outcomes in the short and long term?
-What particular features of the program are working well?
Step 3: Construct Program Theory
If parents/caregivers and their children are provided with the knowledge, resources, and opportunities for social and academic interactions within the StrongStart BC program, then the participants will demonstrate a developmental readiness for Kindergarten.
Logic Model:
Step 4: Identify, Describe, and Rationalize your Evaluation Approach
Evaluation Approach:
The evaluation approach I plan on using for this program evaluation is impact evaluation. StrongStart BC has an acute desire to understand how the program can be improved yearly, based on its short term objectives. Peersman (2015), a contributor from Better Evaluation adds “...the findings of an impact evaluation can be used to improve implementation of a programme for the next intake of participants by identifying critical elements to monitor and tightly manage”, which is the purpose of this evaluation. After the evaluation is conducted, the program coordinators for StrongStart BC will have an in-depth understanding of its effectiveness in preparing children and families for kindergarten in British Columbia.
Step 5: Identify Data Collection Methods and Analysis Strategies
As noted in the logic model above, this evaluation will collect quantitative data (ex. surveys, attendance numbers, anecdotal notes, EDI assessment) as well as qualitative data (ex. observations, discussions, reflections, and questionnaires). Quantitative data will, however, mainly determine if the intended results have been achieved.
The results from this impact evaluation will provide StrongStart BC with...
The percentage of parents/caregivers who feel sufficiently skilled in supporting their child’s development based on what they learned while participating in the program:
- The data collected here would be quantitative and gathered via a survey administered at the end of the program year
- The survey would include a scale with the selections being: yes, somewhat, and no
- A section at the bottom of the survey would allow for parents/caregivers to explain their rating
Feedback from parents/caregivers regarding the effect the program had on them and their child’s development.
- The data gathered here would be qualitative and gathered again via a questionnaire administered at the end of the program year
The percentage of children who display sufficient academic and social knowledge required for kindergarten as per the EDI assessment
- This quantitative data would be collected through an EDI assessment conducted on each child
- The results from each of the EDI assessments would provide the evaluator with data that could be converted into a percentage of “sufficient” and “insufficient” academic skills and social skills
The percentage of families registering for the program, versus the actual attendance of the weekly sessions
- This data would be gathered quantitatively through registration and attendance records
- This data would then be converted into percentages
- This data would also be converted into a bar graph depicting: yearly trends
Step 6: Describe Approach to Enhance Evaluation Use
To increase utility for StrongStart BC, this evaluation use will focus on the “process use” approach. Process use is defined as “knowledge gained through the course of conducting program evaluation” (Peck & Gorzalski, 2009). Multiple studies have affirmed many benefits of engaging stakeholders and sharing new learnings throughout the evaluation process. Some strategies for enhancing evaluation use have been noted in articles written by Patricia Rogers, CEO of BetterEvaluation. Rogers (2018) suggests scheduling a series of analysis and reporting cycles throughout the year, and jointly including the evaluators and stakeholders to report these findings is crucial for process use. In doing this, it is my belief that this evaluation will have a positive impact on the StrongStart program, and enhance the use from this evaluation.
Step 7: Commitment to Standards of Practice
This impact evaluation has been conducted in the interest of StrongStart BC and designed to comply with the Standards for Program Evaluation. Because of the nature of this program, certain standards have been highlighted due to their significance in this evaluation. Under the summary of Standards for Program Evaluation, this evaluation promises to:
Utility Standards:
i) U5 Report Clarity- clearly describe the program, its context, purposes, procedures, and findings in the evaluation so that essential information is provided and easily understood.
ii) U6 Report Timeliness and Dissemination- disseminate all evaluation findings in a timely fashion to all intended users.
iii) U7 Evaluation Impact- plan, conduct, and report this evaluation so follow-through by stakeholders is increased.
Feasibility Standards:
i) F1 Practical Procedures- make evaluation procedures practical and keep disruption to a minimum while obtaining information
Propriety Standards:
i) P3 Rights of Human Subjects- design and conduct this evaluation to respect and protect the rights and welfare of the human subjects.
ii) P4 Human Interactions- respect human dignity and worth in any interactions with persons associated with this evaluation, so that no participants are threatened or harmed.
iii) P5 Complete and Fair Assessment- be fair and record the strengths and weaknesses of the program being evaluated, so that the strengths can be built upon and the problem areas addressed.
Accuracy Standards:
i) A8 Analysis of Quantitative Information- appropriately and systematically analyze the information obtained by the evaluation, so the guiding questions can be effectively answered.
ii) A9 Analysis of Qualitative Information- appropriately and systematically analyze the information obtained by the evaluation, so the guiding questions can be effectively answered.
iii) A10 Justified Conclusions- explicitly justify all conclusions reached in the evaluation, so the stakeholders can assess them.
iiii) A11 Impartial Reporting- guard against distortion caused by personal feelings and biases of any part of the evaluation, so the evaluation report fairly reflects the evaluation findings.
References:
British Columbia Ministry of Education. (2007, September). Evaluation of StrongStart BC. Retrieved from the province of British Columbia’s website: https://www2.gov.bc.ca/gov/content/education-training/early-learning/teach/strongstart-bc
British Columbia Ministry of Education. (n.d.). StrongStart BC Early Learning Programs. Retrieved from the province of British Columbia’s website: https://www2.gov.bc.ca/gov/content/education-training/early-learning/teach/strongstart-bc
Peck, L. R., & Gorzalski, L. M. (2009). An Evaluation Use Framework and Empirical Assessment. Journal of MultiDisciplinary Evaluation, 6(12), 139-156.
Peersman, G. (2015) Impact evaluation. BetterEvaluation. Retrieved from http://www.betterevaluation.org/themes/impact_evaluation
Rogers, P. (2018) 7 Strategies to Improve Evaluation Use and Influence. Retrieved from https://www.betterevaluation.org/en/blog/strategies_for_improving_evaluation_use_and_influence
Sanders, J. R. (1994). The program evaluation standards: How to assess evaluations of educational programs. Sage.
Types of Evaluation. (n.d.). Retrieved from https://www.cdc.gov/std/Program/pupestd/Types%20of%20Evaluation.pdf


Comments
Post a Comment