Program Review - Alice Jamieson Girls' Academy: An internal evaluation conducted by Dr. Ursula Steele for Dr. Brendan Croskery, Chief Superintendent of Schools Calgary Board of Education.
Introduction
This program evaluation was performed by the Calgary Board of Education to determine whether the Alice Jamieson Girls’ Academy, an alternative program, was effective in meeting the implementation of the program’s ten protocols. As part of the Calgary Board of Education Administrative Regulations a program evaluation is required within three years of the program being created and this is furthered by the accountability plan outlined in the Alberta Education Alternative Programs Handbook.
Model / Methodology
The main purpose for the evaluation was to conform to the Alberta Education accountability plan in ensuring that: students are not disadvantaged in their achievement; students and parents are satisfied with the program; the program is attaining the intended results; and the program is maintaining the elements that make it alternative (Alberta Education, 2010). To this extent the evaluations would be summative; however, the Calgary Board of Education analyzed the same data in order to make recommendations on ways of improving the program, and in doing so the evaluation would also be formative.
The methodology fits with Provus’s Discrepancy Evaluation Model (DEM) in that it explores the program’s design and purpose, its ability to meet the design and purpose, if the resources and techniques being used fit with the design and purpose, and if the major objectives of the program were achieved. The DEM works to provide greater accountability of educators to the public which is also the purpose of Alberta Education's accountability plan.
Pros
Context
The evaluation begins by including background information regarding the reason the school was established and the foundations that it is based on. This is followed by a clear explanation of the purpose of the evaluation which is supported by various governmental and organizational regulations. It provides relevancy, justification, and a sense of validity to the evaluation.
Follows Guidelines
The evaluation does what it intended to to as it evaluates the program based on protocols of the program. Each protocol is explained and then a comment is made regarding what was being done in the school to meet the protocol. This is likely based on evaluator observations. In a later section each protocol is again revisited and this time is supported with quantitative and qualitative data collected supported by a number of means.
Data
Multiple data sets were used including a survey for students, parents, and staff; data from the Provincial Achievement tests for Grade 6 & 9 students; and the Accountability Pillar Survey which is administered by Alberta Education and given to Grade 4 and Grade 7 students as well as parents and teachers. This information allows for greater comparisons to be made. As part of this both quantitative and qualitative data is examined and allows for the analysis of personal experience.
Format / Readability
The evaluation is well organized well in a sequential order which makes it much easier to follow. However, this may have not been difficult to do since the evaluation is rather short in length especially when one considers that one quarter of the evaluation is simply providing background or procedural information. The voice used is professional but is not overwhelming and could be considered business casual, and not black tie, if making an analogy to clothing.
Cons
Evaluator & Participant Bias
The evaluators have an invested interest in the survival of the program as they were all involved in the creation and initiation of the program itself. The main evaluator is the Principal of the School in which the Girls’ Academy is housed. Therefore, I believe that there may be a conflict of interest. This is also true well looking at the participants in the survey as the teachers involved likely want the program to continue as to provide job retention and also because they have been involved in the develop and running of the program.
Being an alternative program, parents pushed for the program to be established and believe in the values of the program as they felt it was a better fit for their children. Even though they are removed from the process of the program this may call in to question subject effects in the results. This is also true for the students who may be influenced by demand characteristics.
Surveys & Quantitative Data
The surveys are provided in the appendix which was nice to see; however, in the evaluators attempt to keep the wording consistent some of the language exhibited may be difficult for younger students, such as those in Grade 4 & 5, English Language Learners, or students with other needs to understand. If teacher support or explanation was required this could then also have an impact on how the students responded.
The evaluators seemed to be selective on which anecdotal data was shared in the report as the vast majority of it was positive. This is possible yet when this is compared with the statistical representations things don’t match up. The data provided reinforces the positive aspects of the program, what is being worked on in the program, or what is being recommended for the program.
The anecdotal comments included in the evaluation are used as summaries instead of providing a summary of the comments themselves. This likely served a dual purpose, to show parental and student support, and also to decrease the workload for the evaluators. It would have been nice to have included all of the anecdotal comments in the appendix.
The sample size is small for the surveys. The number of students responded is close to 80% which is impressive; however, only eighty-two parents responded and five teachers. There is no data provided regarding the total number of possible respondents but it is believed, based on estimations, that this number represents less than 50% of the total.
The quantitative data used for the evaluation is based on provincial standardized testing and does not account for variables such as previous experiences (SES, Schools, Languages, etc) when looking at the Grade 3 data as the students would not have been at Alice Jamieson Girls’ Academy and therefore, it would be difficult to compare this data to the Grade 6 and Grade 9 data. Even the Grade 6 & 9 data is difficult to use because of students entering and leaving the program. Also, some argue the validity of such standardize tests. The Alberta Government is systematically removing the provincial achievement tests beginning in September, 2014.
Thoughts
Overall, the program evaluation meets the needs of the review and is clear and easy to follow. Recommendations and conclusions are provided though they are minimal and as a result the evaluation reads as more of a praise for the program than a review of it. However, this is expected based on who was asked to perform the evaluation. It would have been nice to see if an external review would have had found consistent findings. However, based on the situation and a lack of funding an external review would not likely occur because of the extra costs associated with this.
Introduction
This program evaluation was performed by the Calgary Board of Education to determine whether the Alice Jamieson Girls’ Academy, an alternative program, was effective in meeting the implementation of the program’s ten protocols. As part of the Calgary Board of Education Administrative Regulations a program evaluation is required within three years of the program being created and this is furthered by the accountability plan outlined in the Alberta Education Alternative Programs Handbook.
Model / Methodology
The main purpose for the evaluation was to conform to the Alberta Education accountability plan in ensuring that: students are not disadvantaged in their achievement; students and parents are satisfied with the program; the program is attaining the intended results; and the program is maintaining the elements that make it alternative (Alberta Education, 2010). To this extent the evaluations would be summative; however, the Calgary Board of Education analyzed the same data in order to make recommendations on ways of improving the program, and in doing so the evaluation would also be formative.
The methodology fits with Provus’s Discrepancy Evaluation Model (DEM) in that it explores the program’s design and purpose, its ability to meet the design and purpose, if the resources and techniques being used fit with the design and purpose, and if the major objectives of the program were achieved. The DEM works to provide greater accountability of educators to the public which is also the purpose of Alberta Education's accountability plan.
Pros
Context
The evaluation begins by including background information regarding the reason the school was established and the foundations that it is based on. This is followed by a clear explanation of the purpose of the evaluation which is supported by various governmental and organizational regulations. It provides relevancy, justification, and a sense of validity to the evaluation.
Follows Guidelines
The evaluation does what it intended to to as it evaluates the program based on protocols of the program. Each protocol is explained and then a comment is made regarding what was being done in the school to meet the protocol. This is likely based on evaluator observations. In a later section each protocol is again revisited and this time is supported with quantitative and qualitative data collected supported by a number of means.
Data
Multiple data sets were used including a survey for students, parents, and staff; data from the Provincial Achievement tests for Grade 6 & 9 students; and the Accountability Pillar Survey which is administered by Alberta Education and given to Grade 4 and Grade 7 students as well as parents and teachers. This information allows for greater comparisons to be made. As part of this both quantitative and qualitative data is examined and allows for the analysis of personal experience.
Format / Readability
The evaluation is well organized well in a sequential order which makes it much easier to follow. However, this may have not been difficult to do since the evaluation is rather short in length especially when one considers that one quarter of the evaluation is simply providing background or procedural information. The voice used is professional but is not overwhelming and could be considered business casual, and not black tie, if making an analogy to clothing.
Cons
Evaluator & Participant Bias
The evaluators have an invested interest in the survival of the program as they were all involved in the creation and initiation of the program itself. The main evaluator is the Principal of the School in which the Girls’ Academy is housed. Therefore, I believe that there may be a conflict of interest. This is also true well looking at the participants in the survey as the teachers involved likely want the program to continue as to provide job retention and also because they have been involved in the develop and running of the program.
Being an alternative program, parents pushed for the program to be established and believe in the values of the program as they felt it was a better fit for their children. Even though they are removed from the process of the program this may call in to question subject effects in the results. This is also true for the students who may be influenced by demand characteristics.
Surveys & Quantitative Data
The surveys are provided in the appendix which was nice to see; however, in the evaluators attempt to keep the wording consistent some of the language exhibited may be difficult for younger students, such as those in Grade 4 & 5, English Language Learners, or students with other needs to understand. If teacher support or explanation was required this could then also have an impact on how the students responded.
The evaluators seemed to be selective on which anecdotal data was shared in the report as the vast majority of it was positive. This is possible yet when this is compared with the statistical representations things don’t match up. The data provided reinforces the positive aspects of the program, what is being worked on in the program, or what is being recommended for the program.
The anecdotal comments included in the evaluation are used as summaries instead of providing a summary of the comments themselves. This likely served a dual purpose, to show parental and student support, and also to decrease the workload for the evaluators. It would have been nice to have included all of the anecdotal comments in the appendix.
The sample size is small for the surveys. The number of students responded is close to 80% which is impressive; however, only eighty-two parents responded and five teachers. There is no data provided regarding the total number of possible respondents but it is believed, based on estimations, that this number represents less than 50% of the total.
The quantitative data used for the evaluation is based on provincial standardized testing and does not account for variables such as previous experiences (SES, Schools, Languages, etc) when looking at the Grade 3 data as the students would not have been at Alice Jamieson Girls’ Academy and therefore, it would be difficult to compare this data to the Grade 6 and Grade 9 data. Even the Grade 6 & 9 data is difficult to use because of students entering and leaving the program. Also, some argue the validity of such standardize tests. The Alberta Government is systematically removing the provincial achievement tests beginning in September, 2014.
Thoughts
Overall, the program evaluation meets the needs of the review and is clear and easy to follow. Recommendations and conclusions are provided though they are minimal and as a result the evaluation reads as more of a praise for the program than a review of it. However, this is expected based on who was asked to perform the evaluation. It would have been nice to see if an external review would have had found consistent findings. However, based on the situation and a lack of funding an external review would not likely occur because of the extra costs associated with this.
Alberta Education. (2010). Alternative programs handbook 2010. Retrieved from http://www.education.alberta.ca/media/434640/alternative programs handbook policy branch edits march 27 2009 april 13 2010 kwrt.pdf
Croskery, B., & Steele, U. (2007). Program review - alice jamieson girls' academy. Retrieved from http://www.cbe.ab.ca/Trustees/reports/alicejamieson18dec07.pdf
Croskery, B., & Steele, U. (2007). Program review - alice jamieson girls' academy. Retrieved from http://www.cbe.ab.ca/Trustees/reports/alicejamieson18dec07.pdf