Appendices

Appendix 1: Changes From 2005/06–2007/08 Service Plan Update (September 2005) to 2006/07–2008/09 Service Plan

1. Ministry Overview and Core Business

  • The Ministry’s Mission Statement was changed to more accurately reflect the purpose of the Ministry of Education.
  • The Ministry’s Values Statement was also adapted to better describe the principles which guide the work of the Ministry of Education.

2. Strategic Context

  • The Ministry’s strategic focus for the next three years was adjusted to reflect changes in the operating environment.

3. Core Business

  • No substantive changes since the 2005 September Update.

4. Resource Summary

  • For this reporting period, the Ministry did not have any major capital projects to report (the Ministry has defined “major capital projects” as having a threshold of $50 million or greater).

5. Goals, Objectives, Strategies and Results

  • A section was added that explicitly links the Ministry’s work to the Government’s great goals.
  • The changes to the Ministry’s objectives, strategies and performance measures, from the previous 2005/06–2007/08 Service Plan (September Update) can be found at: http://www.bced.gov.bc.ca/annualreport/2006/appl.pdf.

Appendix 2: The Accountability Framework

The Ministry’s Accountability Framework36 focuses school and school board attention and resources on improving student achievement. Key elements of the Accountability Framework include:

School Plans — each school has a School Plan, developed by its School Planning Council. The focus of the annual School Plan is on improving achievement for all students in the school. The Council considers classroom, school and district data related to student achievement, and this information is shared with the school community. Progress is assessed in relation to the plan from the previous year, and schools are expected to monitor and adapt their strategies in order to better meet the needs of students.

School Plans are subject to school board approval and plans may be approved, approved with modifications, or rejected. School boards must also consult with School Planning Councils regarding:

  • The allocation of staff and resources in the school;
  • Matters contained in the board’s Accountability Contract that pertain to the school; and
  • Educational programs and services in the school.

School Plans developed by School Planning Councils inform district Accountability Contracts.

Each School Planning Council is comprised of three parents, one teacher, the school principal, and in schools that have Grades 10 to 12, one student so as to reflect the characteristics, values and needs of the school community. School Planning Councils are advisory bodies — their major responsibility is to assist as well as guide in planning at the school level. School Planning Councils reach out to all members of the school community. The school community consists of all parents, students, teachers and support staff.


36  For more information on the Accountability Framework, please visit:
http://www.bced.gov.bc.ca/policy/policies/accountability_framework.htm.

Accountability Contracts (District Plans) — each of the 60 school districts must complete an Accountability Contract every year. An Accountability Contract is a school board’s public commitment to improve student achievement, and is based on thoughtful consideration of students’ performance information. Accountability Contracts reflect the unique characteristics, priorities and needs of each school district. School boards submit their Accountability Contracts to the Minister on or before October 31 each year, and make them available to the public. The Ministry monitors Accountability Contracts and provides feedback to districts through Deputy Minister’s visits, Ministry contact, and District Reviews.

District Reviews — District Reviews are intended to improve student achievement through a focused review of district results, followed by the sharing of effective practices throughout the province. The Reviews are designed to provide feedback and recommendations to the district, the Ministry and the public regarding the school districts’ work in improving student achievement. District Reviews focus on ten key areas related to school and district improvement, and 20 districts are reviewed annually, with results made publicly available. In each district, a team chaired by a superintendent and composed of educators, parents and ministry staff will:

  • review district and school achievement data;
  • review the district accountability contracts and school plans to improve student achievement;
  • make recommendations to the board and to the Minister about improving student achievement; and
  • identify promising practices that will assist other districts and schools in their efforts to improve student achievement.

Aboriginal Education Enhancement Agreements — British Columbia schools have not been successful in ensuring that Aboriginal students receive a quality education, one that allows these students to succeed in the larger provincial economy while maintaining ties to their culture. Growing recognition of this problem led to the signing of a Memorandum of Understanding in 1999: “We the undersigned, acknowledge that Aboriginal learners are not experiencing school success in British Columbia. We state our intention to work together within the mandates of our respective organizations to improve school success for Aboriginal learners in British Columbia.”

The Memorandum of Understanding led to a framework for the creation of Aboriginal Education Enhancement Agreements. Through these agreements, new relationships and commitments were made to improve the educational success of Aboriginal students. Enhancement Agreements are intended to continually improve the quality of education achieved by all Aboriginal students. An EA is a working agreement between a school district, all local Aboriginal communities, and the Ministry of Education. EAs are designed to enhance the educational achievement of Aboriginal students. The EA establishes a collaborative partnership between Aboriginal communities and school districts that involves shared decision-making and specific goal setting to meet the educational needs of Aboriginal students.

Enhancement Agreements highlight the importance of academic performance and more importantly, stress the integral nature of Aboriginal traditional culture and languages to Aboriginal student development and success. Fundamental to EAs is the requirement that school districts provide strong programs on the culture of local Aboriginal peoples on whose traditional territories the districts are located.

Enhancement Agreements:

  • support strong cooperative, collaborative relationships between Aboriginal communities and school districts;
  • provide Aboriginal communities and districts with greater autonomy to find solutions that work for Aboriginal students, the schools and the communities; and
  • require a high level of respect and trust to function effectively.

Appendix 3: Data Confidence for Performance Measures

1: Data Confidence — Completion Rate

Source: Ministry of Education — data for Completion Rates are based on the Ministry’s 1701 form, student transcripts, and provincial exams.

Collection Method: This measure quantifies the proportion of Grade 8 students who graduate within six years of entering secondary school for the first time. Schools and school districts provide the Ministry with the number of courses leading to graduation and the school marks for both examinable and non-examinable courses. This data is then transferred into the Education Data Warehouse (EDW), where it undergoes a final check.

Data Reliability: Completion Rate estimates are based on a statistical model that corrects for out-migration. This introduces a small degree of error in the estimates. Results are based on PEN’s, and schools providing course data on time. The system is checked bi-annually to ensure there are no instances of more than one student sharing the same PEN (less than .002 per cent).

Reporting Period: The data are collected from July through October, with the results publicly reported in November of each year.

Timeliness: The rate is available in February each year.

2: Data Confidence — Adult Literacy

Source: The Adult Literacy and Life Skills Survey, The National Center for Education Statistics

Collection Method: Data was collected from a survey of individuals, which was administered for the first time in 2003 and published in May of 2005.

Data Reliability: Accepted standard for large scale adult literacy.

Reporting Period: Provincial results were released in November, 2005.

Timeliness: The turnaround from the time the survey was administered to the time the results were made public was 18 months.

3: Data Confidence — Reading and Numeracy Skills

Source: FSA tests are written by students in Grades 4 and 7 at school, and are administered by school staff.

Collection Method: Each answer sheet has the student’s Personal Identification Number (PEN) affixed. The tests are collected at the school and sent to the Ministry of Education for marking. Multiple-choice answers are scanned and open-ended responses are marked by specially trained teachers. All results are then transferred into the Education Data Warehouse (EDW), where they undergo a final check.

Data Reliability: The FSA is a standardized measure, which is designed by B.C. teachers to reflect British Columbia’s school curriculum.

Reporting Period: The FSA results are reported publicly on the Ministry of Education website in September of each year.

Timeliness: The turnaround time for the FSA tests is approximately four months, from the time they are written, to when they are reported. This means that the data is available in a timely manner, allowing educators and partners to plan and react in an efficient and effective manner to changes in achievement.

Participation Rates:

Grade 4 2000/01 2001/02 2002/03 2003/04 2004/05
Reading Comprehension 93 92 93 91 92
Writing 93 92 92 91 91
Numeracy 93 92 93 91 92
Grade 7 2000/01 2001/02 2002/03 2003/04 2004/05
Reading Comprehension 93 92 92 92 92
Writing 93 92 92 91 91
Numeracy 93 91 92 91 92

4: Data Confidence — School Readiness

Source: The Human Early Learning Partnership (University of British Columbia — funded by the Ministry of Children and Family Development, the Ministry of Education, and the Ministry of Health.)

Most commonly, EDI results are mapped by “average score,” and by “percentage vulnerable.”

  • Average score: This method simply takes the average EDI score on a given domain of all of the surveyed children living in a particular area. At an individual level, scores may range from 0 to 10 on any given subscale of the EDI, with higher numbers indicating a higher readiness for school.
  • Percentage vulnerable: The lowest scoring ten per cent of children province-wide are deemed to be vulnerable. The “percentage vulnerable,” then, refers to the percentage of children in a given area that fall into this category. To illustrate: if vulnerable children were spread evenly around the province, every region’s “vulnerability value” would be exactly 10 per cent. Of course, such an even distribution is not the case, and as a result, some regions have nearly half of their children in the bottom 10 per cent, while others have none.

Collection Method: The EDI forms are distributed to Kindergarten teachers who then fill out a questionnaire for each student.

Data Reliability: Although it is not a perfect measure, a great deal of background work has been done on the EDI. It has been validated for a wide range of populations in urban, rural, and remote communities, and communities with particular social and cultural compositions (e.g., Aboriginal communities, inner-city communities, affluent suburban communities, etc.). It has proven to be a useful and reliable instrument, and has since been used in jurisdictions across Canada, the United States, Australia, Chile, and several other countries. British Columbia is the first province or state in the world to have administered the EDI to its entire population of five year-old kindergarten students (over a period of three years).37

Reporting Period: The EDI is administered in February of each year and reported publicly on the Health and Early Learning Partnership website.

Timeliness: The data meets the criteria for timeliness, in that it is intended to measure the experiences of children upon entering school (mid-way through the Kindergarten year).


37  For more information about EDI data, please visit: http://ecdportal.help.ubc.ca/aboutedidata.htm.

5: Data Confidence — National and International Assessments (SAIP/PCAP and PISA)

Student Achievement Indicators Program (SAIP)

Source: SAIP is coordinated by the Council of Ministers of Education, Canada (CMEC), which oversees the administration, data collection, analysis and storage and reporting of information.

Collection Method: The data are collected from randomly selected samples of 13- and 16-year olds in B.C. Approximately 25,000 English and French-speaking Canadian students participated — in B.C., 1800 students from more than 170 schools participated.

Data Reliability: Results are reported with confidence intervals that discourage “ranking” and promote an analysis of jurisdictions that perform in a similar range, based on a review of the confidence intervals that determine the statistically significant differences between results of jurisdictions.

Reporting Period: For SAIP Science III, 2004, the tests were administered over a four week period in April and May 2004, and the results were reported publicly on the CMEC SAIP38 website in fall 2005.

Timeliness: The turnaround for SAIP data is approximately 18 months, from assessment to publication.

Programme for International Student Assessment (PISA)

Source: PISA is coordinated by the OECD (Organization for Economic Co-operation and Development). In Canada, Statistics Canada and the Council of Ministers of Education, Canada (CMEC) oversee the administration, data collection, analysis, storage and reporting.

Collection Method: The data are collected from a randomly selected sample of BC 15 year olds. Forty-one countries participated in PISA 2003 — in Canada, approximately 28,000 students from over 1,000 schools participated.

Data Reliability: The results are reported publicly for 41 countries. Canadian provinces over-sample in order to be able to report results at the provincial level. Results are reported with confidence intervals that discourage “ranking” and promote an analysis of jurisdictions that perform in a similar range, based on a review of the confidence intervals that determine the statistically significant differences between results of jurisdictions.

Reporting Period: The testing takes place in the Spring of every third year (2000, 2003, 2006, etc.) and the results are reported approximately a year and a half later (i.e., the PISA 2003 results were published in late 2004).

Timeliness: PISA is administered every three years. Initial results are reported approximately 18 months later, with secondary analysis and additional reports being published throughout the years between administrations.


38  http://www.cmec.ca/saip/indexe.stm.

6: Data Confidence — Rates of Tobacco Use in Youth

Source: Community Health, Education and Social Services Omnibus Survey (CHESS) — BC STATS

Collection Method: The survey is conducted monthly, and administered by telephone.

Data Reliability: The margin of error is 1.3 per cent. This may change depending on the size of the sub-sample examined.

Reporting Period: The survey results will be reported twice a year, in July and December. Beginning in July 2005, two CHESS data files will be released every year: a file with data collected from January to June and a file in January with the July to December data.

Timeliness: Unlike many other methods of data collection, CHESS data is reported in a “continuous time” series.

7: Data Confidence — Rates of Physical Activity in K–12 students

Source: Satisfaction Survey, Ministry of Education.

Collection Method: Paper or electronic surveys. The data is transferred into the Education Data Warehouse, where it undergoes a final check for accuracy.

Data Reliability: An annual technical analysis of the survey questions is conducted and those analyses have shown the survey instrument to be valid and reliable.

Reporting Period: Satisfaction Surveys are administered electronically and physically each year, from January to March and the results are released in late May, on the Ministry’s website.

Timeliness: The turnaround time from data collection to publication is brief — three months — rendering the data timely and relevant.

8: Data Confidence — Satisfaction

Source: Satisfaction Survey, Ministry of Education.

Collection Method: Paper or electronic surveys. The data is transferred into the Education Data Warehouse, where it undergoes a final check for accuracy.

Data Reliability: An annual technical analysis of the survey questions is conducted and those analyses have shown the survey instrument to be valid and reliable.

Reporting Period: Satisfaction Surveys are administered electronically and physically each year, from January to March and the results are released in late May, on the Ministry’s website.

Timeliness: The turnaround time from data collection to publication is brief — three months — rendering the data timely and relevant.

9: Data Confidence — Transition Rate to Post-Secondary Education

Source: Graduate Transition Survey, Ministry of Education

Collection Method: A telephone survey of 1,000 secondary school graduates is conducted.

Data Reliability: Results can be viewed with considerable confidence. Maximum sampling error is within +/–3.1 per cent (19 times out of 20).

Reporting Period: Results are reported annually each Fall, on the Ministry’s website.

Timeliness: The turnaround time from data collection to publication is brief — two months — rendering the data timely and relevant.

10: Data Confidence — Transition Rate to Employment

Source: Graduate Transition Survey, Ministry of Education

Collection Method: A telephone survey of 1,000 secondary school graduates is conducted.

Data Reliability: Results can be viewed with considerable confidence. Maximum sampling error is within +/–3.1 per cent (19 times out of 20).

Reporting Period: Results are reported annually each fall, on the Ministry’s website.

Timeliness: The turnaround time from data collection to publication is brief — two months — rendering the data timely and relevant.

11: Data Confidence — Achieving Results

Source: British Columbia Ministry of Education, System Performance Branch. FSA results, completion rates, and other data collected by the Ministry of Education are also utilized during the review process.

Collection Method: District Review Teams, comprised of senior school district administrators, teachers, parents and Ministry staff review the work of districts in support of student achievement through observations, discussions, and inquiry based conversations. A report is completed by the team based on the review process and submitted to the Ministry of Education.

Data Reliability: To ensure consistency in the review process, each Review Team member is required to attend a training session once per year, prior to the reviews.

Reporting Period: Twenty districts are reviewed each year, so that all districts are reviewed at least once every three years. The reviews are conducted in the spring of each year, from February through May, and results are reported on the Ministry website shortly thereafter.

Timeliness: The turnaround time from when the review is conducted to submission of the report to the Ministry is less than one month.

12: Data Confidence — Parental Involvement

Source: British Columbia Ministry of Education, System Performance Branch. FSA results, completion rates, and other data collected by the Ministry of Education are also utilized during the review process.

Collection Method: District Review Teams, comprised of senior school district administrators, teachers, parents and Ministry staff review the work of districts in support of student achievement through observations, discussions, and inquiry based conversations. A report is completed by the team based on the review process and submitted to the Ministry of Education.

Data Reliability: To ensure consistency in the review process, each Review Team member is required to attend a training session once per year, prior to the reviews.

Reporting Period: Twenty districts are reviewed each year, so that all districts are reviewed at least once every three years. The reviews are conducted in the spring of each year, from February through May, and results are reported on the Ministry website shortly thereafter.

Timeliness: The turnaround time from when the review is conducted to submission of the report to the Ministry is less than one month.

Back. Balanced Budget 2006 Home.