This report presents the findings of an external review and analysis of relevant recent practices, research and data on the delivery of Vocational Education and Training (VET) to secondary students. The review and analysis were commissioned by the NSW Department of Education and were conducted by the Centre for Vocational and Educational Policy at the University of Melbourne to identify best possible practices and make recommendations for future practice.
• What do the VET programs offered in Australian schools look like?
• Who participates in these VET programs and why?
• What are useful measures of VET program effectiveness?
• What are the strengths and weaknesses of the current VET programs in NSW government schools?
• What recommendations are made for improving VET programs in NSW government schools?
In 2015, the NSW Department of Education introduced the Supported Students, Successful Students funding package. A key initiative within this package was $15 million over four years to support schools to implement Positive Behaviour for Learning (PBL). PBL is a whole school approach that aims to create a positive, safe and supportive school climate in which students can learn and develop. The funding employed 32 PBL coach mentors and four PBL deputy principals.
CESE’s evaluation included:
• two rounds of fieldwork (a survey and in-depth interviews) to examine the experiences and views of PBL and Non-PBL schools, PBL coach mentors, PBL deputy principals, and other school services staff
• a review of how some PBL schools use their data to inform decision-making
• development of statistical models to measure the impact of PBL on student attendance and suspensions, as well as student wellbeing measures captured in the department’s Tell Them From Me student survey.
We conservatively estimate that 1,138 NSW public schools are implementing PBL and that 67 schools have stopped implementing PBL. This translates roughly to a 94% retention rate.
Almost all schools reported implementing each of the universal school-wide features of PBL, using their data to inform decision making and develop appropriate interventions, and using existing PBL evaluation tools to examine their implementation fidelity.
At the time of data collection, approximately four in ten schools were implementing tier 2 (targeted support) and two in ten were implementing tier 3 (intensive individualised support). The most common targeted intervention was an individual support plan.
Coach mentors provided schools with professional learning, general information about PBL, and support with data and evaluation, and are viewed as a source of expert knowledge and advice.
Using their own internal school data, observations and feedback from parents, nearly nine in ten PBL schools reported that they perceive PBL to have improved student wellbeing. The large majority of PBL schools reported that both major and minor problem behaviour incidents have reduced since implementing PBL. More than half of the schools also perceived that PBL had reduced short suspensions, but only a small proportion of schools reported an improvement in attendance.
These findings are not reflected in the department's centrally recorded data and are not supported by our outcome analyses, which found no meaningful differences between PBL schools and non-PBL schools on attendance, suspensions or student wellbeing. However, we identified a number of limitations in the use of these data sources as outcome measures. Without better data systems in place, we are unable to make a conclusive statement about the effectiveness of PBL.
The What works best: 2020 update summarises some of the most significant research into effective teaching. It outlines eight evidence-based practices that teachers can use in their classrooms to support improved student learning.
Engage students and challenge them to learn new things. Establish clear and consistent expectations for their learning and behaviour, support them to meet those expectations. Tailor your teaching to meet their needs, and engage with parents and carers to encourage them to hold high expectations of their children.
Make assessment an integral part of your teaching and learning program. Establish learning intentions, create success criteria and provide effective feedback. Teach your students how to peer and self-assess and to set individual goals.
Clearly explain to students why they are learning something, how it connects to what they already know, what they are expected to do, how to do it, and what it looks like when they have succeeded.
Develop high-quality student-teacher relationships. Provide structure, predictability and opportunities for active student participation in the classroom. Actively supervise students to keep them on task, respond to disengagement or disruptive behaviours, and support students to re-engage with learning.
Be detailed and specific. Focus on how students performed on a particular task, where mistakes were made, and what needs to happen to improve in future.
Create a safe environment. Increase student's sense of belonging, value students' opinions and perspectives, encourage interest in learning, and promote social and emotional skills.
Collect data from a wide range of sources, including your observations, class tests, formal exams, student work samples and responses to informal questions.
Connect with colleagues and experts from outside the school. Work together to plan lessons and teaching programs, obseve each others' lessons and provide feedback. Engage in professional discussion and reflection.
In 2012, the NSW Department of Education launched the Local Schools, Local Decisions (LSLD) education reform. LSLD aimed to give NSW public schools more authority to make local decisions to best meet the needs of their students.
The reform focused on five interrelated reform areas: making decisions, managing resources, staffing schools, working locally and reducing red tape. In 2014, a new needs-based approach to school funding through the Resource Allocation Model (RAM) was added to the LSLD reform.
The Centre for Education Statistics and Evaluation (CESE) commenced an evaluation of LSLD in 2016. CESE’s final evaluation report is an outcome evaluation aiming to answer three evaluation questions:
The report’s key findings are that:
The report concludes that the department should:
The interim evaluation report was published in 2018.
We are committed to providing accessible content for all users. To request an accessible version of this content, please contact us.
The Year 1 Phonics Screening Check is a short assessment that takes 5-7 minutes and indicates to classroom teachers how their students are progressing in phonics. The Phonics Screening Check is designed to be administered in Year 1, after students have had time to develop phonic knowledge, but with enough time left to make sure interventions and targeted teaching can still make a difference.
The Phonics Screening Check complements existing school practices used to identify students’ progress in developing foundational literacy skills.
This document provides a summary of information and data from the Phonics Screening Check trial delivered in 2020.
Effective reading programs have six key components: phonemic awareness, phonics, fluency, vocabulary, comprehension and oral language. Reading programs are also most effective when these components are taught explicitly, systematically and sequentially. Based on this evidence, the NSW Department of Education developed an evidence-based two-day professional learning (PL) course on effective reading instruction, with a strong focus on explicit teaching of phonemic awareness and synthetic phonics. The PL was provided in 16 locations in NSW in terms 2 and 3 of 2018. The department funded all NSW government schools with a kindergarten enrolment to send up to two teachers to the PL. In total 2,288 staff from 1,089 schools attended the PL.
The evaluation measures the impact of the PL on teachers’ beliefs about the most effective practices for teaching reading to students; and confidence in implementing these practices; and their practices in the classroom.
While some beliefs about the most effective practices for teaching reading changed, as anticipated, after the PL, other beliefs did not show this anticipated change. The largest changes were in beliefs about the explicit and systematic teaching of phonics and reading skills. These beliefs aligned with key concepts that were a focus of the PL.
Other beliefs showed little change after the PL, with two alternative explanations:
Participants reported increased confidence for all measured areas of effective reading instruction after the PL and these changes were maintained over time. There is still room for further improvement in participants’ feelings of confidence in teaching a comprehensive and effective reading program.
Areas of practice that had the largest positive changes after the PL were the reading of decodable texts, teaching phonic knowledge and reviewing phonemic awareness. In contrast, developing reading fluency and comprehension strategies had the smallest change. This was expected as these components of reading were not a key focus of the PL.
The majority of participants shared what they learnt from the PL with their colleagues. This tended to happen through informal conversations rather than more formal sharing practices.
Our key learning is that the department should continue to offer targeted, engaging, evidence-based PL on learning and teaching topics. This evaluation shows that educators’ beliefs, confidence and practice can be positively changed through high-quality PL.
Since 2017, the department has undertaken a range of strategic activities and developed a suite of new resource to support schools with early literacy instruction:
The Check-in assessments are optional online reading and numeracy assessments designed to assist schools following the disruptions to schooling in 2020. The assessments cover similar aspects of literacy and numeracy as in NAPLAN reading and numeracy tests.
These formative assessments are offered for schools to:
This page provides a summary of information and data from the Check-in assessments delivered in 2020.
Each assessment in 2020 was designed to be quick and easy to administer, consisting of approximately 40 multiple choice questions. Suggested completion time was 50 minutes, however, teachers could use their discretion based on the needs of their students.
Students in Years 5 and 9 completed the assessments during Term 3, Weeks 5 to 7 (17 August–4 September). Students in Year 3 completed the assessments during Term 3, Week 10 to Term 4, Week 2 (21 September–23 October).
Initial results were available to schools within 48 hours of test completion, enabling teachers to rapidly move to use the results in addressing learning gaps.
To assist teachers in using the results, test items were aligned to the NSW syllabus, National Literacy and Numeracy Learning Progressions and teaching strategies.
Student assessment feedback and mapping against the syllabus and learning progressions indicators was made available in the department’s reporting platform, Scout.
Features of the school reports included:
Records of student achievement of learning progression indicators were also available in the department’s PLAN2 platform, where teachers could monitor student progress and create ‘Areas of Focus’ for targeted teaching and skill development.
Professional learning and assessment support was available to all teachers in participating schools for 2020 assessments. This included how best to make use of the assessment package for each school context, administration of the assessment, how to access and use feedback to help inform planning and strategies for teaching.
As at 10 November, more than 4,700 teachers had accessed:
Participation in the Check-in assessments was high, with 83% (1,775) of department schools participating (of schools with students in Years 3, 5 or 9). Participation was higher among primary schools than secondary schools, with 88% of all Year 3 students, 86% of all Year 5 students, and 61% of all Year 9 students participating in the Check-in assessments.
Participation was largely representative across various student and school groups.
*Note (for tables 1-3): Remoteness area is based on ASGC2016 remoteness area classifications. Inner regional and outer regional Australia are combined, as are remote and very remote Australia. Percentages of schools participating are calculated based on the total number of schools with enrolments in the relevant scholastic year, for each school type. Figures are based on the test participation data extracted from the test platforms on 10 November 2020.
For each 2020 assessment, a quarter of the test items were NAPLAN items with known psychometric properties and difficulty estimates on the NAPLAN scales. This provided the possibility of linking the Check-in assessments with these scales to assist with further analysis.
After scaling and equating exercises for available results from Year 3, Year 5 and Year 9 tests, five assessments in Year 3 reading, Year 3 numeracy, Year 5 reading, Year 5 numeracy and Year 9 numeracy were able to be equated to the NAPLAN scales. Year 9 reading was not able to be linked to the NAPLAN scale due to a range of factors including test design differences between NAPLAN and the Check-in assessment.
As the Check-in assessments were optional, results were weighted (at student level and by prior performance band in NAPLAN test for Year 5 and 9 or prior performance band in Best Start Kindergarten assessment for Year 3, and remoteness) to arrive at population estimates.
Table 4 presents the estimated proportions of students in NAPLAN bands based on Check-in assessments measured in August-October 2020.
Note: Check-in assessment results were weighted to arrive at population estimates. Results need to be interpreted with caution as they have larger uncertainty than typical NAPLAN results.
Table 5 presents the mean scaled scores for each assessment, for 2019 NAPLAN and as estimated from Check-in assessments measured in August-October 2020. This table shows the August-October results in 2020 were similar to previous years’ NAPLAN results, assessed in May for Year 3 reading, Year 5 reading and numeracy, and Year 9 numeracy. In contrast, Year 3 numeracy Check-in results in September/October were substantially higher than previous years' NAPLAN results assessed in May (note that NAPLAN did not take place in 2020 due to COVID-19).
Note: Due to differences between the Check-in assessments and NAPLAN tests (e.g. test design, purpose of tests), caution is needed when comparing Check-in results to NAPLAN results.
The response from schools as to the diagnostic value of the assessments has been overwhelmingly positive. Teachers have commented:
“the rich data gleaned is simply amazing!”
“as a class we use the Check-in assessment feedback to talk about how we solve number problems and what strategies we use”.
The 2020 Check-in assessments demonstrate the feasibility of conducting formative assessments that provide schools with rapid insight and highly targeted support in a short timeframe and reduced administrative complexity. The high take-up and strong support across schools demonstrates the willingness and ability of schools to use formative assessment to support their professional judgments in rapidly identifying gaps in student learning. The inability to equate Year 9 reading also demonstrates to some degree the limitation of a fast deployment in single state context. In the longer term the availability of pre‑calibrated assessments for use by teachers would further increase the uptake and usability of check-in type assessments.
At the system level, the comparison with 2019 NAPLAN demonstrates that students were generally performing in August-October 2020 at the same levels previously seen in May (with the exception of Year 3 numeracy). This indicates that on average students have fallen approximately 3-4 months behind in Year 3 reading, and 2-3 months behind in Year 5 reading and numeracy and Year 9 numeracy.
This paper aims to support early childhood education (ECE) practitoners and policy-makers by bringing together the available research on formative assessment, contextualised to early childhood education in NSW. Formative assessment is an educational practice that has broad applicability and support.
In this paper, several aspects of formative assessment are discussed:
This podcast is part of an eight-part series. In this podcast, Mark Scott, Secretary of the NSW Department of Education, dives into what teacher collaboration looks like in practice and why it is important at Blue Haven Public School. Mark speaks with Substantive Principal and Principal in Residence Literacy and Numeracy, Paul McDermott, Relieving Principal, Dale Edwards, and staff at Blue Haven Public School.
Download the transcript (PDF, 237kB)