Publications filter
Thursday, 03 December 2020

What works best: 2020 update poster

What works best: 2020 update poster (PDF, 2.23MB)

What works best: 2020 update poster (PDF, 2.23MB)

The What works best: 2020 update summarises some of the most significant research into effective teaching. It outlines eight evidence-based practices that teachers can use in their classrooms to support improved student learning.

 

How to implement What works best in your classroom

High expectations

Engage students and challenge them to learn new things. Establish clear and consistent expectations for their learning and behaviour, support them to meet those expectations. Tailor your teaching to meet their needs, and engage with parents and carers to encourage them to hold high expectations of their children.

Assessment

Make assessment an integral part of your teaching and learning program. Establish learning intentions, create success criteria and provide effective feedback. Teach your students how to peer and self-assess and to set individual goals.

Explicit teaching

Clearly explain to students why they are learning something, how it connects to what they already know, what they are expected to do, how to do it, and what it looks like when they have succeeded.

Classroom management

Develop high-quality student-teacher relationships. Provide structure, predictability and opportunities for active student participation in the classroom. Actively supervise students to keep them on task, respond to disengagement or disruptive behaviours, and support students to re-engage with learning.

Effective feedback

Be detailed and specific. Focus on how students performed on a particular task, where mistakes were made, and what needs to happen to improve in future.

Wellbeing

Create a safe environment. Increase student's sense of belonging, value students' opinions and perspectives, encourage interest in learning, and promote social and emotional skills.

Use of data to inform practice

Collect data from a wide range of sources, including your observations, class tests, formal exams, student work samples and responses to informal questions.

Collaboration

Connect with colleagues and experts from outside the school. Work together to plan lessons and teaching programs, obseve each others' lessons and provide feedback. Engage in professional discussion and reflection.

 Local Schools, Local Decisions final evaluation report (PDF, 6MB)

Local Schools, Local Decisions final evaluation report (PDF, 6MB)

One-page summary (PDF, 183kB)

 

Introduction

In 2012, the NSW Department of Education launched the Local Schools, Local Decisions (LSLD) education reform. LSLD aimed to give NSW public schools more authority to make local decisions to best meet the needs of their students.
The reform focused on five interrelated reform areas: making decisions, managing resources, staffing schools, working locally and reducing red tape. In 2014, a new needs-based approach to school funding through the Resource Allocation Model (RAM) was added to the LSLD reform.

The Centre for Education Statistics and Evaluation (CESE) commenced an evaluation of LSLD in 2016. CESE’s final evaluation report is an outcome evaluation aiming to answer three evaluation questions:

  • What has been the combined impact of LSLD and RAM funding on school and student outcomes?
  • How have schools spent the additional funding they have received since the implementation of LSLD (including RAM and other funding)?
  • What has been the impact of LSLD on school management and local decision-making practices?

 

Findings and conclusions

The report’s key findings are that:

  • Since the introduction of LSLD, there has been no substantial improvement in NAPLAN Reading and Numeracy results, HSC completion and performance have worsened, and Tell Them From Me student wellbeing outcomes have either not changed or have worsened.
  • It is not possible to use current system finance data to identify exactly how schools spent their funding. The LSLD policy documentation did not explicitly ask schools to demonstrate how changes they made under LSLD, or funding decisions they made with RAM funding, improved student outcomes, nor to report on that improvement.
  • LSLD had a positive impact on schools’ ability to make local, context-specific decisions. However, the administrative burden for schools increased during LSLD.

The report concludes that the department should:

  • Ensure that schools are accountable for their decision-making by requiring and supporting schools to report through the school planning tools.
  • Provide further guidance for schools on effective ways to improve school and student outcomes by continuing to identify what is already known about ‘what works best’ for school leadership and decision-making, and cataloguing and providing guidance on ways that schools should spend their funding in the most effective ways.
  • Ensure policies have clear aims and mechanisms to achieve success in terms of outcomes, and that evaluation is a part of needs-based policy development in future.
  • Develop and support effective financial and administrative management by ensuring that changes to processes and system tools are appropriately piloted, managed and coordinated, and school staff are provided with targeted training.
  • Ensure that financial reporting systems allow the department to track expenditure to the level of detail required to ensure student outcomes are being targeted.

 

Related reports

The interim evaluation report was published in 2018.

 


 

We are committed to providing accessible content for all users. To request an accessible version of this content, please contact us.

what-works-best-summary-tiles

What works best summary tiles (PDF, 357.8KB)

 

Practical strategies for embedding high expectations in teaching and learning

  • Consistently challenge all students to learn new things.
  • Establish clear and consistent expectations for learning and behaviour.
  • Guide and support students towards meeting expectations.
  • Engage with parents and carers to encourage them to hold high expectations of their children.

 

Practical strategies for using data in practice in teaching and learning

  • Regularly dedicate time to using data effectively.
  • Collect meaningful data.
  • Analyse the data to monitor student learning and progress.
  • Make teaching decisions based on data analysis goals.

 

Practical strategies for supporting student wellbeing

  • Select and develop strategies to proactively teach healthy coping mechanisms, resilience and self-regulation.
  • Initiate strategies to build a positive learning environment characterised by supportive relationships and regular contact with each student.
  • Target support for different phases of student development and for students who may be at risk.
  • Use collaborative strategies and share with staff, the school community and other agencies as required, to support the wellbeing of students.

 

Practical strategies for effective teacher collaboration

  • Seek professional learning opportunities to share and gain expertise in evidence-based teaching practices.
  • Regularly participate in structured lesson observations that focus on how different teaching approaches impact on student learning.
  • Regularly dedicate time throughout the school year for working with colleagues to plan, develop and refine teaching and learning programs.
  • Work in partnership with colleagues to achieve shared collaboration goals.

 

Practical strategies for embedding explicit teaching in the classroom

  • Prepare for explicit teaching by planning lesson scope, assessing data, reviewing prior learning and balancing teacher-directed, teacher-guided and student-directed learning.
  • Explain, model and guide learning.
  • Monitor student progress and check for understanding.

 

Practical strategies for embedding effective feedback in teaching and learning

  • Reflect and communicate about the learning task with students.
  • Provide students with detailed and specific feedback about what they need to do to achieve growth as a learner.
  • Encourage students to self-assess, reflect and monitor their work.
  • Ensure that students act on feedback that they receive.

 

Practical strategies for using assessment to improve student learning

  • Make student assessment a part of everyday practice.
  • Use assessment to provide students with learning opportunities.
  • Design and deliver high-quality formal assessment tasks.
  • Carefully structure group assessment activities to ensure that students are supported, challenged and able to work together successfully.

 

Practical strategies to support teachers in managing their classrooms effectively

  • Develop high-quality student-teacher relationships.
  • Provide structure, predictability, and opportunities for active student participation in the classroom.
  • Actively supervise students to keep them on task.
  • Respond to disengagement and disruptive behaviours and support students to re-engage in learning.
Monday, 30 November 2020

Phonics Screening Check

Phonics Screening Check (PDF, 1.3MB)

Phonics Screening Check (PDF, 1.3MB)

The Year 1 Phonics Screening Check is a short assessment that takes 5-7 minutes and indicates to classroom teachers how their students are progressing in phonics. The Phonics Screening Check is designed to be administered in Year 1, after students have had time to develop phonic knowledge, but with enough time left to make sure interventions and targeted teaching can still make a difference.

The Phonics Screening Check complements existing school practices used to identify students’ progress in developing foundational literacy skills.
This document provides a summary of information and data from the Phonics Screening Check trial delivered in 2020.

Professional-learning-effective-reading-instruction-early-years-evaluation-thumb

Professional learning – effective reading instruction in the early years (PDF, 872.6KB)

One page summary (PDF, 142.5KB)

 

Background

Effective reading programs have six key components: phonemic awareness, phonics, fluency, vocabulary, comprehension and oral language. Reading programs are also most effective when these components are taught explicitly, systematically and sequentially. Based on this evidence, the NSW Department of Education developed an evidence-based two-day professional learning (PL) course on effective reading instruction, with a strong focus on explicit teaching of phonemic awareness and synthetic phonics. The PL was provided in 16 locations in NSW in terms 2 and 3 of 2018. The department funded all NSW government schools with a kindergarten enrolment to send up to two teachers to the PL. In total 2,288 staff from 1,089 schools attended the PL.

The evaluation measures the impact of the PL on teachers’ beliefs about the most effective practices for teaching reading to students; and confidence in implementing these practices; and their practices in the classroom.

Key findings

Beliefs

While some beliefs about the most effective practices for teaching reading changed, as anticipated, after the PL, other beliefs did not show this anticipated change. The largest changes were in beliefs about the explicit and systematic teaching of phonics and reading skills. These beliefs aligned with key concepts that were a focus of the PL.

Other beliefs showed little change after the PL, with two alternative explanations:

  • First, some participant beliefs about effective reading instruction already aligned with the PL content and therefore did not need to change.
  • Second, some beliefs about effective reading instruction, in particular those related to a whole language approach to teaching reading, appear to be deeply entrenched, and more work may be needed to change these beliefs.

Confidence

Participants reported increased confidence for all measured areas of effective reading instruction after the PL and these changes were maintained over time. There is still room for further improvement in participants’ feelings of confidence in teaching a comprehensive and effective reading program.

Practice

Areas of practice that had the largest positive changes after the PL were the reading of decodable texts, teaching phonic knowledge and reviewing phonemic awareness. In contrast, developing reading fluency and comprehension strategies had the smallest change. This was expected as these components of reading were not a key focus of the PL.

The majority of participants shared what they learnt from the PL with their colleagues. This tended to happen through informal conversations rather than more formal sharing practices.

Key considerations

Our key learning is that the department should continue to offer targeted, engaging, evidence-based PL on learning and teaching topics. This evaluation shows that educators’ beliefs, confidence and practice can be positively changed through high-quality PL.

Based on these key findings, we have five key considerations for future professional learning offered by the department on learning and teaching topics

  • Link the PL more effectively to existing practices, systems and interventions.
  • Use baseline data to more effectively differentiate PL content to the needs of participants.
  • Ensure PL is focused on a smaller number of targeted concepts and a specific audience.
  • Support staff after the initial PL to see long-term changes in practice.
  • Leverage the school executive more effectively to support school-wide changes in practice after PL.

 

Literacy support for schools and related resources

Since 2017, the department has undertaken a range of strategic activities and developed a suite of new resource to support schools with early literacy instruction:

Professional learning

Other

Check-in-assessment-thumbCheck-in assessments – Years 3, 5 and 9 (PDF, 600KB)

 

What are the Check-in assessments?

The Check-in assessments are optional online reading and numeracy assessments designed to assist schools following the disruptions to schooling in 2020. The assessments cover similar aspects of literacy and numeracy as in NAPLAN reading and numeracy tests.

These formative assessments are offered for schools to:

  • supplement existing school practices used to identify how students are performing in literacy and numeracy
  • help teachers tailor teaching to meet student needs.

This page provides a summary of information and data from the Check-in assessments delivered in 2020.

Each assessment in 2020 was designed to be quick and easy to administer, consisting of approximately 40 multiple choice questions. Suggested completion time was 50 minutes, however, teachers could use their discretion based on the needs of their students.

Students in Years 5 and 9 completed the assessments during Term 3, Weeks 5 to 7 (17 August–4 September). Students in Year 3 completed the assessments during Term 3, Week 10 to Term 4, Week 2 (21 September–23 October).

Initial results were available to schools within 48 hours of test completion, enabling teachers to rapidly move to use the results in addressing learning gaps.

To assist teachers in using the results, test items were aligned to the NSW syllabus, National Literacy and Numeracy Learning Progressions and teaching strategies.

Student assessment feedback and mapping against the syllabus and learning progressions indicators was made available in the department’s reporting platform, Scout.

Features of the school reports included:

  • information at item-level with links to the questions and strategies related to the skill being assessed
  • information at syllabus stage and progression level for each student
  • feedback on strategies students may have been using if they got the answer correct or incorrect alongside how each student responded.

Records of student achievement of learning progression indicators were also available in the department’s PLAN2 platform, where teachers could monitor student progress and create ‘Areas of Focus’ for targeted teaching and skill development.

Support

Professional learning and assessment support was available to all teachers in participating schools for 2020 assessments. This included how best to make use of the assessment package for each school context, administration of the assessment, how to access and use feedback to help inform planning and strategies for teaching.

As at 10 November, more than 4,700 teachers had accessed:

  • online support, including live chat and sessions with other teachers to ask questions and share ideas
  • a range of online courses including guided professional learning to support the analysis of their students’ assessment information using a ‘data pathway’ (Terms 3 and 4)
  • strategies to identify areas to focus attention in aspects of literacy and numeracy
  • teaching strategies to address these areas specific to Years 3, 5 and 9.

Participation rates

Participation in the Check-in assessments was high, with 83% (1,775) of department schools participating (of schools with students in Years 3, 5 or 9). Participation was higher among primary schools than secondary schools, with 88% of all Year 3 students, 86% of all Year 5 students, and 61% of all Year 9 students participating in the Check-in assessments.
Participation was largely representative across various student and school groups.

 

Table 1: Number of schools and students participating in Year 3 Check-in assessments

 

 

 

Table 2: Number of schools and students participating in Year 5 Check-in assessments

 

 

Table 3: Number of schools and students participating in Year 9 Check-in assessments

 

 

*Note (for tables 1-3): Remoteness area is based on ASGC2016 remoteness area classifications. Inner regional and outer regional Australia are combined, as are remote and very remote Australia. Percentages of schools participating are calculated based on the total number of schools with enrolments in the relevant scholastic year, for each school type. Figures are based on the test participation data extracted from the test platforms on 10 November 2020.

 

Summary results

For each 2020 assessment, a quarter of the test items were NAPLAN items with known psychometric properties and difficulty estimates on the NAPLAN scales. This provided the possibility of linking the Check-in assessments with these scales to assist with further analysis.

After scaling and equating exercises for available results from Year 3, Year 5 and Year 9 tests, five assessments in Year 3 reading, Year 3 numeracy, Year 5 reading, Year 5 numeracy and Year 9 numeracy were able to be equated to the NAPLAN scales. Year 9 reading was not able to be linked to the NAPLAN scale due to a range of factors including test design differences between NAPLAN and the Check-in assessment.
As the Check-in assessments were optional, results were weighted (at student level and by prior performance band in NAPLAN test for Year 5 and 9 or prior performance band in Best Start Kindergarten assessment for Year 3, and remoteness) to arrive at population estimates.
Table 4 presents the estimated proportions of students in NAPLAN bands based on Check-in assessments measured in August-October 2020.

 

Table 4: Estimated (weighted) proportion of students by band, Check-in assessment (August-October 2020)

 

Note: Check-in assessment results were weighted to arrive at population estimates. Results need to be interpreted with caution as they have larger uncertainty than typical NAPLAN results.

 

Table 5 presents the mean scaled scores for each assessment, for 2019 NAPLAN and as estimated from Check-in assessments measured in August-October 2020. This table shows the August-October results in 2020 were similar to previous years’ NAPLAN results, assessed in May for Year 3 reading, Year 5 reading and numeracy, and Year 9 numeracy. In contrast, Year 3 numeracy Check-in results in September/October were substantially higher than previous years' NAPLAN results assessed in May (note that NAPLAN did not take place in 2020 due to COVID-19).

 

Table 5: Weighted mean scores of students in the Check-in assessments, compared to NAPLAN 2019

 

Note: Due to differences between the Check-in assessments and NAPLAN tests (e.g. test design, purpose of tests), caution is needed when comparing Check-in results to NAPLAN results.

Feedback

The response from schools as to the diagnostic value of the assessments has been overwhelmingly positive. Teachers have commented:
“the rich data gleaned is simply amazing!”
and
“as a class we use the Check-in assessment feedback to talk about how we solve number problems and what strategies we use”.

Conclusion

The 2020 Check-in assessments demonstrate the feasibility of conducting formative assessments that provide schools with rapid insight and highly targeted support in a short timeframe and reduced administrative complexity. The high take-up and strong support across schools demonstrates the willingness and ability of schools to use formative assessment to support their professional judgments in rapidly identifying gaps in student learning. The inability to equate Year 9 reading also demonstrates to some degree the limitation of a fast deployment in single state context. In the longer term the availability of pre‑calibrated assessments for use by teachers would further increase the uptake and usability of check-in type assessments.

At the system level, the comparison with 2019 NAPLAN demonstrates that students were generally performing in August-October 2020 at the same levels previously seen in May (with the exception of Year 3 numeracy). This indicates that on average students have fallen approximately 3-4 months behind in Year 3 reading, and 2-3 months behind in Year 5 reading and numeracy and Year 9 numeracy.

Formative assessment practices in early childhood settings: evidence and implementation in NSW (PDF, 5MB)

 

Summary

This paper aims to support early childhood education (ECE) practitoners and policy-makers by bringing together the available research on formative assessment, contextualised to early childhood education in NSW. Formative assessment is an educational practice that has broad applicability and support.

In this paper, several aspects of formative assessment are discussed:

  • What formative assessment is and how it can be used in ECE settings
  • Current and emerging evidence supporting formative assessment practices in these settings
  • How several NSW ECE services have embedded formative assessment in their practices
  • The implications of the research for fostering greater application of evidence-based approached in the NSW ECE sector.

Related resources

This podcast is part of an eight-part series. In this podcast, Mark Scott, Secretary of the NSW Department of Education, dives into what teacher collaboration looks like in practice and why it is important at Blue Haven Public School. Mark speaks with Substantive Principal and Principal in Residence Literacy and Numeracy, Paul McDermott, Relieving Principal, Dale Edwards, and staff at Blue Haven Public School.

Download the transcript (PDF, 237kB)

Access our other What works best resources
 

Part of the conversation on video

This podcast is part of an eight-part series. In this podcast, Mark Scott, Secretary of the NSW Department of Education, discusses what effective classroom management looks like in practice with Strathfield Girls High School Principal, Angela Lyris, and students.

Download the transcript (PDF, 193kB)

Access our other What works best resources
 

Part of the conversation on video

This podcast is part of an eight-part series. In this podcast, Mark Scott, Secretary of the NSW Department of Education, speaks with Principal, Bob Willetts, and staff at Berry Public School to explore how they use data to provide students with effective feedback and inform their classroom practice.

Download the transcript (PDF, 176kB)

Access our other What works best resources
 

Part of the conversation on video

Page 1 of 17

Publications advanced search

Accessible documents

If you find a CESE publication is not accessible, please contact us

Waratah-NSWGovt-Reverse