Publications filter

pbl-thumbPositive Behaviour for Learning evaluation – final report (PDF, 2.2.MB)

Positive Behaviour for Learning evaluation appendices (PDF, 3.7MB)

One-page summary (PDF, 20kB)

Summary

Evaluation background

In 2015, the NSW Department of Education introduced the Supported Students, Successful Students funding package. A key initiative within this package was $15 million over four years to support schools to implement Positive Behaviour for Learning (PBL). PBL is a whole school approach that aims to create a positive, safe and supportive school climate in which students can learn and develop. The funding employed 32 PBL coach mentors and four PBL deputy principals.
CESE’s evaluation included:
• two rounds of fieldwork (a survey and in-depth interviews) to examine the experiences and views of PBL and Non-PBL schools, PBL coach mentors, PBL deputy principals, and other school services staff
• a review of how some PBL schools use their data to inform decision-making
• development of statistical models to measure the impact of PBL on student attendance and suspensions, as well as student wellbeing measures captured in the department’s Tell Them From Me student survey.

Main findings

We conservatively estimate that 1,138 NSW public schools are implementing PBL and that 67 schools have stopped implementing PBL. This translates roughly to a 94% retention rate.
Almost all schools reported implementing each of the universal school-wide features of PBL, using their data to inform decision making and develop appropriate interventions, and using existing PBL evaluation tools to examine their implementation fidelity.

At the time of data collection, approximately four in ten schools were implementing tier 2 (targeted support) and two in ten were implementing tier 3 (intensive individualised support). The most common targeted intervention was an individual support plan.
Coach mentors provided schools with professional learning, general information about PBL, and support with data and evaluation, and are viewed as a source of expert knowledge and advice.
Using their own internal school data, observations and feedback from parents, nearly nine in ten PBL schools reported that they perceive PBL to have improved student wellbeing. The large majority of PBL schools reported that both major and minor problem behaviour incidents have reduced since implementing PBL. More than half of the schools also perceived that PBL had reduced short suspensions, but only a small proportion of schools reported an improvement in attendance.
These findings are not reflected in the department's centrally recorded data and are not supported by our outcome analyses, which found no meaningful differences between PBL schools and non-PBL schools on attendance, suspensions or student wellbeing. However, we identified a number of limitations in the use of these data sources as outcome measures. Without better data systems in place, we are unable to make a conclusive statement about the effectiveness of PBL.

Published in Evaluations

 Local Schools, Local Decisions final evaluation report (PDF, 6MB)

Local Schools, Local Decisions final evaluation report (PDF, 6MB)

One-page summary (PDF, 183kB)

 

Introduction

In 2012, the NSW Department of Education launched the Local Schools, Local Decisions (LSLD) education reform. LSLD aimed to give NSW public schools more authority to make local decisions to best meet the needs of their students.
The reform focused on five interrelated reform areas: making decisions, managing resources, staffing schools, working locally and reducing red tape. In 2014, a new needs-based approach to school funding through the Resource Allocation Model (RAM) was added to the LSLD reform.

The Centre for Education Statistics and Evaluation (CESE) commenced an evaluation of LSLD in 2016. CESE’s final evaluation report is an outcome evaluation aiming to answer three evaluation questions:

  • What has been the combined impact of LSLD and RAM funding on school and student outcomes?
  • How have schools spent the additional funding they have received since the implementation of LSLD (including RAM and other funding)?
  • What has been the impact of LSLD on school management and local decision-making practices?

 

Findings and conclusions

The report’s key findings are that:

  • Since the introduction of LSLD, there has been no substantial improvement in NAPLAN Reading and Numeracy results, HSC completion and performance have worsened, and Tell Them From Me student wellbeing outcomes have either not changed or have worsened.
  • It is not possible to use current system finance data to identify exactly how schools spent their funding. The LSLD policy documentation did not explicitly ask schools to demonstrate how changes they made under LSLD, or funding decisions they made with RAM funding, improved student outcomes, nor to report on that improvement.
  • LSLD had a positive impact on schools’ ability to make local, context-specific decisions. However, the administrative burden for schools increased during LSLD.

The report concludes that the department should:

  • Ensure that schools are accountable for their decision-making by requiring and supporting schools to report through the school planning tools.
  • Provide further guidance for schools on effective ways to improve school and student outcomes by continuing to identify what is already known about ‘what works best’ for school leadership and decision-making, and cataloguing and providing guidance on ways that schools should spend their funding in the most effective ways.
  • Ensure policies have clear aims and mechanisms to achieve success in terms of outcomes, and that evaluation is a part of needs-based policy development in future.
  • Develop and support effective financial and administrative management by ensuring that changes to processes and system tools are appropriately piloted, managed and coordinated, and school staff are provided with targeted training.
  • Ensure that financial reporting systems allow the department to track expenditure to the level of detail required to ensure student outcomes are being targeted.

 

Related reports

The interim evaluation report was published in 2018.

 


 

We are committed to providing accessible content for all users. To request an accessible version of this content, please contact us.

Published in Evaluations

Professional-learning-effective-reading-instruction-early-years-evaluation-thumb

Professional learning – effective reading instruction in the early years (PDF, 872.6KB)

One page summary (PDF, 142.5KB)

 

Background

Effective reading programs have six key components: phonemic awareness, phonics, fluency, vocabulary, comprehension and oral language. Reading programs are also most effective when these components are taught explicitly, systematically and sequentially. Based on this evidence, the NSW Department of Education developed an evidence-based two-day professional learning (PL) course on effective reading instruction, with a strong focus on explicit teaching of phonemic awareness and synthetic phonics. The PL was provided in 16 locations in NSW in terms 2 and 3 of 2018. The department funded all NSW government schools with a kindergarten enrolment to send up to two teachers to the PL. In total 2,288 staff from 1,089 schools attended the PL.

The evaluation measures the impact of the PL on teachers’ beliefs about the most effective practices for teaching reading to students; and confidence in implementing these practices; and their practices in the classroom.

Key findings

Beliefs

While some beliefs about the most effective practices for teaching reading changed, as anticipated, after the PL, other beliefs did not show this anticipated change. The largest changes were in beliefs about the explicit and systematic teaching of phonics and reading skills. These beliefs aligned with key concepts that were a focus of the PL.

Other beliefs showed little change after the PL, with two alternative explanations:

  • First, some participant beliefs about effective reading instruction already aligned with the PL content and therefore did not need to change.
  • Second, some beliefs about effective reading instruction, in particular those related to a whole language approach to teaching reading, appear to be deeply entrenched, and more work may be needed to change these beliefs.

Confidence

Participants reported increased confidence for all measured areas of effective reading instruction after the PL and these changes were maintained over time. There is still room for further improvement in participants’ feelings of confidence in teaching a comprehensive and effective reading program.

Practice

Areas of practice that had the largest positive changes after the PL were the reading of decodable texts, teaching phonic knowledge and reviewing phonemic awareness. In contrast, developing reading fluency and comprehension strategies had the smallest change. This was expected as these components of reading were not a key focus of the PL.

The majority of participants shared what they learnt from the PL with their colleagues. This tended to happen through informal conversations rather than more formal sharing practices.

Key considerations

Our key learning is that the department should continue to offer targeted, engaging, evidence-based PL on learning and teaching topics. This evaluation shows that educators’ beliefs, confidence and practice can be positively changed through high-quality PL.

Based on these key findings, we have five key considerations for future professional learning offered by the department on learning and teaching topics

  • Link the PL more effectively to existing practices, systems and interventions.
  • Use baseline data to more effectively differentiate PL content to the needs of participants.
  • Ensure PL is focused on a smaller number of targeted concepts and a specific audience.
  • Support staff after the initial PL to see long-term changes in practice.
  • Leverage the school executive more effectively to support school-wide changes in practice after PL.

 

Literacy support for schools and related resources

Since 2017, the department has undertaken a range of strategic activities and developed a suite of new resource to support schools with early literacy instruction:

Professional learning

Other

Published in Evaluations

Evaluation of the Rural and Remote Education Blueprint - final report (PDF, 2MB)

Authors: Andrew Griffiths, Ian Watkins, Francis Matthew-Simmons, Sasindu Gamage

Evaluator company/business: Centre for Education Statistics and Evaluation

Year: 2020

URL or PDF: Evaluation of the Rural and Remote Education Blueprint

Summary: This final evaluation report examines the implementation and impact of actions contained in the Blueprint. It also examines important education performance indicators to assess any changes in the magnitude of the gaps between rural and remote students and metropolitan students since the launch of the Blueprint. We collected a range of qualitative and quantiative data sources to evaluate the Blueprint including interviews, surveys and administrative data. This evaluation has found that:

  • Gaps in NAPLAN scores and school attendance between rural and remote students and metropolitan students have not reduced since the introduction of the blueprint. The gaps between remote students and metropolitan students have narrowed on Best Start and retention to Year 12.
  • The 50% rental subsidy introduced at some fourpoint schools had no meaningful impact on teacher retention.
Published in Evaluation repository

Evaluation of Flexible Funding for Wellbeing Services (PDF, 4.09MB)

Evaluation of Flexible Funding for Wellbeing Services (PDF, 4.09MB)

One-page summary (PDF, 119kB)

 

Evaluation background

In 2015, the NSW Department of Education introduced the Supported Students, Successful Students funding package. A key initiative within this package was the commitment of $51.5 million to Flexible Funding for Wellbeing Services for the period 2016 to 2018. This funding was distributed to 381 schools, averaging approximately $45,000 per school per calendar year. The funding allocation methodology took into consideration multiple indicators of need and the School Counselling Service allocation. Schools were instructed to use the funds to purchase wellbeing services specific to their school’s varying needs.

CESE’s evaluation included:

  • two rounds of fieldwork (a survey and in-depth interviews) to gather information on schools’ 2017 and 2018 expenditure as well as perceived outcomes
  • development of statistical models to measure the impact on mean (average) change over time in student wellbeing measures captured in the department’s Tell Them From Me student self-report survey.

 

Main findings

Schools spent their Flexible Funding for Wellbeing Services on up to eight separate services or resources. The two most popular were whole of school wellbeing programs (40%) and employing a Student Support Officer (37%). Other popular options were targeted wellbeing programs/approaches for students who need additional support (35%), professional learning in wellbeing approaches (34%) and employing wellbeing executive/staff (32%). Schools used the funds for new services or resources, for topping up existing ones, or a mix of both.

Decisions on spending were guided by the student profile and the additional needs of specific sub-groups of students. Half of the schools changed the way they spent their funding from 2017 to 2018. These changes were most commonly to meet the changing or emerging needs of a specific sub-group of students, or to shift focus to whole school needs. Schools value this ability to adapt to changing needs over time.

Schools are typically very satisfied that the services they invested in met the wellbeing needs of students. They also believe that this funding, combined with other funding, allows them to provide appropriate wellbeing services and activities. The large majority perceived the services they had funded to have improved student wellbeing at a whole school level. Perceived improvements were generally greatest when a staff member had been employed. Schools reported even stronger positive impacts on the wellbeing of student sub-groups that were specifically targeted for particular types of support.

Contrary to feedback from schools, our outcome analyses found no meaningful differences between Flexible Funding and non-Flexible Funding schools in the average change over time in self-reported student wellbeing. However, we identify a number of limitations that make it difficult for an analysis to detect school-level differences between the two groups.

Published in Evaluations

Supported Students School Counselling Evaluation

Process evaluation of the expansion to the school counselling service (PDF, 1.24MB)

One-page summary (PDF, 120.27KB)

 

Evaluation background

In 2015, the NSW Department of Education introduced the Supported Students, Successful Students (SSSS) funding package. The expansion to the school counselling service is one of its key initiatives and includes $80.7 million to employ an extra 236 full-time equivalent school counselling staff and $8.0 million to expand the graduate scholarship program and workforce development scholarships. To implement the expansion, many concurrent changes took place, including reconfiguring teams and boundaries and the introduction of new roles.

The changes implemented under the SSSS funding package are the most significant that the school counselling service has ever seen. The 236 new positions reflect an expansion to the service by 30%.

CESE’s process evaluation investigated different aspects of implementation and the perceived impacts on student wellbeing from 2016 to 2019. The methodology included:

  • interviews with 63 school-based staff and members of the school counselling service from Term 4 2017 to Term 2 2019
  • analysis of data on recruitment and separations, scholarships and sponsorships, and the number of new case files.

 

Main findings

Implementation has been a significant undertaking, and challenging in the interim for school counselling service staff and schools. Recruitment has been difficult in some areas and there has been extensive flow-on recruitment activity arising from staffing movements. As a result, many of the schools CESE spoke to in 2017 and 2018 had not yet experienced their expected increase in school counselling service time, and this was a source of frustration.

In schools that had experienced an increase in school counselling staff time, interviewees reported reduced waitlists and wait times, more students being supported, better management of crisis incidents, better follow-up and liaison with external services, and sometimes an increase in early intervention initiatives.

One significant change was the introduction of the new school psychologist role to facilitate recruitment of 236 new positions. Interviewees indicated that school counselling service staff and school principals value the complementary skills and experience school psychologists bring to the school counselling service.

Senior Psychologists Education (SPEs) have effectively navigated a time of unprecedented change, supporting several new school counselling service staff in their teams and managing staffing vacancies. The new role of Leader Psychology Practice has provided valuable support for SPEs and has enhanced the service's strategic planning capacity.

The scholarship funding from SSSS has successfully enabled 94 additional teachers to retrain as school counsellors under the existing sponsorship program, and 40 permanent appointments to be made from an additional scholarship program.

At the end of 2010, Learning and Wellbeing confirmed that all 236 positions have been filled (although some of the occupants may be on leave or reliving elsewhere creating flow-on vacancies).

The findings from this evaluation will be used to inform ongoing policy development and implementation for the school counselling service.

Published in Evaluations

Process evaluation of the Refugee Student Counselling Support Team

Authors: Rebecca Wilkinson, Jessica Fulcher, Rochelle Cox

Evaluator company/business: Centre for Education Statistics and Evaluation

URL or PDF: Process evaluation of the Refugee Student Counselling Support Team

Summary: The Refugee Student Counselling Support Team (RSCST) provides specialised support to NSW public schools that have refugee students enrolled. The process evaluation comprised 43 in-depth interviews with RSCST team members, school-based staff, Refugee Support Leaders and other providers of refugee services; development of four case studies to illustrate good practice; and review of activity data and self-evaluation data collected by the team. The study found that school staff consistently observed that the RSCST's work has led to improvements in refugee students' social and emotional skills, a reduced incidence and intensity of negative behaviours, and an increased readiness to learn. Further, many school staff felt more confident and supported to put into practice the skills and strategies learnt from the RSCST's capacity-building sessions and side-by-side counselling support. The report includes four case studies that showcase how the RSCST supports NSW public schools, including: complex case support and classroom teacher capacity building; the benefits of play therapy; supporting a new regional refugee settlement area, and RSCST's collaborative working relationship with STARTTS and the benefits for schools.

Published in Evaluation repository

Evaluation of the Rural and Remote Education Blueprint - final report (PDF, 2MB)

Download the Evaluation of the Rural and Remote Education Blueprint - final report (PDF, 2MB)

 

Background

Research shows that students in rural and remote (non‑metropolitan) areas of NSW tend to underperform on major educational indicators when compared to students in metropolitan locations. To address this disparity, the NSW Minister for Education released Rural and Remote Education: A Blueprint for Action in November 2013. The blueprint committed $80 million over four years to implement a broad set of actions in four focus areas: quality early childhood education, great teachers and school leaders, curriculum access for all, and effective partnerships and connections.

 

Evaluation

This final evaluation report examines the implementation and impact of actions contained in the blueprint, using available data up to and including 2017. It also examines important education performance indicators to assess any changes in the magnitude of the gaps between rural and remote students and metropolitan students since the launch of the blueprint.

 

Main findings

This evaluation has found that:
• Gaps in NAPLAN scores and school attendance between rural and remote students and metropolitan students have not reduced since the introduction of the blueprint. The gaps between remote students and metropolitan students have narrowed on Best Start and retention to Year 12. The gaps between provincial students and metropolitan students have not reduced on these measures.
• The 50% rental subsidy introduced at some fourpoint schools had no meaningful impact on teacher retention.
• Aurora College provides an important opportunity for gifted and talented students. Enrolments have grown and issues related to timetabling are being addressed.
• Education Networks and Networked Specialist Centres (NSCs) have had little impact. At the time of the evaluation, Education Networks had not been used in the more substantial ways originally envisaged, for example to increase community engagement or share budgets. Some NSC facilitators were unsure of their overall effectiveness or were confused about the scope of the role. Since the evaluation, the role of NSCs has been clarified and the department believes they will demonstrate value into the future.
• Enrolments of 4 and 5 year old Aboriginal children in community preschools in rural and remote areas increased by 45% between 2013 and 2017. Enrolments of non-Aboriginal 4 and 5 year old children from low income families increased by 8%.

 

Related reports

The interim monitoring and evaluation report was published in 2016.

 


 

We are committed to providing accessible content for all users and are working towards providing this PDF in a more accessible format. To request an accessible version, please contact us.

Published in Evaluations

Process evaluation of the Refugee Student Counselling Support Team (PDF, 7MB)

Process evaluation of the Refugee Student Counselling Support Team (PDF, 7MB)

The information on this page is also available as a one page summary (PDF, 109kB)

 


 

Evaluation background

The Refugee Student Counselling Support Team (RSCST) is a small Sydney-based team that provides specialised support to NSW public schools that have refugee students enrolled. Its main work areas include:

  • tailored professional learning
  • targeted counselling in complex cases and additional support for the school counselling service
  • advice and consultation
  • assistance connecting refugee students and their families to other local supports.

CESE conducted a process evaluation which involved:

  • 43 in-depth interviews with RSCST team members, school based staff, internal and external providers of refugee services
  • development of four case studies to illustrate good practice
  • a review of activity data and self-evaluation data collected by the team.

 

Main findings

The RSCST has a well-established service model that has been refined over time since its inception in 2016. The team’s reach has been broad and it has carried out an increasing volume of work in each of its core areas.

Capacity building has been the key priority from the outset and occurs through an array of professional learning workshops and via side-by-side work with school counselling staff. An increasing proportion of the team’s time has been spent providing targeted counselling support for refugee students with complex needs. The team also conducts group support work that is highly valued by schools. The team has established a contact number that is manned throughout the week for school enquiries and has developed strong local partnerships with internal and external refugee services.

School staff consistently observed that the team’s work has led to improvements in refugee students’ social and emotional skills, a reduced incidence and intensity of negative behaviours, and an increased readiness to learn. They described improvements to the wellbeing of students’ families, stemming from increased trust and confidence in school staff. Further, many school staff felt more confident and supported to put into practice the skills and strategies learnt from the RSCST’s capacity-building sessions and side-by-side counselling support. RSCST staff are particularly valued for their expertise in trauma-informed practice. The team’s collaboration with other refugee services has improved the set of services available to schools and to refugee students.

Recruitment has been a key challenge, and the team has often operated with less than its full complement of eight staff.

The nature of work requires a combination of specialist skills and personal attributes that are not easily found. The team is also working on increasing schools’ awareness of the team’s responsibilities and range of services, and on managing schools’ expectations of support. An ongoing challenge is deciding how to prioritise the team’s limited time most effectively across the state as demand for its services continues to grow.

Published in Evaluations

Connected Communities Strategy - Final evaluation report (PDF, 1.8MB)

Connected Communities Strategy  final evaluation report (PDF, 1.8MB)

One-page summary (PDF, 186kB)

 

Background

The Connected Communities Strategy commenced implementation in schools in 2013. The strategy aims to improve outcomes for students in 15 schools in some of the most complex and vulnerable communities in NSW. The strategy is underpinned by a commitment to ongoing partnership with Aboriginal communities, supporting Aboriginal people to actively influence and fully participate in social, economic and cultural life, consistent with the NSW Government’s plan for Aboriginal affairs, OCHRE (Opportunity, Choice, Healing, Responsibility, Empowerment). Connected Communities is one initiative under OCHRE.

 

Evaluation

This evaluation assesses the implementation and impact of the strategy, and aims to answer the following questions:

1. How well has the model of the strategy been formed and implemented, and what variation exists across schools?

2. What are the outcomes and impact of the Connected Communities Strategy?

The evaluation of Connected Communities commenced in 2014. The focus of this final evaluation report is on the outcomes and impacts of the strategy.

 

Main findings

Overall, Connected Communities is showing promising results. This evaluation has shown that Connected Communities has had a positive impact in schools, particularly in outcomes for students in their early years. Connected Communities represents a sound policy approach that has the potential to provide further positive outcomes for students and communities, given more time.

The strategy appears to be more effective at the primary level than the secondary level. The primary school cohort of students who have been ‘fully exposed’ to Connected Communities for their entire time at school appear to be showing the greatest benefit from the strategy in terms of NAPLAN results, and appear to be more developmentally ready for school than earlier cohorts.

Further time will be required to see if the results in later years improve as this cohort of students continues through its schooling.

Both implementation and outcomes have varied across individual schools. The buy-in of all staff remains key to the successful implementation of Connected Communities, and it is critical that Executive Principals continue to articulate a clear vision of the strategy, ensure staff support, and prioritise high expectations for all students.

 

Related reports

The Connected Communities interim evaluation report, published in 2016, primarily addressed the implementation of the strategy.

Published in Evaluations
Page 1 of 3

Publications advanced search

Accessible documents

If you find a CESE publication is not accessible, please contact us

Waratah-NSWGovt-Reverse