Notes
Article history
The research reported in this issue of the journal was funded by the PHR programme as project number 13/164/06. The contractual start date was in December 2015. The final report began editorial review in December 2019 and was accepted for publication in September 2020. The authors have been wholly responsible for all data collection, analysis and interpretation, and for writing up their work. The PHR editors and production house have tried to ensure the accuracy of the authors’ report and would like to thank the reviewers for their constructive comments on the final report document. However, they do not accept liability for damages or losses arising from material published in this report.
Disclaimer
This report contains transcripts of interviews conducted in the course of the research and contains language that may offend some readers.
Permissions
Copyright statement
© Queen’s Printer and Controller of HMSO 2021. This work was produced by Kidger et al. under the terms of a commissioning contract issued by the Secretary of State for Health and Social Care. This issue may be freely reproduced for the purposes of private research and study and extracts (or indeed, the full report) may be included in professional journals provided that suitable acknowledgement is made and the reproduction is not associated with any form of advertising. Applications for commercial reproduction should be addressed to: NIHR Journals Library, National Institute for Health Research, Evaluation, Trials and Studies Coordinating Centre, Alpha House, University of Southampton Science Park, Southampton SO16 7NS, UK.
2021 Queen’s Printer and Controller of HMSO
Chapter 1 Introduction
Background
Young people’s mental health
Mental disorders among 11- to 15-year-olds in the UK have risen in recent years, particularly among girls experiencing emotional disorders, with 13.6% among this age group identified as having one or more disorders. 1 Cohort studies indicate that levels of distress and difficulty in the early teenage years are increasing,2 with 25.7% of teenagers aged 14–15 years, and as many as 37.4% of girls in this age group, reporting high levels of psychological distress. 3 There is also evidence that self-harm is increasing among young people in the early to mid-teenage years, particularly among girls. 2,4 Intervening early to support young people’s mental health is important to avoid the development of more serious and long-term problems,5 as approximately half of all adult mental disorders begin during adolescence. 6
Given that only 25% of children and young people with a diagnosable condition are able to access Child and Adolescent Mental Health Service (CAMHS) support,7 it is vital that non-specialist settings, including schools, where most children and young people spend a great deal of their time, are equipped to engage in early prevention work and support those at risk of developing a mental health problem. A large number of school-based studies seeking to improve children and young people’s mental health focus on developing individual coping skills or mental health literacy. There is mixed evidence as to the effectiveness of such programmes. A recent systematic literature review8 found some effects on prevention or reduction of anxiety and depression for a short time. However, a recent network meta-analysis9 did not find such positive results and concluded that previously observed effects may be driven by differential control group effects.
As well as attempting to develop young people’s own skills and knowledge regarding mental health maintenance, it is important to consider changes to the school environment to ensure that it is supportive of mental health. Research has shown that teachers are the professionals that teenagers are most likely to talk to about mental health difficulties,10 and also that perceived supportive relationships with school staff can be protective of future depression for students in secondary schools. 11 Conversely, difficult teacher–student relationships among this age group predict mental health problems and future exclusion. 12 Despite the importance of positive relationships with teachers for student mental health, many teachers report a lack of training in how to support students. 13,14 In a questionnaire commissioned by the Department for Education in 2015, 52% of respondents working in schools identified training for staff as an important strategy to support student mental health. 15 The government’s pledge to train one senior lead on mental health in every secondary school in England16 is unlikely to go far enough in terms of strengthening all teachers’ ability to develop supportive relationships with students, and to know how to help those at risk of poor mental health.
Teachers’ mental health
In addition to a lack of training, another barrier to teachers providing appropriate support to students is their own poor mental health and well-being. 17,18 Health and Safety Executive data for Great Britain consistently show teaching professionals to have higher rates of work-related stress, anxiety and depression (recent figures show a rate of 3020 per 100,000 for teaching and education professionals vs. a rate of 1320 per 100,000 for all industries). 19 Causes of these heightened levels of work-related stress and distress have been identified as excessive workload, lack of autonomy, poor salary, perceived lack of status, challenging student behaviour, difficult relationships with colleagues and managers, and pressure to ‘perform’ in a context in which schools are increasingly regulated and judged against an array of externally determined targets. 20–23 These stressors may be exacerbated by a culture in which teachers feel unable or unwilling to ask for help. 3,24 Failure to support teachers experiencing such difficulties may lead to longer-term mental health problems, presenteeism (i.e. going to work when one is mentally or physically ill) and sickness absence or quitting the profession. 25–28 Presenteeism for teachers may result in difficulty managing classroom behaviour effectively and supporting vulnerable students,29,30 both of which may impact on student mental health. 31
Interventions to improve teacher–student relationships
A small number of previous studies have attempted to enhance teachers’ support of students with the aim of improving student outcomes. The INCLUSIVE study32 found that a whole-school approach to strengthening relationships among students and between students and teachers reduced bullying and psychological difficulties among UK secondary school students. The Supporting Teachers and Children in Schools (STARS) trial33 found that the Incredible Years training, which aims to improve school teachers’ classroom management skills and strengthen teacher–student relationships in UK primary schools, led to a short-term (9-month) improvement in student mental health, although this was not sustained. The Saving and Empowering Young Lives in Europe (SEYLE) trial,34 conducted across 10 European countries, compared the impact of three interventions on suicidal ideation and attempts among secondary school students: (1) training school staff to recognise and support at-risk students, (2) delivering a mental health awareness programme to students and (3) screening and referral of at-risk students by professionals. This trial found that only the awareness raising intervention had an impact on suicide ideation and attempts, although the other interventions showed promise, with the confidence intervals (CIs) overlapping across the three. The authors suggested that the smaller effect of the staff training element may have been because poor teacher well-being reduced teachers' ability to support students. 30
Workplace interventions to support mental health
No previous studies have evaluated the impact of improving support for teachers’ own mental health alongside improving their skills in supporting students. A number of reviews have examined the effectiveness of workplace interventions that aim to reduce stress or improve mental health, and report that approaches such as cognitive–behavioural therapy, self-help, exercise and mindfulness interventions have all been found to be effective at reducing workplace depression and other common mental disorders. 35–38 The only systematic review39 that looked at interventions to support teachers’ mental health specifically investigated the impact of organisational-level change. Of the four studies included in this review, one looked at changing teachers’ tasks (e.g. establishing flexible work schedules) and reported a resultant reduction in work stress and small increase in performance. Two studies evaluated coaching support, but found no effects on anxiety, depression or burnout. The final study found that a multicomponent programme, which included performance bonus pay, job promotion opportunities and mentoring, led to reductions in teacher resignations. However, the authors note that the quality of studies was poor.
Social support has been found to be protective for mental health in the workplace. 40 Bricheno et al. ,41 in their review of teacher well-being, identified evidence from questionnaires and case studies that teachers themselves view support from colleagues, senior staff and head teachers as an important factor. Workplace peer support has been identified as helpful because of its contribution towards establishing a supportive culture, its basis in shared experience and understanding of workplace challenges, and its avoidance of the perceived stigma attached to more formal help sources. 42,43 However, the evidence base is limited and has tended to focus on peer education to engender behavioural change (e.g. increasing screening uptake or healthy eating). 44,45 A number of peer support schemes have been set up in occupations requiring emotional labour or that potentially involve traumatic experiences, such as health-care professionals, but these rarely produce published evaluations. One published example of a peer support scheme for health-care workers43 described the following features as important: credibility of peers, immediate availability, voluntary access, confidentiality, emotional first aid (as opposed to therapy) and facilitated access to next level of support. Social support is conceptualised as comprising problem-focused and emotion-focused supportive strategies, both of which have been found to have a positive impact on physical and mental health. 46 A review42 of peer support schemes suggests that they provide a combination of informational, emotional and instrumental support.
Mental health first aid training in the workplace
Mental health first aid (MHFA) training has been delivered in the workplace as one way of improving support. 47 MHFA is an internationally recognised training package that is designed to improve lay people’s knowledge of the signs and symptoms of developing mental disorders, and their confidence in helping those in mental health crisis. 48 Course attendees are taught to follow five steps when approaching another person who requires support: approach the person, assess and assist with any crisis; listen non-judgementally; give support and information; encourage the person to get appropriate professional help; encourage other supports (ALGEE). A number of studies have provided evidence that MHFA training is effective in improving knowledge, confidence in helping others and intention to help others, as well as reducing stigmatising attitudes. 49 MHFA delivered in the workplace has been found to provide a number of benefits: encouraging social support,50 increasing self-reported help-seeking from professionals47,50 and improving the mental health of course participants themselves. 47 Participants generally express very positive attitudes towards the training, reporting gains in knowledge and skills, and improved ability to provide help to those around them by using that learning. 51–53 Despite high participant acceptability, and evidence of improved mental health literacy and intention to help, no study has yet been able to show impact of MHFA training on mental health outcomes. 49 However, a shorter training programme that focused on improving knowledge and communication around mental health when delivered to senior managers in a large fire and rescue service did lead to a decrease in absence rates among the staff whom trainees managed. 54
Mental health first aid training in schools
A version of MHFA, the youth MHFA, has been designed specifically for adults supporting young people, using the same ALGEE model. Although the full version has not been evaluated, a randomised controlled trial (RCT) tested a shortened version delivered to secondary school teachers in Australia. 55 At 6 months’ follow-up, this study reported improvements to teachers’ knowledge and confidence to help students and colleagues, but there were no changes in the actual help that teachers gave to students or in student mental health. However, the intervention did not attempt to improve support for the teachers’ own mental health. We sought to build on this study by addressing teachers’ own support needs, alongside the training in how to support students.
Rationale for the current study
To date, school intervention studies have focused on improving student mental health, with the majority of studies evaluating classroom-based psychological or educational approaches. Very few studies have focused on training teachers to support vulnerable students through their day-to-day interactions, despite the role of teachers in identifying and attending to students’ mental health problems, as demonstrated in the research literature14,56 and recommended in key policy and guidance documents, such as the Department for Education’s Mental Health and Behaviour in Schools. 57 Furthermore, no studies have developed and evaluated an intervention that focuses on teachers’ own mental health, despite evidence that they are a group at risk of poor mental health, and that this is likely to influence the health and attainment of those they teach. We therefore developed an intervention in consultation with teachers and public health specialists working with schools, which addressed these gaps and which we planned to evaluate. The Wellbeing in Secondary Education (WISE) trial was the result of this process.
The Wellbeing in Secondary Education feasibility and pilot trial
In line with Medical Research Council guidance for the evaluation of complex interventions,58 we undertook a pilot cluster RCT of the WISE intervention in six secondary schools to explore its feasibility and acceptability. 59 The intervention comprised (1) 2-day standard MHFA training for school staff who went on to set up a peer support service for colleagues and (2) 2-day youth MHFA training for school staff to improve the support they were able to provide to students. The pilot trial also enabled us to test the feasibility of key aspects of the trial design, including the collection of outcome measures and the recruitment and retention of schools, and to develop a logic model as to the likely pathways by which the intervention might have an effect on the outcomes of interest (Figure 1). Key findings from this pilot were as follows:
-
Schools from a range of socioeconomic catchment areas were able to be recruited and retained, although it was necessary to make personal contact with a relevant member of staff to ensure this.
-
Measures of staff and student mental health, and staff presenteeism and absence were able to be collected at baseline and at 12-month follow-up (T1). Data were collected during meetings and not left to individuals to complete in their own time.
-
Staff viewed MHFA training as relevant and useful in terms of knowledge gain and opportunities to reflect on mental health, although a shorter version of the youth MHFA course that focused on practical strategies was suggested.
-
Staff peer supporters were able to be recruited and trained in each school. A mean of at least two interventions per peer supporter over a 2-week period was reported, and the service was viewed positively by staff. However, barriers to using the service were identified, specifically concerns around confidentiality, a preference for talking to pre-established social networks and lack of knowledge about the service.
-
Unlike the earlier study of youth MHFA in schools,55 we found that, at follow-up, students in the intervention schools had higher well-being (WEMWBS score 48.1 vs. 45.8; p = 0.12) and lower levels of mental health difficulties (SDQ total difficulties scale score 11.2 vs. 13.2; p < 0.01) than those in the control schools, once adjusted for baseline measures.
-
Staff who received the MHFA training had improved mental health knowledge and well-being, and lower depression than all other staff, as reported in earlier MHFA studies. 49
-
There were no differences between study arms regarding staff well-being and depression, but we were not powered to detect such changes, nor were we able to examine sustainability as the study only ran for 12 months.
We concluded that the results regarding feasibility and acceptability of the pilot intervention and evaluation were sufficiently positive to justify a full RCT, powered to measure the effect of the intervention on outcomes, and this is the trial reported here.
The present trial
The WISE trial is a full cluster RCT with integrated process and economic evaluations. 61 It tested the effectiveness of a complex intervention that delivered mental health support for teachers and training in supporting student mental health, with the aim of improving teacher mental health and performance, and subsequently improving student mental health and attainment, through the mechanisms of change identified in the logic model (see Figure 1).
Objectives
The primary objective was to establish if the WISE intervention leads to improved teacher emotional well-being compared with usual practice.
The secondary objective was to answer the following research questions:
-
Does the WISE intervention lead to lower levels of teacher depression, absence and presenteeism, improved student well-being, attendance and attainment, and reduced student mental health difficulties, compared with usual practice?
-
Do any effects of the intervention differ according to the proportion of children receiving free school meals (FSM) (an indicator of the socioeconomic catchment area) and geographical area, or individual-level baseline mental health, sex, ethnicity and FSM?
-
What is the cost of the WISE intervention, and is it justified by improvements to staff and student well-being, and reductions to staff depression and student difficulties?
-
Does the WISE intervention work according to the mechanisms of change hypothesised in the logic model?
-
Is the WISE intervention sustainable?
The intervention: underlying theory
The WISE intervention is informed by social support theory. Social support can offer problem-focused coping strategies and emotion-focused supportive strategies, both of which can have a positive impact on physical and mental health. 46 Based on findings from the pilot trial,59 we hypothesised that peer supporters would provide both emotion-focused support (e.g. by listening non-judgementally) and problem-focused support (e.g. by offering practical suggestions for solutions to work-based difficulties), where appropriate. Perceived availability of social support may be even more important to mental health than actual support,46 and, therefore, the existence of a peer-delivered support service was theorised to have a positive impact on teacher well-being, regardless of actual service utilisation. The programme theory was further informed by an ecological view of school connectedness, which considers the quality of social bonds and interactions within a school to be a characteristic of the whole-school environment or culture. 46 Improvement to teachers’ own mental health and well-being via supportive relationships with peers should lead to more positive teacher–student relationships,61 which is associated with improved student mental health. 12 Therefore, all teachers and students within an intervention school may benefit, regardless of whether or not they themselves directly engage with the intervention. Furthermore, a change made from the pilot to the full intervention was that all teachers were offered a shorter version of mental health training, with the aim of improving interactions between all teachers and students, and enabling all teachers to have a better insight into their own mental health (see Figure 1).
The intervention: description of components
Each intervention school received the following.
Staff peer support service
Following the methods developed and found to be acceptable in the pilot study, all teachers in the intervention schools were invited to nominate colleagues whom they considered would make good peer supporters, via a confidential, anonymous written questionnaire completed at baseline before randomisation. The study population was teaching staff only, because pilot findings showed that this was the group most at risk of poor mental health and because of practical difficulties collecting data from all staff. However, the peer support service was available to all staff, as our patient and public involvement work with school staff identified that a service for teachers only may prove divisive. A list was compiled of the 8% of staff with the most nominations, ensuring a mixture of sex, years of experience and teaching/non-teaching roles. The one exclusion criterion was being a member of the senior leadership team, as pilot findings indicated that staff might not use a support service that included senior leaders because of concerns about performance management. Those on the list were invited to attend the 2-day standard MHFA training and become a staff peer supporter. The MHFA training was generally delivered on school premises (one school had the training at a university, and one school shared the training with another school and attended it there). Schools were offered reimbursement for cover costs for teaching staff. A short session about setting up the peer support service and expectations around this was included in the 2-day training, either delivered by the MHFA trainer (England) or by a study team member (Wales). Following the training, participants were provided with guidelines for setting up the peer support service, developed from patient and public involvement work and pilot study findings (see Report Supplementary Material 1). These guidelines covered confidentiality, advertising the service, support for the peer supporters, communication with the research team and practical considerations, such as where support would be provided. An example policy document including a confidentiality statement drawn up by a pilot school and posters for advertising the service were also provided.
Teacher training in mental health first aid for schools and colleges
At least 8% of all teaching staff who had not received the 2-day peer supporter MHFA training received the 1-day MHFA training for schools and colleges. In our pilot study, findings suggested that teachers who have a support role but who have not received prior mental health training, such as tutors or heads of year, would benefit the most. Schools were, therefore, encouraged to select teachers in such roles to attend, but the ultimate decision about who did attend was made by schools. The aim was to train 8% of the teachers, but schools were free to train up to 16 staff, which is the course maximum. The full 2-day youth MHFA course was made available to staff in our pilot schools, but mainstream teachers were often unable to attend because of its length. Partly in response to these findings, the 1-day MHFA for schools and colleges was developed by MHFA England (London, UK). When possible, this training was delivered during in-service training time to ensure that it was as accessible as possible for teachers. Having completed the training, staff were expected to apply the MHFA learning about how to respond to, and support, a student in distress in their day-to-day interactions with students. The decision to train a different 8% (minimum) of staff in each version of the course meant that 16% of staff were trained in total. A previous study using peer influence to address smoking behaviour found an effect when 16% of students were trained as peer educators. 62
Mental health awareness raising session for all teachers
A 1-hour training session aimed at raising awareness of mental health and to introduce self-help strategies, ways to support other people and the peer support service, was delivered either during in-service training or during staff meeting time. This session was co-produced with a MHFA trainer and drew on core MHFA content, and was delivered by the same team of MHFA trainers that delivered the rest of the training. All teaching staff were invited to attend and schools were also free to invite other non-teaching staff. This element of the intervention was added in response to pilot findings, which suggested that some staff were unaware of the peer support service in their schools, and to strengthen the whole-school nature of the intervention.
Detailed information about the content of the MHFA courses can be seen in Report Supplementary Material 2, where we provide an outline of the observation schedule (see also Chapter 2, Data sources). Further information about MHFA courses can be found on the MHFA England website (URL: www.mhfaengland.org).
The control
The comparison schools continued with usual practice in terms of teacher support and training. The details of what this entailed in practice were examined via an audit as part of the process evaluation (see Chapter 2) and results are reported in Chapter 4. Schools signed a research agreement, stating that they would not access MHFA training during the study.
Implementation strategy
Different models for implementing the training were deployed in England and Wales because of the intervention funders in Wales (Public Health Wales, Cardiff, UK) wishing to make use of existing resources and infrastructure to ensure sustainability if the intervention was found to be effective.
In intervention schools in England, the standard MHFA, the MHFA for schools and colleges, and the 1-hour awareness training was delivered by local independent trainers who had attended MHFA instructor training and received certification. Learning from the pilot indicated that trainers needed to have awareness of the school context and challenges faced by teachers, and two of the three trainers had extensive experience of delivering training to schools. In Wales, seven healthy schools co-ordinators (HSCs) attended a 6-day bespoke MHFA instructor training course, received certification and went on to deliver the three strands of training in the intervention schools. HSCs are employed by local authorities or Public Health Wales and are responsible for monitoring and accrediting schools participating in the Welsh Network of Healthy Schools Scheme. HSCs are experienced in delivering training to schools; however, because they were new to MHFA training, they delivered the training in pairs as per MHFA England protocol.
Originally, the training was intended to be delivered within a single academic term (September–December 2016). However, a small number of schools completed the training in January 2017 because of their staff training schedule, and one school did not receive the 1-day MHFA for schools and colleges training until June 2017 because of a change of leadership creating a delay. The order in which training components were delivered was not prespecified to accommodate the constraints and needs of each school. All intervention schools received all three training components.
For the establishment of the peer support services, peer supporters were asked to meet within 3–4 weeks of completing the standard MHFA training course to plan the logistics of running the service, and to develop a confidentiality policy and advertising strategy.
Summary of the remaining document
In the following chapter, we outline the methods used for the main statistical analysis, the economic evaluation and the process evaluation. In Chapter 3 we present the main outcomes and economic evaluation findings. Chapter 4 presents the process evaluation findings, and we discuss the findings and implications in Chapter 5.
Chapter 2 Methods and analyses
Overall study design
This was a cluster RCT with an embedded economic and mixed-methods process evaluation. The unit of recruitment and randomisation was the school, as this was an intervention that operated across the whole school.
The trial was registered with the International Standard Randomised Controlled Trial Register (reference ISRCTRN95909211). We published the protocol on 18 October 2016. 63 We made some small changes to the protocol since it was originally submitted, which are summarised in Table 1. The full statistical analysis plan is available at URL: www.bristol.ac.uk/population-health-sciences/projects/wise/publications/ (accessed 5 November 2020).
Change to protocol | Date |
---|---|
We were not able to recruit two schools per FSM stratum as planned; therefore, in both England and Wales, we had to merge strata. Further details on how the final sample fitted the FSM strata are included in Chapter 3 | May 2016 |
We recruited one school more than our original sample size target (in case of drop-out) | June 2016 |
Three schools at baseline and two at both follow-up time points were unable to secure meeting time for the team to administer the teacher questionnaires. These were, therefore, completed in teachers’ own time in these schools | June 2016 |
We changed our safety procedure and information to participants to say that if a teacher writes anything that suggests that their life is in danger, the study team will use the participant names with study ID list to identify that individual and write to them at their school, advising they seek help, but that we would not inform anyone at their school about this | July 2016 |
We originally planned to hold feedback meetings with the peer supporters once a term, but to avoid this feedback becoming part of the intervention (which would not be sustainable for any future roll out), we changed this to two feedback meetings with just one or two peer supporters each time to minimise the influence on behaviour | December 2016 |
In one intervention school, the 1-day training was not delivered until the summer term because of difficulties securing a date for this | June 2017 |
Data entry staff were not blinded to study arm as the questionnaires at T1 and at T2 were different | July 2017 |
The statistician and health economist were not blinded at the point of conducting the analysis because of team error in providing unblinded data with regard to study arm | July 2018 |
In addition to asking peer supporters to recruit individuals who had used the service for interviews, we also included a slip of paper in the teacher T2 questionnaires asking for volunteers. In the end, we did not manage to conduct these interviews as we only had one response | July 2018 |
Study population
The population from which the schools in our sample were recruited was secondary schools in two geographical areas: (1) Bristol and surrounds up to a 30-mile radius, comprising 64 mainstream secondary schools and (2) South East and South Central Wales, comprising 88 mainstream secondary schools. Exclusion criteria were as follows:
-
fee-paying schools
-
special schools (e.g. for those with learning disabilities)
-
pupil referral units
-
schools that had been WISE pilot schools
-
schools participating in other, similarly intensive, research studies
-
schools already delivering MHFA or other mental health training
-
schools without available FSM data for stratification purposes
-
schools within the same academy chain and local authority as one that had already been selected for participation.
In total, we aimed to recruit 24 schools (i.e. 12 in each area).
Our study population within each school was all teaching staff and all students in year 8 (aged 12–13 years) at baseline. At 24-month follow-up (T2), the same students were year 10 (aged 14–15 years). We assumed that any effect on this year group would be the same for all students, as the intervention was intended to be at the whole-school level.
Recruitment and randomisation
In both England and Wales, the sampling frames were stratified by the proportion of children in each school who were eligible to receive FSM to ensure schools from a wide range of socioeconomic catchment areas were included in the study. In Wales, the sample was divided into two administrative regions (educational consortia), South East and South Central Wales, and schools were stratified into three FSM levels (high, medium and low vs. the national average). Two schools were randomly selected from each stratum in each region, and selected schools were approached via a relevant senior leader (e.g. a deputy head in charge of pastoral care) and invited to participate in the study. Schools that declined were replaced by a randomly selected school from the same stratum and region. In England, the study was advertised to head teachers at all eligible schools and invitations were followed up with relevant senior leaders, identified with the help of local public health teams that worked with the schools. Those who expressed interest in participation were stratified into three levels according to FSM (high, medium and low vs. the national average) and local authority (Bristol/non-Bristol). When more than two schools fitted into one stratum, two were randomly selected.
Within each study area and stratum, selected schools were randomly allocated to a study arm. Sequence allocation was generated by the study statistician, who was blinded to the schools’ identities, using the Stata ralloc command (version 14, StataCorp LP, College Station, TX, USA). Random allocation to study arm took place after baseline data collection.
Main outcomes
Primary outcome
The primary outcome was teacher well-being, measured using the Warwick–Edinburgh Mental Wellbeing Scale (WEMWBS). 62 The WEMWBS is comprehensive (incorporating elements of both subjective and psychological well-being), short enough to be used in population-level questionnaires, responsive to change and validated among community samples of adults in the UK. [Licenses to use the WEMWBS are available at https://warwick.ac.uk/fac/sci/med/research/platform/wemwbs/using/ (accessed 11 February 2021).] We selected well-being rather than a measure of mental health difficulty as the primary outcome because it is a broad enough measure to capture changes among staff with a range of mental health needs, including those who have good well-being, but may still benefit from cultural changes at the whole-school level.
The WEMWBS scores can range from 14 to 70, where a higher score indicates better well-being. This outcome was measured at baseline, T1 and T2.
Secondary outcomes
Teacher depression was measured using the Patient Health Questionnaire-8 items (PHQ-8). 64 The PHQ-8 is suitable for measuring levels of depressive symptoms in population-based studies and is short enough to be used in self-complete questionnaires. It was analysed as a continuous variable, but also as an ordinal variable (0–4 indicating no depressive symptoms, 5–9 indicating mild symptoms, 10–14 indicating moderate symptoms, 15–19 indicating moderately severe symptoms and 20–24 indicating severe symptoms) and a binary variable, with a cut-off point of ≥ 10 indicating depression. 64 These cut-off points were established for the Patient Health Questionnaire-9 items, but are the same for the PHQ-8, with the only difference between the two scales being an additional question about self-harm behaviour, which does not contribute to the overall depression score.
Teacher absence was measured by self-report (i.e. the ‘number of days missed from school because of health problems in the past 4 working weeks’) and routine data collected by the schools (i.e. the total number of days of absence over the previous year). Both measures were treated as continuous, but the self-report measure was also treated as binary (any vs. no absence) and ordinal. Based on the distribution of responses in the pilot study, in which the majority of respondents had taken very little absence, categories were 0 days, 1 day, 2–7 days or ≥ 7 days.
Teacher presenteeism was measured by self-report, using the relevant item from the Work Productivity and Activity Impairment questionnaire65 and adapted to fit teachers’ work schedule. Teachers were asked ‘during the last 4 working weeks, how much did health problems affect your productivity while you were working?’ and selected a number between 0 (no effect) and 10 (completely prevented me from working). This measure was treated as continuous, but also as binary (no effect vs. 1–10 effect on work over the previous 4 working weeks) and ordinal. Based on the pilot responses to presenteeism, in which roughly one-third of participants scored 0 and a majority scored < 6, the categories were 0 days, 1–5 days or 6–10 days.
Teacher retirements and leaving for other reasons were collected as total number reported by the schools for the previous year.
Year 8 student well-being was measured using the WEMWBS, which has been validated for use among teenagers from age 13 years. 66 This measure was treated as continuous.
Year 8 student psychological distress was measured using the Strengths and Difficulties Questionnaire (SDQ), a widely used measure among this age group, with well-established norms for a UK population. 67 A total difficulties score was calculated (this can range from 0 to 40, with a higher score indicating greater difficulties) and treated as continuous.
Student attendance was collected as a total proportion for each school from routine data sources for England68 and Wales. 69
Student attainment at the end of year 11 was collected from the same routine data sources as student attendance. 68,69 Exams at the end of statutory school age (key stage 4) are graded and reported differently for the two countries. For English schools, we reported per cent of students achieving grade 5 or above in both English and Maths General Certificates of Secondary Education, and for Welsh schools we reported per cent of students achieving grade C or above in English/Welsh, Maths and Science GSCEs.
The baseline questionnaires are included in Appendices 1 and 2.
Outcome data collection process
Teacher self-report outcome questionnaires were completed during staff meetings at baseline (May–July 2016), T1 (May–July 2017) and T2 (May–July 2018). Those who were absent from the data collection session were offered the option of completion via an online survey. Data on teacher absence, retirement and leaving for other reasons were collected via routine data recorded by the schools.
Student mental health outcomes were gathered via self-report questionnaires at baseline and T2, completed by all students in year 8 (aged 12–13 years at baseline), during tutor groups or lesson time. Where fewer than five students were absent, questionnaires were left with school for those students. Where five or more students were absent, a second data collection was arranged with schools. School-level student absence and attainment were collected using government statistics published online.
Teacher routine outcomes (school-reported absence and leavers data) and all student outcomes were measured at baseline and T2 only. This was for practical reasons, and also because the intervention’s logic model implied that teacher mental health would need to improve before changes to student mental health were maximised.
The study team planned to attend all data collections, introduce the questionnaires, answer any questions and take them away at the end. However, three schools at baseline did not manage to secure meeting time to conduct the data collections. In these schools, the team briefly introduced the questionnaires in a meeting, but left them with staff along with envelopes in which questionnaires could be sealed and then either posted or returned via a box left at the school. Two of these schools followed the same process at both T1 and T2, despite attempts by the study team to organise meeting time.
Compliance and loss to follow-up
We encouraged the schools to set the dates for data collection sessions far in advance to try and ensure maximum attendance. We built in time to follow up those who were absent, either in person or by leaving questionnaires in named envelopes, which the research team collected once completed. We also made an online version of the questionnaire available to teachers who missed the data collection session. We incentivised schools to remain part of the study by providing them with their own school-level data from each time point at the end of the study. We also offered control schools a payment at the end of the study, which they could choose to spend on the intervention if it was found to be effective. This payment was contingent on a response rate of at least 80% at follow-up among teachers and students.
Sample size
The study was powered to detect a mean change of 3 points on the WEMWBS (i.e. the primary outcome measure). This difference was chosen as the minimum meaningful change discussed in a WEMWBS user guide. 70
A change of 3 points was also close to the difference in mean WEMWBS scores between staff in the highest- and lowest-ranked schools in the pilot study.
It was estimated that each school would have approximately 60 teachers and 150 year 8 students, depending on size. The sample size took account of clustering. In the pilot data, the teacher WEMWBS intracluster correlation coefficient (ICC) at baseline was 0.01 (95% CI 0.00 to 0.03) and at follow-up it was 0.00 (95% CI 0.0 to 0.02). We assumed a mean of 50 (83%) teachers followed up per school (with a coefficient of variation of sample size of 0.5) and a standard deviation (SD) for WEMWBS of 8.4 (based on the pilot data). A sample size of 24 schools (i.e. 12 intervention schools and 12 control schools) would achieve 83% power for an ICC of 0.05. This would rise to 98% if the ICC is 0.02, and fall to 65% for an ICC of 0.08. Sample sizes were calculated using the clustersampsi command in Stata. For the tests for interactions (see Main outcomes analysis), we would have 80% power to detect an interaction that is twice as big as the main effect, and 30% power to detect an interaction that is the same size as the main effect. 71
Blinding and breaking of blind
Allocation to study arm took place after baseline data had been collected to ensure blinding among all parties during this first data collection. It was not possible for the schools, teachers or students to be blind to intervention status (although students were likely to be unaware of the intervention). The research assistants/associates leading outcome data collection also collected the process data, which prevented blinding among the study team.
Clustering
All statistical analyses took account of clustering by school using robust standard errors and, where appropriate, by using random-effects models with schools as a random effect. Analyses were conducted using Stata.
Main outcomes analysis
Primary analysis
The primary analysis was carried out under the intention-to-treat principle, analysing participants as randomised without the imputation of missing data. Repeated-measures (random-effects) models were used to examine pattern of change in primary outcome over baseline, T1 and T2, adjusted for stratification variables, sex and years of experience. This included a random effect for individual participants and another for school. This model included every teacher who had at least one measure of the outcome (i.e. at baseline, T1 or T2). Using a maximum likelihood estimator, this analysis was robust to data that are missing at random (MAR). 72,73 Results are presented in Chapter 3 as mean difference in the primary outcome between the trial arms over the follow-up period, with associated 95% CIs and p-values.
The primary analysis was repeated with a treatment by time interaction term added to the model. This allowed estimation of treatment effect at T1 and T2 separately.
Secondary analyses
Analysis of secondary outcomes included linear, ordinal and logistic regression models, dependent on the nature of the outcome variable being analysed (continuous, ordinal or binary, respectively).
For secondary individual-level outcomes that are measured at baseline, T1 and T2 (e.g. PHQ-8), repeated-measures models were used and these models included random effects for clustering by individual (because of repeated measure) and by school. All individuals with at least one observation of the outcome measure were included in the model for that outcome using maximum likelihood under a MAR assumption. For each secondary individual-level outcome that was measured at baseline, T1 and T2, three models are presented:
-
Unadjusted model (model 1): repeated measures of the outcome regressed on treatment arm, accounting for clustering because of repeated measures and by school (using random effects).
-
Partially adjusted model (model 2): model 1 plus adjustment for stratification variables.
-
Fully adjusted model (model 3): model 2 plus additional adjustment for the covariates sex and years of teaching experience.
For each secondary individual-level outcome measured at baseline and T2 only, three models are presented:
-
Unadjusted model (model 1): outcome at T2 regressed on treatment arm and baseline value of the outcome, accounting for clustering by school (using a random effect).
-
Partially adjusted model (model 2): model 1 plus adjustment for stratification variables.
-
Fully adjusted model (model 3): model 2 plus additional adjustment for sex and ethnicity as covariates.
For each secondary school-level outcome measured at baseline and T2 only, two models are presented:
-
Unadjusted model (model 1) – outcome at T2 regressed on treatment arm and baseline value of the outcome (no need for any adjustment for clustering).
-
Adjusted model (model 2) – model 1 plus adjustment for stratification variables.
Missing data
We assessed the impact of missing data and non-response on teacher WEMWBS and PHQ-8 outcomes and student WEMWBS and SDQ outcomes using multiple imputation. The multiple imputation model73 included all outcomes at baseline, T1 and T2, variables included in the primary analysis model and baseline characteristics associated with missingness. Analyses were repeated on the imputed data sets and results were combined using Rubin’s rules. We also used sensitivity analyses to examine the impact of potential missing not at random (MNAR) data by incorporating a scaling parameter that was allowed to differ between arms (i.e. allowing the missing data mechanism to differ between arms). 74 This was carried out by imputing under the MAR assumption and then multiplying imputed values of the outcome by a scaling parameter. For example, a scaling parameter of 0.95 in the control arm and 0.90 in the intervention arm implies that missing WEMWBS scores in the control arm are assumed to be 5% lower than estimated under a MAR assumption, whereas scores in the intervention arm are 10% lower.
Sensitivity analyses
We re-ran the primary analysis without identified outliers (defined as a WEMWBS score > 3 SDs from the mean) to assess the effect of outliers on the main outcome.
Subgroup analysis
We used a complier-average causal effect (CACE) approach (using instrumental variable analysis)75 to examine the impact of MHFA training on follow-up WEMWBS and PHQ-8 scores, comparing those who completed the 1 or 2 days, training in the intervention schools with those in the control schools who would have completed the training, had they been offered it. For each participant, we calculated a summary score of the two outcomes and used the summary score to account for the repeated measurements (therefore reducing the two measures to a single measure). If only a single measure was available then we used just the single measure. We included robust standard errors to account for clustering at the school level.
Tests for interactions
The effect of the intervention on teacher WEMWBS and PHQ-8 score was tested for interaction with baseline well-being/depression score (grouped as above or below the bottom quartile of the WEMWBS, or with a score of ≥ 10 on the PHQ-8), geographical area (Wales/England), school-level FSM and sex. The effect of the intervention on student WEMWBS and SDQ score was tested for interaction with baseline well-being/SDQ score (grouped as above or below the bottom quartile of the WEMWBS and a score of ≥ 20 on the SDQ), geographical area, school- and individual-level FSM, sex and ethnicity. p-values were interpreted with caution because of the low power and number of interactions being tested (we used Bonferroni corrected or permutation p-values).
Exploratory analysis: mechanisms of change
To explore the study’s logic model and hypothesised mechanisms of change, we collected data in the teachers’ questionnaires regarding stress and satisfaction at work, support given/received at school, school’s perceived attitude to staff and student well-being, and perceived quality of relationships in school. These questions were asked with a series of Likert responses (see Appendix 2 for full wording and response options). Logistic regression models were used to compare binary measures of these variables between arms at follow-up adjusted for baseline scores, school-level FSM and geographical area. We examined whether or not baseline measures of these variables, which provide indicators of school psychosocial context, moderate the effect of the intervention by including interaction terms between these baseline variables and intervention arm in the analysis model.
Appropriate mixed models were applied as for the primary analysis, but, additionally, these measures were included as covariates. We assessed the degree to which the estimated treatment effect attenuated compared with our chief analysis model (substantial attenuation would indicate the proposed mechanisms of change are indeed acting as such).
Exploratory analysis: level of intervention
Data from the process evaluation [i.e. training participant evaluation forms, peer supporter feedback meetings, peer supporter logs of support and follow-up questionnaires (see Process evaluation for more detail)] were used to divide intervention schools into ‘low’ or ‘high’ implementation groups. Factors, measured as binary variables, that were used for this analysis were as follows.
Dosage
-
At least 8% of teachers completed the 1-day MHFA for schools and colleges training (vs. < 8% of teachers).
-
At least 8% of staff in the school completed the 2-day standard MHFA training and went on to become a peer supporter (vs. < 8% of staff).
-
At least 8% of staff in the school still acting as peer supporters by T2 (vs. < 8% of staff).
Reach
-
At least 75% of teachers attended the 1-hour awareness training (vs. < 75% of teachers).
-
Higher than study mean for those who select ‘staff peer supporter’ in response to the question ‘if a work-related problem was making you stressed or down who would you talk to about it at school?’ in the follow-up questionnaires.
Fidelity
-
100% of course attendees indicated that all topics were covered in the 1-day MHFA for schools and colleges training and 2-day standard MHFA training (vs. < 100%).
-
The peer supporters set up a confidentiality policy for the service (vs. no policy).
-
The peer support service was advertised in three or more ways initially (vs. advertised in two or fewer ways).
-
The peer support service was re-advertised at the beginning of the 2017–18 academic year (vs. not re-advertised).
-
The peer support service had been championed by a member of the senior leadership team (vs. not championed).
Correlation between all the implementation measures was poor and so we were unable to determine one composite binary measure ‘low/high implementer’. Instead, we scored each school on each factor and split the schools based on their total scores, with the six highest being high implementers and the six lowest being low implementers. We then compared primary outcomes among intervention schools that were high implementers with those that were low implementers. We repeated this analysis with implementation as a continuous variable. Adjustment was made for both school- and individual-level covariates, as in the primary analysis.
Economic evaluation
Measurements
The economic analysis took a public sector perspective, calculating the financial and opportunity costs to schools of participating in the WISE intervention. We collected self-reported information on the impact of teachers’ health on absenteeism and presenteeism during the previous 4 weeks at T1 and T2. However, in common with other school-based interventions,76 we elected not to track teacher health-care use at T1 or T2.
Data were collected on the resources used for the following four activities: (1) the HSC MHFA instructor training (which was relevant for Welsh schools only); (2) the 2-day standard MHFA training; (3) the in-service day MHFA training for schools and colleges; and (4) the awareness raising session. In each case, the project team completed a pro forma after the training session, documenting the resources used and expenses claimed.
As noted in Chapter 1, in Wales, Public Health Wales paid for a course to train seven HSCs as MHFA instructors who could then deliver the MHFA training to teachers in Welsh schools. The course included 6 days of training with a MHFA trainer and 2 additional days of independent preparation for the HSC trainees. The costs comprised HSCs’ time (estimated based on salary costs), actual travel expenses incurred, MHFA course fees, MHFA trainer expenses and course refreshments. The venue was provided at no cost by Cardiff University (Cardiff, UK).
The resources used during the 2-day standard MHFA training comprised MHFA trainer fees, trainer travel expenses (in English schools this was sometimes included in the trainer fee), MHFA manuals, refreshments (in most schools) and the time of the teachers being trained as peer supporters (a minimum of five teachers and a maximum of 16 teachers). There were no venue costs for training, as this happened either on the school site or at a university. Several schools provided estimates of the cost of hiring supply teachers to cover the teaching commitments of staff attending the MHFA training. We used these estimates to impute the opportunity cost of staff time at all schools.
The resources used for the 1-day MHFA schools and colleges training comprised MHFA trainer fees, trainer travel expenses (in English schools this was sometimes included in the trainer fee), MHFA manuals and the time of the teachers attending training (a minimum of seven teachers and a maximum of 18 teachers). As training took place during in-service days, no supply teachers were necessary and staff costs were estimated based on anonymised salary information provided by the schools. When salary information was not provided by the school (n = 1), these costs were imputed based on the number of attendees and the mean salaries of attendees from other schools.
The resources used for the 1-hour awareness raising session for all teachers comprised the trainer fees, and travel costs and the staff costs for the teacher and support staff attending. For the English schools, staff costs were estimated from the number of staff attending (a minimum of 31 staff and a maximum of 85 staff) and estimated salaries were based on salary band information provided by schools. The mean school staff costs in England were used to impute the staff costs in Welsh schools, where data on staff attendance was not available.
All resource use was valued in monetary terms using published UK unit costs (Table 2) or participant valuations estimated at the time of analysis (2016/17). Support staff salaries were obtained by asking schools/individuals for this information. Teachers’ salaries were estimated using national pay scales after confirming participants’ pay bands.
Description | Unit cost | Source |
---|---|---|
HSCs MHFA instructor training (Wales only) | ||
MHFA course fee | £11,632 | Expenses recorded |
Trainer’s costs | £79.43 per day plus subsistence and parking | Expenses recorded |
HSC time | £18.40–20.47 per hour | Estimate based on HSC salaries |
Travel/parking | Variable | Trainer and trainee expense forms |
Catering | £58.80 per day | Based on expenses recorded |
2-day peer supporter MHFA training | ||
HSC time (Wales only) | £18.40 per hour | Estimate based on HSC salaries |
MHFA manuals (Wales only) | £18–20 per manual | Total cost based on number of peer supporters trained |
MHFA trainer fees including manuals (England only) | £420–650 per day | Based on expenses recorded |
Catering | £60 per day | Based on expenses recorded |
Travel/parking | Variable | Trainer and trainee expense forms |
Teaching cover | Variable | Based on school reports of expenses paid to supply teachers |
In-service day MHFA for schools and colleges training | ||
HSC time (Wales only) | £18.40 per hour | Estimate based on HSC salaries |
MHFA manuals (Wales only) | £18–20 per manual | Total cost based on number of peer supporters trained |
MHFA trainer fees including manuals (England only) | £600–750 per day | Based on expenses recorded |
Travel/parking | Variable | Trainer and trainee expense forms |
Awareness raising session fees | ||
HSC time (Wales only) | £18.40 per hour | Estimate based on HSC salaries |
MHFA trainer fees (England only) | £260–350 per session | Based on expenses recorded |
Travel/parking | Variable | Trainer expense forms |
Cost–consequence analysis
The economic evaluation is a cost–consequence study that provides evidence on whether or not the incremental costs of the intervention are justified because of improved teacher or student outcomes. The potential benefits of the intervention are multifaceted and affect multiple agents (e.g. staff, students and schools), even those not directly observed in the study (e.g. students in other years). Therefore, a cost–consequence framework that tabulates costs against several outcomes for staff and students is appropriate, rather than a more reductive cost-effectiveness or cost–utility analysis that attempts to summarise efficiency in a single ratio. 77 It provides a more complete analysis for policy-makers than a ‘cost per unit improvement in WEMWBS score’, which has no easy interpretation. Previous research has demonstrated limited correlation and overlap between the WEMWBS well-being scores and health-related utility scores (such as the EuroQol-5 Dimensions, three-level version) used to estimate quality-adjusted life-years. 78 As quality-adjusted life-years are unlikely to be sensitive to small changes in well-being, we did not include the EuroQol-5 Dimensions or other preference-based health-related quality-of-life measures.
Subgroup analysis
Subgroup analysis is reported by country to investigate any differences in findings between England and Wales. This is important, given that the costs and, potentially, the outcomes will differ because of the different use of MHFA trainers in England and Wales.
Process evaluation
Aims
A mixed-method process evaluation was integrated into the outcome and economic evaluations,79 following a ‘nested’ design. 80 The primary aim of the process evaluation was to support the interpretation of the outcome data and further refine the programme theory. 81 The process evaluation had five central objectives, which, for the purpose of the study, were categorised as domains of inquiry. These were as follows.
Implementation
The aim was to assess the implementation of the intervention, using a multicomponent framework.
Implementation is assessed separately for the MHFA for schools and colleges and standard MHFA training courses provided for schools, and the delivery of the peer support services. For the training, the aim was to assess (1) reach (i.e. the number who attended), (2) dose (i.e. the completion of the training), (3) fidelity to the course and adaptations undertaken, (4) variations in fidelity between schools in England and Wales, and (5) quality of training delivery. For delivery of the peer support service, the aim was to assess (1) reach (i.e. the proportion of teachers who make use of the peer support service), (2) fidelity to the peer support service model and adaptations undertaken, and (3) contextual characteristics that determined barriers to and facilitators of implementation.
Mechanisms of change
The aim was to explore if the intervention’s hypothesised mechanisms of change were activated as intended, and the extent to which these mechanisms were modified through their interaction with contextual characteristics. Postulated mechanisms, refined through the feasibility and pilot phase of intervention evaluation, are depicted in the logic model (see Figure 1). We also considered unintended consequences and, in particular, unanticipated harms and the pathways by which these may have occurred. 82
Programme differentiation and contamination
The aim was to understand usual practice to assess programme differentiation and ascertain if contamination had occurred in control schools.
Sustainability of the intervention
The aim was to assess the extent to which the intervention is sustainable and its scope to become routinised as part of usual practice outside a trial setting.
Acceptability
The aim was to explore participants’ perceptions of and interactions with the intervention components, and how this differed across school contexts and across the course of delivery. The purpose of exploring intervention acceptability was to understand if and in what ways its acceptability to participants influenced the operationalisation of the mechanisms of change. Therefore, it is explored as an embedded theme throughout the process evaluation results in Chapter 4, rather than as a discreet theme.
Research questions
Table 3 lists the research questions that the process evaluation addressed and outlines the domain of inquiry to which each one was aligned.
RQ | Process evaluation domain |
---|---|
RQ1. Are the intervention’s mechanisms of change operationalised as hypothesised? RQ2. How is the operationalisation of the mechanisms of change influenced by contextual factors? RQ3. Does the interaction of the mechanisms of change with contextual factors give rise to unintended effects? |
Mechanisms of change |
RQ4. Is the WISE intervention differentiable from ‘usual practice’ and does this differentiation change during the study? RQ5. Is there contamination of usual practice in control schools by receipt of the WISE intervention or similar approaches? |
Programme differentiation and usual practice |
RQ6. What is the reach of the WISE training components (e.g. 8% of staff attending standard MHFA)? RQ7. How many targeted staff complete the WISE intervention training (dose)? RQ8. Are the WISE training components delivered with fidelity and what is the nature of any modifications undertaken? RQ9. Are there differences in the delivery of the WISE training components between England and Wales, and what gives rise to any differences? RQ10. How well are the WISE training components delivered? |
Implementation (WISE training components) |
RQ11. What proportion of teachers receive support from the peer support service? RQ12. Is the peer support service delivered with fidelity and what is the nature of any adaptions undertaken? RQ13. What are the barriers to and facilitators of the implementation of the peer support service? |
Implementation (peer support service) |
RQ14. Is the WISE intervention acceptable to funding organisations, intervention trainers, head teachers, teachers and students? | Acceptability |
RQ15. How likely is the WISE intervention to be sustainable and what factors might ensure sustainability? | Sustainability |
Sample
Four intervention schools and four control schools were purposively sampled to serve as ‘case study schools’, in which more extensive process evaluation was undertaken (e.g. focus groups and observations). To sample the case study schools, all study schools were stratified by trial arm allocation (intervention vs. control), site (England vs. Wales), administrative region (educational consortia vs. local authority) and proportion of students eligible for FSM (high/low vs. the national average). One intervention school within each stratum was then purposively sampled to achieve variation across the cases in school size and assessment ranking by the educational inspectorate. One control school was selected within each stratum to match the intervention cases as closely as possible. Table 4 presents the final sample of case study schools.
School | Trial status | Site | Administrative region | FSM eligibility | School size | Inspectorate assessment |
---|---|---|---|---|---|---|
1 | Intervention | England | 1 | Low | Large | Good |
2 | Intervention | England | 2 | High | Small | Requires improvement |
3 | Intervention | Wales | 3 | High | Small | Adequate |
4 | Intervention | Wales | 4 | Low | Large | Good with outstanding features |
5 | Control | England | 1 | Low | Large | Requires improvement |
6 | Control | England | 2 | High | Small | Good |
7 | Control | Wales | 3 | High | Small | Adequate |
8 | Control | Wales | 4 | Low | Large | Good |
Data sources
The process evaluation adopted a mixed-methods approach and used both quantitative and qualitative data sources. The different types of data collected from the study schools, and how they relate to the research questions, are summarised in Report Supplementary Material 3.
Teacher and student questionnaires
The teacher and student questionnaires that were designed to collect the main outcome data also contained questions relating to the mechanisms of change domain of inquiry. As already noted in Main outcomes, the teacher questionnaires asked about stress and satisfaction at work, quality of relationships between teachers and students and between staff, whether or not the school cares about teacher well-being and how much individuals had given support to colleagues over the previous year. The questionnaire also asked about perceptions of how much their school cared about student well-being and the extent to which they had supported students in the previous year. Follow-up questionnaires in the intervention group only also examined intervention reach (i.e. the numbers who had completed MHFA training and who had received support from the peer supporters) as part of the implementation domain of inquiry. The student questionnaires asked about perceptions of the extent to which the school and teachers cared about students, how often individuals had asked a teacher for help over the previous year and whether or not this had been useful.
Audit of school policies and interventions
Participating schools were audited at baseline and T2 (see Report Supplementary Material 4 for a copy of the audit questions). The main contact teacher for each school reported on existing policies and interventions within the school in relation to the mental health and well-being of teachers and students. The baseline audit allowed for comparison of pre-existing activities in intervention and control schools to explore possible baseline imbalances and to understand the general context in which the intervention was being delivered. The T2 audit captured relevant policies or interventions that had been introduced during the study period. It therefore enabled exploration of programme differentiation from usual practice in intervention schools and assessment of contamination in the control schools by identifying any new activities that were similar to components of the WISE intervention. An additional question was included in the follow-up audits that asked about any key events that had occurred during the study that were likely to have an impact on mental health and well-being. This, again, enabled consideration of the impact of the context on any intervention effects.
Attendance records for the Wellbeing in Secondary Education intervention training
The MHFA course instructors completed an attendance record for each training course, which enabled an assessment of reach of the training. Instructors also conducted a head count of staff attending the 1-hour awareness raising session; however, some schools opened this session to all teaching and non-teaching staff, and, therefore, it was not possible to calculate the proportion of teaching staff that attended these sessions.
Observations of the Wellbeing in Secondary Education intervention training
In the four intervention case study schools, two members of the research team observed the 1-hour awareness raising session, the 2-day MHFA training course for peer supporters and the 1-day MHFA for schools and colleges course. Standardised observation schedules (see Report Supplementary Material 2 for an example of a template) were completed to quantitatively assess coverage of materials (yes/no for each topic), quality of delivery and participant engagement (5-point Likert scales). Observers also qualitatively documented any information relevant to understanding the quantitative assessment and other issues of importance not covered by the quantitative scale, including course adaptations and general contextual observations. This generated data relating to the implementation and acceptability domains of inquiry.
Post-mental health first aid training course fidelity checklist
In all 12 intervention schools, participants completed a checklist recording whether or not key topics were covered (yes/no), and rating the instructors knowledge and skills (5-point Likert scale). An example checklist is included in Report Supplementary Material 5. This enabled further consideration of the training implementation.
Training evaluation forms
Attendees of all MHFA training in the 12 intervention schools completed standardised MHFA evaluation forms. These forms recorded self-assessed knowledge and confidence to support others before and after the course, and views on course quality (5-point Likert scale).
Attendees of the 1-hour awareness raising session completed a study-specific evaluation form (see Report Supplementary Material 6) which asked them to rate how much their knowledge had improved (no impact, a small increase, a large increase) in relation to six topics and how highly they rated the instructor (5-point Likert scale). These data illuminated the mechanisms of change domain of inquiry.
Peer supporter logs and feedback sessions
The peer supporters completed a termly electronic log that documented delivery of support to colleagues during the previous 2 working weeks (see Report Supplementary Material 7). To mitigate the risk of seasonal bias (e.g. stress associated with end of term examinations), peer support logs were issued at different times during the academic term. The log assessed the reach (i.e. how many colleagues received support), the role of the staff members supported, the type of problem addressed (e.g. work or personal life), time of day and length of support episode, and the outcome of the interaction. The logs also invited peer supporters to provide any information about difficulties they had experienced as a peer supporter to allow monitoring of any harms. The log was used to record when peer supporters had left the school.
In addition, one or two peer supporters in all 12 intervention schools were invited to a feedback session approximately 6 months and 18 months after their training had been completed. The purpose of the feedback sessions was to assess the extent to which the peer support service guidelines had been adhered to, again shedding light on the implementation domain of inquiry. The sessions took the form of structured interviews (see Appendix 3 for the questions asked), with responses audio-recorded.
Interviews with mental health first aid instructors
As noted above, the 2-day, 1-day and 1-hour training sessions were delivered by independent MHFA instructors to English schools, and by HSCs employed by Public Health Wales or local authorities to Welsh schools. Semistructured interviews were conducted with the three MHFA instructors in England and a subgroup of three of the seven HSCs in Wales. The HSCs were purposively sampled to ensure that a representative from each of the six intervention schools was interviewed (HSCs delivered the intervention in pairs). Topic guides explored experiences of training delivery, including barriers to and facilitators of delivering a successful session, the extent to which instructors adhered to the course and motivations for any adaptations undertaken. In addition, the perceived acceptability of the training to course participants was also explored.
Interviews with head teachers
Semistructured interviews were conducted with the head teachers from all 25 schools. Topic guides explored the perceived role of schools in addressing the mental health and well-being of staff and students, the acceptability of usual practice, motivations to take part in the study, and the barriers to and facilitators of supporting teacher and student mental health. These data contributed to understanding of the acceptability and sustainability of the intervention.
Interviews with peer support service users
In the four intervention case study schools, we had planned to conduct semistructured interviews with teachers who had utilised the peer support service. However, despite trying a number of methods to recruit volunteers (e.g. the peer supporters asking colleagues they had helped, putting reply slips in the teacher questionnaires) we had only one volunteer and so we did not complete this part of the data collection.
Focus groups with peer supporters and recipients of mental health first aid for schools and colleges training
In the four intervention case study schools, focus groups were conducted with peer supporters and with teachers who had attended the MHFA for schools and colleges training. A focus group was conducted with each group within 6 months of training delivery, and a second focus group was conducted with each group approximately 18 months after training delivery. For the selection of participants for the focus groups, convenience sampling was used (i.e. all trainees in the case study schools were invited to attend and the actual make up of each group was determined by availability). Focus groups comprised four to eight participants.
Focus groups with peer supporters explored the acceptability of MHFA training and attendees’ preparedness to become peer supporters, the acceptability of delivering support, barriers to and facilitators of implementation of the peer support service, fidelity to the guidelines and reasons for adaptations undertaken, and sustainability of the intervention. Focus groups with 1-day MHFA for schools and colleges trainees covered the acceptability of and potential improvements to the training, impact of the training on how participants interacted with and supported students, and sustainability of the intervention. At the second time point, 18 months after training delivery, one-to-one interviews were conducted in place of a focus group in one school because of logistical difficulties in organising a focus group, but the same questions were explored.
Focus groups with teachers not in receipt of the mental health first aid training
It was planned to conduct one focus group in each of the four intervention and four control case study schools with teachers who did not receive any MHFA training, approximately 18 months after training delivery. For selection of the focus groups, random sampling was used (i.e. all teachers in the school were stratified according to sex and department, and participants were randomly selected from each stratum using random number generation software). If someone declined to participate, a replacement was randomly selected from the same strata until eight participants had been confirmed. Each focus group comprised six to eight teachers. In one intervention school, a one-to-one interview was held, as a focus group was too difficult to organise. In the intervention schools, focus groups explored participants’ views of the peer support service (including comparison with ‘usual care’), perceived barriers to and facilitators of uptake, and potential service sustainability. In the control schools, focus groups explored evidence of contamination, perceptions of ‘usual practice’ and views on a hypothetical peer support service.
Focus groups with year 10 students
In all eight case study schools, one focus group was held with a group of year 10 students (aged 14–15 years), approximately 18 months after training delivery. For selection of students, lists of all students in year 10 split by sex were used. Six names were selected using random number generation (three from each sex). Those students were invited to participate and were asked to also invite a friend to participate. As the target number for each focus group was six to eight, we did not need all students to consent to participate or to bring a friend. This methodology was used to help ensure that participants felt comfortable to talk during the focus groups. Each focus group comprised 3–12 students (although we had aimed for six to eight attendees, actual numbers were determined by those who actually attended on the day for whom we had been able to obtain parental consent). In all case study schools, the student focus groups followed a topic guide that explored participants’ views of teacher–student relationships and provision of mental health and well-being support in their schools. Given that the intervention was directed towards staff, students in the intervention schools were unlikely to have known about the intervention, and, therefore, were not asked directly about it. The data generated contributed to understanding of whether or not the mechanisms of change had been activated in the intervention schools and whether or not there was any indication of contamination in control schools.
Interviews with funding organisation representatives
Semistructured interviews were conducted with a representative from each funding organisation that contributed to intervention costs (n = 3). Interviews assessed fit with existing organisational priorities and the feasibility of sustained resource allocation to the WISE intervention, if it was found to be effective.
All topic guides are included in Appendix 4.
Analysis
Process evaluation data were analysed independently of the trial outcome data. Findings were then integrated following the completion of outcome data analysis so that the process evaluation could inform the interpretation of these data, specifically by explaining how the mechanisms of change worked (or did not work) in context and how the intervention had been implemented.
Qualitative analysis of the process data
All interviews and focus groups were audio-recorded, and recordings were transcribed and uploaded to NVivo version 11 (QSR International, Warrington, UK) for data storage and management. We followed Braun and Clarke’s83 thematic analysis approach in which emergent themes (codes) are identified and organised into an analytic framework (a coding frame). Separate frameworks for each data set were initially constructed (e.g. one for peer supporter focus groups, one for service user interviews, one for head teacher interviews, etc.). For each data set, two members of the research team independently coded two transcripts and then discussed and agreed the coding frame. A priori codes that mapped onto the process evaluation domains were included in the initial coding frameworks, along with novel codes that emerged from the data. One of the team then analysed the remaining transcripts within that data set according to the coding framework, with refinements being made to the frameworks as new themes emerged. A second team member coded 10% of the remaining transcripts to ensure intercoder reliability. All discrepancies and refinements to the coding framework were discussed between the two coders until an agreement framework was reached.
Once the frameworks were complete, common and contrasting themes across the different data sets were examined. The themes, pertinent to questions of reach, contamination/differentiation, fidelity, acceptability, mechanisms of change and sustainability, were drawn together and used to construct interpretation of the study’s impact on the main outcomes. Integration and triangulation of data adopted a complementary approach, whereby all participant narratives were equally privileged in the generation of new theoretical and empirical insights. 83
Quantitative analysis of the process data
Descriptive statistics (totals, means and SDs) were produced for all checklists, evaluation forms, records of materials used, peer supporter logbooks and feedback reports, audits and observational data. Teacher questionnaire data regarding use and views of the peer support service and other support services were also summarised descriptively. When appropriate, logistic or linear regression models were conducted to examine the difference between groups.
Patient and public involvement
During the development of the intervention, senior leaders at schools and public health professionals working with schools were consulted about the intervention design and plans for evaluation. Findings from the pilot trial were discussed at a workshop with school teachers and leaders and, as a result of these discussions, changes were made to the peer supporter guidelines, training delivery was built into in-service time and the target population was changed from all staff to teachers. In the present trial, two senior leaders from local schools were part of the advisory group, and an employee of the Education Support (London, UK) was part of the Trial Steering Committee. We consulted the Centre for the Development and Evaluation of Complex Interventions for Public Health Improvement’s (DECIPHer’s) young people’s advisory group twice,84 once to obtain comments on the student questionnaire and topic guide and a second time to discuss implications of the findings from the perspective of young people. We also conducted two workshops at the end of the study, one in England and one in Wales, with representatives from participating schools, local public health teams, Public Health Wales and third-sector organisations to share the findings and discuss implications for practice and future research.
Ethics considerations
Ethics approval for the study was granted by the University of Bristol’s Faculty of Health Sciences Research Ethics Committee (reference 28522).
Informed consent
The head teachers at all participating schools were given an information leaflet and had up to 1 month to confirm participation. At this point, they signed an agreement consenting to the school’s participation (or another senior leader signed with the head teacher’s consent), confirming that they would comply with all intervention and data collection requirements. All teachers and student participants were given information sheets at least 2 weeks before each questionnaire data collection, outlining the right to opt out of participation, the purpose of the study, any potential benefits or harms in taking part, details about data storage and details about what the data would be used for. The information sheets also contained contact information for the study team. Letters containing the same information were posted or e-mailed to parents by schools (in accordance with their usual procedures for communicating with parents) at least 1 week before student data collection, accompanied by opt out forms. Any student whose parent returned the opt out form did not take part in the questionnaire data collection. Consent was assumed from participants who completed a questionnaire. Written consent to participate was obtained from all staff peer supporters and from all participants who agreed to take part in an interview or focus group. We attempted to obtain parental written consent for students invited to take part in a focus group, but, where this was not returned, the school contact obtained verbal parental consent over the telephone.
Safety reporting procedures
School contacts and those delivering the intervention (i.e. MHFA trainers, HSCs and those trained as peer supporters) were asked to contact the study team within 2 working days if any untoward incident or adverse event (AE) occurred to a student or member of staff (1) as a direct result of taking part in the WISE study or (2) because of changes that had occurred in the school environment because of participation in the WISE trial (e.g. heightened awareness among staff of mental health problems, leading to inappropriate referrals to specialist help sources for ‘normal’ student behaviour). In these cases, study-specific AE/incident forms were completed, recording information on the event. Members of the research teams in Bristol and Cardiff were also required to complete a form about any incidents or AEs that they encountered during data collections. All AE/incident report forms were discussed with the principal investigator to assess seriousness and to explore causality. All AEs deemed to be ‘serious’ [i.e. a serious adverse event (SAE)] were to be reported to the sponsor within 24 hours. All AEs and SAEs not deemed to be related to the study were reported to the Trial Steering Committee at the next scheduled meeting. There were no SAEs reported that were suspected to be related to the intervention.
All student questionnaires were checked on return to the university for any indication of serious harm. If a student had written something on their questionnaire indicating that they were at risk of serious physical or emotional harm, the principal investigator (Bristol) or study manager (Cardiff) was to be informed and they would contact the child protection officer at the school to report the concern. No such issues arose, possibly because the questionnaire did not directly ask about things indicating risk of serious harm.
If a teacher wrote something on their questionnaire indicating that their life may be in danger (e.g. ‘I would like to bring an end to it all’), the plan was for the study team to break anonymity and write to the individual at their school, in an envelope marked confidential, expressing concern and recommending seeking help. The teacher participant information sheets made clear the circumstances in which we would break anonymity and the action that would be taken. No such action was needed during the course of the study.
Chapter 3 Main trial results and economic evaluation
Main trial results
Final sample of schools
We recruited one extra school than was needed to allow for school dropout. The final total was, therefore, 25 schools (12 intervention schools and 13 control schools). Not enough schools could be recruited at each geographical site in each FSM level, partly because of the distribution of schools by FSM status in each area. We therefore merged strata for recruitment and randomisation, where necessary, to maintain balance across the study overall. In England, the lower two strata were merged and in Wales, the top two strata were merged.
The final distribution of recruited schools by geographical area and FSM strata is shown in Table 5.
FSM eligibility | Total in geographical area, n | Intervention, n | Control, n | Total in the sample, n | |||
---|---|---|---|---|---|---|---|
England | Wales | England | Wales | England | Wales | ||
Low | 23 | 35 | 3 | 1 | 4 | 1 | 9 |
Medium | 28 | 24 | 2 | 2 | 2 | 2 | 8 |
High | 12 | 29 | 1 | 3 | 1 | 3 | 8 |
Total | 63a | 88 | 6 | 6 | 7 | 6 | 25 |
Participant response rates
All 25 schools remained in the trial until the end and participated in all three data collections. Overall, teacher and student response rates at each time point are shown in Table 6.
Participant | Response rates, % (range) | ||
---|---|---|---|
Baseline | T1 | T2 | |
Teachers | 87.1 (41.3–100) | 85.0 (45.0–100) | 83.3 (65.8–100) |
Students | 94.8 (79.1–100) | 88.1 (63.1–100) |
Participant flow through the trial is shown in the Consolidated Standards of Reporting Trials (CONSORT) flow diagram (Figure 2). As can be seen in Figure 2, teacher response rates were slightly lower in intervention schools at both follow-ups than in control schools. At each time point, a proportion of teachers and students were not eligible for inclusion and, therefore, were not given a questionnaire to complete (e.g. those who were on maternity leave, had left the school, had parental consent withdrawn or were on long-term absence, such that they would not be in school during the intervention period). Reasons for non-completion at each time point were absence from all data collection sessions and returning a blank questionnaire (which those not consenting to participate were asked to do).
The total number of teachers included in the primary analysis was 1722. This was well above the mean of 50 teachers per school assumed within the power calculation. The total number of teachers in any secondary individual-level analysis ranged from 1533 to 1739. The total number of students included in the analysis with well-being as the outcome was 2700. The total number of students included in any individual-level analysis ranged from 2700 to 2703.
Baseline characteristics
Baseline teacher demographic data (Table 7) were similar between study arms, with the majority being white female participants. This reflects the wider teaching workforce. Figures for 2018 in England showed that 64.3% of full-time state-funded secondary school teachers were female and 82.6% were white. 85 Figures for 2019 in Wales showed that 75.5% of all teachers were female and 90.5% were white. 86 Participating teachers had much poorer well-being than the general working population, whose mean score is 51.4. 62 Similarly, participant median scores on the PHQ-8 suggest a higher level of depressive symptoms than population prevalences reported in other studies. 65,87 Around one-third of participants self-reported experiencing a previous mental health problem, and the vast majority had not received previous mental health training. Most participants reported good relationships between teachers and students and between staff at their school, and just under half had provided support to a colleague more than once a month in the previous academic year.
Variable (teacher) | Control | Intervention | ||||
---|---|---|---|---|---|---|
n | Mean/median/% | SD/IQR | n | Mean/median/% | SD/IQR | |
Male, % | 221 | 36 | 205 | 37 | ||
WEMWBS mean score | 619 | 46.8 | 8.2 | 559 | 46.8 | 8.6 |
PHQ-8 median score | 615 | 5 | 3–9 | 556 | 5 | 3–9 |
Presenteeism median score | 508 | 1 | 0–3 | 461 | 1 | 0–3 |
Median days absent in past 4 weeks | 617 | 0 | 0–0 | 554 | 0 | 0–0 |
Ever absent in the previous 4 weeks, % | 516 | 16 | 471 | 15 | ||
Previous mental health problem, % | 223 | 37 | 189 | 34 | ||
Previous mental health training, % | ||||||
No | 597 | 96 | 546 | 98 | ||
Yes | 11 | 2 | 7 | 1 | ||
Cannot remember | 14 | 2 | 7 | 1 | ||
Job satisfaction, % | ||||||
Very satisfied | 70 | 11 | 73 | 13 | ||
Satisfied | 278 | 44 | 243 | 44 | ||
A little dissatisfied | 172 | 28 | 157 | 28 | ||
Dissatisfied | 84 | 14 | 66 | 12 | ||
Highly dissatisfied | 18 | 3 | 19 | 3 | ||
Job stressfulness, % | ||||||
Not at all stressful | 4 | 1 | 11 | 2 | ||
Mildly stressful | 96 | 15 | 93 | 17 | ||
Moderately stressful | 259 | 42 | 232 | 42 | ||
Very stressful | 215 | 35 | 185 | 33 | ||
Extremely stressful | 48 | 8 | 36 | 6 | ||
Experience with school environment (years), % | ||||||
< 1 | 41 | 7 | 24 | 4 | ||
1–2 | 42 | 7 | 30 | 5 | ||
3–5 | 79 | 13 | 68 | 12 | ||
6–10 | 133 | 22 | 122 | 22 | ||
> 10 | 319 | 52 | 315 | 56 | ||
Ethnicity, % | ||||||
White | 591 | 97 | 539 | 97 | ||
Mixed | 9 | 1 | 8 | 1 | ||
Asian or Asian British | 7 | 1 | 2 | < 1 | ||
Black or black British | 2 | < 1 | 5 | 1 | ||
Chinese or other | 0 | 0 | 1 | < 1 | ||
Provided support to a colleague, % | ||||||
Never | 39 | 6 | 39 | 7 | ||
Less than once a month | 289 | 47 | 251 | 45 | ||
More than once a month | 262 | 42 | 246 | 44 | ||
Every day | 26 | 4 | 23 | 4 | ||
Good relationship between students and teachers, % | ||||||
Strongly disagree | 2 | < 1 | 2 | < 1 | ||
Disagree | 14 | 2 | 17 | 3 | ||
Agree | 394 | 64 | 344 | 62 | ||
Strongly agree | 208 | 34 | 194 | 34 | ||
Good relationship between staff, % | ||||||
Strongly disagree | 6 | 1 | 3 | 1 | ||
Disagree | 42 | 7 | 40 | 7 | ||
Agree | 397 | 64 | 371 | 67 | ||
Strongly agree | 173 | 28 | 142 | 26 |
Baseline student characteristics (Table 8) were well balanced between arms, although the mean total difficulties score was slightly higher among control school students. Students were predominantly white, reflecting the population ethnicity in which the study was conducted. The mean well-being score was lower than the population mean WEMWBS score of 51.7, reported for 11- to 19-year-olds in England in 2017. 1 This difference may reflect poorer well-being among our particular geographic sample, but it may also be partly due to the narrower age range in the current study. The baseline SDQ scores also indicate higher total psychological difficulties than UK norms for 11- to 15-year-olds. 88 Again, this may indicate poorer mental health in this particular sample or it may reflect wider population changes since these norms were established in 2004. The majority of students at baseline reported good relationships between teachers and students in their school, and just under half had never gone to a teacher for help in the previous academic year.
Variable (student) | Control | Intervention | ||||
---|---|---|---|---|---|---|
n | Mean/% | SD | n | Mean/% | SD | |
Male, % | 918 | 47 | 884 | 49 | ||
SDQ mean score | 1753 | 12.7 | 6.1 | 1631 | 12.0 | 6.0 |
Student WEMWBS mean score | 1746 | 47.0 | 9.4 | 1629 | 47.5 | 9.0 |
In past year, regularity of going to a teacher for help with social/personal problem, % | ||||||
Never | 593 | 34 | 505 | 31 | ||
Once or twice | 621 | 36 | 624 | 38 | ||
Once a term | 67 | 4 | 41 | 3 | ||
Once a month | 40 | 2 | 34 | 2 | ||
More than once a month | 104 | 6 | 105 | 6 | ||
I have not had any problems | 318 | 18 | 321 | 20 | ||
Regularity of wanting to ask for help from teacher but unable to in last year, % | ||||||
Never | 580 | 43 | 553 | 45 | ||
Once or twice | 419 | 31 | 389 | 32 | ||
Once a term | 57 | 4 | 54 | 4 | ||
Once a month | 38 | 3 | 34 | 3 | ||
More than once a month | 100 | 7 | 82 | 7 | ||
I have not had any problems | 144 | 11 | 115 | 9 | ||
Good relationship between students and teachers (%) | ||||||
Strongly disagree | 77 | 4 | 62 | 4 | ||
Disagree | 440 | 26 | 345 | 21 | ||
Agree | 1053 | 61 | 1067 | 66 | ||
Strongly agree | 154 | 9 | 142 | 9 | ||
Ethnicity, % | ||||||
White | 1668 | 85 | 1448 | 81 | ||
Mixed | 147 | 8 | 147 | 8 | ||
Asian or Asian British | 69 | 4 | 91 | 5 | ||
Black or black British | 45 | 2 | 78 | 4 | ||
Chinese or other | 24 | 1 | 23 | 1 |
Schools were also well balanced by most baseline characteristics across arms (Table 9). Intervention schools had a considerably higher mean number of teachers leaving during the year leading up the baseline measures for reasons other than retirement, and a higher percentage of schools in this group had equal to or above the national average in student attainment.
School-level variable | Control | Intervention | ||||
---|---|---|---|---|---|---|
n | Mean/median/% | SD/IQR | n | Mean/median/% | SD/IQR | |
Teacher–student ratio, median | 13 | 0.33 | 0.31–0.36 | 12 | 0.35 | 0.31–0.39 |
Teachers retired, median % | 12 | 1.7 | 0.7–3.2 | 9 | 0 | 0–1.8 |
Teachers left for other reasons, median % | 12 | 14.6 | 7.3–21.4 | 9 | 21.6 | 11.7–25.0 |
Number of teachers in the whole school, mean | 13 | 60.1 | 33–95 | 12 | 60.0 | 27–101 |
FSM tertile, n (%) | ||||||
Low | 13 | 3 (23) | 12 | 6 (50) | ||
Middle | 6 (46) | 2 (17) | ||||
High | 4 (31) | 4 (33) | ||||
Number of students in the whole school, mean | 13 | 834.3 | 484–1203 | 12 | 907.2 | 389–1584 |
Teacher absence, median % | 12 | 6.1 | 3.7–12.8 | 8 | 4.0 | 3.5–10.1 |
Student attendance, median % | 13 | 93.6 | 93.4–94.8 | 12 | 93.7 | 93.2–94.8 |
Student attainment (N schools ≥ average), n (%) | 13 | 4 (30) | 12 | 5 (42) |
Primary analysis
Table 10 shows the teacher WEMWBS scores by arm at each follow-up time point and the results for the primary analysis. There was no evidence of a difference between arms in mean teacher mental well-being (WEMWBS score) over the course of follow-up (adjusted mean difference –0.90, 95% CI –2.07 to 0.27; p-value 0.130).
Variable (teacher) | Control | Intervention | Adjusteda difference in means (95% CI)b (n = 1722) | p-value | ||||
---|---|---|---|---|---|---|---|---|
n | Mean | SD | n | Mean | SD | |||
WEMWBS score (baseline) | 619 | 46.8 | 8.2 | 559 | 46.8 | 8.6 | –0.90 (–2.07 to 0.27) | 0.130 |
WEMWBS score (T1) | 634 | 48.1 | 8.7 | 556 | 47.4 | 9.4 | ||
WEMWBS score (T2) | 607 | 48.4 | 8.4 | 505 | 47.5 | 8.6 |
Secondary analyses
Primary outcome
The primary analysis was repeated with a time by allocation interaction term included in the analysis model. There was no evidence that treatment effect varied over time (p-value for interaction term 0.654).
Secondary outcomes
Table 11 shows the teacher outcomes at each follow-up by arm.
Variable (teacher) | Control | Intervention | ||||
---|---|---|---|---|---|---|
n | Median/% | IQR | n | Median/% | IQR | |
PHQ-8 median score (baseline) | 615 | 5 | 3–9 | 556 | 5 | 3–9 |
PHQ-8 median score (T1) | 631 | 5 | 2–9 | 550 | 5 | 2–9 |
PHQ-8 median score (T2) | 605 | 5 | 2–8 | 497 | 5 | 2–9 |
Median days absent past 4 weeks (baseline) | 617 | 0 | 0–0 | 554 | 0 | 0–0 |
Median days absent past 4 weeks (T1) | 631 | 0 | 0–0 | 547 | 0 | 0–0 |
Median days absent past 4 weeks (T2) | 604 | 0 | 0–0 | 506 | 0 | 0–0 |
Presenteeism median score (baseline) | 508 | 1 | 0–3 | 461 | 1 | 0–3 |
Presenteeism median score (T1) | 526 | 1 | 0–3 | 432 | 1 | 0–3 |
Presenteeism median score (T2) | 495 | 1 | 0–3 | 427 | 1 | 0–3 |
Absence (ever/never) (baseline), % | 516 | 16.0 | 471 | 15.0 | ||
Absence (ever/never) (T1), % | 631 | 15.4 | 547 | 19.9 | ||
Absence (ever/never) (T2), % | 604 | 17.4 | 506 | 14.8 | ||
Teacher absence (school level) (baseline), median % | 12 | 6.1 | 3.7–12.8 | 8 | 4.0 | 3.5–10.1 |
Teacher absence (school level) (T2), median % | 9 | 4.8 | 3.4–7.6 | 6 | 7.9 | 2.2–9.9 |
Teachers retired (school level) (baseline), median % | 12 | 1.7 | 0.7–3.2 | 9 | 0 | 0–1.8 |
Teachers retired (school level) (T2), median % | 6 | 0.7 | 0.0–2.2 | 2 | 8.4 | 3.6–13.2 |
Teachers left for other reasons (school level) (baseline), median % | 12 | 14.6 | 7.3–21.4 | 9 | 21.6 | 11.7–25.0 |
Teachers left for other reasons (school level) (T2), median % | 6 | 14.0 | 10.4–20.0 | 2 | 43.9 | 3.6–84.2 |
There was no evidence of a difference between intervention and control groups in teacher depression (PHQ-8) or presenteeism over the course of follow-up when these outcomes were treated as continuous, binary or categorical outcomes. There was weak evidence of a difference between groups in self-reported absenteeism over the course of follow-up, whereby the number of days of absence was 4% higher for teachers in the intervention group than for those in the control group (ratio of geometric means 1.04, 95% CI 1.00 to 1.09; p-value 0.042). Results for the fully adjusted models are shown in Tables 12–14. Appendix 5 shows the unadjusted and partially adjusted model results, which follow a similar pattern.
Fully adjusted modela | n | Ratio of geometric meansb intervention/control (95% CI) | p-value |
---|---|---|---|
PHQ-8: continuous | 1719 | 1.00 (0.92 to 1.10) | 0.964 |
PHQ-8: binary | 1719 | 1.27 (0.82 to 1.97) | 0.279 |
PHQ-8: categorical | 1719 | 1.06 (0.70 to 1.60) | 0.790 |
Fully adjusted modela | n | Ratio of geometric meansb intervention/control (95% CI) | p-value |
---|---|---|---|
Absence: continuous | 1717 | 1.04 (1.00 to 1.09) | 0.042 |
Absence: binary | 1717 | 1.22 (0.90 to 1.67) | 0.203 |
Absence: categorical | 1717 | 1.45 (0.98 to 2.14) | 0.063 |
Fully adjusted modela | n | Difference in meansb (95% CI) | p-value |
---|---|---|---|
Presenteeism: continuous | 1539 | 0.12 (–0.13 to 0.37) | 0.361 |
Presenteeism: binary | 1539 | 1.00 (0.67 to 1.49) | 0.988 |
Presenteeism: categorical | 1539 | 1.02 (0.71 to 1.45) | 0.935 |
Table 15 shows the student outcomes at baseline and T2 by arm. As shown in Table 16, there was no evidence of a difference in student mental health outcomes (WEMWBS or SDQ) between intervention and control groups at T2 (see Appendix 6 for unadjusted and partially adjusted models for these outcomes, which show the same null effect).
Variable (student) | Control | Intervention | ||||
---|---|---|---|---|---|---|
n | Mean/median/% | SD/IQR | n | Mean/median/% | SD/IQR | |
SDQ mean score (baseline) | 1753 | 12.7 | 6.1 | 1631 | 12.0 | 6.0 |
SDQ mean score (T2) | 1628 | 13.5 | 6.0 | 1439 | 13.3 | 5.9 |
WEMWBS mean score (baseline) | 1746 | 47.0 | 9.4 | 1629 | 47.5 | 9.0 |
WEMWBS mean score (T2) | 1628 | 44.9 | 9.7 | 1445 | 45.0 | 10.0 |
Attendance (school level) (baseline), median % | 13 | 93.6 | (93.4–94.8) | 12 | 93.7 | (93.2–94.8) |
Attendance (school level) (T2), median % | 13 | 93.5 | (92.8–94.2) | 12 | 93.2 | (92.6–94.5) |
Attainment (N schools ≥ average) (baseline), n (%) | 13 | 4 (30) | 12 | 5 (42) | ||
Attainment (N schools ≥ average) (T2), n (%) | 13 | 3 (23) | 12 | 4 (33) |
Fully adjusted modelsa | n | Difference in means (95% CI)b | p-value |
---|---|---|---|
Student SDQ | 2702 | 0.27 (–0.18 to 0.73) | 0.241 |
Student WEMWBS | 2700 | –0.35 (–1.35 to 0.66) | 0.500 |
There was also no evidence of a difference between intervention and control groups in any of the school-level outcomes (i.e. teacher absenteeism, teacher resignation/redundancy, teacher retirement, student attainment and student attendance) (Table 17), although there was a weak tendency towards greater teacher absence in the intervention schools. There was also a weak tendency towards more teachers retiring in the intervention schools, but also towards participants having taught for longer in intervention schools (p = 0.027 for years spent teaching at T2, with a higher number of years in intervention schools). Therefore, teachers in the intervention group may have been slightly older and nearer retirement age. In addition, data were missing from 17 schools for teacher retirement and left for other reasons, and data were missing from 12 schools for school-level teacher absence, which means that caution is required when interpreting these results. For student attainment, data were available for all 25 schools, but, in this analysis, some observations had to be omitted because of issues of perfect prediction and collinearity. Appendix 7 shows the results for school-level outcomes from the unadjusted models, which showed similar effects.
Fully adjusted modela | n | Difference in means (95% CI)b | p-value |
---|---|---|---|
Teacher absence | 13 | 3.69 (–1.72 to 9.11) | 0.151 |
Teacher retirement | 8 | 10.78 (–2.17 to 23.73) | 0.070 |
Teacher left for other reasons | 8 | 47.36 (–148.39 to 243.11) | 0.407 |
Student attainment | 9 | 1.12 (0.03 to 38.71)c | 0.407 |
Student attendance | 25 | 0.17 (–0.60 to 0.95) | 0.645 |
Subgroup and sensitivity analyses
Outliers
The primary analysis model was repeated after the exclusion of any data points that were identified as potential outliers with respect to the primary outcome (i.e. a WEMWBS score > 3 SD from the mean). Ten potential outliers were removed. The interpretation of the results of this analysis does not differ from that of the primary analysis model (i.e. there is no evidence of a difference in teacher WEMWBS score between the intervention and control groups over the course of follow-up) (Table 18).
Missing data
Multiple imputation was carried out under a MAR assumption; potential MNAR was explored by incorporating a scaling parameter that was additionally allowed to differ between arms (i.e. allowing the missing data mechanism to differ between arms). This was carried out by imputing under the MAR assumption and then multiplying imputed values of the outcome by a scaling parameter (e.g. a scaling parameter of 0.95 in the control arm and 0.90 in the intervention arm implies that missing WEMWBS scores in the control arm are assumed to be 5% lower than estimated under a MAR assumption, whereas scores in the intervention arm are 10% lower).
Tables 19 and 20 show the results of the analyses for teacher WEMWBS and PHQ-8.
MNAR rescaling parameters | Adjusted difference in means (95% CI)a | p-value |
---|---|---|
1.00, 1.00 | –0.79 (–1.96 to 0.39) | 0.190 |
1.00, 0.95 | –1.75 (–2.96 to –0.53) | 0.005 |
0.95, 1.00 | 0.08 (–1.10 to 1.25) | 0.899 |
0.95, 0.95 | –0.88 (–2.09 to 0.32) | 0.152 |
0.95, 0.90 | –1.84 (–3.09 to –0.59) | 0.004 |
0.90, 0.95 | –0.02 (–1.23 to 1.19) | 0.973 |
0.90, 0.90 | –0.98 (–2.23 to 0.27) | 0.125 |
1.00, 1.05 | 0.17 (–0.98 to 1.33) | 0.767 |
1.05, 1.00 | –1.65 (–2.83 to –0.46) | 0.006 |
1.05, 1.05 | –0.69 (–1.85 to 0.47) | 0.246 |
1.05, 1.10 | 0.28 (–0.88 to 1.43) | 0.639 |
1.10, 1.05 | –1.55 (–2.72 to –0.37) | 0.010 |
1.10, 1.10 | –0.59 (–1.75 to 0.57) | 0.322 |
MNAR rescaling parameters | Ratio of geometric meansa intervention/control (95% CI)b | p-value |
---|---|---|
1.00, 1.00 | 1.00 (0.91 to 1.10) | 0.961 |
1.00, 0.95 | 0.97 (0.88 to 1.06) | 0.494 |
0.95, 1.00 | 1.03 (0.94 to 1.13) | 0.478 |
0.95, 0.95 | 1.00 (0.91 to 1.09) | 0.974 |
0.95, 0.90 | 0.96 (0.88 to 1.06) | 0.431 |
0.90, 0.95 | 1.03 (0.94 to 1.13) | 0.518 |
0.90, 0.90 | 0.99 (0.91 to 1.09) | 0.905 |
1.00, 1.05 | 1.04 (0.94 to 1.14) | 0.448 |
1.05, 1.00 | 0.97 (0.88 to 1.07) | 0.556 |
1.05, 1.05 | 1.01 (0.91 to 1.11) | 0.902 |
1.05, 1.10 | 1.04 (0.94 to 1.15) | 0.424 |
1.10, 1.05 | 0.98 (0.88 to 1.08) | 0.624 |
1.10, 1.10 | 1.01 (0.91 to 1.12) | 0.849 |
Under a MAR assumption, there was no difference in conclusions drawn from the analysis of the imputed data compared with the primary analysis (i.e. no evidence of a difference in outcome between arms), which was as expected given the primary analysis model is robust to an assumption of MAR. Using scaling parameters to allow for potential MNAR, there was no evidence of a difference between arms when the same scaling parameter was used in each arm. There was evidence of a difference between arms if the scaling parameters differed between arms and the rescaling parameter was ‘worse’ in the intervention arm, whereby WEMWBS score was lower in the intervention group than in the control group over the course of follow-up.
Tables 21 and 22 show the results for the student WEMWBS and SDQ outcomes.
MNAR rescaling parameters | Adjusted difference in means (95% CI)a | p-value |
---|---|---|
1.00, 1.00 | –0.04 (–1.17 to 1.09) | 0.946 |
1.00, 0.95 | –0.43 (–1.55 to 0.70) | 0.456 |
0.95, 1.00 | 0.29 (–0.83 to 1.41) | 0.613 |
0.95, 0.95 | –0.10 (–1.21 to 1.01) | 0.861 |
0.95, 0.90 | –0.48 (–1.60 to 0.63) | 0.394 |
0.90, 0.95 | 0.23 (–0.88 to 1.33) | 0.684 |
0.90, 0.90 | –0.16 (–1.27 to 0.95) | 0.783 |
1.00, 1.05 | 0.35 (–0.80 to 1.50) | 0.549 |
1.05, 1.00 | –0.37 (–1.52 to 0.78) | 0.530 |
1.05, 1.05 | 0.02 (–1.15 to 1.19) | 0.969 |
1.05, 1.10 | 0.42 (–0.78 to 1.62) | 0.494 |
1.10, 1.05 | –0.31 (–1.50 to 0.89) | 0.613 |
1.10, 1.10 | 0.09 (–1.14 to 1.31) | 0.888 |
MNAR rescaling parameters | Adjusted difference in means (95% CI)a | p-value |
---|---|---|
1.00, 1.00 | 0.06 (–0.56 to 0.67) | 0.859 |
1.00, 0.95 | –0.08 (–0.68 to 0.52) | 0.800 |
0.95, 1.00 | 0.17 (–0.45 to 0.78) | 0.589 |
0.95, 0.95 | 0.04 (–0.56 to 0.63) | 0.907 |
0.95, 0.90 | –0.10 (–0.68 to 0.48) | 0.739 |
0.90, 0.95 | 0.15 (–0.45 to 0.74) | 0.622 |
0.90, 0.90 | 0.02 (–0.56 to 0.59) | 0.959 |
1.00, 1.05 | 0.19 (–0.45 to 0.83) | 0.562 |
1.05, 1.00 | –0.06 (–0.68 to 0.57) | 0.859 |
1.05, 1.05 | 0.08 (–0.57 to 0.72) | 0.816 |
1.05, 1.10 | 0.21 (–0.48 to 0.87) | 0.540 |
1.10, 1.05 | –0.04 (–0.69 to 0.61) | 0.914 |
1.10, 1.10 | 0.10 (–0.58 to 0.77) | 0.780 |
There was no difference in the conclusions drawn from any of the missing data sensitivity analyses compared with the analysis of the student data before imputation (i.e. there was no evidence of a difference between arms in student WEMWBS or SDQ).
Complier-average causal effect analysis
As shown in Table 23, there was no evidence of a treatment effect in those who attended the MHFA training, although there was a weak effect in the direction favouring participants in the control schools. There was also no evidence of a treatment effect in higher implementing schools, although there was a weak tendency towards higher well-being among lower implementing schools.
Explanatory variable | n | Adjusted mean difference (95% CI)a | p-value |
---|---|---|---|
Attended MHFA trainingb | 940 | –2.26 (–4.93 to 0.40) | 0.096 |
Implementation score: continuousc | 1494 | –0.24 (–0.51 to 0.04) | 0.089 |
Implementation: high/low (binary) | 1494 | –2.04 (–4.87 to 0.78) | 0.156 |
Exploratory analyses
Effect modification by demographic variables
There was no evidence of modification of the effect of the intervention on teacher WEMWBS or PHQ-8 by baseline well-being, depression, geographical area, school-level FSM or teacher sex (Table 24).
Teacher outcome | p-value |
---|---|
WEMWBS | |
Baseline well-being | 0.802 |
Baseline depression | 0.603 |
Geographical area | 0.306 |
FSM | 0.932 |
Sex | 0.570 |
PHQ-8 | |
Baseline depression | 0.929 |
Baseline well-being | 0.799 |
Geographical area | 0.540 |
FSM | 0.860 |
Sex | 0.663 |
Similarly, there was no evidence of modification of the effect of the intervention on student SDQ or WEMWBS by baseline well-being, psychological difficulty, geographical area, FSM, or sex. There was, however, evidence of potential treatment effect heterogeneity on student WEMWBS (but not SDQ) by ethnicity, whereby there was a greater reduction in WEMWBS score in black students or mixed-ethnicity students who received the intervention than in white students. These results are shown in Table 25.
Student outcome | p-value |
---|---|
WEMWBS | |
Baseline well-being | 0.152 |
Baseline SDQ | 0.451 |
Geographical area | 0.289 |
FSM | 0.421 |
Sex | 0.676 |
Ethnicity | |
Mixed | 0.076 |
Asian/Asian British | 0.689 |
Black/black British | 0.023 |
Chinese or other ethnic group | 0.692 |
SDQ | |
Baseline well-being | 0.350 |
Baseline SDQ | 0.137 |
Geographical area | 0.695 |
FSM | 0.658 |
Sex | 0.830 |
Ethnicity | |
Mixed | 0.536 |
Asian/Asian British | 0.120 |
Black/black British | 0.895 |
Chinese or other ethnic group | 0.518 |
Intervention effect on self-report measures of school context
There was no evidence of a difference between intervention and control groups with respect to stress and satisfaction with work, school’s perceived attitude towards staff and student well-being, perceived quality of relationships between staff or between staff and students, or the amount of support given (Table 26).
Teacher-reported indicator | n | Adjusted OR (95% CI)a,b | p-value |
---|---|---|---|
How stressful job is | 694 | 1.02 (0.63 to 1.67) | 0.931 |
How satisfied at work | 693 | 0.84 (0.52 to 1.37) | 0.496 |
School cares about teacher well-being | 675 | 0.58 (0.29 to 1.19) | 0.137 |
School cares about student well-being | 689 | 0.48 (0.14 to 1.61) | 0.232 |
Staff–staff relationships | 688 | 0.49 (0.22 to 1.10) | 0.084 |
Staff–student relationships | 657 | 1.17 (0.35 to 3.92) | 0.795 |
Frequency of providing emotional support to a distressed colleague | 687 | 1.06 (0.67 to 1.68) | 0.799 |
Effect modification by self-report measures of school context
The variables stress at work, satisfaction at work, staff relationships, teacher–student relationships, perception of how much the school cares about teacher well-being and perception about how much the school cares about student well-being were included as treatment by baseline variable interaction terms to investigate potential treatment effect modification. The results are shown in Table 27. There was evidence of potential effect modification by perceived school attitude to student well-being. However, the interaction term was negative (–4.45, 95% CI –8.38 to –0.53). This suggests that the effect of the intervention was more negative among teachers who were more likely to agree that their school cares about students at baseline.
Interaction term | n | p-value |
---|---|---|
Treatment group by stress interaction | 1174 | 0.092 |
Treatment group by satisfaction interaction | 1175 | 0.362 |
Treatment group by staff–staff relationships interaction | 1169 | 0.414 |
Treatment group by staff–student relationships interaction | 1170 | 0.862 |
Treatment group by school cares about staff well-being interaction | 1156 | 0.752 |
Treatment group by school cares about student well-being interaction | 1168 | 0.026 |
Treatment group by support given to a colleague interaction | 1171 | 0.095 |
The same variables were included in the primary analysis model as time-varying covariates (Table 28). In certain cases, this did cause a reduction in the estimate of the regression coefficient of the intervention. However, as there was no evidence of an effect of the intervention in the primary analysis, it is difficult to draw any clear conclusions about potential mediation, particularly as the CIs still included zero.
WEMWBS | n | Adjusted mean difference (95% CI)a,b | p-value |
---|---|---|---|
Primary analysis | 1722 | –0.90 (–2.07 to 0.27) | 0.130 |
Effect of allocation after adjusting for stress | 1718 | –0.98 (–2.05 to 0.09) | 0.072 |
Effect of allocation after adjusting for satisfaction | 1719 | –0.74 (–1.65 to 0.17) | 0.111 |
Effect of allocation after adjusting for staff–staff relationships | 1717 | –0.46 (–1.50 to 0.58) | 0.385 |
Effect of allocation after adjusting for staff–student relationships | 1717 | –0.55 (–1.53 to 0.43) | 0.270 |
Effect of allocation after adjusting for attitude to staff well-being | 1715 | –0.29 (–1.29 to 0.70) | 0.563 |
Effect of allocation after adjusting for attitude to student well-being | 1720 | –0.73 (–1.84 to 0.37) | 0.193 |
Effect of allocation after adjusting for support given to a colleague | 1719 | –0.82 (–1.96 to 0.31) | 0.156 |
Cost analysis
The average cost of the intervention was £9103 per intervention school (Table 29). The cost was slightly lower in English schools (£8263) than in Welsh schools (£9943), primarily because of the upfront cost of training the HSC in Wales. Staff salaries and costs for supply teachers (£5566) accounted for the majority (61%) of the total cost of the intervention. As reported above (see Tables 10, 12–14 and 16–17), there was no evidence that this additional cost was justified by improvement in teacher well-being, depressive symptom scores, presenteeism or absenteeism, or by any of the student-reported outcomes.
Resource use | n | Mean cost (£) | SD (£) | Range (£) |
---|---|---|---|---|
HSCs instructor training | ||||
Training feesa | 12 | 1749.11 | 1826.88 | 0.00–3498.21 |
2-day standard MHFA training | ||||
Training fees and manualsb | 12 | 933.37 | 366.85 | 379.20–1380.00 |
Travel costs | 12 | 50.51 | 33.37 | 0.00–108.30 |
Refreshments | 12 | 34.90 | 48.15 | 0.00–147.24 |
Teacher and support staff cover | 12 | 1351.23 | 473.59 | 568.00–2000.00 |
In-service day MHFA training for schools and colleges | ||||
Training fees and manualsb | 12 | 556.03 | 162.02 | 263.60–750 |
Travel costs | 12 | 23.21 | 24.67 | 0.00–73.62 |
Teacher and support staff salaryc | 12 | 2257.68 | 709.43 | 1341.77–3406.01 |
Awareness raising session | ||||
Training fees | 12 | 179.97 | 143.17 | 35.27–350.00 |
Travel costs | 12 | 9.91 | 13.16 | 0.00–36.81 |
Teacher and support staff salaryc | 12 | 1957.54 | 1018.70 | 841.5–4961.20 |
Total cost | 12 | 9103.44 | 1882.69 | 5378.97–12,026.73 |
Total cost: England | 6 | 8263.07 | 2218.52 | 5378.97–12,026.73 |
Total cost: Wales | 6 | 9943.82 | 1086.82 | 8855.81–11,960.32 |
Chapter 4 Process evaluation results
This chapter presents the results of the mixed-methods process evaluation. It begins by reporting how well the intervention was implemented in terms of the training components and the peer support service. This is followed by exploration of the extent to which the mechanisms of change in the logic model were activated, evidence of harm, and the extent of contamination and differentiation from usual practice. Evidence regarding acceptability of the intervention is woven through these sections, as it was integral to understanding implementation and the extent to which the mechanisms of change were activated. The chapter ends with a brief consideration of sustainability, although in the light of the null findings it is unlikely that schools or funders will be looking to roll out the intervention. In quotations throughout this chapter, P[n] indicates which participant is talking and Q indicates that the researcher is talking.
Implementation
Implementation of Wellbeing in Secondary Education training components
To measure the reach and dosage of the MHFA training, the number of teachers and other school staff attending and completing the courses were recorded. Reach of the 1-hour awareness raising course was measured by recording the number of staff that attended it; however, we could not identify how many of these staff were teachers and how many were other staff, such as teaching assistants or learning mentors. Fidelity to the training delivery plan and quality of the training was measured drawing on the training observations, checklists and evaluation forms completed by training attendees, and trainer interviews (see Chapter 2 for the description of these data sources). Response rates to the checklists and feedback forms were high (Table 30).
Course | Response rate, n (%) | |
---|---|---|
Checklist | Evaluation form | |
1-hour awareness session | 494 (74.2) | |
1-day MHFA for schools and colleges | 118 (80.8) | 142 (97.3) |
2-day MHFA for peer supporters | 108 (95.6) | 110 (97.3) |
Dose
In eight (66.7%) of the 12 intervention schools, the prespecified intervention dose for the 2-day MHFA training (i.e. at least 8% of school staff attending the course and becoming a peer supporter) was achieved. One additional staff member would need to have been trained in each of the remaining four schools to reach that prespecified dose. In nine (75.0%) of the intervention schools, the prespecified 8% of teachers attended the 1-day MHFA for schools and colleges training. Of the three schools not reaching this prespecified dose, an average of two additional teachers would need to have been trained to do so. Reasons for schools not achieving the prespecified dose included researcher error, nominated staff being unable to attend at the last minute and trained staff leaving the school shortly after training and therefore not becoming peer supporters. Of all those trained, 92 (81.4%) peer supporters were still employed by the schools at T2.
Reach of training
Across the 12 schools assigned to the intervention arm, 113 teachers and support staff attended and completed the 2-day standard MHFA training, and 146 teachers and support staff completed the 1-day MHFA for schools and colleges training. A total of 666 (54.5%) teachers and other school staff attended the 1-hour awareness-raising session.
Fidelity to training components and adaptations
In all four of the case study intervention schools, observers assessed that the fidelity of training delivery was consistently high for each of the items of assessment (i.e. instructor knowledge of materials, presentation skills, facilitation and support of the learning, interest from the group and coverage of materials). There was little variation in mean scores on these items for the 1-day MHFA for schools and colleges [range 3.9–4.3 (out of 5)] or the 2-day standard MHFA training (range 4.1–4.4), although mean scores were slightly lower for the 1-hour mental health awareness raising session (range 3.4–4.1) (Table 31).
Observer checklist forma | 1-day MHFA for schools and colleges (n = 7) | 2-day standard MHFA training (n = 8) | Awareness raising session (n = 7) | |||
---|---|---|---|---|---|---|
Mean (SD) | Range | Mean (SD) | Range | Mean (SD) | Range | |
Knowledge of materials | 4.2 (0.9) | 2.8–5.0 | 4.4 (0.7) | 3.0–4.9 | 4.1 (0.7) | 3.3–5.0 |
Presentation skills | 4.3 (0.6) | 3.5–5.0 | 4.1 (0.6) | 4.0–5.0 | 3.8 (0.7) | 2.8–4.9 |
Facilitation and support of the learning | 4.0 (0.8) | 2.5–5.0 | 4.1 (0.7) | 3.0–4.9 | 3.6 (0.6) | 3.1–5.0 |
Interest from the group | 3.9 (0.3) | 3.5–4.3 | 4.3 (0.5) | 3.7–5.0 | 3.4 (0.4) | 3.0–4.1 |
Coverage of material (% saying yes) | 94.0 | 85.6–100.0 | 96.3 | 84.2–100.0 | 100.0 | 100.0 |
Turning to the participant checklists, most attendees scored trainers highly for knowledge of materials, presentation skills, diversity of learning materials, communication skills, use of a range of teaching approaches and ability to keep the course focused and relevant. There was little variation in the mean scores for each of the instructor qualities for the 1-day MHFA for schools and colleges (range 4.6–4.7) and 2-day standard MHFA (range 4.6–4.7) training courses (Table 32).
Evaluation item | 1-day MHFA for schools and colleges (n = 146) | 2-day standard MHFA training (n = 113) | ||
---|---|---|---|---|
n | Mean (SD) | n | Mean (SD) | |
MHFA training evaluation formsa | ||||
Overall | 138 | 4.6 (0.5) | 110 | 4.5 (0.6) |
Presentation slides | 140 | 4.4 (0.6) | 109 | 4.3 (0.6) |
Video clips | 140 | 4.7 (0.5) | 110 | 4.5 (0.6) |
Manual | 123 | 4.7 (0.5) | 110 | 4.7 (0.5) |
Learning exercises | 136 | 4.4 (0.6) | 110 | 4.5 (0.6) |
Environment | 138 | 4.1 (0.7) | 107 | 3.8 (0.9) |
Structure | 140 | 4.4 (0.6) | 109 | 4.4 (0.7) |
Content | 140 | 4.6 (0.6) | 109 | 4.6 (0.6) |
Participant fidelity checklist formsa | ||||
Knowledge of materials | 106 | 4.6 (0.7) | 103 | 4.7 (0.5) |
Presentation skills | 107 | 4.7 (0.6) | 103 | 4.7 (0.6) |
Diversity of learning materials | 107 | 4.6 (0.7) | 103 | 4.6 (0.6) |
Communication and interaction | 107 | 4.7 (0.6) | 103 | 4.7 (0.6) |
Facilitation and support of the learning | 107 | 4.7 (0.6) | 102 | 4.6 (0.6) |
Relevance of content and discussion | 107 | 4.7 (0.6) | 103 | 4.7 (0.5) |
Flexibility of use of most relevant materials | 107 | 4.7 (0.6) | 103 | 4.7 (0.6) |
Despite their diversity in relation to their experience of delivering MHFA sessions, all six trainers interviewed reported high levels of fidelity in terms of ensuring key content was delivered:
No, we didn’t leave bits, we didn’t leave bits out, we had the time for each section we did . . . Yes, trying to think it through, to see where we’ve got, like if it said, optional to show a film clip, then we didn’t show it.
One-to-one interview, trainer 6
I have to make sure that I cover all of those slides that are in there. The only kind of flexibility that there is, is that sometimes there are up to three different videos, which talk about the same subject.
One-to-one interview, trainer 1
But yes, as far as the content and delivering the course was concerned, obviously, that was sort of, you know, drummed into us how important it was to sort of stick to it, so that’s what we did. But it was just little bits here and there and we thought, right, if we can just, rather than about 15 minutes for that activity, we’ll probably get it done in 10, so then we’d got actually 5 minutes to play with, you know, different things like that, but not much. We stuck to pretty much what it told us to do, you know.
One-to-one interview, trainer 2
However, three main factors appeared to present a challenge to that fidelity, requiring trainers to be flexible in course delivery, while still ensuring all key content was covered.
Needs of the group
Trainers discussed the need to re-evaluate choice of materials or ordering of exercises as the course went along, depending on the needs of the group or the dynamics that they had picked up:
You’re not meant to go off the planned route really but I just think, if the room is slumping slightly, you know, if you can kind of get them sort of re-energised for a little while and get them involved in something.
One-to-one interview, trainer 5
I think it’s a general thing about watching your group, seeing how they’re interacting, and making sure that they are interacting about the subject matter.
One-to-one interview, trainer 3
Location of the mental health first aid training delivery
Having the training on-site for most of the schools resulted in interruptions to the delivery in some cases, due to competing priorities of school staff (such as resolving student incidents), performance management meetings and break duties:
There was an incident in the school that afternoon, which required several members of staff to have to leave in the afternoon and go and do things and come back. I guess that’s just the nature of life inside a school.
One-to-one interview, trainer 2
Frequently I was having to move the day around or rejig, to make sure they covered the most important points.
One-to-one interview, trainer 3
In the MHFA evaluation forms, participants had the opportunity to write open-ended comments about their experience of the training and a number of participants commented that having it off-site would have been better.
Scheduling mental health first aid training within the school timetable
The school timetable was the third factor that presented challenges to the trainers maintaining fidelity during training delivery. Some trainers reported a reduction in the time they were expecting to have, with set break and lunchtimes and other scheduled school events being prioritised:
We couldn’t start at eight thirty because it was an [in-service] day and the principal wanted staff to come and join the main assembly for a talk. So that pushed it beyond 9 o’clock.
One-to-one interview, trainer 4
We’re not going to be pedantic about time scales . . . we’ll just go with the flow of the school day and just stop and start when it, you know, sort of automatically fits.
One-to-one interview, trainer 6
When faced with the challenges outlined above, the trainers appeared adept at thinking on their feet and ensuring that they still covered the essentials, which demonstrates the importance of having trainers that are skilled at working flexibly when delivering mental health training to schools.
Quality of training component delivery
Overall mean scores for the MHFA evaluation sheets were generally high (see Table 31). For the 1-day MHFA for schools and colleges training, mean ratings ranged from 4.4 to 4.7, with the exception of the environment rating, which was slightly lower (4.1, SD 0.7). Mean scores for the 2-day standard MHFA course were also high (range 4.3–4.7), with lower scores again observed for environment (3.8, SD 0.9). In the open-ended text space on the evaluation sheets, there were a few suggestions for improvement, including spending less time on the more serious but less common mental disorders, allowing more time for the course and changing the venue. However, the comments were overwhelmingly positive regarding the value and quality of the training, indicating a high level of acceptability for this part of the intervention:
This is the most interesting and engaging training I’ve had in years. One instructor talking all day and holding the audience? Amazing!
1-day evaluation sheet, school 1D
Longer would have been amazing but the instructor tackled some very complex and challenging topics with care and huge knowledge.
1-day evaluation sheet, school 1I
This course has been useful. I feel that all staff could benefit from the training to enable a whole-school approach.
2-day evaluation sheet, school 3P
. . . despite the topics being ‘dark’, I found the course really informative and interesting. I feel more confident at helping and supporting colleagues now, so thank you.
2-day evaluation sheet, school 4V
For the 1-hour awareness raising session, 85% of attendees rated the overall delivery of this session to be good or very good on the evaluation sheets, again, indicating good acceptability. In the free-text section, comments were generally positive, with words such as ‘enjoyable’, ‘interesting’, ‘informative’ and ‘useful’ being used. However, there were some observations that the session needed to be more interactive, to focus less on statistics and more on strategies, and to not be delivered at the end of the school day. In addition, two participants noted that staff should have been warned about the content, as it could have been difficult for some of them. There was also confusion from some participants about the MHFA training, suggesting that the intention to explain the study had not always worked well.
Overall, there were no discernible patterns of difference in dose, fidelity or quality of training between England and Wales, despite the different models of delivery used. This suggests that newly qualified and more experienced MHFA trainers delivered very similar training experiences, and that there were no systematic overall context differences by country that interacted with training implementation.
Implementation: peer support service
Reach of the peer support service was assessed using information from the teacher questionnaires regarding use of the peer support service, the peer supporter logs recording support given and peer supporter focus group data. As described in Chapter 2, all peer supporters were asked to complete an electronic log each academic term (three times a year). Of the 113 peer supporters trained, an average of 60.6 (53.6%) peer supporters completed logs at each of the five time points, and each supporter completed a mean of 3.1 (SD 1.5) logs. Just over half the peer supporters completed the log at each of the five time points, although 14.5% did not complete one at any time point.
Fidelity of the peer support service delivery was assessed through the peer supporter feedback meetings (described in Chapter 2). A feedback meeting was successfully held in all 12 intervention schools at 6 months, but was held in only 10 schools at the 18-month follow-up, despite multiple attempts to organise this in all the schools.
Reach of peer support service
At T1, 34 (6.1%) of 557 teachers indicated that they had accessed the peer support service in the previous 12 months. Of those teachers, most indicated they had used the service once or twice (n = 16, 47.1%) or once a term (n = 9, 26.5%). At T2, a similarly small proportion (n = 30, 5.9%) of the 510 teachers indicated that they had accessed the peer support service in the previous 12 months (Table 33). Of those who did use the service a large majority reported that it helped them.
Use of staff peer support in previous year | T1 (N = 557), n (%) | T2 (N = 510), n (%) |
---|---|---|
Yes | 34 (6.1) | 30 (5.9) |
No | 523 (93.9) | 480 (94.1) |
Frequency of use | ||
Once or twice | 16 (47.1) | 13 (43.3) |
Once a term | 9 (26.5) | 6 (20.0) |
Once or twice per month | 5 (14.7) | 8 (26.7) |
More than once per week | 4 (11.7) | 3 (10.0) |
Usefulness of service | ||
Not at all helpful | 1 (2.9) | 2 (6.9) |
It helped | 18 (54.6) | 14 (48.3) |
It helped a lot | 14 (42.4) | 13 (43.3) |
Across all time points combined, peer supporters reported that they had supported a mean of 1.7 (SD 1.8) colleagues in the previous 2 weeks, of which, on average, 0.7 (SD 0.8) were judged to be in addition to colleagues whom they may have supported prior to being trained. In fact, in the feedback meetings and focus groups, it was clear that peer supporters found it difficult to distinguish what were additional episodes of support, partly because of the informal way in which support was often given within existing relationships:
So I don’t think it’s a matter of showing them the list [of peer supporters] and going, oh, so and so, I’ll just go and have a chat with him. It’s much more of an organic process than that, where certain people will come, and not necessarily in the context of school, just odd times. It can be a conversation will strike up at an odd moment and it will lead to some sort of offloading of some issue.
Intervention group, peer supporters, phase 2, school 4N
It is possible that those receiving support had a similar difficulty in deciding when something would be ‘counted’ as support given, which may partly explain why there was a discrepancy between the amount of support reported by the peer supporters and the number of teachers reporting receiving such support.
Most often, the support reported in the logs was provided to each person once [the mean number of colleagues at each time point helped once was 23.8 (40.8%) and helped twice was 18.8 (32.2%)]. Just over half (53.1%) of those supported were teaching staff. Support was most often provided during lunchtime (35.1% of help episodes) and problems were predominantly work related (50.9% of help episodes). Usually, peer supporters spent between 5 and 15 minutes assisting their colleague (42.1% of help episodes) and this was usually through face-to-face contact (42.4, 68.6%). The most common outcomes were no further action (39.5% of help episodes) or the peer supporter checking in with the colleague at a future agreed time (25.0% of help episodes) (Table 34).
Feature | Mean number of colleagues across all time points, n (%) |
---|---|
Colleagues supported | 1.7 (1.8) |
Colleagues supported who would not have been otherwise | 0.7 (0.8) |
Frequency of support provided | |
Once | 23.8 (40.8) |
Twice | 18.8 (32.2) |
Three times | 9.0 (15.4) |
Four times | 3.6 (6.2) |
Five or more times | 3.2 (5.5) |
Characteristics of colleagues supported | |
Teachers | 30.8 (53.1) |
Non-teaching staff | 12.6 (21.7) |
Classroom support | 9.2 (15.9) |
Senior leadership team | 3.4 (5.9) |
When support was provided | |
Lunchtime | 25.0 (35.1) |
Before and after school day | 23.6 (33.1) |
Teaching time | 15.6 (21.9) |
Off-site | 7.0 (9.8) |
Type of problem | |
Work related | 33.6 (50.9) |
Home life | 20.0 (30.3) |
Other | 12.4 (18.8) |
Length of time support provided (minutes) | |
< 5 | 3.8 (6.3) |
5–15 | 25.4 (42.1) |
15–30 | 19.0 (31.5) |
30–60 | 7.8 (12.9) |
> 60 | 4.4 (7.3) |
Type of support offered | |
Face to face | 42.4 (68.6) |
Text messaging | 8.6 (13.9) |
5.0 (8.1) | |
Telephone | 5.8 (9.4) |
Actions resulting from support | |
No further action | 24.0 (39.5) |
Discussion of available referral services | 14.2 (23.4) |
Subsequent action in agreement with colleague | 7.4 (12.2) |
Agreement to check-in with colleague | 15.2 (25.0) |
Fidelity to peer support model and adaptations
At the first peer supporter feedback meetings, nine of the groups indicated that support had already been provided to colleagues by peer supporters. Peer supporters at the majority of schools had met as a group to discuss the set-up of the staff peer service (n = 9), with a smaller number (n = 5) indicating that they had held regular update meetings since then. All schools had set up a ‘buddy’ system, although this was reported to be fairly informal in all but one school. Five of the schools had set up a confidentiality policy for the peer support service. All schools offered service users the choice of which peer supporter they contacted and senior leaders were perceived to have helped raise the profile of the peer support service in eight schools.
One or two schools considered creating a specific space and time when staff could access peer support because of the difficulties of finding such safe space on an ad hoc basis. However, this proved difficult, as one peer supporter reflected on later:
I mean when we first did it, there was talk about having like a drop in, and then we were going to kind of have a rota and do that. But, you know, people are just so busy that it’s hard to ask people to give up their free time.
Intervention group, peer supporters, phase 2, school 1D
All schools had used advertising to launch the peer support service. Methods most commonly used for this included posters provided by the research team and staff briefings, e-mails and posting information in staff pigeonholes were additional methods used in a few schools.
By the time of the second feedback meeting, conducted in 10 of the 12 intervention schools, none still met as a group; rather, participants said that they discussed any issues with other peer supporters informally, if needed. Three of the schools had re-advertised the service, by e-mail (n = 3), posters (n = 1) and through a staff newsletter (n = 1). At this second feedback meeting, peer supporters at only four schools indicated that they had senior leadership commitment. The need for better formal recognition of the peer supporter work was discussed in some of the case study school focus groups:
I think that maybe needs to be addressed because we want to have more of an impact. Then actually, we need to have that recognition, as to the role that we are playing. And perhaps sitting down with the head [head teacher] and, as a group of people, and actually, this is our plan, like how do you, how will you support us, kind of thing, because it is really important.
Intervention group, peer supporters, phase 2, school 2L
We have had, at the beginning of this year also, performance management targets given that didn’t take into account any of the WISE work . . . So we now, more than any year before, have more time before, after school and during break times taken for meetings and additional work, which means that we’re drowning . . . So it’s a bit like the woodcutter analogy they used in the training, no one’s got time to sharpen the axe and it’s got to the point where this isn’t a, this isn’t an advertised or discussed or talked about initiative anymore, so it’s kind of fallen by the wayside I would argue.
And that’s just because of lack of time?
Yes, lack of time and support by SLT [Senior Leadership Team]. They haven’t argued for it or made it a place or kept it in mind or encouraged those who do it.
Intervention group, peer supporters, phase 2, school 1D
Again, there were no discernible differences by country in the extent to which the guidelines were followed. Looking at the implementation measures used in the exploratory CACE analysis described in Chapter 3, Subgroup and sensitivity analyses, the total implementation score for English schools was 29 and for Welsh schools it was 28, suggesting remarkably little difference in how well dose, reach and fidelity were met.
Summary
The different components of the WISE intervention (i.e. the three different training sessions and the setting up of a peer support service) were delivered in all of the intervention schools. Generally, the training was delivered as planned, although the specified dose was not reached in all schools and the trainers did experience challenges in the school context. The peer support services were variable in their fidelity to the guidelines, with a majority not writing a confidentiality policy or feeling supported by senior leadership in the longer term. The significance of these issues will be returned to in the following section.
Mechanisms of change
Having established that the intervention was delivered in all intervention schools, a key question asked within the process evaluation was did the intervention work as hypothesised within the logic model (see Figure 1)? In the following sections, we consider the evidence for each hypothesised mechanism of change within the logic model in turn.
Did teachers have access to support and advice?
As a peer support service was set up in each intervention school, all teachers did at least, in theory, have access to increased support and advice. However, as reported in Chapter 4, Reach of peer support service, the teacher questionnaire data indicated that only a small number of teachers actually accessed the peer support service. The questionnaire also asked those who had not used it why they had not accessed the service. At both time points, the most frequent reasons given were help was not needed, a preference for talking to other colleagues and a lack of knowledge about the service (Table 35). Other reasons mentioned by fewer participants were lack of time, having access to support elsewhere, not wanting to talk to the particular individuals involved, believing that it would not help and concerns about confidentiality.
Reasona | T1 (N = 506), n (%) | T2 (N = 463), n (%) |
---|---|---|
Did not need to | 230 (38.2) | 205 (36.8) |
Prefer to talk to other colleagues | 89 (14.8) | 89 (16.0) |
Lack of knowledge about the service | 73 (12.1) | 73 (13.1) |
Not enough time at work | 34 (5.6) | 35 (6.3) |
Access to support outside school | 39 (6.5) | 28 (5.0) |
Would not approach those particular people | 35 (5.8) | 22 (3.9) |
Did not think it would help/not professionals | 24 (4.0) | 22 (3.9) |
Concerns about confidentiality | 18 (3.0) | 21 (3.8) |
In keeping with the results from the peer supporter logs reported in the previous section (see Table 34), comments from the peer supporter focus groups did indicate that some teachers were using the service:
I rate the project, I think it’s in the forefront of people’s minds to talk to you. The volume at which they’re talking to you and sharing, I think it’s had an impact.
Intervention group, peer supporters, phase 2, school 4N
Yes, and likewise, the staff room, as you said, is really good. Because I had a member of staff who came and sat and spoke to me about quite a difficult thing that was going on, and would come and sit by me every break to talk about, and really, you know, we got to know each other through that, just when she’d started to talk and felt comfortable. And it went on and I followed that process right through. And I think somebody just listening to you and empathising with you, is often really, really powerful, even though they’re going through the most awful situations. And I think that’s where actually, if there is that opportunity to have tea and cake, yes, it’s great, it’s just currently, I don’t think that that, there is any room for people to manoeuvre within that because of everything that’s going on. Hopefully, when we get to a good place, I think that would be a good thing to try and initiate, you know.
Intervention group, peer supporters, phase 2, school 2L
Furthermore, there was evidence that the 2-day standard MHFA training had improved the quality of support that those acting as peer supporters could provide because of a better awareness of mental health problems and how to respond:
Those high-end issues, the signs to look out for, the symptoms, you know, it’s certainly improved my awareness of lots of different areas.
Intervention group, peer supporters, phase 1, school 1D
And as you were doing the sessions you were kind of thinking, oh that conversation I had with that person the other day, that kind of needs to be followed up because of a reason or something like that.
Intervention group, peer supporters, phase 1, school 1D
I think it has changed my language which is a weird way of describing it because you can be supportive without suggesting and I have stopped necessarily suggesting and it is about – what do you think will help you, what do you think your next step is? It is about not trying to solve the problem and I think just the change in language has helped.
Intervention group, peer supporters, phase 2, school 3P
Participants also highlighted the limitations of the service and the potential difficulty of providing the initial support but then having nowhere to signpost the person on to:
The other thing is have you have got the services that help and support and help people to move on to access the right things. Sometimes the right thing is we have had a whinge and it has made me feel better but sometimes it is not enough and it is about making sure that they can access other things. I have had one person talk to me over a period of time about coming back after a long term absence and he still feels that actually people don’t look at him in the same way because of his issues. But he is struggling and he could do with extra support in terms of school but you can’t really do it as well as perhaps you would like to.
Intervention group, peer supporters, phase 2, school 3P
In addition, comments from both peer supporters (focus groups and logs) and other teacher focus groups identified similar barriers to using the peer support service as those identified in the teacher questionnaires (see Table 32). Lack of understanding about how to access support, a general concern about confidentiality, a preference for using other informal or formal supports and a lack of time for both those needing support and the peer supporters themselves were all raised:
I’d struggle to tell you who was involved in it or where to go.
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 2, school 3P
So when I did have issues with mental health, I thought about all of the people there that I would speak to. So one was my line manager, I didn’t want to speak to her about it because it felt like a conflict of interest. Several of the people on there I think would have told other people, so I didn’t talk to them.
So you didn’t trust the confidentiality?
No. And then, so I did e-mail one person and we were going to meet but then I was actually worried about people seeing me with her and then being like, well you would never, what reason do you have to speak to her, and putting two and two together, so I just didn’t speak to her.
So you were just worried about other people finding out?
Yes, because I think it’s frowned upon that, you know like for me, if I take my mental health issues to my employer and they kind of fob it off and they’re not bothered, it just left me, that’s what happened, and then it left me in quite an insecure place. So I just felt like I was, and I guess, which is typical of mental health, I felt like I was very much on my own with it and I felt quite paranoid when, you know, like actually, now I’m in a little bit of a better place, that seems even silly saying it, but at the time.
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 2, school 1D
I feel I’ve got people I can talk to about things and be honest about, especially my line manager.
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 1, school 3P
Yes, like people can’t come because they’re running after school sessions. People are working so hard and I think they just think that’s part of the job now, they just don’t think that coming to talk to us is going to make a lot of difference. I think they all just, people are right up here, aren’t they?
Yes, which is probably the worst time, everyone probably needs more support, but actually, putting your hand up and think, you know, I need that support now but I haven’t got time to do that.
Yes, and I think we haven’t got the time to give them the time that they need really.
Intervention group, peer supporters, phase 2, school 2L
Therefore, although teachers in the intervention schools did have access to a new form of advice and support and some made use of this, there were significant barriers to uptake that appeared to result in relatively few teachers consciously making use of the service.
Did teachers have increased insight into their own mental health and did mental health have a raised profile in schools?
There was evidence of increased mental health awareness and skills among those who had attended the MHFA training. The MHFA evaluation sheets showed improvements following the course in self-assessed personal confidence to help [change to mean scores: 2.8 (1-day MHFA for schools and colleges) and 3.7 (2-day standard MHFA); p < 0.001] and knowledge and understanding [change to mean scores: 3.1 (1-day MHFA for schools and colleges) and 4.1 (2-day standard MHFA); p < 0.001] (Table 36).
Key skill | n | Before training | After training | Change to score | |||
---|---|---|---|---|---|---|---|
Mean (SD) | Range | Mean (SD) | Range | Mean (SD) | p-valuea | ||
MHFA for schools and collegesb | |||||||
Personal confidence to help | 134 | 5.3 (1.9) | 1–10 | 8.1 (1.2) | 3–10 | 2.8 (1.8) | < 0.001 |
Knowledge and understanding | 133 | 5.2 (1.9) | 1–10 | 8.3 (1.1) | 5–10 | 3.1 (1.8) | < 0.001 |
2-day standard MHFA | |||||||
Personal confidence to help | 99 | 4.2 (2.1) | 0–10 | 7.9 (1.1) | 4–10 | 3.7 (2.0) | < 0.001 |
Knowledge and understanding | 98 | 4.0 (2.2) | 0–10 | 8.1 (1.1) | 5–10 | 4.1 (2.1) | < 0.001 |
Many focus group participants who had attended both the 1-day MHFA for schools and colleges and the 2-day standard MHFA training courses discussed increased awareness of the need to look after themselves and how to do this, and also when to seek support. Some reflected that this was important to ensure that they were better equipped to support those around them:
I think you’re a bit more aware of like thinking about yourself, making sure you care about yourself. And that we have such roles that are so, like so demanding and there’s so many different things going on, just to make sure that you yourself are OK and you’re in the right place to support others, because otherwise you can’t. I think that’s, I took that away from it.
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 1, school 2L
There’s so much pressure on you as teachers that I think having that awareness of the mental health training from a personal point of view, as well as being able to support each other as a staff is really important, because I think that if we are anxious and we are stressed, and we are finding it difficult to manage or to cope in some way, or we are experiencing mental health issues, then that does impact on the children as well. So having this training is important not only in the way we deal with the pupils, but the way we manage our jobs for ourselves. Because otherwise, if we let things get on top of us . . . It’s nice to know there are people we can talk to about it or there are things that we can do to try and think actually, I need to handle this situation better.
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 1, school 3P
The majority of attendees of the 1-hour awareness raising session also reported a small increase in knowledge and understanding in relation to all the topics asked about (Table 37).
Impact of session on knowledge of . . . | (N = 494), n (%) |
---|---|
Magnitude of mental health problems and their impact on young people | |
No impact | 69 (14.0) |
Small increase | 333 (67.7) |
Large increase | 90 (18.3) |
Magnitude of mental health problems and their impact on teachers | |
No impact | 84 (17.1) |
Small increase | 316 (64.4) |
Large increase | 91 (18.5) |
How to improve mental health and well-being | |
No impact | 87 (17.7) |
Small increase | 326 (66.1) |
Large increase | 80 (16.2) |
How to support someone in mental health difficulty | |
No impact | 110 (22.3) |
Small increase | 298 (60.5) |
Large increase | 85 (17.2) |
Knowledge of sources of help | |
No impact | 73 (15.0) |
Small increase | 247 (50.6) |
Large increase | 168 (34.4) |
Knowledge of the WISE intervention | |
No impact | 28 (5.7) |
Small increase | 200 (40.8) |
Large increase | 262 (53.5) |
Some focus group participants perceived a greater openness to talking about mental health in their schools, although it was not always clear how much this was down to the WISE intervention or how much it was influenced by a perceived growing awareness in society:
It is definitely a conversation that we’re all having. Members of staff are not frightened of using the words and having the conversations. Children are more and more asking for help, and maybe that’s because they need help, maybe that’s because we’re raising awareness.
Intervention group, peer supporters, phase 2, school 1D
I think there’s a better understanding of mental health, certainly, amongst the students, and there’s a lot of work on that, in terms of the knowing what it is and perhaps recognising it. As I say, the next steps now are, can we prevent it, and that side of things. I think there’s certainly an increase in the awareness of it and it’s just much more open, isn’t it I think, not just in schools but nationally as well. So that helps and it’s in the media and things like that, like those case studies and examples to be able to use and talk about when you do . . .
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 2, school 2L
I’d say staff attitudes have changed. I think it’s a lot more acceptable, or accepted from my perspective at any rate. I sort of see some attitudes around the school which before it was like you couldn’t even talk about one individual member of staff and now it’s like, oh he’s doing so much better, he’s come such a long way. From that perspective I think there has been a little bit of a shift.
I’d agree with [P2]. Just in terms of I think you mentioned acceptable versus accepted I think it has become more acceptable to discuss mental health and we are more accepting of mental health and what that involves. I think there is a slight shift that we’re going through that I think is being mirrored to a certain extent in society as well. I think it’s going hand and hand in some ways and at various rates but I think that is happening outside of the school environment as well which will obviously feed into our school culture then.
Intervention group, untrained teachers, school 3P
The teacher questionnaires did not ask about openness to discussing mental health; however, as reported in Chapter 2, they did ask about perceptions regarding how much the school cares about teacher well-being, the quality of relationships among staff and how often participants had given help to a colleague in the past academic year. As Table 23 shows, there was no evidence of a difference in these ratings by study arm at T2. Therefore, it appears that there were no large improvements to the perceived culture around support in the intervention schools as a result of the intervention.
In the student focus groups, participants were asked whether or not there had been any changes in the way that teachers at their school talked about mental health and well-being, to try and capture any increased openness that the students were aware of. Across all case study schools, and regardless of study arm, students reported that little had changed from their perspective:
So have you noticed any change in the way the teachers in this school talk about mental health and well-being over the past sort of year?
[Shaking heads.]
We don’t do much on this.
Intervention group, student focus group, school 1D
. . . Do you think anything’s changed, in terms of things being talked about in school, over the last year? . . .
About the same.
Control group, student focus group, school 2C
Did trained teachers have increased confidence and skills in supporting students?
As already shown in Table 35, the MHFA evaluation sheets showed an increase in self-assessed knowledge and confidence to help among those attending both the 1-day MHFA for schools and colleges and 2-day standard MHFA training courses. Focus group discussions indicated that teachers who had attended either course found that they had been able to provide better one-to-one support to students and were also able to raise their awareness about mental health and well-being (e.g. through using the materials at assemblies and tutor time). Aspects of the course that were particularly highlighted as having been used with students were methods for dealing with anxiety, the concept of the ‘stress bucket’ and being able to identify when a student might need further help:
Some of the techniques and things they talked about, especially with the anxiety I found really useful just coming up, especially with some of the change in this school in the past year it’s been quite unsettling to some of the pupils.
Intervention group, 1-day MHFA training for schools and colleges trained teachers, phase 1, school 3P
Using the stress bucket when supporting them [students] one to one, they really find that they can relate to that. That makes a lot of sense for them. So when they’re trying to articulate how they’re thinking and feeling and why they’re thinking and feeling that, when we refer to that, it just helps them, ‘ah yes, no, that’s exactly what I’m feeling.’ ‘I get that now’ . . .
Intervention group, 1-day MHFA trained teachers, phase 1, school 2L
I have used some of the training skills with a number of kids because I would say it is almost like you have opened a wound and the number of children who are now comfortable with saying – I have issues, has gone up I would say. They come and see us when they are feeling a little anxious or a little of this or a little of that rather than letting it build up and up until there is some sort of explosion. We are seeing kids who are willing to say – I can’t cope right now.
Intervention group, peer supporters, phase 2, school 3P
It’s easier to spot signs of people dealing with anxiety and depression. You can refer with a bit more confidence to the pastoral team, you pick up on the nuance I suppose of dealing with those people.
Intervention group, peer supporters, phase 2, school 4N
One teacher from an untrained teacher focus group commented that they were impressed by the skills they had seen in trained colleagues regarding panic attacks:
I’ve seen different members of staff deal with panic attacks. The ones who are trained are phenomenal, I have full admiration, I’m watching and I’m thinking wow, they’re keeping their composure, erm, taking away the emphasis and working on their breathing and that’s fantastic.
Intervention group, untrained teachers, school 4N
Some participants did not feel that the training had given them new knowledge or skills, but generally still felt that it had reinforced what they were already doing, which helped improve confidence:
I think a lot of us, because we work within the pastoral team, are well aware of some of the aspects of the training. And I think the training basically clarified some of the aspects of what we do on a day to day basis. It’s stuff we undertake on a daily basis anyway.
Do others feel that’s a fair reflection or . . .?
Yes, I think it reinforced things we need to be looking out for. I don’t think I learned anything new per se from the training, but it was definitely a reinforcement of things, to support how you go about supporting young people.
Yes, I’d agree with that, yes.
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 1, school 4N
I found it really useful, in terms of validating what you already do and knowing you’re doing it right, and giving me a bit more confidence as well really in having those conversations. And to know that you are sort of like using the right skills and kind of giving them the right tools, so that was quite nice to have that sort of confirmation.
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 1, school 2L
Did teachers have trained colleagues to offer advice in supporting students?
Although there was good evidence that those who completed the training had improved skills and confidence, there was little evidence of this learning being shared or of untrained staff asking trained colleagues for advice on supporting students. Only one teacher mentioned being approached by another member of staff for advice in supporting a student and another described being more proactive in supporting colleagues:
I think it just makes you more aware of, for example I knew when I took a pupil back to a class and the teacher was really struggling, and I was really concerned. Then I checked back and said – I was really concerned are you OK? I think that member of staff was quite shocked that I had actually bothered to go and see him to make sure he was OK.
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 2, school 3P
The lack of knowledge-sharing may be because of contextual barriers, for example the limited time in schools to share learning and failure to embed any learning in the wider school processes (something that is discussed further in Chapter 4, Contextual challenges to the Wellbeing in Secondary Education intervention).
Did staff and student relationships improve?
Both the teachers and students were asked about teacher–student relationships in the questionnaires. At baseline and T2, the vast majority of teachers in both study arms agreed that their school cares for student well-being and has good teacher–student relationships (Table 38). A similar majority in both study arms reported providing emotional support to a distressed student at least once a month, with this proportion increasing slightly at T2. At T2, there were no strong differences by study arm in these variables.
Questionnaire items | Baseline | T2 | Adjusted ORa,b (95% CI) | p-value | ||
---|---|---|---|---|---|---|
Control, n (%) | Intervention, n (%) | Control, n (%) | Intervention, n (%) | |||
School cares about student well-being | 587 (95.1) | 522 (93.9) | 350 (93.8) | 294 (91.3) | 0.48 (0.14 to 1.61) | 0.23 |
Good teacher–student relationships | 602 (97.4) | 538 (96.6) | 359 (96.0) | 312 (96.9) | 1.17 (0.35 to 3.92) | 0.80 |
Supported a distressed student at least once a month | 426 (68.7) | 369 (66.4) | 438 (72.5) | 370 (72.7) | 0.90 (0.73 to 1.10) | 0.29 |
In the case study, school teachers talked about having good-quality relationships with students in both intervention and control schools, with no obvious differences by study arm and no comments indicating that the intervention had led to improved teacher–student relationships:
I think we’ve got pretty good relationships with most of the students in the school as well, and it’s the sort of nurturing culture that we’ve got here, where we do encourage pupils, if there is an issue, speak up. And I think they’re pretty comfortable in doing that, which I think is the first step towards finding out what the problem is, making that initial contact.
Control group, untrained teachers, school 3O
We’re there for anybody who wants us, we do our job, we listen, and we do our best to help anybody, really. And that’s adults or children, or the pupils, really. And I do feel like they feel comfortable coming to speak to us.
Intervention group, peer supporters, phase 1, school 3P
The logic model was based on the assumption that if teachers have improved support for their own mental health then they would feel better able to then maintain positive relationships with students, a logic endorsed by teachers in both intervention and control schools. Given the lack of improvement in overall teacher mental health following the intervention, it is perhaps not surprising if there is a lack of difference by study arm in the perceived quality of teacher–student relationships by the end of the study.
Turning to the students’ perspective, Table 39 shows that at T2 students in the control arm had more favourable perceptions of the quality of teacher–student relationships and whether or not teachers care about or listen to students than those in the intervention arm. Generally, perceptions had become more negative across both arms for these variables, but this change had been more pronounced among the intervention group. There were no large differences by arm regarding whether or not the school cares for student well-being, whether or not teachers are there when the student needs them and whether or not teachers are interested in the student.
Questionnaire items | Baseline | T2 | Adjusted ORa,b (95% CI) | p-value | ||
---|---|---|---|---|---|---|
Control, n (%) | Intervention, n (%) | Control, n (%) | Intervention, n (%) | |||
School cares for student well-being | 1213 (85.7) | 1119 (87.4) | 1097 (76.9) | 959 (75.0) | 0.88 (0.72 to 1.08) | 0.22 |
Good teacher–student relationships | 1007 (71.2) | 977 (76.3) | 1023 (72.3) | 854 (67.5) | 0.80 (0.66 to 0.96) | 0.02 |
Teachers care about students | 1211 (85.6) | 1154 (90.0) | 1158 (81.8) | 1016 (79.9) | 0.79 (0.64 to 0.99) | 0.04 |
Teachers listen to the students | 1065 (75.5) | 986 (77.5) | 934 (66.3) | 745 (58.9) | 0.64 (0.53 to 0.77) | < 0.01 |
Teachers there for me when I need them | 1043 (74.7) | 994 (78.6) | 994 (71.5) | 899 (71.3) | 1.02 (0.84 to 1.25) | 0.83 |
Teachers are interested in me and my life | 649 (45.9) | 634 (50.2) | 605 (42.8) | 555 (43.4) | 1.03 (0.86 to 1.23) | 0.74 |
Asked teacher for help with a personal problem | 649 (45.4) | 625 (48.5) | 557 (38.8) | 524 (40.6) | 1.11 (0.92 to 1.33) | 0.28 |
Teacher was helpful when asked for help | 551 (86.2) | 548 (86.1) | 454 (82.1) | 433 (83.4) | 1.21 (0.79 to 1.86) | 0.38 |
In the student focus groups, observations regarding quality of teacher–student relationships were remarkably similar between study arms. Both groups reported that relationships tended to be variable, with good relationships dependent on the individual teachers and the qualities they exhibited:
Do you think that teachers and students at your school have a good relationship?
It depends who the teacher is.
Yes, it’s all about who it is.
Intervention group, student focus group, school 2L
I know some teachers will ask to see you at the end of the lesson and some do ask if you’re like alright, if you’ve been upset in the lesson . . . but it’s not like all teachers. It’s only like certain teachers you get along with.
Control group, student focus group, school 2C
Some students acknowledged that the quality of relationships is shaped by both parties, highlighting poor student behaviour towards teachers and also the way vicious circles are created:
Some of the students like are hard to handle . . . Like some of them just, they just always act up for no reason . . . It’s only sometimes now but when we were younger, it was more often . . . before, when we was in a class, there was the teacher and this kid went mental at her and she started crying.
Control group, student focus group, school 1K
Some students are really disrespectful, they’ll tell them to ‘f off’ and calling them the ‘c word.’ It’s disgusting the way they talk to them, and I feel bad for them in that sense because they can’t really do much, and they’re too soft on them I think. Fair enough they’ll call down on them and call the head [head teacher], they’ll exclude them for a couple of days for it, but I think there should be more in place for that, because it’s not acceptable.
Control group, student focus group, school 3O
I think it’s just whether you have a good relationship with your teacher or not, you have to have a good relationship to be able to work, like I don’t have a good relationship with him [maths teacher], I don’t like maths whatsoever. I do play up in maths I’ll admit that, because I don’t like him. If I had a better relationship with him, I think I’d do better.
Intervention group, student focus group, school 4N
Some students in both arms described extremely negative relationships with teachers and noted that this would put them off seeking help from a teacher:
Most of us, we’re scared of the teachers.
There’s some people that are, like I am scared.
What are you scared of?
Like, I don’t know, like say in a lesson the teacher asks me a question, like in front of the class, I would always say the wrong answer because I’m like, like would she shout? Like the teacher tells us, guess. And when we do guess then she says it’s wrong and then she makes fun out of us.
Intervention group, student focus group, school 1D
. . . some teachers like, I don’t know, like if they’re like having a bad day . . . they’ll just shout at us for not getting it. And there was one teacher who called another student . . . I wasn’t in the lesson, but they called him a . . . douche or something like that.
Control group, student focus group, school 1K
My tutor scares me. No, like honestly, whatever he says, he actually scares me. Like I wouldn’t go to him whatsoever, he’d be the last person to go to.
Intervention group, student focus group, school 2L
In a number of discussions, students noted their belief that teachers cared much more about their academic performance than their well-being:
I don’t think teachers really, I know it sounds really bad, but care, I don’t think they really care . . .
It doesn’t feel like they care anyway.
They would rather you succeed than have a good mind-set on stuff.
Intervention group, student focus group, school 4N
They just generally don’t listen to the students, like they don’t want to, I don’t know how to explain it, they just don’t want to listen. They only tell us, do your work and, you know, the negatives . . . They don’t care about anything else.
Intervention group, student focus group, school 1D
These perceptions of teachers’ priorities are likely to reflect the wider context in which teachers work, which is considered further in Chapter 4, Contextual challenges to the Wellbeing in Secondary Education intervention.
Did students receive improved mental health support?
Chapter 4, Summary presented evidence that trained staff had improved skills and confidence in supporting students. However, this did not appear to translate into more support being provided in the intervention schools relative to the control schools (see Table 38).
Similarly, results from the student questionnaire showed that the number of students asking a teacher for help with a personal problem and proportion of those students finding the teacher helpful were similar between study arms at T2 (Table 40). In both cases, just under half of all responding students had asked for help and a majority found the interaction helpful when they did.
Questionnaire items | Baseline | T2 | Adjusted ORa,b (95% CI) | p-value | ||
---|---|---|---|---|---|---|
Control, n (%) | Intervention, n (%) | Control, n (%) | Intervention, n (%) | |||
Asked teacher for help with a personal problem | 832 (47.7) | 804 (49.3) | 637 (39.1) | 580 (40.0) | 1.11 (0.92 to 1.33) | 0.28 |
Teacher was helpful when asked for help | 707 (86.2) | 691 (86.7) | 521 (82.2) | 484 (84.0) | 1.21 (0.79 to 1.86) | 0.38 |
For students in the intervention schools to be more likely to seek help from teachers, their perceptions regarding teachers as a good source of support would have needed to change. The student focus group data did not show any evidence of this, with a number of examples shared where students felt that appropriate and timely support had not been given:
I can probably count the amount of teachers that would help on like one hand really, because I don’t really feel like they do anything. Someone in this school . . . was very depressed . . . there was always this constant look on his face every single day, literally, screaming, I want someone to talk to, I want someone to help me . . . it took his best friend to run to him to actually talk to him, from helping him to stop something that he would have done. And instead, it was meant to really be a teacher’s job to really make him sit down and say, are you OK, do you need someone to talk to?
Intervention group, student focus group, school 1D
I think they could do better, in acknowledging the signs of it more. Cos’, like, I saw a year 7 girl who’s mum had to physically pick her up and carry her up the stairs, and this was like a month or two into term, if the teachers had noticed ‘why doesn’t she want to come into school’ before, then they could have avoided all of that. Like we were walking to lessons and all we could hear this girl screaming, I felt really bad.
Control group, student focus group, school 3O
Although some students were able to name staff or processes by which they could seek help, barriers to doing so were noted, which had similarities to the barriers teachers cited to using the peer support service:
I’d rather someone who, well who’s never been to the school and no one’s ever really seen because usually, if like a teacher that’s been here for a while decides to take up that job, it would kind of be, people would kind of get the wrong idea and expect, know what they’re going to say, if that teacher has a reputation of maybe shouting at a student. When they sat down there with that teacher, she’ll say, oh you can talk to me. They’ll say, oh obviously not, since, for what you’ve done in the past.
Intervention group, student focus group, school 1D
On the back of everybody, all the teacher’s badges they’ve got numbers and stuff for people but again the people that you’re meant to go to I don’t feel comfortable. There are people, other teachers that I’d feel comfortable going to but not the ones I’m meant to.
Control group, student focus group, school 4W
I wouldn’t talk to my teacher. I feel like they judge you and then they will tell their teacher friends.
Intervention group, student focus group, school 2L
That’s the thing with school I think there’s so many students I don’t think they can just focus on one that has probably the same thing that most students have, because with exams and everything, probably half of the school has like a mental issue, it’s just the way it is these days, most teenagers do.
Intervention group, student focus group, school 4N
On the other hand, there were positive comments made in both intervention and control groups indicating that it could be helpful to talk to teachers in school about mental health. In addition, students felt that they could do this confidentially. Sometimes that person was their tutor, but the most important criterion was that it was someone with whom the student felt that they had a good relationship:
Sometimes when you feel sad or something you just want to tell someone to get it out of the way . . . And so telling a teacher is probably better, like it feels better because you know they’re an adult, and you can trust them but if it’s not really something like that bad, it’s something that’s just blown out of proportion.
Yeah, I’d either go to the head of year, or a teacher that I feel like I can trust, that I have a really good relationship with . . . one that you feel like that you can talk to. It doesn’t matter if you have them for tutorial . . .
Intervention group, student focus group, school 3P
Tutors tend to be like the most understanding and the one you trust the most. I mean, you know, we’ve done like the whole school, every morning you see them in school.
Intervention group, student focus group, school 2L
Was teacher stress related to supporting students alleviated?
As noted by students in Chapter 4, Did staff and student relationships improve?, one source of teacher stress is poor student behaviour; therefore, if support for students were strengthened by the intervention, leading to improved behaviour, then this may reduce this source of work-related stress for teachers. In addition, feeling better equipped to support students (through skills learnt on the training courses and having access to better support for themselves) may also reduce stress.
The teacher questionnaires did not ask specifically about stress relating to student support, but they did ask more generally about stress and satisfaction at work. As shown in Table 6, at baseline, a small majority of teachers were satisfied with their work, and just under half found their work very or extremely stressful. The results of the exploratory analysis comparing these outcomes by arm found no difference between intervention or control schools at follow-up (see Chapter 3, Table 23). Given that there was no strong evidence that support for students at the whole-school level had materially changed, and teacher well-being and mental health had not improved, it is unsurprising that stress or satisfaction levels had not improved in intervention schools.
In the focus groups, there was evidence that teachers found supporting students stressful. Staff talked about a perceived increase in problems among students, which might be expected to lead to increased stress among teachers:
If you’re a counsellor, you have no choice but after a certain amount of time to be supervised by another counsellor in terms of to offload, to clear your own mental health and likewise in the school environment, we’re never given that opportunity and I think especially with the heads of year or pastoral, he does, they can deal with a phenomenal amount of issues ranging from the trivial friendship, to suicidal. Now if it’s a Friday afternoon and they’ve dealt with, I know from experience . . . Friday afternoon they may deal with three suicide attempts, and then it’s right OK, go back to see the kids and off I go, and then all weekend, absolutely fixated on those three students, hoping that in the next 48 hours nothing happens and yet there’s no well-being for those members of staff to actually offload that on someone else, if you see what I mean.
Intervention group, untrained teachers, school 4N
It’s going to become a bigger and bigger and bigger issue in schools for staff and for students. And actually, staff being able to help students is going to become more important because we need students to be more resilient, because what they’re being asked to do is so much more difficult. It’s at a time when curriculums are being squeezed into subjects that are thought of as important and, therefore, you know, expressive arts, things like that, are being squeezed. Everything that’s written in the press. So, you know, I think it’s tough being a student between the ages of 11 and 18, it’s tough being a teacher.
Intervention group, peer supporter, phase 1, school 1D
Yes, and access to people like Off the Record [young person’s mental health charity, Bristol, UK] and stuff, I don’t think it’s that straightforward. I’ve had students that have self-harmed and their parents have tried to contact Off the Record, and there’s another one as well I think, at CAMHS, and they’ve heard nothing back from them. And that’s, I know that it’s, obviously, an organisation that’s voluntary but, you know, there’s only so much that I can talk to a child about that kind of stuff, and they do need some more expert help with that. And sometimes, I think, those organisations, people slip through the cracks I think, and that’s really unfair.
Control group, untrained teachers, school 2C
There was no indication from the data that the WISE intervention was effective in addressing this important source of stress.
Contextual challenges to the Wellbeing in Secondary Education intervention
In understanding why the mechanisms in the logic model had or had not been activated, it is important to consider the context in which the WISE intervention took place. Many of the focus group discussions and head teacher interviews in both intervention and control schools highlighted the difficult job that teachers do, and a perception that the profession is becoming more challenging. In addition to the stress caused by student behaviour and concern about student well-being discussed already, another key issue talked about was workload and the consequential impact on teachers’ mental health and well-being:
As times goes on I think people will always say the teaching is becoming more and more stressful I think if you look at lots of public service jobs whether that’s in health care or police or whatever. Resources are being cut, demands are going up, targets are being set left, right and centre. And it’s fair to say that people are finding it more and more difficult.
Intervention group, headteacher one-to-one interview
Workload.
. . . in the last few years we’ve had an increase in workload so we’ve had fewer free lessons. We’ve lost I think three now over 5 or 6 years. Plus more initiatives and bullshit, excuse my French. So it’s just getting worse and worse and worse and next year we’re losing another lesson. Yeah. People are just sort of on the floor.
Intervention group, peer supporters, phase 1, school 4N
Yes, my first year of teaching felt, it was just like, at the end of the day, just like lying on the floor being like, I’ve just had so many battles today, and you don’t battle with anybody anymore. It’s like you put your head on the desk and you collect your pens and now you leave.
Control group, untrained teachers, school 1K
I think the sense of guilt in our job is massive. It’s like we’ve all been there before and dragged ourselves into work barely able to stand and I know for myself I’ve had one period of sickness since I’ve been in this school and that was literally, well my face was out here and there was no way I could come into school, but by the time I came back, the first person I met was like, oh my God, you’re back, thank God for that. You can’t go off on the sick again and I had that all day, because of the kids I work with, that sense of guilt is enormous.
Intervention group, untrained teachers, school 3P
Part of the workload pressure came from teachers being held accountable for students’ results and having to meet unrealistic student attainment targets, often without consideration of the broader context. Teachers were aware that this could be at the detriment of student well-being and offered an explanation as to why, as discussed above, students perceived teachers to be primarily concerned about their performance over and above their mental health:
I think perhaps in the sort of last 10 years or so there’s been an increase in how, yes, it’s not enough just to be a student and do your best. It’s about actually having this very, sometimes quite, well very challenging in many cases, facilitate unrealistically challenging targets to try and attain. And there’s not that wider appreciation of a particular student’s context that can be applied to that data to make it a bit more real. And I think that’s sometimes very, very difficult to balance off as a teacher as well because you’re kind of, I know this child is doing their best, they may well not get this particular grade, but I know that we’re doing all we can, they’re doing all they can, and yet, there’s that kind of relentless expectation to keep on pushing and keep on pushing. And it’s that danger, kind of being aspirational versus being unrealistically, or being unrealistic, isn’t it?
Control group, untrained teachers, school 2C
As described in this chapter, the WISE intervention had high acceptability in this context. The majority of participants who attended the training were enthusiastic about the MHFA training courses and their potential value. Most participants were also positive about the peer support service in principle. A number of head teachers reflected on the dynamism and enthusiasm displayed by training attendees following course completion:
Yes, I mean staff were really excited, they were really bubbly, they were really enthusiastic about it. And they’ve had a couple of meetings as a group of staff to talk about, right, we’ve done the training, it was brilliant, really useful, how are we going to impact on staff, as well as on students?
Intervention group, head teacher one-to-one interview
This high level of initial acceptability could be explained as schools having reached a ‘tipping point’, where they had become increasingly aware of the need to address staff mental health and well-being. A number of participants reflected that schools’ well-being agenda had historically focused on students, whereas the needs of school staff had remained largely invisible. For example, one teacher made this observation about the school’s pastoral support:
I’d like to have access to a counsellor as well. I mean I’ve actually, I’ve suffered from, I’m not ashamed to say, I’m diagnosed with depression, and I actually went to the school nurse and the school counsellor and said, is there any opportunity for me to have some counselling? They were like, oh you can phone this number or such and such, go and see your GP [general practitioner]. But it was, no, you can’t come and talk to me, only the children can. I thought, that’s a shame.
Intervention group, peer supporters, phase 1, school 2L
The emergent prioritisation of staff well-being was not only because of its inherent value, but also because of a perceived link to teachers’ ability to undertake their professional role and mitigating the risk of burnout or absence:
I think the peer training that took place at the start of this term, to have staff in school who are accessible for other staff within school and not members of the leadership team I think gives us a far better conduit to deal with issues than perhaps otherwise would have been the case. It is about making sure that our staff feel valued really and if your staff feel valued, they tend to be happier, if there is such a thing. It can have an impact on staff absence and that, in turn, has an impact on outcomes. You know I don’t think we can divorce the two. We are an outcomes driven organisation but it is about ensuring that the workforce feel valued and if the workforce feels valued and supported, the likelihood is that they are more likely then to fulfil the goals of the organisation aren’t they.
Intervention group, head teacher one-to-one interview
There was specific acknowledgement of the current sociopolitical context of ‘austerity’ that schools were operating in, where sustained and increasing resource cuts were reducing the availability of external support (e.g. youth workers, educational psychologists). The WISE intervention was considered an innovative and necessary approach in strengthening teachers’ resilience so they could shoulder this extra burden:
I’m glad it’s being addressed because I think with services being cut and less options to have to refer students, I can see why they’re making, like putting it in schools because they’re kind of replacing services. And then one thing we do need is to have a replacement for it, so I think that’s really good.
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 1, school 1D
. . . how schools move to up-skill staff now is absolutely vital in terms of what we’re talking about, in terms of the mental health, in terms of the well-being of students. Do you know what I mean? It is, you know . . . we’re not going to be able to employ, because we haven’t got the budget, a raft of educational psychologists. So it’s whether or not there are certain interventions or certain training schemes.
Control group, headteacher one-to-one interview
However, some participants reported what they felt was a tokenistic approach to teacher well-being and mental health, as opposed to genuine investment and commitment from senior leadership, with a few hinting that this may have been the approach taken to the WISE intervention:
The one thing I would say in this school context, is it has to be valued by senior management or it’s not going to go anywhere. Quite often things are tick-box exercises and they’re not committed to and they’re not evaluated and they’re not valued and there’s so many initiatives that they’re like, let’s have a go at this, let’s have a go at this. I would hate something as important as this, or potentially as important as this to just be, oh well there we are, we’ve just ticked another box. It’s got to be meaningful.
Control group, untrained teacher, school 4W
I think a lot of the things we do here is a bottom-up approach, rather than a top-down approach, so it is . . . the initiative is taken from staff and maybe for me, I think that should be more the role of leadership or management, and I don’t think that they are that proactive. It comes from the staff body, so it’s always a bottom up approach.
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 2, school 4N
Would you recommend other secondary schools to use mental health first aid training?
I would say, yes, I would, but I think you would have to, it would have to be used in such a way that it was something the school genuinely wished to do, rather than something, which might have been just ticking a box and adding on. And I think there’s a danger that, you know, a lot of, not just MHFA but all sorts of things in a similar manner tick a box, which the school management would like to tick and which are under pressure to tick. And whether an initiative is genuinely embraced or not is something quite different to that.
Do you feel that’s the kind of approach this school may have taken, like a tick box?
I’m not really sure I should respond to that.
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 2, school 1D
Perhaps not surprisingly in the light of such observations, a number of teachers discussed their frustration that the intervention had not become embedded in school life and ongoing processes:
There’s been no, we’ve talked about it again but there’s been, it hasn’t gone into kind of the whole-school ethos. So there’s been no time to reflect, no time to encourage me to look at like the ALGEE model or how I’ve used it. There’s been no, it hasn’t been linked to my performance management, you know, like it hasn’t gone into the ethos of the school, no time’s been given to it and it hasn’t been reflected on, so, therefore, it just seems like it was a thing we did just for doing sake.
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 2, school 1D
Senior leadership were supportive initially but there has not been much interest from them since the set up. None of them have offered time or leniency with other duties to support the running of the peer support, or spoken to the team about uptake or usage.
Intervention group, peer supporter feedback meeting, phase 2, school 4S
Such comments indicated that, for at least some participants, the WISE intervention was seen as a minimally disruptive approach, a tick-box exercise rather than a serious attempt to address the structural determinants of teacher mental health and well-being. This may have diminished its acceptability over time, replacing initial enthusiasm with disappointment.
This failure to instigate meaningful change at the school level meant that the peer support service struggled to really take off, as peer supporters found it difficult to give it the time it needed. Furthermore, a number of participants described their school environment as characterised by a judgemental and punitive culture, which left them reluctant to admit any personal or professional challenges:
I don’t often like to admit when I’m struggling. I do feel in this job it’s a judge culture, and I don’t like to admit when I’m struggling and when I’m finding things really difficult, or difficult to complete, or whether I’m able to keep up on marking. So whilst it’s there, and as [P3] said, it does raise awareness. Again, I always have that in the back of my mind.
Intervention group, 1-day MHFA for schools and colleges trained teachers, phase 1, school 4N
This directly impinged on intervention acceptability, manifesting in fears about whether or not the confidentiality of the peer support service could be retained:
I think it can be good. I think maybe in this instance there were too many. I also think it needs to be, I was a bit worried because when I look at that list, I see some of the biggest gossips in school on that list. And I worry, as a member of staff, if it’s meant to be confidential, how confidential would it be? So I would worry to approach some of the members of staff who are on that list. Now I, obviously, assume, in the training, that they probably would have been informed that this is a confidential service and that they should maintain confidentiality. But I think, when I see that list and I know some of the names on that list, I think, I don’t know if I want to go to that person and give out all of my concerns and issues about my mental health.
Intervention group, non-trained teachers, phase 2, school 1D
Summary
Although aspects of the logic model were achieved in the intervention schools (notably staff having access to support and advice, awareness of mental health and self-care being raised and trained staff having improved skills in supporting students), there was little evidence of whole-school change in terms of staff and students feeling more supported, and quality of teacher–student relationships improving. The WISE intervention was delivered against a challenging backdrop of staff feeling overwhelmed with workload and pressure to perform, a perceived lack of genuine care from senior leaders and a culture of fear, and there was little evidence that it had addressed any of these issues. Ultimately, these failures to instigate whole-school cultural change may explain why the intervention did not have a positive effect on the outcomes. They may also explain the finding that by T2, according to some measures, control school participants actually had a more positive view of their school culture relative to the intervention schools, because they had not had their expectations for change raised. The next section explores other possible iatrogenic effects of the intervention.
Unintended consequences and potential harms
Did the training cause harm to attendees?
As the training covered many of the common mental health issues and the training’s participatory nature, there was the potential for it to raise sensitive subjects for the participants. There were two comments from study team members during observed sessions that indicated one participant became upset from frustration at the lack of mental health services and another during the section focusing on suicide:
During the discussion around psychosis one participant became slightly upset at the lack of appropriate service available to support young people and the school. She gave a personal example when a young person exhibiting sign of psychosis was referred to inappropriate services and the negative impact on the young person and their family.
Intervention group, 1-day MHFA for schools and colleges training, observation comment, school 4N
In 2-day observation, in one school [1D] a participant left during the section around suicide. One participant went out during one of the trainer’s anecdotes about suicide, upset – another participant followed them to check if OK. Participant then left for the day.
Intervention group, 2-day standard training, observation comment, school 1D
In these cases, the standard MHFA protocol was followed (i.e. trainers ask participants leaving the room to give a thumbs up if they are OK, otherwise the trainer checks with them whether or not they need further support).
No participants reported detrimental effects as a result of the training on the feedback forms. In fact, as noted in Chapter 4, Fidelity to training components and adaptations, the comments on these forms were overwhelmingly positive, in spite of what trainees acknowledged were difficult topics. During focus groups, a small number of participants reported that the training activated past experiences and issues for them, but they were not unduly distressed by this.
Did the peer support service cause harm to peer supporters?
In the peer supporter logs, there were a number of comments indicating that being a peer supporter could engender positive feelings of self-worth. Peer supporters did not record any major negative impacts (to themselves or users of the service), but there were a number of comments about the difficulties in finding time and space to provide support. There were also a smaller number of peer supporters who indicated feelings of powerlessness (when they did not feel that they could help or that anything would be changed in school) and upset (because of a very difficult problem or because they were experiencing something similar themselves):
Just the rushed nature of it; for example one discussion was when I was in the toilet! The colleague needed to talk so I ended up talking in there and had no break as a result. I didn’t mind but there seems to be no time to talk properly.
Peer supporter log, school 4V
I felt a bit helpless as I could see this person was stressed. I spoke to them about practical issues and ways that they may cut the workload. However, they could not see the wood for the trees in this situation and it got worse. They are currently off work.
Peer supporter log, school 4Q
Emotional stress and worry for that individual you are concerned about, that come to see you. Lack of time to spend with staff and follow up.
Peer supporter log, school 2F
Although a buddy system was recommended in the peer supporter guidelines, some peer supporters still commented on a lack of emotional support for themselves when carrying out the role:
Some staff were visibly upset when talking about the pressures that being a member of peer support staff puts upon them, especially when considering their own individual situations. The group expressed concern that with no support ‘above’ them in place (e.g. a counsellor, etc.) to offload upon then the role of PS [peer supporter] becomes very emotionally draining.
Field notes from peer supporter feedback meeting, phase 2, school 4S
It’s an added difficulty when you’ve got a five-period day and suffer with mental health problems yourself. It can be a bit overwhelming. It got to the point where I had to avoid someone sometimes just to get a drink at break. It’s also stressful when a member of staff talks about self harm. While the school business manager has been very supportive, a number of HOD [heads of departments] under great pressure themselves have not been.
Intervention group, peer supporter logs, school 2L
The intention was for peer supporters to receive ongoing support from HSCs (Wales) and CAMHS link workers (England). However, this did not happen because of the trained HSCs not covering the relevant schools in their day-to-day work and a lack of capacity among local CAMHS teams. If the intervention was to continue, finding a sustainable way for peer supporters to receive support would be important.
Did the peer support service cause harm to users?
Unfortunately, we were not able to interview users of the peer support service as planned, despite attempts to recruit through the peer supporters. In addition, through including direct requests in the teacher follow-up questionnaires, we received only one volunteer to take part. Therefore, no data were collected from the user point of view on benefits or harms of using the peer support service. As noted above, the peer supporters did not log any episodes of harm that they perceived had happened to users of the service as a result of taking part in the study, although the lack of time and also lack of safe spaces to provide support may have meant that the experience of using the peer support service had some negative consequences:
Yes. And it’s like that one where we got interrupted and it was really, that was the end of it that time. So I made a point of finding them early the next day and sort of saying, sorry, we were interrupted, you know, I’ve got some time now or we can arrange another time, sort of thing, because I didn’t, because I felt embarrassed by that, I have to say, because it was just like the moment had gone, so to speak. And I think it had taken them a bit of time to sort of build up to.
Intervention group, peer supporters, phase 1, school 2L
One person was so anxious about being seen talking to me by certain members of staff, we had to come up with a cover story so she could feel more relaxed to talk, and then find a remote space where she felt more comfortable talking.
Peer support log, school 1H
One peer supporter described the negative consequence of a line manager suspecting support had been given, showing that concerns about confidentiality described earlier may have been justified in this school at least:
It was quite a tricky situation with a member of staff, it was only one really particularly, and I was trying to talk to her about possible actions because of, you know, line management roles and all the rest of it. And I was making suggestions, which, and it was, and the line manager had a few pops at me, in terms of, well it’s quite clear you’re supporting that member of staff. And I found that quite difficult, I mean I rode it off and said, well, I just ignored it really.
So the line manager knew that they’d spoken to you?
Yes, it was uncomfortable really, that was uncomfortable. I mean it wouldn’t stop me doing what I was going to do and did, because my, because I felt the member of staff was trusting me and I was trying to advise them, not advise them, trying to say, well you could do this, you could do that, and then it’s up to you really what you want to do. But that was awkward and I didn’t enjoy that much.
Intervention group, peer supporters, phase 2, school 2L
Possibly resulting from such concerns, the peer supporters in one school identified that since the formalisation of the support the informal supportive conversations had decreased:
They feel this may impact staff feelings re [regarding] confidentiality and they feel the more informal ‘how you doing’ conversations have decreased since introducing the service.
Intervention group, peer supporter feedback meeting, phase 1, school 2A
The formalising of support from members of staff, therefore, may have had the iatrogenic effect of making staff reluctant to approach them to talk about problems, which may have led to a less supportive environment within the school. However, there were no differences by study arm in perceptions of how much the school cared for teachers, or the extent of support given to or received by colleagues. Therefore, the data did not indicate colleagues becoming less supportive towards each other as a result of taking part in the study.
Programme differentiation and contamination
A final, important series of questions that we explored in the process evaluation were what ‘usual practice’ entailed, whether or not there was contamination in the control schools and what major events occurred during the study that may have interfered with the intervention.
The audits were intended to be the main data source with regard to these issues. However, there were some difficulties with their completion, with key contacts from the school not always being able to answer all of the questions without consulting their colleagues. Although we invited them to do this, or to put us in touch with other relevant informants, some data lacked completeness. In addition, some of the questions were open to interpretation and we were not always able to gather evidence to support responses. We completed an audit for all schools at baseline, as planned. However, at T2 we managed to collect audit data from only 18 of the schools, despite several attempts to contact the relevant staff members.
Usual practice in all schools at baseline
According to the audits, at baseline none of the schools had a specific well-being policy for students, only one had a policy for staff and another had a ‘stress at work’ policy included in their overall health and safety policy. However, most schools had some initiatives to improve staff and/or student well-being, with similar activities featuring in both study arms. For staff, these included mindfulness training, well-being groups, addressing workloads (e.g. new methods of marking), flexible working, regular social times to share food and drink and formal staff support (e.g. through line management and recognition of good work). For students, initiatives included pastoral care and extra support for vulnerable students, special assemblies promoting well-being and raising awareness of mental health, mindfulness lessons, peer mentoring, personal social health and economic education that covered well-being issues. One school had a pupil well-being group.
In the majority of the schools, informants reported that both students and staff had access to support services. For staff, this was mostly off-site via a range of providers and included occupational health, employee assistance programmes and local authority-funded support. For students, counsellors were generally available on site, either employed directly or working for local organisations. Several schools reported that these counsellors were available part-time only.
Did the control schools take up similar interventions to the Wellbeing in Secondary Education study during the study period?
There was no strong evidence in follow-up audits of a large increase in support for staff or students in the control schools. Data from the head teacher interviews and staff focus groups revealed a range of initiatives for staff and students that schools had recently implemented, although it was not always clear whether these had started before or after baseline measures were collected. Those that focused directly on students included mindfulness training for vulnerable students, a writing project for year 11 students (aged 15–16 years), activities to help students manage stress (including exam stress), student peer support and homework clubs. One school mentioned that they were co-operating with an external agency, ‘Kooth’ (London, UK), to provide online mental health support for students:
. . . we’ve got the Kooth pilot . . . which is happening, which has been launched with all our year 7/8/9/10 students at the moment.
Control Group, head teacher one-to-one interview
Some schools described their recent activities to promote student well-being as being big developments:
The well-being facility is the new thing that we’ve got in this school, so we didn’t have it before and its being well used, it’s being well used that it’s a big, big development for us . . .
Control group, head teacher one-to-one interview
It’s increased quite considerably this year . . . we did a big push in terms of assemblies and so on. We had . . . the mental health nurse, talk about the issues surrounding mental health . . . We made the support mechanisms available to pupils and . . . increased the awareness . . .
Control group, head teacher one-to-one interview
There were also new initiatives to promote staff mental health and well-being, although these were smaller in number. Some schools had focused on well-being during in-service training days and some had introduced mindfulness training or activities such as yoga and sports. One had set up a well-being group for staff and conducted a well-being questionnaire to develop an action plan:
. . . there’s a few things that go on, as a result of that [in-service] day . . . it’s good that it’s on the agenda, because it wasn’t on the agenda for, I’ve been here 7 years and you’ve been here longer, and I’m pretty sure it wasn’t on the agenda up until last year, but there’s still quite a lot of work to do with it I think, it’s probably fair to say.
Control group, untrained teachers, school 1K
. . . our well-being group for staff has probably been new, we are about to look at the findings of our well-being survey and look at what we need to do and how the staff action plan related to that.
Control group, head teacher one-to-one interview
One head teacher reported that during the intervention period there had been a shift in the way that the school thought about staff mental health, resulting in greater consideration of teachers’ well-being with regard to workloads:
. . . what I would say is I think it’s been far more intrinsic. We’ve been thinking about staff mental health and well-being in a far more pro-active way, shall we say, in terms of when we’re introducing things, the work/life balance thing, in terms of what we’re expecting them to do.
Control group, head teacher one-to-one interview
The motivations for the implementation of the different student and staff initiatives by schools were not clear, apart from an apparent concern to improve well-being of students and staff in the school, which presumably had motivated the school to sign up to the WISE study.
Is the Wellbeing in Secondary Education intervention differentiable from ‘usual practice’?
As already noted, ‘usual practice’, as identified in the baseline audits, did entail some activities to support staff and student well-being/mental health. However, components of the WISE intervention were quite different from the usual practice reported by schools. Post-intervention audits in the intervention schools revealed some new activities, specifically the employment of a pastoral support officer to work with year 7 and 8 students (aged 11–13 years), a relaunch of the school council for students, focus on antibullying and the introduction of a student peer support service. Like the control schools, some head teachers reported changes that had been instigated recently and, in these cases, they explicitly put this down to participation in the WISE trial:
. . . during some of the initial [WISE] meetings, it transpired that well-being, because we had a pupil well-being group but it emerged that there was possibly a staff well-being issue. Issue is perhaps too strong a word but staff well-being perhaps as an area that wasn’t having enough overt focus as perhaps was needed.
Intervention group, head teacher one-to-one interview
We’re trying to do things, so for instance, we had the staff who were involved in the project [WISE], who’ve been trained, we’re now running drop-ins every day after school in the staff room. So there’s free tea and coffee available every day, there’s posters around the school saying, come along and have a chat . . .
Intervention group, head teacher one-to-one interview
. . . out of that group of people that have been trained to be peer mentors [through WISE], one of them came up and said, do you know what, we don’t do enough around staff voice. OK, great, what shall we do? Well could we have a drop box? So we do . . .
Intervention group, head teacher one-to-one interview
What major events occurred in each study arm?
In many of our study schools, major events and changes that had occurred during the course of the study were reported in the follow-up audit. Major events were described on the audit forms as things ‘that could have affected teacher and/or student well-being’. Of the 18 schools that provided data for the follow-up audit, 14 reported at least one major event or change (8/10 control schools and 6/8 intervention schools). The types of incidents are reported in Table 41.
Major event | Number of control schools | Number of intervention schools |
---|---|---|
Student death | 1 | 1 |
Staff death | 3 | 0 |
Student attempted suicide, rescued by staff | 1 | 0 |
Child stopped breathing and staff member performed cardiopulmonary resuscitation | 1 | 0 |
Change in head teacher | 2 | 2 |
Changes in leadership team | 1 | 2 |
Staffing changes and redundancies | 3 | 2 |
School moved premises, renamed and restructured | 1 | 1 |
Funding cuts | 1 | 1 |
Ofsted/Estyn: negative report | 0 | 2 |
Schools in both the control and intervention groups experienced some significant events that will have undoubtably had an impact on staff and student well-being. Schools in both groups experienced most of the issues, including student deaths, changes in management and staff, and changes to school premises and structure. However, there were some incidents that were reported in one arm only. In the control group, three schools reported staff deaths and one school reported two serious incidents where staff had to intervene to save the life of a student, whereas no such events were reported by intervention schools. In the intervention group, two schools reported negative feedback from the relevant inspectorate [Office for Standards in Education, Children's Services and Skills (Ofsted) or Estyn]. One school was given a ‘requires improvement’ assessment and another was placed in special measures. None of the schools in the control group reported negative assessments during the study period.
Summary of contamination and differentiation from usual practice
Although the audits indicated that ‘usual practice’ already entailed a certain amount of activity to support student and teacher mental health, this was broadly similar across both arms. The components of the WISE intervention were different from usual practice and the control schools did not introduce components similar to the intervention during the study period. There were a similar number of major events reported in both study arms, although there were a small number specific to one or other arm. The possible significance of this is considered in Chapter 5.
Sustainability of the intervention
The findings discussed in previous sections regarding the failure to embed the intervention into ongoing practice in schools raises questions about how sustainable the intervention would be. Among intervention funders, there were three issues regarded as important when considering likely sustainability. One was the extent to which the intervention aligned with the strategic priorities of their respective organisations, which may affect any future funding, but also the extent to which schools would be encouraged to take it up:
Yes, it fits well. So you’ll probably be aware that, well I mean we’ve already talked about the sort of strategic priorities that, you know, are in place. I guess we have also relaunched our healthy schools programme and the mental health badge, which enables schools to just focus on particularly mental health, so that they can really dedicate their time to that. It does include an element around the staff well-being. It does mention mental health first aid training within it. So I mean it’s very well aligned to that programme of work. And I’d say it’s fairly well aligned to the sort of new policy context and, you know, developing leads in schools around mental health and sort of teams around mental health.
One-to-one interview, funder 1
. . . one because we recognised that mental health was a priority and looking at the school setting as a mean of looking at mental health outcomes, secondly, for us it was consistent with our ways of working in the sense that it was a whole-school model.
One-to-one interview, funder 2
The second was the extent to which the intervention was found to be effective, in a context in which evidence-informed practice was often considered lacking:
My view would always be if we’ve got interventions that we believe we have an evidence base for and we believe have a cost effectiveness and they’re working on topics and health outcomes that are important in a policy and a population perspective, which this one would be, then it’s an easy ask to be honest.
One-to-one interview, funder 2
The third factor was the need to develop an infrastructure so that any future training or rollout could genuinely stand alone, rather than rely on external support:
The way I would see it being sustainable would be that the various partners/agencies on the ground commit the resource. If everybody says OK we can give X number of days of so and so’s time in order to deliver this, then that helps to make it doable. Increasingly now that’s how you make things work. You provide the opportunity but on the ground people will come together in a coalition to deliver something under the auspices of one of their local partnerships.
One-to-one interview, funder 2
However, even if these three requirements were met, concerns were still expressed by one of the funding representatives and some head teachers about difficulty finding the resources needed to ensure that the intervention became routinised, particularly in terms of providing future training to replenish peer supporters who may have left the school:
Well I think, unfortunately, the first question that any school would be asking at the moment, is one based around finance. So, you know, the commitment to the training, for example. So it depends how that’s funded, how that training is made available. I suppose the other resource cost then is the time that, the time pressure that it puts on colleagues, in terms of making themselves available.
Intervention group, headteacher one-to-one interview
Given the fact that the intervention was not found to be effective, it is unlikely that funders or schools would be interested in further exploration as to how to make the intervention sustainable.
Chapter 5 Discussion
Main findings
The WISE intervention did not appear to have an effect on the primary outcome of teacher well-being. There was also no evidence of an effect on teacher depression, student well-being or psychological difficulty, or on most other individual- or school-level outcomes. There was one small effect for teacher absence as a continuous variable in the fully adjusted models, but in favour of the control arm. This may indicate that the intervention actually led to poorer teacher well-being and resultant absence, but it could also mean that teachers in the intervention schools had become better at identifying when they need to take time off because of increased awareness of the importance of self-care. Given the number of analyses that were run, a third explanation could be that this finding was due to chance. There was weak evidence of a change in the direction that favoured the control schools or lower implementing schools in some of the analyses, which may indicate harm resulting from the intervention, something explored further below. However, it must be emphasised that these observed changes were not large and at least some may have been the result of chance findings due to the number of analyses undertaken. There was also the suggestion of an interaction between perceptions of school attitude towards student well-being and teacher well-being, with teachers who were less likely to believe their schools cared about student mental health at baseline having higher well-being by the end. Other studies have found universal interventions to have heterogeneous effects in different subpopulations,89 although it must be emphasised that this analysis was underpowered and so was exploratory. Furthermore, it could be a chance finding, given that we examined a number of different secondary outcomes.
We found insufficient evidence that the costs of the WISE intervention were justified by any improvements in staff-, student- or school-level outcomes. The total mean cost per school was relatively small (£9103). The model of training HSCs to deliver the training used in Wales was slightly more expensive within the context of this trial. However, it has the potential to be cheaper in the long-term, as it would mean the MHFA training could be extended to a larger number of schools. It is clear from the process evaluation that the intervention also has opportunity costs for peer supporters. Although the proportion of teachers who reported accessing the support was relatively low, the typical duration of contacts (i.e. 5–30 minutes) and the timing of those contacts (i.e. lunchtime or before and after school) will have an impact on peer supporters’ time for other activities. Furthermore, there may be other potential spillover costs, for example increased mental health awareness leading to staff taking time off for self-care or to seek further help. The intervention also has potential effects on students in other years, family members and associates of staff that were not captured in the trial. It is also possible that the full effect of the intervention might become evident over a longer follow-up period. However, we found no evidence to suggest that either of these factors was likely to have an important impact on our conclusions. Although there are no direct comparisons with economic evaluations of interventions aiming to improve the mental health and well-being of teaching staff, several recent trials have evaluated interventions aimed at improving student social, emotional and mental health. 33,77,90 The interventions evaluated in these trials are diverse and target different aspects of mental health in different age groups. Nevertheless, some of the economic findings may be generalisable to other school-based interventions to improve mental health. The choice of valuation method for staff time in intervention training and delivery is likely to have a strong influence on cost-effectiveness estimates. Interventions that require teachers to be absent for lengthy training periods will incur costs for the school in obtaining supply teacher cover. Intervention delivery may have no financial costs for the school, but will have an opportunity cost if it displaces other aspects of the curriculum. Effect sizes, where present, tend to be small in public health interventions aimed at the general population of teachers and/or students. Therefore, cost-effectiveness may be hard to demonstrate within a trial. In this context, it is particularly important for policy-makers to consider the impact on subgroups of the population (e.g. those with poorer mental health), the sustainability of the intervention (e.g. because of high staff and student turnover) and the potential spillover effects across the school year groups and beyond the school.
It is important to draw on the findings from the mixed-methods process evaluation to understand why the intervention did not have the hypothesised impact on teacher or student outcomes. Regarding how well implementation was achieved, fidelity and quality of the 1- and 2-day MHFA training were rated as good by research team observers, course participants and trainers alike. As has been found in other studies evaluating MHFA,49 the training improved teachers’ knowledge and confidence to help others. There were challenges in terms of delivery in the school context, with the risk of interruptions and timetabling issues. There was no evidence that this had an impact on fidelity or quality because of the skills of the trainers at working flexibly, but it may have meant that for a few participants they were distracted or even had to miss parts of the training. The prespecified dosage of the MHFA training (in total, 16% of all staff in each school) was not achieved in one-third to one-quarter of schools. A previous RCT found an effect when 15% of the student body was trained to be a peer educator. 90 The failure to achieve the target dose in this study may have meant that fewer students or staff received help from a trained individual, thereby affecting any population-level improvement. Fidelity of the 1-hour awareness session delivered to all teachers was also good and most participants reported learning gains. However, this session was more variable in terms of acceptability, with some participants suggesting improvements relating to the focus and delivery approach.
Implementation of the peer support service was more variable. A service was set up in all 12 intervention schools, and there was evidence that each service was used by some staff. However, peer supporter dropout created further dosage reductions in some schools, which may have contributed to the lack of impact. In addition, key aspects of the guidance were not always followed, specifically senior leaders visibly championing the service and supporting its delivery in practical terms, the establishment of a confidentiality policy and advertising the service at least once each academic year. These issues were likely to have had an impact on the key barriers to service use that were highlighted by participants, namely concerns about confidentiality, concerns that peer supporters did not have time or a safe space to provide support and not knowing about the service. As a result, although all teachers did have increased access to support, which was the first step in the logic model, they did not always consider this an acceptable form of support, which contributed to low service use. This may explain the lack of a population-level impact on staff outcomes. The importance of having an intervention champion and support from senior management have been identified as key for success in the child and adolescent health literature. 91
Next, we turn to whether or not the mechanisms of change were successfully deployed, as theorised in the logic model. As already noted, there was evidence that teachers did have improved access to support if they chose to make use of this, and they also had increased awareness regarding their own and their students’ mental health and knowledge of how to provide support. According to course attendees, the MHFA training also improved the quality of support that they were able to provide to students. However, there was no evidence that this had an impact on the number of students who reported receiving support from teachers or on their perceptions of how supportive teachers were. This may be because the number of teachers completing the 1- or 2-day training was too low to see this effect and there was little evidence of other staff making use of their trained colleagues or learning from them. However, over half of all teachers received at least 1-hour of awareness raising. An alternative explanation is that the barriers to seeking help from the student perspective were not addressed. Specifically, students discussed concerns about confidentiality and a perception that teachers are interested in their academic performance only.
There was no evidence that the intervention had an impact on the subsequent steps in the logic model, namely improved teacher–student relationships in the schools and a reduction in stress related to supporting students. Qualitative findings from the student focus groups in both study arms included some very negative comments about their relationships with some teachers, although others acknowledged that students were also responsible for poor-quality relationships, and a majority in the student survey agreed that their school and their teachers cared about them. Counterintuitively, some of the measures in the student surveys showed that students in the control schools had a higher opinion of teachers’ attitudes towards them and their well-being by T2. This may indicate an unintended negative effect of the intervention, although the ‘dark logic’ by which this may have come about is not clear. 82 There was some suggestion in the qualitative data of teachers in the intervention schools becoming disillusioned, which may have affected their commitment to supporting the students, although there is no evidence from the process evaluation to support this theory. This sense of disappointment that the intervention did not lead to greater change may also explain the weak tendency towards lower well-being in the intervention schools and among those trained at the final follow-up than among those in control schools. It would be important in any future study focusing on teacher well-being to explicitly monitor this potential for disillusionment and any subsequent implications for teacher or student outcomes. There was also no evidence that teacher stress had been alleviated or satisfaction at work improved by the intervention. This is not surprising, as, if quality of relationships had not improved and large numbers of teachers were not accessing more support, then there would be no reason for stress to have reduced. Given that many of the mechanisms of change were not activated, at least for large numbers of the teacher and student populations, the null effect for the main outcomes is perhaps not surprising.
The final question, which may explain the null finding, concerns the extent to which the WISE intervention differed from usual practice, and whether there was any indication of contamination, or of control schools taking up other, equally effective interventions. There was evidence of some new activity in the control schools to support students and staff over the course of the study. These activities were not the same as those within the WISE intervention, but this could have diluted any effects if these activities were equally effective. However, there was also evidence of similar new activity in the intervention schools, so if the WISE intervention was effective we might have expected to see this over and above the impact of these other activities. The fact that schools across arms appeared to remain broadly similar in terms of delivering other activities shows that the WISE intervention did not have a harmful effect through causing a reduction in other activities supportive of mental health.
Strengths and limitations
Strengths
To our knowledge, this was the first RCT to evaluate the clinical effectiveness and cost-effectiveness of an intervention aiming to improve teacher mental health. The prospective collection of cost data allowed us to evaluate the trade-off between the investment (i.e. of time and money) required of schools against the potential for improved staff and student health and well-being.
The high response rates from both teachers and students, and the retention of all recruited schools ensured that the study had sufficient statistical power to identify important differences in mental health outcomes between study arms. In addition, high response rates and the use of multiple imputation in our analysis helped limit the risk of reporting bias due to missing outcome data. Collecting baseline data before randomisation helped to further mitigate risk of bias. The stratified approach to recruitment and sampling helped ensure that samples were balanced across study arms at baseline, with the exception of student attainment and teachers leaving for reasons other than retirement, both of which were higher in the intervention schools. An extensive process evaluation enabled us to answer important questions about how the intervention was delivered, participants’ experiences of the intervention and the context in which the intervention was delivered, which helped to shed light on why it did not lead to improvements in teacher or student outcomes. We successfully achieved delivery of all intervention components in all intervention schools and, to the best of our knowledge, no control schools took up the intervention during the study.
Limitations
The lack of blinding of participants was a limitation, as we cannot be sure that knowing they were in an intervention school did not affect their responses. Indeed, there were signs that some teachers in the intervention schools became frustrated by the lack of whole-school change, which may have led to reporting bias, particularly given that a self-report questionnaire was the main source of outcome data. A second problem with using self-report measures of well-being and mental health is the risk that gaining knowledge about mental health may affect how individuals then report their own mental health. This problem has been discussed in the mindfulness literature, in which those who are more experienced at mindfulness may paradoxically report themselves as less mindful because of this greater insight. 92
An important potential limitation for our economic analysis is that we did not directly collect information on school staff or student use of mental health services during the follow period. Direct measurement at 12 months (teachers) and 24 months (teachers and students) would probably have resulted in recall bias,93 and potentially differential bias between study arms in an unblinded study. An effective peer support intervention might either increase (because of signposting) or reduce (because of addressing distress early) health-care use. We know from the log reports that peer supporters discussed available services in approximately 23% of cases (see Table 31). However, we do not know how many of these cases were acted on or which help sources were involved. If the trial had demonstrated that the intervention was effective and efficient from the school’s perspective, it would be less efficient from an NHS perspective if the peer supporters signposted many more teachers on to general practitioner and other NHS services. Given that we found no improvement in staff well-being, schools are unlikely to invest in or sustain the intervention regardless of any possible impact on health services.
Regarding limitations of the process evaluation, it was difficult to confidently measure use of the peer support service because of the informal way in which it operated, in the context of pre-existing relationships. It is, therefore, hard to be sure of how extensively the service was used. Furthermore, because we did not manage to identify enough users of the peer support service willing to take part in an interview, we were unable to evaluate the quality of the support provided beyond a closed question in the teacher questionnaire. We did not manage to collect all intended data from all schools; specifically, we did not collect peer supporter feedback from all schools 1 year after service set up and we did not manage to complete a T2 audit for all schools.
There were aspects of the intervention that did not go to plan. We did not manage to train the specified number of teachers in all schools, and numbers reduced further in some cases because of staff leaving. We were unable to build in more formal external support for the peer supporters from local CAMHS link workers or HSCs, which may have led to the peer supporters finding the role more challenging and may also have contributed to the sense that this new role had not been properly acknowledged. Finally, as discussed above, we were also unable to ensure that the peer support service was delivered as intended, including, crucially, with support from senior leadership, and with a clear confidentiality policy.
Links to the wider literature
Despite the lack of impact on student mental health, MHFA was viewed as valuable by participants in terms of improving their knowledge, confidence and skills in responding to the needs of students that they experienced in their day-to-day roles. It is therefore important to continue to explore effective ways in which mental health training might be able to be delivered. Building mental health training into initial teacher training might be one way to ensure that all teachers are equipped to respond appropriately to vulnerable students in their care, which may lead to greater and more sustainable changes relating to teacher–student relationships and mental health outcomes. Most MHFA evaluations have not measured impact beyond 6 months. 49 One evaluation of the impact of training parents of teenagers found improvements in knowledge and confidence were still visible at 2 years, but had decreased in effect size. 93 Therefore, examining the possibility of delivering regular refresher training would be important as part of any training delivered to all teachers.
By comparison, the peer support service, although viewed positively as an idea, was not implemented with the same consistently high fidelity and, partly as a consequence of this, was not well used. Other evaluations of peer support work have uncovered similar challenges in terms of implementation and identified the need for such services to have clear expectations around the role and how it fits within the wider organisation, something that was not always achieved in the WISE intervention. 94,95 For such a service to work well in the school context, it is clear that more work needs to be carried out with schools and senior leaders to explore where they see this service fitting into the broader context, and how the role can be more formally recognised as part of the wider work of staff within the school. In addition, if such an intervention were to be pursued in schools, evidence from the process evaluation indicates that peer supporters would benefit from more formal supervision, similar to that received by counsellors. This would of course increase the costs of the intervention.
The failure of the intervention to address barriers to help-seeking expressed by both teachers and students ultimately meant that there had been no significant change to whole-school culture in relation to the stigma of seeking help, something reported elsewhere by teachers. 24 A theme running throughout the process data was the difficult job that teachers do and how the situation is worsening. Teachers in our sample had poor well-being and a high level of depressive symptoms compared with the general population, and 85% felt that their job was moderately to extremely stressful at baseline. A recent survey published by the Education Support partnership of educational professionals painted a similar picture, with 72% of respondents describing themselves as stressed and the sample having a mean well-being score on the WEMWBS of 44.7. 95 Furthermore, resonating again with themes from this study, 60% of respondents said that they would not feel confident disclosing stress or mental health issues to their employer and 43% felt that their employer did not support staff mental health adequately. Evaluations of public health interventions need to take into account the ways in which context may constrain or enable delivery. 96 This wider picture, although confirming the need for interventions to improve support for teachers, also created a challenging context for the WISE intervention to be implemented in, and to have a significant and sustainable impact.
Against this backdrop of challenges within the teaching profession, the WISE intervention had initial high acceptability, as it fitted a wider developing agenda around staff mental health. However, this acceptability waned over time, as participants realised that it had been minimally disruptive to the wider, deeply entrenched culture in which they worked,97 characterised by demanding workloads, high levels of accountability and a fear of asking for support. Hawe98 suggests that the success of an intervention delivered into a complex system, such as a school, depends on the extent to which that intervention harnesses and ultimately reconfigures the dynamic properties of that system, including the social networks therein. 98 Schools are complex systems in the sense that they are made up of a series of interacting subsystems, including the teacher system, the senior leader system, the non-teaching staff system and the student system. In the case of the WISE intervention, those interactions were not sufficiently changed to create new customs, norms and expectations within the whole system. Previous studies have reported that school culture (namely the extent of accountability, how far senior leaders are seen to care about teacher welfare and connectedness among teachers) can be more influential on teacher stress and satisfaction than discreet interventions aiming to support them and, indeed, that these school-level factors can interfere with the successful implementation of such interventions. 99–101 In the present trial, the key initial gains of the WISE intervention (i.e. learning from the MHFA training and provision of a peer support service) failed to become embedded into ongoing practice in any meaningful way. Therefore, they did not lead to changes at the system level in terms of the norms regarding expectations made of teachers and how far mental health is prioritised. The evidence suggested that this led to a certain amount of disillusionment from teachers who had initially invested willingly and enthusiastically in the intervention, leaving them with the view that the WISE intervention had been treated as a tick-box exercise. The difficulties experienced by the peer supporters, particularly regarding role credibility and feeling what they were doing was a core part of the organisation, have been reported elsewhere. 102 The perceived failure of senior leadership to prioritise and embed the intervention was raised as a particular sticking point by participating teachers, and, therefore, future work exploring how to better engage educational leaders in the task of supporting teachers’ and students’ mental health would be valuable. One recent workplace RCT delivered mental health training to senior managers within the fire and rescue service, and reported a positive effect in terms of reduced levels of sickness absence among the staff that they managed. 54 Another important avenue to explore is support for senior leaders themselves. The Education Support survey found that 74% of all education professionals described themselves as stressed and this rose to 84% for senior leaders. 95
It was clear from the audit data that a large number of potentially distressing events and changes can happen in schools over the course of 2 years. However, negative inspections were reported in intervention schools only, in our sample at T2. Despite regular reports of the stressful impact of an Ofsted report in the media, there is very little research examining this as a determinant of stress or poor mental health. Two small-scale studies103,104 have shown associations with increased levels of stress and sickness absence among teachers during the period just before and just after an inspection. In addition, one of the studies104 showed that staff morale is generally affected after any inspection, regardless of the outcome. It is, therefore, possible that this stressor contributed to a particularly difficult culture among those intervention schools affected, which may have over-ridden any potential impact of the intervention.
Summary of research recommendations
Future school-based research to improve mental health should focus on the following.
How to fully engage senior leaders in mental health interventions
This may include focusing on training for the leaders themselves, addressing the pressures that they experience and ways in which these pressures can be alleviated, and identifying facilitators that would enable them to prioritise mental health. The inclusion of assessment of mental health support for staff and students in Ofsted inspections may be one way to incentivise senior leaders to give such interventions the status and investment they require.
How to address the structural determinants of poor teacher mental health, such as high workload and pressure to perform
The application of systems thinking to public health improvement has become increasingly popular, in which stakeholders map aspects of the system that influence the outcome of interest and where change is required. Such an approach could be applied to the school system, with subsequent evaluation of system-level changes. One characteristic of the system identified in this study is the culture of silence and stigma around mental health that still pervades much of teachers’ working lives. Research to understand how this culture is perpetuated and how it can be challenged would be an important aspect of such system-level work.
How to strengthen relationships within schools (between staff, staff and students, and staff and senior leaders)
Introducing mental health training to a small group of staff was not enough to see a change in these relationships. It may be that whole-school changes involving policy and practice are needed to create more positive and supportive interactions as the norm.
How to build support for schools within the wider mental health services system
In this study, we explicitly planned to set up support for the peer supporters via CAMHS and HSC teams that were already working in schools, but this proved too challenging in practice. The introduction of education mental health practitioners to certain local authorities as a consequence of the government’s Transforming Children and Young People’s Mental Health Provision: A Green Paper16 may provide a model in which this can be achieved, although the focus here is on student mental health and not staff mental health. In the absence of sufficient specialist provision for children and young people’s mental health, expectations are likely to continue to fall on schools to support ‘low-level’ mental health needs, creating further stress for busy teachers. One important avenue for future research could be to examine the feasibility of external services providing formal supervision for school staff to support them in this role.
Whether mental health first aid or other mental health training does have a place within schools alongside system-level change
The training delivered as part of the WISE intervention was well received by teachers, but there was evidence that not all knowledge was retained and dosage may not have been sufficient. Future studies could investigate whether or not the impact increases with a higher dosage and regular top-up sessions. Consideration could also be given to whether or not particular teachers would benefit most, for example those who are still in training or newly qualified, and whether or not learning should be more formally cascaded to colleagues. In addition, ways to more formally recognise the contribution of those who have received the training to supporting mental health should be explored, for example releasing staff for a set amount of time each week to engage in support work, although this would add to the costs of the intervention.
Conclusions
Schools are at a ‘tipping point’. Teacher mental health is becoming increasingly acknowledged as an important focus for interventions because of high levels of stress and distress, and poor well-being in this population. The WISE intervention was acceptable, those who engaged with it found it useful and there was some evidence of a general raising of awareness of the importance of mental health and its impact in schools. However, it did not have an impact on teacher or student mental health. This is likely to be because, following the intervention, there was not enough of an increase in support accessed by staff or students, and there was no evidence of better-quality relationships, reduction in work-related stress or changes to a school culture characterised by high workload, accountability and fear of asking for help. Ultimately, the WISE intervention did not create a system-level change. 105 Future studies aiming to support teacher or student mental health need to take a more radical approach that seeks to understand how to be more disruptive of these systemic, deeply embedded cultural norms. This needs to include further exploration of how to reduce the stresses faced by senior leaders themselves, and how to better engage them in supporting sustained and meaningful work towards improved teacher and student mental health.
Acknowledgements
This study was funded by the National Institute for Health Research Public Health Research programme (13/164/06). We are very grateful for the intervention costs being covered by Public Health England, Public Health Wales and Bristol City Council. The work was also undertaken with the support of DECIPHer, a UK Clinical Research Collaboration Public Health Research Centre of Excellence. This study was designed and delivered in collaboration with the Bristol Randomised Trials Collaboration, a UK Clinical Research Collaboration Registered Clinical Trials Unit in receipt of National Institute for Health Research Clinical Trials Unit support funding. In addition, we gratefully acknowledge support from Jodi Taylor in providing ongoing trials advice.
We would like to thank our Trial Steering Committee members [Professor Laurence Moore (chairperson), University of Glasgow; Professor Paul Stallard, University of Bath; Professor Karla Hemming, University of Birmingham; Dr Ann Lendrum, University of Manchester; and Sandra Taylor, Education Support] for their invaluable support and advice throughout the study. We also extend our thanks to the Schools Health Research Network in Wales and local public health partners in England [Steve Spiers and Sarah Godsell, South Gloucestershire; Julie Coulthard, Bristol; Judy Allies, Bath and North East Somerset; Shaun Cheesman, North Somerset; and Teresa Day, Somerset] for their support in recruiting schools. We gratefully acknowledge the invaluable contribution of Hannah Baber as trial manager from 2018 to the study end and the excellent support from study administrators Odell Harriss and Camilla Sapsworth (University of Bristol) and Alison Evans, Danielle Couturiaux, Amy Bond and Amy Edwards (Cardiff University). We thank Aideen Ahern for contributing to the design and conduct of the economic analysis, Dr Mari-Rose Kennedy for support with data analysis and report formatting, and Professor Kate Tilling for early advice regarding study design and statistical analysis plan.
Finally, our thanks go to all the staff and student participants of the study, and the MHFA trainers and HSCs for their contributions to intervention delivery.
Contributions of authors
Judi Kidger (https://orcid.org/0000-0002-1054-6758) (Lecturer, Public Health) was the principal investigator of the study, conceived the study, oversaw all aspects of study delivery, data collection and analysis, and wrote the report.
Rhiannon Evans (https://orcid.org/0000-0002-0239-6331) (Senior Lecturer, Social Sciences) was a co-applicant, led the process evaluation, led study delivery at the Welsh site and contributed to writing the report.
Sarah Bell (https://orcid.org/0000-0003-1181-9591) (Senior Research Associate, Public Health) assisted with study delivery at the English site, contributed to all data collection and analysis, and contributed to writing the report.
Harriet Fisher (https://orcid.org/0000-0002-5639-0955) (Senior Research Associate, Public Health) led analysis of implementation data, contributed to all other data analysis and contributed to writing the report.
Nicholas Turner (https://orcid.org/0000-0003-1591-6997) (Research Fellow, Statistics) wrote the statistical analysis plan, conducted the statistical analysis and contributed to writing the report.
William Hollingworth (https://orcid.org/0000-0002-0840-6254) (Professor, Health Economics) was a co-applicant, planned and oversaw the economic evaluation, and contributed to economic analysis and to writing the report.
Sarah Harding (https://orcid.org/0000-0001-9385-6307) (Research Associate, Public Health) assisted with all aspects of study delivery, and contributed to data collection and analysis.
Jillian Powell (https://orcid.org/0000-0002-1538-9405) (Research Associate, Social Sciences) assisted with study delivery at the Welsh site, and contributed to all data collection and analysis.
Rowan Brockman (https://orcid.org/0000-0002-0342-2124) (Senior Research Associate, Public Health) assisted with study delivery at the English site, and contributed to data collection and analysis.
Lauren Copeland (https://orcid.org/0000-0003-0387-9607) (Research Associate, Social Sciences) contributed to data analysis and to writing the report.
Ricardo Araya (https://orcid.org/0000-0002-0420-5148) (Professor, Global Mental Health) was a co-applicant, contributed to study design and oversight, and to the interpretation of findings.
Rona Campbell (https://orcid.org/0000-0002-1099-9319) (Professor, Public Health) was a co-applicant, contributed to study design and oversight, and to the interpretation of findings.
Tamsin Ford (https://orcid.org/0000-0001-5295-4904) (Professor, Child and Adolescent Psychiatry) was a co-applicant, contributed to study design and oversight, and to the interpretation of findings.
David Gunnell (https://orcid.org/0000-0002-0829-6470) (Professor, Epidemiology) was a co-applicant, contributed to study design and oversight, and to the interpretation of findings.
Richard Morris (https://orcid.org/0000-0001-7240-4563) (Professor, Statistics) oversaw the statistical plan and analysis, and contributed to the interpretation of findings.
Simon Murphy (https://orcid.org/0000-0003-3589-3681) (Professor, Social Interventions and Health) was a co-applicant, oversaw the Welsh site, and contributed to study design and to the interpretation of findings.
All authors made critical comments on drafts of the monograph and approved the final submission.
Publications
Kidger J, Evans R, Tilling K, Hollingworth W, Campbell R, Ford T, et al. Protocol for a cluster randomised controlled trial of an intervention to improve the mental health support and training available to secondary school teachers – the WISE (Wellbeing in Secondary Education) study. BMC Public Health 2016;16:1089.
Evans R, Brockman R, Grey J, Bell S, Harding S, Campbell R, et al. A cluster randomised controlled trial of the Wellbeing in Secondary Education (WISE) project – an intervention to improve the mental health support and training available to secondary school teachers: protocol for an integrated process evaluation. Trials 2018;19:270.
Harding S, Evans R, Morris R, Gunnell D, Ford T, Hollingworth W, et al. Is teachers’ mental health and wellbeing associated with students’ mental health and wellbeing? J Affect Disord 2018;242:180–7.
Kidger J, Turner N, Hollingworth W, Evans R, Bell S, Brockman R, et al. An intervention to improve teacher well-being support and training to support students in UK high schools (the WISE study): a cluster randomised controlled trial. PLoS Med 2021;18:e1003847.
Data-sharing statement
All available data can be obtained by contacting the corresponding author.
Disclaimers
This report presents independent research funded by the National Institute for Health Research (NIHR). The views and opinions expressed by authors in this publication are those of the authors and do not necessarily reflect those of the NHS, the NIHR, NETSCC, the PHR programme or the Department of Health and Social Care. If there are verbatim quotations included in this publication the views and opinions expressed by the interviewees are those of the interviewees and do not necessarily reflect those of the authors, those of the NHS, the NIHR, NETSCC, the PHR programme or the Department of Health and Social Care.
References
- NHS Digital . Mental Health of Children and Young People in England, 2017 n.d. http://digital.nhs.uk/pubs/mhcypsurvey17 (accessed 5 November 2020).
- Patalay P, Gage SH. Changes in millennial adolescent mental health and health-related behaviours over 10 years: a population cohort comparison study. Int J Epidemiol 2019;48:1650-64. https://doi.org/10.1093/ije/dyz006.
- Lessof C, Ross A, Brind R, Bell E, Newton S. Longitudinal Study of Young People in England Cohort 2: Health and Wellbeing at Wave 2. London: Department for Education; 2016.
- Morgan C, Webb RT, Carr MJ, Kontopantelis E, Green J, Chew-Graham CA, et al. Incidence, clinical management, and mortality risk following self harm among children and adolescents: cohort study in primary care. BMJ 2017;359. https://doi.org/10.1136/bmj.j4351.
- Dekker MC, Ferdinand RF, van Lang ND, Bongers IL, van der Ende J, Verhulst FC. Developmental trajectories of depressive symptoms from early childhood to late adolescence: gender differences and adult outcome. J Child Psychol Psychiatry 2007;48:657-66. https://doi.org/10.1111/j.1469-7610.2007.01742.x.
- Kessler RC, Demler O, Frank RG, Olfson M, Pincus HA, Walters EE, et al. Prevalence and treatment of mental disorders, 1990 to 2003. N Engl J Med 2005;352:2515-23. https://doi.org/10.1056/NEJMsa043266.
- Morse A. Improving Children and Young People’s Mental Health Services. London: National Audit Office; 2018.
- Werner-Seidler A, Perry Y, Calear AL, Newby JM, Christensen H. School-based depression and anxiety prevention programs for young people: a systematic review and meta-analysis. Clin Psychol Rev 2017;51:30-47. https://doi.org/10.1016/j.cpr.2016.10.005.
- Caldwell DM, Davies SR, Hetrick SE, Palmer JC, Caro P, López-López JA, et al. School-based interventions to prevent anxiety and depression in children and young people: a systematic review and network meta-analysis. Lancet Psychiatry 2019;6:1011-20. https://doi.org/10.1016/S2215-0366(19)30403-1.
- Ford T, Hamilton H, Meltzer H, Goodman R. Child mental health is everybody’s business: the prevalence of contact with public sector services by type of disorder among British school children in a three-year period. Child Adolesc Ment Health 2007;12:13-20. https://doi.org/10.1111/j.1475-3588.2006.00414.x.
- Kidger J, Araya R, Donovan J, Gunnell D. The effect of the school environment on the emotional health of adolescents: a systematic review. Pediatrics 2012;129:925-49. https://doi.org/10.1542/peds.2011-2248.
- Lang IA, Marlow R, Goodman R, Meltzer H, Ford T. Influence of problematic child-teacher relationships on future psychiatric disorder: population survey with 3-year follow-up. Br J Psychiatry 2013;202:336-41. https://doi.org/10.1192/bjp.bp.112.120741.
- Kidger J, Gunnell D, Biddle L, Campbell R, Donovan J. Part and parcel of teaching? Secondary school staff’s views on supporting student emotional health and well-being. Br Educ Res J 2009;36:919-35. https://doi.org/10.1080/01411920903249308.
- Rothì DM, Leavey G, Best R. On the front-line: teachers as active observers of pupils’ mental health. Teach Teach Educ 2008;24:1217-31. https://doi.org/10.1016/j.tate.2007.09.011.
- Harland J, Dawson A, Rabiasz A, Sims D. NFER Teacher Voice Omnibus: Questions for the Department for Education. London: Department for Education; 2015.
- Department for Education . Transforming Children and Young People’s Mental Health Provision: A Green Paper 2017.
- Stansfeld SA, Rasul FR, Head J, Singleton N. Occupation and mental health in a national UK survey. Soc Psychiatry Psychiatr Epidemiol 2011;46:101-10. https://doi.org/10.1007/s00127-009-0173-7.
- Johnson S, Cooper C, Cartwright S, Donald I, Taylor P, Millet C. The experience of work-related stress across occupations. J Manag Psychol 2005;20:178-87. https://doi.org/10.1108/02683940510579803.
- Health and Safety Executive . Work-Related Stress, Anxiety or Depression Statistics in Great Britain, 2020 n.d. www.hse.gov.uk/statistics/causdis/stress.pdf (accessed 5 November 2020).
- Ball SJ. The teacher’s soul and the terrors of performativity. J Educ Policy 2003;18:215-28. https://doi.org/10.1080/0268093022000043065.
- Barmby P. Improving teacher recruitment and retention: the importance of workload and pupil behaviour. Educ Res 2006;48:247-65. https://doi.org/10.1080/00131880600732314.
- Chaplain RP. Stress and psychological distress among trainee secondary teachers in England. J Educ Psycol 2008;28:195-209. https://doi.org/10.1080/01443410701491858.
- Smithers A, Robinson P. Factors Affecting Teachers’ Decisions to Leave the Profession. Nottingham: Department for Education and Skills Publications; 2003.
- Davies C. The Wellbeing of Teachers in Wales. Nantgarw: Teacher Support Cymru; 2007.
- Melchior M, Caspi A, Milne BJ, Danese A, Poulton R, Moffitt TE. Work stress precipitates depression and anxiety in young, working women and men. Psychol Med 2007;37:1119-29. https://doi.org/10.1017/S0033291707000414.
- Henderson M, Harvey SB, Øverland S, Mykletun A, Hotopf M. Work and common psychiatric disorders. J R Soc Med 2011;104:198-207. https://doi.org/10.1258/jrsm.2011.100231.
- Bowers T, McIver M. Ill Health Retirement and Absenteeism Amongst Teachers. Nottingham: Department for Education and Employment Publications; 2000.
- Kidger J, Brockman R, Tilling K, Campbell R, Ford T, Araya R, et al. Teachers’ wellbeing and depressive symptoms, and associated risk factors: a large cross sectional study in English secondary schools. J Affect Disord 2016;192:76-82. https://doi.org/10.1016/j.jad.2015.11.054.
- Jennings PA, Snowberg KE, Coccia MA, Greenberg MT. Improving classroom learning environments by cultivating awareness and resilience in education (CARE): results of two pilot studies. JCI 2011:37-48.
- Sisask M, Värnik P, Värnik A, Apter A, Balazs J, Balint M, et al. Teacher satisfaction with school and psychological well-being affects their readiness to help children with mental health problems. Health Educ J 2014;73:382-93. https://doi.org/10.1177/0017896913485742.
- Harding S, Morris R, Gunnell D, Ford T, Hollingworth W, Tilling K, et al. Is teachers’ mental health and wellbeing associated with students’ mental health and wellbeing?. J Affect Disord 2019;242:180-7. https://doi.org/10.1016/j.jad.2018.08.080.
- Bonell C, Allen E, Warren E, McGowan J, Bevilacqua L, Jamal F, et al. Effects of the learning together intervention on bullying and aggression in English secondary schools (INCLUSIVE): a cluster randomised controlled trial. Lancet 2018;392:2452-64. https://doi.org/10.1016/S0140-6736(18)31782-3.
- Ford T, Hayes R, Byford S, Edwards V, Fletcher M, Logan S, et al. The effectiveness and cost-effectiveness of the Incredible Years® teacher classroom management programme in primary school children: results of the STARS cluster randomised controlled trial. Psychol Med 2019;49:828-42. https://doi.org/10.1017/S0033291718001484.
- Wasserman D, Hoven CW, Wasserman C, Wall M, Eisenberg R, Hadlaczky G, et al. School-based suicide prevention programmes: the SEYLE cluster-randomised, controlled trial. Lancet 2015;385:1536-44. https://doi.org/10.1016/S0140-6736(14)61213-7.
- Nigatu YT, Huang J, Rao S, Gillis K, Merali Z, Wang J. Indicated prevention interventions in the workplace for depressive symptoms: a systematic review and meta-analysis. Am J Prev Med 2019;56:e23-e33. https://doi.org/10.1016/j.amepre.2018.08.027.
- Bartlett L, Martin A, Neil AL, Memish K, Otahal P, Kilpatrick M, et al. A systematic review and meta-analysis of workplace mindfulness training randomized controlled trials. J Occup Health Psychol 2019;24:108-26. https://doi.org/10.1037/ocp0000146.
- Joyce S, Modini M, Christensen H, Mykletun A, Bryant R, Mitchell PB, et al. Workplace interventions for common mental disorders: a systematic meta-review. Psychol Med 2016;46:683-97. https://doi.org/10.1017/S0033291715002408.
- Wan Mohd Yunus WMA, Musiat P, Brown JSL. Systematic review of universal and targeted workplace interventions for depression. Occup Environ Med 2018;75:66-75. https://doi.org/10.1136/oemed-2017-104532.
- Naghieh A, Montgomery P, Bonell CP, Thompson M, Aber JL. Organisational interventions for improving wellbeing and reducing work-related stress in teachers. Cochrane Database Syst Rev 2015;4. https://doi.org/10.1002/14651858.CD010306.pub2.
- Laine H, Saastamoinen P, Lahti J, Rahkonen O, Lahelma E. The associations between psychosocial working conditions and changes in common mental disorders: a follow-up study. BMC Public Health 2014;14. https://doi.org/10.1186/1471-2458-14-588.
- Bricheno P, Brown S, Lubansky R. Teacher Wellbeing: A Review of the Evidence. London: Teacher Support Network; 2009.
- Linnan L, Fisher EB, Hood S. The power and potential of peer support in workplace interventions. Am J Health Promot 2013;28:TAHP2-10.
- van Pelt F. Peer support: healthcare professionals supporting each other after adverse medical events. Qual Saf Health Care 2008;17:249-52. https://doi.org/10.1136/qshc.2007.025536.
- Allen JD, Stoddard AM, Mays J, Sorensen G. Promoting breast and cervical cancer screening at the workplace: results from the Woman to Woman Study. Am J Public Health 2001;91:584-90. https://doi.org/10.2105/AJPH.91.4.584.
- Buller DB, Morrill C, Taren D, Aickin M, Sennott-Miller L, Buller MK, et al. Randomized trial testing the effect of peer education at increasing fruit and vegetable intake. J Natl Cancer Inst 1999;91:1491-500. https://doi.org/10.1093/jnci/91.17.1491.
- Thoits PA. Mechanisms linking social ties and support to physical and mental health. J Health Soc Behav 2011;52:145-61. https://doi.org/10.1177/0022146510395592.
- Kitchener BA, Jorm AF. Mental health first aid training in a workplace setting: a randomized controlled trial. BMC Psychiatry 2004;4. https://doi.org/10.1186/1471-244X-4-23.
- Jorm AF, Kitchener BA. Noting a landmark achievement: mental health first aid training reaches 1% of Australian adults. Aust N Z J Psychiatry 2011;45:808-13. https://doi.org/10.3109/00048674.2011.594785.
- Hadlaczky G, Hökby S, Mkrtchian A, Carli V, Wasserman D. Mental health first aid is an effective public health intervention for improving knowledge, attitudes, and behaviour: a meta-analysis. Int Rev Psychiatry 2014;26:467-75. https://doi.org/10.3109/09540261.2014.924910.
- Bovopoulos N, LaMontagne A, Martin A, Jorm A. Delivering mental health first aid training in Australian workplaces: exploring instructors’ experiences. Int J Ment Health Promot 2016;18:65-82. https://doi.org/10.1080/14623730.2015.1122658.
- Kelly CM, Mithen JM, Fischer JA, Kitchener BA, Jorm AF, Lowe A, et al. Youth mental health first aid: a description of the program and an initial evaluation. Int J Ment Health Syst 2011;5. https://doi.org/10.1186/1752-4458-5-4.
- Jorm AF, Kitchener BA, Mugford SK. Experiences in applying skills learned in a mental health first aid training course: a qualitative study of participants’ stories. BMC Psychiatry 2005;5. https://doi.org/10.1186/1471-244X-5-43.
- Borrill J, Kuczynska P. Evaluation of Youth Mental Health Fist Aid Training in the North-East of England. London: Mental Health First Aid Youth; 2013.
- Milligan-Saville JS, Tan L, Gayed A, Barnes C, Madan I, Dobson M, et al. Workplace mental health training for managers and its effect on sick leave in employees: a cluster randomised controlled trial. Lancet Psychiatry 2017;4:850-8. https://doi.org/10.1016/S2215-0366(17)30372-3.
- Jorm AF, Kitchener BA, Sawyer MG, Scales H, Cvetkovski S. Mental health first aid training for high school teachers: a cluster randomized trial. BMC Psychiatry 2010;10. https://doi.org/10.1186/1471-244X-10-51.
- Langeveld J, Joa I, Larsen TK, Rennan JA, Cosmovici E, Johannessen JO. Teachers’ awareness for psychotic symptoms in secondary school: the effects of an early detection programme and information campaign. Early Interv Psychiatry 2011;5:115-21. https://doi.org/10.1111/j.1751-7893.2010.00248.x.
- Department for Education . Mental Health and Behaviour in Schools 2018.
- Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Medical Research Council Guidance . Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ 2008;337. https://doi.org/10.1136/bmj.a1655.
- Kidger J, Stone T, Tilling K, Brockman R, Campbell R, Ford T, et al. A pilot cluster randomised controlled trial of a support and training intervention to improve the mental health of secondary school teachers and students – the WISE (Wellbeing in Secondary Education) study. BMC Public Health 2016;16. https://doi.org/10.1186/s12889-016-3737-y.
- Kidger J, Turner N, Hollingworth W, Evans R, Bell, Brockman R, et al. An intervention to improve teacher well-being support and training to support students in UK high schools (the WISE study): a cluster randomised controlled trial. PLoS Med 2021;18. https://doi.org/10.1371/journal.pmed.1003847.
- Kidger J, Evans R, Tilling K, Hollingworth W, Campbell R, Ford T, et al. Protocol for a cluster randomised controlled trial of an intervention to improve the mental health support and training available to secondary school teachers – the WISE (Wellbeing in Secondary Education) study. BMC Public Health 2016;16. https://doi.org/10.1186/s12889-016-3756-8.
- Campbell R, Starkey F, Holliday J, Audrey S, Bloor M, Parry-Langdon N, et al. An informal school-based peer-led intervention for smoking prevention in adolescence (ASSIST): a cluster randomised trial. Lancet 2008;371:1595-602. https://doi.org/10.1016/S0140-6736(08)60692-3.
- Jennings PA, Greenberg MT. The prosocial classroom: teacher social and emotional competence in relation to student and classroom outcomes. Rev Educ Res 2009;79:491-525. https://doi.org/10.3102/0034654308325693.
- Tennant R, Hiller L, Fishwick R, Platt S, Joseph S, Weich S, et al. The Warwick–Edinburgh Mental Well-being Scale (WEMWBS): development and UK validation. Health Qual Life Outcomes 2007;5. https://doi.org/10.1186/1477-7525-5-63.
- Kroenke K, Strine TW, Spitzer RL, Williams JB, Berry JT, Mokdad AH. The PHQ-8 as a measure of current depression in the general population. J Affect Disord 2009;114:163-73. https://doi.org/10.1016/j.jad.2008.06.026.
- Reilly MC, Zbrozek AS, Dukes EM. The validity and reproducibility of a work productivity and activity impairment instrument. PharmacoEconomics 1993;4:353-65. https://doi.org/10.2165/00019053-199304050-00006.
- Clarke A, Friede T, Putz R, Ashdown J, Martin S, Blake A, et al. Warwick–Edinburgh Mental Well-being Scale (WEMWBS): validated for teenage school students in England and Scotland. A mixed methods assessment. BMC Public Health 2011;11. https://doi.org/10.1186/1471-2458-11-487.
- GOV.UK . Find and Compare Schools in England n.d. www.compare-school-performance.service.gov.uk (accessed 5 November 2020).
- n.d. www.mylocalschool.wales.gov.uk.
- Goodman R. Psychometric properties of the strengths and difficulties questionnaire. J Am Acad Child Adolesc Psychiatry 2001;40:1337-45. https://doi.org/10.1097/00004583-200111000-00015.
- Putz R, O’Hara K, Taggart F, Stewart-Brown S. Using WEMWBS to Measure the Impact of Your Work on Mental Wellbeing: A Practice-Based User Guide Health Scotland. Coventry: Warwick Medical School; 2012.
- Shrier I, Steele RJ, Verhagen E, Herbert R, Riddell CA, Kaufman JS. Beyond intention to treat: what is the right question?. Clin Trials 2014;11:28-37. https://doi.org/10.1177/1740774513504151.
- Twisk J, de Boer M, de Vente W, Heymans M. Multiple imputation of missing values was not necessary before performing a longitudinal mixed-model analysis. J Clin Epidemiol 2013;66:1022-8. https://doi.org/10.1016/j.jclinepi.2013.03.017.
- Royston P. Multiple imputation of missing values: further update of ice, with an emphasis on categorical variables. Stata J 2009;9:466-77. https://doi.org/10.1177/1536867X0900900308.
- Brookes ST, Whitely E, Egger M, Smith GD, Mulheran PA, Peters TJ. Subgroup analyses in randomized trials: risks of subgroup-specific analyses; power and sample size for the interaction test. J Clin Epidemiol 2004;57:229-36. https://doi.org/10.1016/j.jclinepi.2003.08.009.
- White IR, Horton NJ, Carpenter J, Pocock SJ. Strategy for intention to treat analysis in randomised trials with missing outcome data. BMJ 2011;342. https://doi.org/10.1136/bmj.d40.
- Turner AJ, Sutton M, Harrison M, Hennessey A, Humphrey N. Cost-effectiveness of a school-based social and emotional learning intervention: evidence from a cluster-randomised controlled trial of the promoting alternative thinking strategies curriculum. Appl Health Econ Health Policy 2020;18:271-85. https://doi.org/10.1007/s40258-019-00498-z.
- National Institute for Health and Care Excellence (NICE) . Methods for the Development of NICE Public Health Guidance (Third Edition) 2012.
- Johnson R, Jenkinson D, Stinton C, Taylor-Phillips S, Madan J, Stewart-Brown S, et al. Where’s WALY?: a proof of concept study of the ‘wellbeing adjusted life year’ using secondary analysis of cross-sectional survey data. Health Qual Life Outcomes 2016;14. https://doi.org/10.1186/s12955-016-0532-5.
- Evans R, Brockman R, Grey J, Bell S, Harding S, Gunnell D, et al. A cluster randomised controlled trial of the Wellbeing in Secondary Education (WISE) Project – an intervention to improve the mental health support and training available to secondary school teachers: protocol for an integrated process evaluation. Trials 2018;19. https://doi.org/10.1186/s13063-018-2617-4.
- Shorten A, Smith J. Mixed methods research: expanding the evidence base. Evid Based Nurs 2017;20:74-5. https://doi.org/10.1136/eb-2017-102699.
- Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ 2015;350. https://doi.org/10.1136/bmj.h1258.
- Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol 2006;3:77-101. https://doi.org/10.1191/1478088706qp063oa.
- DECIPHer . Young People in Wales Now Substantially More Likely to Try E-Cigarettes Than Tobacco, New Cardiff University Research Shows n.d. https://decipher.uk.net/public-health-improvement-research-networks-phirns/public-involvement-alpha/ (accessed 5 November 2020).
- GOV.UK . School Workforce in England: November 2018 n.d. www.gov.uk/government/statistics/school-workforce-in-england-november-2018 (accessed 10 December 2019).
- Education Workforce Council . EWC Annual Education Workforce Statistics for Wales 2020 n.d. www.ewc.wales/site/index.php/en/statistics-and-research/education-workforce-statistics.html (accessed 10 December 2019).
- Kocalevent RD, Hinz A, Brähler E. Standardization of the depression screener patient health questionnaire (PHQ-9) in the general population. Gen Hosp Psychiatry 2013;35:551-5. https://doi.org/10.1016/j.genhosppsych.2013.04.006.
- SDQ . British Means and Standard Deviations for the Sample Split by Age Band n.d. www.sdqinfo.org/norms/UKNorm3.pdf (accessed 10 December 2019).
- UK Parliament . Teachers: Pay: Written Question – HL7985 Parliament UK n.d. www.parliament.uk/business/publications/written-questions-answers-statements/written-question/Lords/2016-04-26/HL7985/ (accessed 15 November 2019).
- Stallard P, Skryabina E, Taylor G, Phillips R, Daniels H, Anderson R, et al. Classroom-based cognitive behaviour therapy (FRIENDS): a cluster randomised controlled trial to Prevent Anxiety in Children through Education in Schools (PACES). Lancet Psychiatry 2014;1:185-92. https://doi.org/10.1016/S2215-0366(14)70244-5.
- Durlak JA, DuPre EP. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol 2008;41:327-50. https://doi.org/10.1007/s10464-008-9165-0.
- Goodman MS, Madni LA, Semple RJ. Measuring mindfulness in youth: review of current assessments, challenges, and future directions. Mindfulness 2017;8:1409-20. https://doi.org/10.1007/s12671-017-0719-9.
- Dalziel K, Li J, Clarke P. Accuracy of patient recall for self-reported doctor visits: is shorter recall better?. Health Econ 2018;27:1684-98. https://doi.org/10.1002/hec.3794.
- Moran GS, Russinova Z, Gidugu V, Gagne C. Challenges experienced by paid peer providers in mental health recovery: a qualitative study. Community Ment Health J 2013;49:281-91. https://doi.org/10.1007/s10597-012-9541-y.
- Mancini MA. An exploration of factors that effect the implementation of peer support services in community mental health settings. Community Ment Health J 2018;54:127-37. https://doi.org/10.1007/s10597-017-0145-4.
- Education Support . Teacher Wellbeing Index 2019 2019.
- Craig P, Di Ruggiero E, Frohlich KL, Mykhalovskiy E, White M. Taking Account of Context in Population Health Intervention Research: Guidance For Producers, Users and Funders of Research. Southampton: NIHR Evaluation, Trials and Studies Coordinating Centre; 2018.
- Hawe P. Minimal, negligible and negligent interventions. Soc Sci Med 2015;138:265-8. https://doi.org/10.1016/j.socscimed.2015.05.025.
- Hawe P, Shiell A, Riley T. Theorising interventions as events in systems. Am J Community Psychol 2009;43:267-76. https://doi.org/10.1007/s10464-009-9229-9.
- Allen K, Hansford L, Hayes R, Allwood M, Byford S, Longdon B, et al. Teachers’ perceptions of the impact of the Incredible Years® Teacher Classroom Management programme on their practice and on the social and emotional development of their pupils. J Educ Psychol 2020;90:75-90. https://doi.org/10.1111/bjep.12306.
- Wilde S, Sonley A, Crane C, Ford T, Raja A, Robson J, et al. Mindfulness training in UK secondary schools: a multiple case study approach to identification of cornerstones of implementation. Mindfulness 2019;10:376-89. https://doi.org/10.1007/s12671-018-0982-4.
- Ouellette RR, Frazier SL, Shernoff ES, Cappella E, Mehta TG, Maríñez-Lora A, et al. Teacher job stress and satisfaction in urban schools: disentangling individual-, classroom-, and organizational-level influences. Behav Ther 2018;49:494-508. https://doi.org/10.1016/j.beth.2017.11.011.
- Case P, Case S, Catling S. Please show you’re working: a critical assessment of the impact of OFSTED inspection on primary teachers. Br J Sociol Educ 2000;21:605-21. https://doi.org/10.1080/713655370.
- Page F. A review of mental health morbidity associated with OFSTED inspections of schools in one metropolitan local authority. Occup Med 1999;49:534-5. https://doi.org/10.1093/occmed/49.8.534.
- Scanlon M. The Impact of OFSTED Inspections: National Foundation for Educational Research for the National Union of Teachers. Slough: National Foundation For Educational Research; 1999.
Appendix 1 Teacher questionnaire
Appendix 2 Student questionnaire
Appendix 3 Questions for peer supporter feedback meetings
Appendix 4 Topic guides
Appendix 5 Unadjusted, partially adjusted and fully adjusted results for teacher secondary outcomes
PHQ-8a,b | n | Ratio of geometric meansc intervention/control (95% CI) | p-value |
---|---|---|---|
Continuous | |||
Unadjusted model | 1733 | 0.97 (0.98 to 1.07) | 0.556 |
Partially adjusted model | 1733 | 1.00 (0.92 to 1.10) | 0.947 |
Fully adjusted model | 1719 | 1.00 (0.92 to 1.10) | 0.964 |
Binary | |||
Unadjusted model | 1733 | 1.02 (0.66 to 1.57)d | 0.934 |
Partially adjusted model | 1733 | 1.29 (0.83 to 2.00)d | 0.255 |
Fully adjusted model | 1719 | 1.27 (0.82 to 1.97)d | 0.279 |
Categorical | |||
Unadjusted model | 1733 | 0.95 (0.63 to 1.43)d | 0.802 |
Partially adjusted model | 1733 | 1.07 (0.70 to 1.63)d | 0.746 |
Fully adjusted model | 1719 | 1.06 (0.70 to 1.60)d | 0.790 |
Absencea,b | n | Ratio of geometric meansc intervention/control (95% CI) | p-value |
---|---|---|---|
Continuous | |||
Unadjusted model | 1732 | 1.03 (0.98 to 1.08) | 0.308 |
Partially adjusted model | 1732 | 1.04 (1.00 to 1.09) | 0.040 |
Fully adjusted model | 1717 | 1.04 (1.00 to 1.09) | 0.042 |
Binary | |||
Unadjusted model | 1732 | 1.05 (0.77 to 1.44)d | 0.746 |
Partially adjusted model | 1732 | 1.21 (0.90 to 1.63)d | 0.210 |
Fully adjusted model | 1717 | 1.22 (0.90 to 1.67)d | 0.203 |
Categorical | |||
Unadjusted model | 1732 | 1.23 (0.85 to 1.79)d | 0.272 |
Partially adjusted model | 1732 | 1.43 (0.98 to 2.10)d | 0.066 |
Fully adjusted model | 1717 | 1.45 (0.98 to 2.14)d | 0.063 |
Presenteeisma,b | n | Difference in meansc (95% CI) | p-value |
---|---|---|---|
Continuous | |||
Unadjusted model | 1550 | 0.02 (–0.27 to 0.31) | 0.870 |
Partially adjusted model | 1550 | 0.12 (–0.14 to 0.38) | 0.359 |
Fully adjusted model | 1539 | 0.12 (–0.13 to 0.37) | 0.361 |
Binary | |||
Unadjusted model | 1550 | 0.95 (0.62 to 1.47)d | 0.831 |
Partially adjusted model | 1550 | 1.00 (0.67 to 1.50)d | 0.982 |
Fully adjusted model | 1539 | 1.00 (0.67 to 1.49)d | 0.988 |
Categorical | |||
Unadjusted model | 1550 | 0.93 (0.64 to 1.37)d | 0.732 |
Partially adjusted model | 1550 | 1.02 (0.71 to 1.46)d | 0.919 |
Fully adjusted model | 1539 | 1.02 (0.71 to 1.45)d | 0.935 |
Appendix 6 Unadjusted, partially adjusted and fully adjusted results for student secondary outcomes
Outcomea,b | n | Difference in means (95% CI)c | p-value |
---|---|---|---|
SDQ | |||
Unadjusted model | 2703 | 0.31 (–0.21 to 0.83) | 0.249 |
Partially adjusted model | 2703 | 0.26 (–0.23 to 0.74) | 0.300 |
Fully adjusted model | 2702 | 0.27 (–0.18 to 0.73) | 0.241 |
WEMWBS | |||
Unadjusted model | 2701 | –0.39 (–1.40 to 0.61) | 0.445 |
Partially adjusted model | 2701 | –0.36 (–1.42 to 0.70) | 0.502 |
Fully adjusted model | 2700 | –0.35 (–1.35 to 0.66) | 0.500 |
Appendix 7 Unadjusted and adjusted results for school-level outcomes
Outcomea | n | Difference in means (95% CI)b | p-value |
---|---|---|---|
School-level teacher absenteeism | |||
Unadjusted model | 13 | 4.40 (–0.29 to 9.10) | 0.063 |
Adjusted model | 13 | 3.69 (–1.72 to 9.11) | 0.151 |
School-level teacher retirement | |||
Unadjusted model | 8 | 8.31 (–1.89 to 18.51) | 0.090 |
Adjusted model | 8 | 10.78 (–2.17 to 23.73) | 0.070 |
School-level teacher left for other reasons | |||
Unadjusted model | 8 | 19.08 (–74.71 to 112.87) | 0.623 |
Adjusted model | 8 | 47.36 (–148.39 to 243.11) | 0.407 |
School-level student attainment | |||
Unadjusted model | 9 | 0.75 (0.03 to 17.51)c | 0.858 |
Adjusted model | 9 | 1.12 (0.03 to 38.71)c | 0.407 |
School-level student attendance | |||
Unadjusted model | 25 | 0.31 (–0.37 to 0.98) | 0.357 |
Adjusted model | 25 | 0.17 (–0.60 to 0.95) | 0.645 |
List of abbreviations
- AE
- adverse event
- ALGEE
- approach the person, assess and assist with any crisis; listen non-judgementally; give support and information; encourage the person to get appropriate professional help; encourage other supports
- CACE
- complier-average causal effect
- CAMHS
- Child and Adolescent Mental Health Service
- CI
- confidence interval
- CONSORT
- Consolidated Standards of Reporting Trials
- DECIPHer
- Centre for the Development and Evaluation of Complex Interventions for Public Health Improvement
- FSM
- free school meals
- HSC
- healthy schools co-ordinator
- ICC
- intracluster correlation coefficient
- MAR
- missing at random
- MHFA
- mental health first aid
- MNAR
- missing not at random
- Ofsted
- Office for Standards in Education, Children’s Services and Skills
- PHQ-8
- Patient Health Questionnaire-8 items
- RCT
- randomised controlled trial
- SAE
- serious adverse event
- SD
- standard deviation
- SDQ
- Strengths and Difficulties Questionnaire
- T1
- 12-month follow-up
- T2
- 24-month follow-up
- WEMWBS
- Warwick–Edinburgh Mental Wellbeing Scale
- WISE
- Wellbeing in Secondary Education
Notes
Supplementary material can be found on the NIHR Journals Library report page (https://doi.org/10.3310/phr09120).
Supplementary material has been provided by the authors to support the report and any files provided at submission will have been seen by peer reviewers, but not extensively reviewed. Any supplementary material provided at a later stage in the process may not have been peer reviewed.