OUP user menu

How hospital leaders implemented a safe surgery protocol in Australian hospitals

Judith Mary Healy
DOI: http://dx.doi.org/10.1093/intqhc/mzr078 88-94 First published online: 3 December 2011

Abstract

Objective To analyse the strategies used by hospital leaders to improve compliance with the ‘ensuring correct patient, correct site and correct procedure protocol’. While following such a protocol saves lives according to an international study of the World Health Organization safe surgery checklist, promoting compliance in hospitals has proved to be a regulatory challenge.

Design, Setting and Participants Using a qualitative research design and ‘responsive regulation’ theory, this study explored implementation strategies used by hospital leaders in 20 Australian public hospitals. Semi-structured interviews were conducted with 72 informants to analyse how front-line leaders improved compliance with the safe surgery protocol in their hospitals.

Interventions Implementation analysis of the safe surgery protocol.

Main Outcome Measures The use of implementation strategies located on a ‘responsive regulation’ pyramid.

Results Informants identified many strategies used to improve protocol compliance typically beginning with persuasion. Supportive strategies were located on a regulatory pyramid beginning with softer interventions: persuade, enlist leaders, train, remind, relax protocol requirements, redesign hospital systems and reward compliance. In response to low and slow compliance, many hospital leaders switched to a pyramid of escalating sanctions: direct, delegate, monitor, publicly report, reprimand and penalize.

Conclusions A multiplex problem requires graduated and multiplex regulation. Hospital leaders proved to be responsive regulators in applying both multiple supports and sanctions that improved compliance over 3 years. These experiences with protocol implementation illustrate the multifaceted challenge of health sector regulation and offer lessons for embedding future patient safety solutions.

  • patient safety
  • guidelines
  • hospital care

Introduction

There is compelling evidence that the World Health Organization's (WHO) 19-step safe surgery checklist [1] reduces complications and saves lives. An eight-country study found a drop in complications among surgical patients from 11 to 7% and a drop in mortality from 1.5 to 0.8% [2]; a Netherlands study found a drop in complications among patients from 15.4 to 10.6% and a drop in mortality from 1.5 to 0.8% [3]. While the checklist has been shown to be effective, low-cost and non-intrusive, these and other studies have found that it is not easy to implement new guidelines [47], and that intensive efforts are needed to embed a checklist as a standard procedure in hospitals [810]. The WHO 2009 checklist manual recognizes that implementation is a major challenge and offers suggestions in a ‘how-to’ guide, such as build a team, meet with hospital leaders, run a campaign and track progress on goals [11]. The WHO checklist now is widely used as 25 countries said they had ‘mobilized resources’ to implement the checklist and 1788 hospitals in November 2010 reported they were actively using the checklist [12].

This paper reports on the Australian experience in implementing a shorter version of a ‘safe surgery’ checklist. It explores how hospital leaders went about the notoriously difficult task of implementing change in clinical procedures and staff behaviours. In the Australian federal system of government, public hospitals are funded by national and by state and territory governments but primarily are the administrative responsibility of the six states and two territories, although hospitals have considerable clinical autonomy. The Australian Health Ministers in April 2004 called for all public hospitals to use the five-step ‘ensuring correct patient, correct site, correct procedure protocol’ [13]. Dubbed the 3Cs protocol, it set out five steps: (i) check consent form or procedure request; (ii) mark the site; (iii) confirm identification with patient; (iv) take ‘team time out’ to verbally confirm all is correct before commencing the procedure and (v) check diagnostic images. Most states were using the shorter five-step protocol in November 2010, although the Australian Health Ministers have called for the WHO checklist to be implemented across jurisdictions by 1 July 2011.

Policy-makers expected the 2004 checklist to be readily adopted but compliance in operating theatres initially was low and slow. During part of a larger study on the regulation of patient safety and quality [14], hospital leaders were interviewed in order to analyse front-line experiences in implementing the protocol. How did they steer clinicians towards following the protocol? What strategies were used to promote compliance? Is soft regulation, the traditional approach to medical governance, sufficient to ensure compliance with evidence-based patient safety solutions?

The ‘responsive regulation’ framework was used to explore these implementation strategies. ‘Regulation’ in this model means ‘steering the flow of events’ and so takes a much broader view than rule enforcement: it encompasses ‘steering’ actions ranging from persuasion to enforcement that typically involve multiple actors and multiple mechanisms [15]. These multiple mechanisms can be mapped on twin complementary pyramids of supports and sanctions [16]. Responsive regulators give softer mechanisms of trust and respect a chance to work first, rather than opting immediately for enforcement, although it is crucial to have the capacity to escalate upwards to enforcement if necessary. This model, with its nuanced approach beginning with ‘soft regulation’, is a good fit for the health sector, which traditionally relies on clinicians to change their behaviour voluntarily and on self-regulation by professional groups [17]. The concepts of responsive regulation complement the literature on change management in organizational and professional cultures [18].

Methods

Questions on the ‘three Cs’ protocol were included in 72 interviews (only 4 people refused an interview) with health sector leaders (departmental administrators, hospital managers, clinical directors and safety and quality staff) during the period 2007–08 (Table 1). The interviews covered 20 hospitals in Australian state capitals (mostly large teaching hospitals) selected as active protocol promoters. The aim of this ethnographic study was to explore the range of mechanisms used by leaders to embed the safety protocol in their hospital. These in-depth interviews were conducted by the author (and a small number by N.D.); interviews mostly were in person with a few by telephone for geographic reasons, and took 40 min on average. A semi-structured questionnaire is an appropriate format in conversations with senior health professionals and the topic list was identified from the implementation literature (Table 2). About half the interviews were digitally recorded and later professionally transcribed and otherwise extensive notes were taken (when the environment was not conducive to recording) and written up in full later.

View this table:
Table 1

Types and numbers of interview respondents

Interviews recordedInterview notesTotal
Managers (health departments, hospitals, agencies)61925
Safety and quality professionals12820
Surgeons11314
Nurses628
Other hospital professionals55
Total353772
View this table:
Table 2

Primary questions asked during interviews

ContextQuestions asked of managers and clinicians
Policy instrumentWas the protocol a recommendation or directive from state (or hospital) administrators?
Have there been protocol revisions since it was first introduced?
Implementation strategiesHow did the hospital go about implementing the protocol? What strategies were tried (e.g. information sessions, training, guidelines, directives)?
Which strategies worked? Which strategies did not?
What issues arose during protocol implementation (e.g. importance of champions, professional resistance, non-compliance)?
Professional viewsWere some professional groups more accepting of the protocol than others?
Who were the champions for change? Why?
Were some individuals opposed to its introduction? Why?
ContextHas the hospital had a recent adverse event or wrong site surgery event? Did that make a difference to implementation and compliance?
How are the protocol steps followed in your operating theatre?
How is the patient's identity established? What validation occurs? How does the marking occur? Who calls for a time out? What documentation is there of the process?
Monitoring complianceAre there reporting mechanisms in place (e.g. checklists, independent audits)?
What is the estimated level of compliance with the protocol?

The full interview records were read and analysed according to keywords and themes using qualitative analysis methods [19]. The data set was returned to several times to fully analyse responses within each interview and across interviews according to sets of headings. The analytic framework used was that of responsive regulation with the regulatory pyramids used to locate the strategies used in these hospitals.

Results

The experiences recounted by respondents with the 3Cs protocol illustrate the multifaceted challenge of health sector regulation but also how health professionals use a learning model perspective to improve their practice. Informants commented that they intended to apply the lessons learned in implementing the safe surgery protocol to the implementation of future patient safety solutions: ‘We have learned a lot from this experience on how to go about putting a procedure in place in the hospital, and we will definitely do better next time’ (Interviewee #27). The results of the interviews with the 72 respondents are reported below under the headings of support strategies and sanction strategies.

Regulatory support strategies

Figure 1 depicts a pyramid of regulatory support strategies under broad headings in order of strength and frequency, beginning with softer and ranging upwards to stronger supports. As expected from the literature [20], hospital leaders generally began with softer strategies from the base of the pyramid in the hope that professionals would voluntarily adopt patient safety practices.

Figure 1

Pyramid of regulatory supports. Source: Adapted from Braithwaite et al. [16].

Consult widely

The Australian Council (now Commission) on Safety and Quality in Health Care distributed the protocol with ministerial endorsement to all public hospitals. Leaders in the 20 hospitals embarked on an information strategy and clinical leaders and safety and quality coordinators undertook a round of consultations in order to persuade clinicians to adopt the protocol: ‘You have to talk the right talk to your audience. If you talk to surgeons you need statistics, with administrators you need a financial message’ (Interviewee #8). Surgeons in particular took some persuasion. A patient safety professional said: ‘Some surgeons don't understand the number of risk points leading up to the person being rolled onto the operating table … That time out check is the last line of defence against mistakes being made’. She pointed out to the surgeons that ‘if something goes wrong it is the surgeon who gets their photo in the paper and the headlines … this is about protecting you as well as protecting patients’ (#24). Many surgeons objected to ‘more bureaucracy’ and regarded wrong surgery as a rare event, although a director of surgery commented: ‘It is arrogance when surgeons say that nothing goes wrong for them, or they haven't done enough operations, or they are lying’ (#25).

Enlist leaders

The Commission's 3Cs poster was ‘displayed everywhere’ and many hospitals also posted the Royal Australian College of Surgeons endorsement in their operating theatres: ‘The surgeons then could not come back with whose stupid idea is this?’ (#11). Hospitals also enlisted clinical leaders to promulgate the protocol in their specialty groups because a top-down edict does not necessarily work in hospitals since take-up depends upon multiple professional groups.

Educate staff

Hospitals ran training programmes, the protocol was explained in meetings and wrong surgery (anonymous) cases were lodged on internal websites. Hospital leaders pointed out that patient identification checks are essential as modern hospitals are busier and more complex places than previously: ‘Everything has sped up and that is why we need better checks’ (#4). Most hospitals included the protocol in staff induction programmes and continuing professional development courses. Informants stressed that training must be geared to particular groups, regularly repeated and simulation scenarios devised: ‘You play out a script so team time out gets missed and then ask how it could be done better’ (#27). But change is slow in entrenched professional cultures. A director of surgery commented: ‘Younger surgeons do patient safety checks routinely, whereas the older ones are harder to train’ (#12).

Remind staff

A tap on the shoulder was all that many people needed to remind them to comply with an agreed procedure. A theatre manager said: ‘Every few months I do a run-around and talk to the surgeons and registrars’ (#18). Peer pressure often was invoked. A hospital manager expected most surgeons eventually would comply: ‘If nine out of ten surgeons follow the protocol, there is peer pressure on the tenth to do it’ (#12). Where recalcitrant surgeons ignored the protocol, senior clinicians had ‘a little fireside chat’ (#13).

Relax procedure

The WHO checklist stresses that ‘additions and modifications to fit local practice are encouraged’. Most state health departments also allowed variations to the 3Cs protocol since ‘one size does not fit all’ and in order to improve compliance. Hospitals had their own procedures on informed consent, marking the site and patient identification procedures. These procedures also varied within a hospital, however, and some allowed surgical units and even individual surgeons, to decide how to perform the protocol, while others insisted on a standard protocol: ‘we couldn't have half the staff doing one thing and others doing something else’ (#12). A nurse manager stated: ‘I can't say to my nurses that ortho and general surgery mark the site, but plastics, ENT and neuro don't. We need one rule in a big place like this’ (#18).

Redesign hospital systems

Clinical leaders wanted hospitals to invest more in patient identification technology, such as patient identification bracelets/bar codes, and install forcing functions to make people think before taking the next step. One hospital set up a faux forcing function with a red key over the instrument tray to indicate the scrub nurse would not unlock the tray until team time out is done. Another hospital programmed its operating theatre computer system to not proceed until the time out check was verified: ‘It prompts everybody as an automatic thing and has saved us several times from making an error’ (#35).

Reward

Rewards can in-kind or financial. One hospital, as a morale-boosting experiment, awarded time out chocolate bar prizes to the surgical team with the highest protocol compliance (#5). While financial incentives were not used to promote compliance by individuals or hospital units, a high-performing hospital potentially could be rewarded through hospital payment mechanisms. Some suggested that health departments should make more use of public recognition through public awards for high-performing hospitals.

Regulatory sanction strategies

Hospital leaders added stronger strategies from a sanction pyramid after variable compliance became apparent. Figure 2 depicts a pyramid of regulatory sanctions. Informants identified levels of sanctions that ranged upwards from a directive to moderately stronger sanctions such as audits and reprimands.

Figure 2

Pyramid of regulatory sanctions. Source: Adapted from Braithwaite et al. [16].

Direct

The Australian Health Ministers had required all public hospitals to adopt the protocol, but the eight health departments attached varying authority to the protocol, depending upon their public sector cultures and administrative structures: one state issued a guideline, four a policy and three a directive. But compliance by clinicians does not necessarily follow and governance structures within hospitals are complex and involve both vertical and horizontal accountability. One hospital medical director pointed out ‘If clinicians don't buy a protocol, they are not going to do it’ (#11). Many hospital managers avoided beginning with a top-down edict and escalated from a guideline to a directive only when persuasion failed: ‘We spent 18 months fluffing around before we issued a directive … We said this is the way we do business when the new interns began and it went like a dream’ (#5). Informants said that surgeons, unlike nurses, tend to interpret a directive as discretionary not mandatory. Two hospital managers sent a directive personally to each clinician stating that ‘this is the way we will work in this hospital’. A director of surgery said surgeons would only take notice of a professional standard (#4); another commented that ‘a hard-core responds only to threats’ (#8); another said it should be made clearer that disciplinary action would follow non-compliance (#3).

Delegate authority

A health authority CEO said: ‘It's not a matter of hierarchy in a hospital it's a matter of assigning authority … so you know who to ask why the time out form wasn't signed’ (#10). Clinical governance policies in several states assigned responsibility for the protocol to hospital clinical directors. One state directive required hospital clinical directors in turn to assign responsibility for each protocol step: who will do it, what will be done and when will it be done? This state also made hospital chief executive officers responsible in performance agreements for reducing patient/site/procedure incidents by 50%. Two hospitals empowered the senior theatre nurse to halt the operation until all checks were done, and backed the nurse in several cases where surgeons protested. Another hospital contemplated delegating veto power, but nurses were reluctant to incur the wrath of the surgeons: ‘If the nurse says, no, I can't give you the knife, the surgeon will demand it’ (#5).

Monitor adverse events and report progress

Many hospitals discuss serious adverse events as part of their peer review procedures and most do a ‘root cause analysis’ on serious adverse events. An analysis of 31 cases in one state of procedures involving the wrong patient or body part found low protocol compliance: for example, the site was marked in only 6% and a final check done in 3% of cases [21]. ‘A big fright’ was said to greatly improve compliance: ‘The protocol gained traction in our hospital because three near-miss clinical incidents made three groups of surgeons take notice’ (#5). A hospital CEO said: ‘We had an actual event that shook people up since it got through all the safety layers … which is why that lesson was instructive’ (#13).

Some hospitals undertook regular audits of protocol compliance and posted graphs for staff to compare surgical units in the expectation that leaders would pull up the laggards. A hospital manager said: ‘One thing I find really potent is to give clinicians data about their own performance in relation to everyone else’ (#13). Hospital leaders were dismayed by early poor audit results with only 10–30% compliance across specialities, but compliance improved over time with regular audits. ‘Our first audit showed 32% compliance, the next 62% and the last one 94%’ (#11). Observational audits in two states in 2007 in a sample of operating theatres found the site not marked in over 30% of cases, team time out not performed in 20% of cases and considerable variation across hospitals and specialities. Another state reported near universal compliance by 2008 but based on a documentation not observational audit [22].

Report on progress

While all state health departments have adverse event reporting systems for their public hospitals, the Health Ministers required public hospitals to report nationally on ‘sentinel’ events including ‘procedures involving the wrong patient or body part’. The subsequent national report published in 2007, revealing 53 such events in 2004–05 [23], received extensive publicity that prompted greater efforts to ensure protocol compliance. But such events have continued to occur; for example, one state reported 37 wrong patient/body part events in 2007–08 and another over 80 in 2007. Larger numbers do not necessarily indicate more incidents occurring, however, but rather that hospitals identify more incidents once they look.

Reprimand staff

Some clinical directors avoided a formal reprimand for fear their surgeons would take umbrage and resign—an unwelcome risk given surgeon shortages. But one director did not hesitate to reprimand surgeons because ‘if we haven't got a safety ethos then we should not be doing surgery’ (#5), and another warned a surgeon that ‘either you work here and abide by a hospital directive or you don't’ (#12). While the ethos is to avoid ‘a blame culture’, an investigation could take a disciplinary path if a hospital had issued a clear directive, which a staff member had deliberately flouted, and an adverse incident occurred. A health authority CEO said: ‘If the protocol is there, it should not happen, and we should investigate whether there was a breach’ (#10).

Penalise staff

Clinical directors did not interpret non-compliance as a deliberate violation of an agreed standard. No cases were known of serious disciplinary sanctions being applied, such as suspending (pending a review) or dismissing staff for refusal to follow the protocol directive. Accreditation is an external regulatory strategy and while the US Joint Commission made its ‘universal protocol’ mandatory for hospitals seeking accreditation [24], the Australian hospital accreditation agency so far has a non-mandatory standard [25]. Financial sanctions are attached to adverse incidents in some countries [26], but have not been used in Australia. Medical indemnity insurance is another potential regulatory mechanism but informants knew of no wrong patient/site/procedure litigation where the doctor was refused cover. A patient complaint to a medical tribunal also is a powerful sanction. For example, the New South Wales Medical Tribunal in 2006 reprimanded and awarded costs against a surgeon for removing the wrong breast of a patient [27].

Discussion

Safe surgery protocols are part of a worldwide campaign to improve patient safety. Hospital leaders proved to be responsive regulators in applying multiple regulatory supports and sanctions to produce improved compliance with the 3Cs protocol. They mostly applied supports and softer rather than stronger sanctions to improve compliance, given pressures upon over-worked clinicians in public hospitals, the professional self-regulation ethos and the delicate relationship between managers and clinicians. They found a single one-off strategy was not sufficient to embed a new procedure in a complex and high-pressure environment. An information strategy resonated well in hospitals, as did enlisting clinical leaders. Although the 3Cs protocol began as a standardisation strategy, variation often was the trade-off in getting protocol acceptance, as recommended in the WHO checklist, despite the danger that variation within a hospital may cause confusion and hence error.

Safety and quality professionals believe that a patient safety culture entails compliance with a safe surgery protocol, and that soft regulation should escalate to hard regulation in cases of recalcitrance or cover-up. On the sanction pyramid, monitoring compliance proved very effective as did reporting. Health sector regulators did escalate to stronger sanctions over time but had not applied penalties from the apex of the regulatory pyramid, such as fines, suspension and dismissal. There are disciplinary and legal implications, however, if the protocol is ignored and an adverse event ensues. Pressure is building in the Australian health sector for more public reporting given evidence of its positive impact on healthcare quality [28], and the Australian government intends to set up a national authority to monitor the performance of public hospitals.

Funding

This study was supported by the Australian Research Council LP0455448.

Acknowledgements

My thanks are due to Nicola Dunbar, John Braithwaite and Bruce Barraclough for their helpful comments on this paper.

References

View Abstract