OUP user menu

Does setting adolescent-friendly standards improve the quality of care in clinics? Evidence from South Africa

Kim Eva Dickson, Joanne Ashton, Judy-Marie Smith
DOI: http://dx.doi.org/10.1093/intqhc/mzl070 80-89 First published online: 2 February 2007

Abstract

Objective To determine whether setting and implementing adolescent-friendly standards improves the quality of adolescent services in clinics.

Design The evaluation used a quasi-experimental case–control design.

Setting/participants Eleven public health clinics involved in the adolescent-friendly program [The National Adolescent Friendly Clinic Initiative (NAFCI)] and 11 control clinics.

Intervention This included implementation of a set of 10 adolescent-friendly standards and 41 corresponding criteria.

Main outcome measures Percentage scores were achieved for each standard and criterion. Clinics were awarded a Gold Star if they achieved an overall clinic score (average standard score) of ≥90%, a Silver Star for a score between 60 and 89% and a Bronze Star for a score between 30 and 59%.

Results The NAFCI clinics performed better than the control clinics on most criteria. The combined average overall clinic score of all the NAFCI clinics (79.9%) was significantly higher (P = 0.005) than the overall score for the control group clinics (60.9%). Results showed that the longer NAFCI was implemented at a clinic, the higher the score and the more likely that clinic would be accredited as an ‘adolescent friendly’ clinic. NAFCI clinics performed significantly better than the control clinics on criteria specific to the provision of adolescent-friendly services including knowledge of adolescent rights and non-judgmental attitudes of staff.

Conclusion Setting and implementing standards and criteria improves the quality of adolescent services in clinics. The standards and criteria should be set on the basis of the characteristics of adolescent-friendly services and quality of care indicators. Best results are achieved when a facilitator trained in quality improvement methodologies supports clinics.

Keywords
  • adolescents
  • quality of care
  • adolescent-friendly services
  • South Africa

Despite international consensus regarding adolescents' right to reproductive health services and information [13], young people face many barriers to accessing services [4, 5]. A number of organizations have sought to provide adolescent-friendly services to improve access to health care [6, 7]; however, only a few programs have devised ways to standardize and systematically assess the quality of the services [8, 9].

Over the past decade, there has been a surge of interest in establishing programs for assessing and accrediting health-care facilities, as evidenced by the number of hospitals seeking accreditation internationally [10]. In Africa, hospital accreditation programs have been established in Zambia [11] and South Africa [12]. Although it has not been proven that inspection or accreditation results in better clinical outcomes, there is evidence that health facilities can increase their compliance with standards and improve quality of care if these standards are made explicit [13]. These efforts to improve the quality of health care have not been applied to specific clinic programs such as adolescent-friendly services.

The National Adolescent Friendly Clinic Initiative (NAFCI) was initiated in 1999 as an integral component of loveLife, a national multi-dimensional HIV/AIDS youth program [14]. The NAFCI approach was developed on the basis of the Quality Triangle (Fig. 1) conceptualized by the Quality Assurance Project [15]. First, quality was defined as standards of adolescent-friendly services; the standards and criteria were developed on the basis of established characteristics and attributes of adolescent-friendly services, including having adolescent-specific policies and non-judgmental staff, ensuring privacy and confidentiality and having an attractive environment [16]. In addition, standards and criteria included more generic quality of care indicators such as having clinical guidelines, developing service plans and ensuring cleanliness and infection control [17, 18]. Then, a methodology for measuring whether the standards were being met was designed. NAFCI uses both a self-audit process and an external assessment process for clinics to work toward achieving 10 standards and 41 criteria that lead to Bronze, Silver and Gold Star levels of accreditation [19]. A means for improving the quality when the standards were not being met was also developed. The quality improvement process included forming teams in each clinic, with an externally trained NAFCI facilitator to support the team. Values clarification workshops were conducted to assist all staff including nurses, security and clerks to explore their attitudes toward youth seeking services. Further, staff were trained on the standards and a problem-solving methodology was introduced to assist staff find solutions to barriers in implementing the standards. Action plans were used as a key means to achieve team goals. The NAFCI program revolves around four main elements of quality improvement: focus on the client, effective systems/processes, use of data and a team approach [20].

Figure 1

Quality assurance triangle.

This study measured the achievement of clinics (NAFCI and control) toward meeting the standards and criteria set forth by the program.

Methods

Study design

A quasi-experimental case–control design was used as part of a large community-based cross-sectional survey to evaluate the impact of the loveLife program. Eleven study and 11 control clinics were chosen on the following basis:

  1. eleven communities that had a loveLife Y-Centre, which are multi-purpose youth centers were selected (because of the broader aims of the loveLife evaluation);

  2. eleven NAFCI clinics within the same health district as the Y-Centre;

  3. control communities were selected within the same health district as the Y-Centre;

  4. a control clinic was randomly selected within the same community.

The study clinics were chosen from 57 clinics participating in the NAFCI program throughout the country, these clinics had previously been chosen to implement the NAFCI program in consultation with the respective provincial and district health authorities. At least, one clinic was selected in each of the nine provinces in South Africa. A variety of factors had been considered in selecting clinics for the NAFCI program, including poor youth utilization of clinic services, high prevalence of sexually transmitted infections (STIs) and teenage pregnancies in the community. However, these factors differed per province and were not well documented at the time. The assessments were conducted between June 2002 and March 2003.

Ethics approval was obtained from the Committee for Research on Human Subjects of the University of the Witwatersrand and Uppsala University Ethics Committee. In addition, permission to work in the clinics was obtained from the relevant provincial and district authorities. Informed consent was obtained from all young people and clinic staff interviewed.

Measurement procedures and sources of data

Eight assessment tools (Table 1) were used to collect information on the clinic's quality of adolescent services and to determine the level of accreditation. The tools were developed on the basis of the NAFCI standards and criteria and had been pre-tested in 10 pilot clinics. This study considered four main questions.

  1. Did the NAFCI clinics attain higher overall scores than the control clinics?

  2. Did the NAFCI clinics attain higher standard and criteria scores than the control clinics?

  3. Did the duration of implementation of the NAFCI program make a difference to the clinic scores attained on external assessment?

  4. Did the clinic size make a difference to the scores attained?

A team of four people which had to include a professional nurse or doctor and a youth representative, conducted the assessments over 1 day. The doctor or nurse was responsible for conducting the health-care provider interviews and the client–provider interaction observations/simulations. The youth representative was responsible for the adolescent client exit and key informant interviews. At the end of the assessment day, the team met to review all the data collected, to ensure that the data sets were complete and to reach consensus on observations where necessary.

View this table:
Table 1

Instruments and methods for external assessments

Data collection instrumentsMethod of data collectionN
Interview with clinic managerA standardized questionnaire was administered to examine the management systems in place to support the provision of adolescent friendly services1
Document reviewReview of clinic documents and files important to clinic function such as the community health profile, service plan, staff training plan, clinical guidelines and client records19 Clinic documents 10 Patient files
Inventory of the clinic and immediate surroundingsThe clinic and immediate surroundings were assessed to determine the cleanliness, infection control practices and the general state of the environment. Inventories were also done to determine the availability and storage of drugs, equipment and supplies
Health-care provider interviewQuestionnaires were administered to a random sample of health-care providers (mainly nurses) present on the day of the assessment to evaluate their knowledge on pertinent adolescent health issues and to assess whether the health-care providers had the necessary training to provide all the essential service package servicesaFacilities with <5 health-care providers, all of them were interviewed, where there were >5 providers but <10, at least 5 were interviewed and where there were >10 providers, 50% of them were interviewed
Non-clinical support staff interviewQuestionnaires were administered to a random sample of non-clinical support staff to assess the levels of adherence to adolescent rights and any barriers to accessing the clinicSame as for clinical staff
Client–provider interaction observations and simulationsObservations of client consultations to determine whether adolescent clients received an accurate assessment and care based on standard case management guidelines. Simulations were also used as a means of assessing clinical practice where there were no actual cases to observeFive interactions and/or simulations
Adolescent client exit interviewsExit interviews with adolescents after they had received services to assess client satisfaction with the services5
Key informant interviewInterview with young people who had been involved with clinic activities. The key informants were expected to provide valuable insight into the clinics activities toward providing adolescent-friendly services. Each clinic was asked to select their own key informants5
  • aNAFCI Essential Service Package (ESP) included:

  • 1. Information, education and counseling on sexual and reproductive health.

  • 2. Information, counseling and appropriate referral for sexual violence/abuse and mental health problems.

  • 3. Contraceptive information and counseling, provision of methods including, oral contraceptive pills, emergency contraception, injectables and condoms.

  • 4. Pregnancy testing and counseling, antenatal and post-natal care.

  • 5. Pre- and post-abortion counseling and referral.

  • 6. STI information, including information on dual protection strategies.

  • 7. Syndromic management of STIs.

  • 8. HIV information, pre- and post-test counseling and appropriate referral for voluntary testing if services not available.

Data analysis

Original data were captured into an Access Database programmed to calculate how many of the 10 standards and 41 criteria had been met indicated by standard percentage scores. The scores of responses from the different tools were aggregated according to standard and criteria. Each question was weighted on a scale of 1–3 as follows:

  • 1—important, questions that referred to the availability of documents and signage;

  • 2—very important, for proper management planning, assessing and identifying the needs of staff, perceptions of team involvement, good supervision and training of staff;

  • 3—critical, adolescents perceptions and opinions of care received, activities to improve accessibility, remove barriers and promote services, activities to assess and identify adolescent health needs, quality improvement methodologies implemented, good clinical practice and the adhering to adolescents' rights.

Although each standard had a different number of corresponding criteria and questions, all the standards were weighted equally in the data analysis. Clinics were awarded a Gold Star if they achieved an overall clinic score of ≥90%, a Silver Star for a score between 60 and 89% and a Bronze Star for a score between 30 and 59%. Calculated scores were then exported from Access to the Statistical Package for the Social Sciences for further statistical analysis. An independent sample t-test was used to determine whether the differences between the control and research sites were significant, as the Shapiro–Wilk test for the normality of data where there are less than 50 cases indicated a normal distribution.

Results

Comparison of overall and standard scores for NAFCI and control clinics

Taking the NAFCI clinics as a group, the combined average overall clinic score of all the NAFCI clinics, 79.9%, was significantly higher (P = 0.005) than the average overall score of the control group clinics, 60.9%. Although both groups achieved Silver Star status, NAFCI clinics performed better on each of the standards, and the difference in standard scores for the NAFCI and control clinics was significant for all the standards, except Standards 5 and 10 (Table 2). The highest scoring NAFCI clinic achieved a Gold Star at 92.5% and the highest scoring control clinic achieved a Silver Star at 74.7%. The lowest score for an NAFCI clinic was 63.8% (Silver Star), and the control clinic achieved a Bronze Star at 42.7% (Table 3).

View this table:
Table 2

NAFCI–control clinics comparison of overall average clinic and standard scores

StandardNAFCI clinics N = 11 (%)Control clinics N = 11 (%)% Difference (NAFCI − control)P-values
Standard 1. Management systems are in place to support the effective provision of the Essential Service Package (ESP) for adolescent-friendly services78.155.522.60.001
Standard 2. The clinic has policies and processes that support the rights of adolescents91.069.022.10.001
Standard 3. Appropriate adolescent health services are available and accessible90.073.116.60.005
Standard 4. The clinic has a physical environment conducive to the provision of adolescent-friendly health services85.970.915.00.006
Standard 5. The clinic has the drugs, supplies and equipment necessary to provide the essential service package for adolescent-friendly health care81.575.16.40.19
Standard 6. Information, education and counseling consistent with the Essential Service Package are provided77.336.940.40.004
Standard 7. Systems are in place to train staff to provide effective adolescent-friendly services63.836.027.80.01
Standard 8. Adolescents receive an accurate psychosocial and physical assessment73.653.420.30.004
Standard 9. Adolescents receive individualized care on the basis of standard case management guidelines/protocols77.463.513.90.004
Standard 10. The clinic provides continuity of care for adolescents80.676.04.50.32
View this table:
Table 3

NAFCI and control clinics individual clinic scores

ProvinceNAFCI clinicsControl clinics
Clinic nameClinic score (%)Clinic nameClinic score (%)
Free StateChief Albert Luthuli76.0Hoopstad66.8
GautengEmpilisweni CHC82.0Eric Ndaleni63.9
Kwa Zulu NatalGamalakhe92.5Kwa Jali56.9
Kwa Mbonambi77.9Dokodweni59.8
North WestMakapanstad86.5Pudimong55.2
Eastern CapeMvubukazi63.8Erholweni42.7
Kwa Nomzamo75.7New Brighton74.7
LimpopoNkowankowa90.4Makushane56.1
MpumalangaVlak 2 CHC82.2Vaalbank49.5
Western CapeSite B Youth Clinic76.8Bloekombos71.6
Northern CapeSteynville/Hopetown75.0Victoria West73.1

Comparison of criteria scores for NAFCI and control clinics

The NAFCI clinics performed better than the control clinics on most of the criteria and significantly better on criteria specific to the provision of adolescent-friendly services rather than the more generic ‘good quality of care’ criteria. Significant differences were found in the criteria that related to determining adolescent health needs in a community, knowledge of adolescent rights, availability of adolescent-specific information and non-judgmental attitudes of staff (Table 4).

View this table:
Table 4

NAFCI clinics/control group comparison of criteria scores

NAFCI standardNAFCI criterionNAFCI (%)Control (%)P-value
Standard 1. Management systems are in place to support the effective provision of the Essential Service Package (ESP) for adolescent-friendly servicesData are collected to determine the adolescent health needs in the community72.921.4<0.001
The clinic has a service plan that addresses the need for adolescent health services and a process to implement the plan68.219.70.01
Staff receive support and supervision on an on-going basis78.283.30.20
The clinic has a regular process for improving the quality of adolescent services83.649.6<0.001
The clinic has a system to assure adolescent and community participation in the planning and provision of care65.959.10.60
The clinic has an adequate client record system54.532.60.15
Standard 2. The clinic has policies and processes that support the rights of adolescentsClinic staff knows the sexual and reproductive health (SRH) rights of adolescents89.980.00.006
The clinic pro-actively promotes the SRH rights and responsibilities of adolescents94.624.4<0.001
Clinic staff provides services taking into account the rights of adolescents90.588.00.84
Providers and staff maintain confidentiality of adolescent clients89.283.60.37
Standard 3. Appropriate adolescent health services are available and accessibleThe scheduling, location and scope of adolescent services provided by the clinic are clearly visible and communicated to the community76.539.80.007
The clinic actively promotes adolescent health services within the community91.435.3<0.001
Services are provided within time frames convenient for adolescents in the community85.281.10.38
All staff including reception, clerical and housekeeping staff, are able to assist youth to access care in an informed, non-judgmental manner93.887.00.06
Syndromic management of STIs is provided1001001
A high quality voluntary counseling and testing service is provided90.981.80.54
An HIV programme is provided86.477.30.40
Contraceptive information, counseling and methods are provided1001001
Services are provided for pregnancy90.983.60.17
Information, counseling and appropriate referral for violence/abuse and mental health problems are provided81.845.50.08
Standard 4. The clinic has a physical environment conducive to the provision of adolescent-friendly health servicesConsultations with clients occur in a place that assures privacy97.779.90.02
The clinic is clean and comfortable for adolescents89.474.50.01
Appropriate infection control procedures are practiced70.558.10.02
Standard 5. The clinic has the drugs, supplies and equipment necessary to provide the Essential Service Package for adolescent-friendly health careNecessary drugs and contraceptives are regularly available for ESP case management84.483.40.95
Supplies are available for ESP case management79.466.10.03
Working equipment is available for the provision of the ESP80.875.80.44
Standard 6. Information, education and counseling consistent with the Essential Service Package are providedThe clinic has accurate, easily understandable information and education materials appropriate for adolescents79.330.7<0.001
Health-care workers provide information and education activities at the clinic and in the community60.038.70.004
Adolescents are involved in the provision of IEC activities at the clinic and in the community95.041.10.001
Standard 7. Systems are in place to train staff to provide effective adolescent-friendly servicesThe clinic has a training plan to meet the needs of its staff to provide the ESP, using the standard case management guidelines59.131.80.22
The staff is trained in providing the ESP, using the standard case management guidelines69.558.30.21
Staff are trained and developed to assist and serve youth in a non-judgmental manner62.817.9<0.001
Standard 8. Adolescents receive an accurate psychosocial and physical assessmentHealth-care providers take an appropriate history66.057.70.01
Health-care providers perform appropriate physical examination and investigations according the case management guidelines/protocols71.052.30.03
Assessments are undertaken with consideration being given to the comfort, dignity and modesty of the adolescent95.894.80.67
Health-care providers ensure that no opportunity is missed to comprehensively assess adolescents' health needs and risks61.821.00.006
Standard 9. Adolescents receive individualized care based on standard case management guidelines/protocolsCase management guidelines for the ESP are available and used71.256.00.008
Adolescents are encouraged to express their concerns, ask questions and discuss treatment options65.748.90.07
Health-care providers use effective counseling skills based on the ESP95.485.60.04
Standard 10. The clinic provides continuity of care for adolescentsAdolescents are given clear and understandable follow-up information77.273.90.70
An adequate referral system for adolescent health care exists83.978.20.22

Duration of NAFCI implementation

The duration of NAFCI implementation ranged from 0 to 22 months. Two NAFCI clinics were considered to have 0 months implementation because after they were selected for the program, no facilitator was available to work with them to implement the standards (one element of the NAFCI program is providing a trained facilitator who makes weekly support visits to the clinic).

Correlation analysis indicated that there was a moderately high correlation (r = 0.69) between the number of months of NAFCI implementation and the clinic score. This suggests that an increase in clinic scores is related to the implementation period; the longer NAFCI was implemented at a clinic, the higher the score and the more likely that the clinic will be accredited as an ‘adolescent-friendly’ clinic.

NAFCI Clinic pre- and post-intervention scores

Baseline assessments were conducted by the NAFCI facilitator and clinic staff in all the NAFCI clinics before the implementation of the program; however, the data were not systematically maintained. The four clinics where baseline data were available for comparison with external assessment scores showed a marked difference in score. Chief Albert Luthuli 21 vs. 76.0, Makapanstad 36 vs. 86.5%, KwaNomzamo 29 vs. 75.7% and Site B Youth Clinic 19 vs. 76.8%. However, owing to the relatively small sample size, significance tests were not performed.

Clinic size

Clinics were classified as small or large by taking into account the number of professional nurses employed by the clinic. Clinics with less than six professional nurses were classified as small clinics, whereas clinics with more than 10 professional nurses were classified as large clinics. Medium clinics were those with 6–10 professional nurses; however, there was only one such clinic with six professional nurses, which was accordingly classified as a small clinic. The control clinics were intended to be similar in size and staffing as the NAFCI clinics, but it was found that this was not the case in a number of instances. Six NAFCI clinics were classified as large, whereas only two of the control clinics were large.

There was a statistically significant difference (P = 0.03) between the average overall clinic scores of small NAFCI clinics 75.2% and the scores of large NAFCI clinics 85.5%. The larger NAFCI clinics performed better than the small ones on all standards, except Standard 4; 84.8 vs. 86.8%. The large clinics performed significantly better on Standards 3, 8, 9 and 10. Clinic size was also found to be moderately correlated with the overall clinic score; larger clinics were more likely to receive high scores.

Comparison of small NAFCI clinics with small control clinics

Most of the control clinics were classified as small clinics. As the smaller NAFCI clinics performed significantly poorer than the large NAFCI clinics, further analysis was done to compare the small NAFCI clinics with the small control group clinics to establish the effect of this variable. Small NAFCI clinics had a significantly higher overall average clinic score than small control group clinics and on Standards 1, 2, 4, 6 and 8.

Discussion

The study findings suggest that setting and implementing standards and criteria does improve the quality of adolescent services in clinics. The use of a quality approach was felt to contribute to these results as it provided a systematic way for the staff to achieve their goals. For instance, the results show that NAFCI clinics were more likely to conduct a community assessment and to develop a service plan based on this assessment. This concurs with the experiences of other quality improvement programs [15]. The NAFCI facilitator was also felt to be an important factor in the success of the NAFCI clinics; they developed a team at each facility focused on the task. The facilitators built the capacity of staff in quality methodology to address service delivery issues important to young people and generic quality issues such as privacy, cleanliness and infection control. Study results showed that NAFCI clinics scored higher on these criteria, particularly those specific to adolescent-friendly services. The results also show that meeting the adolescent-friendly standards improved over time and that the performance of NAFCI clinics with no facilitator support (0 months implementation) was similar to the control clinics. It is therefore important to ensure that clinics are supported by a trained facilitator who provides support for a sustained period. Providing support for quality improvement is consistent with other study findings, showing that one of the key determinants of unsuccessful continuous quality improvement efforts in health-care organizations is the lack of support to employees [21].

The findings indicate that even though the control clinics met some of the standards and criteria without implementing the program and would have attained a level for accreditation; the highest scoring control clinic attained a Silver Star level and the lowest attained a Bronze Star level. This finding was felt to be due to the control clinics meeting the ‘generic’ quality of care indicators that were not restricted to adolescent-friendly services, e.g. infection control or having stock available. It could also be argued that better performing clinics had been selected to implement NAFCI; however none of the NAFCI clinics had any specific policies or services for adolescents, nor had staff received adolescent-friendly services training prior to NAFCI. Where baseline data were available, big differences were found between pre- and post-intervention scores. Thus, the fact that NAFCI clinics performed significantly better on the criteria specific to adolescent-friendliness was most likely a result of the program.

The finding that the larger NAFCI clinics performed better than the smaller ones came as a surprise to the program managers as the NAFCI facilitators expressed that it was easier to work with the small clinics. However, this finding could be attributed to the larger clinics having more staff available to devote time to determining the gaps in quality and leading the improvement efforts. This variable needs to be investigated further.

This study focused on assessing whether standards established to provide adolescent-friendly services had been met. Although improving the quality of adolescent services has not been conclusively proven to improve utilization rates, a review of studies that measured the impact of adolescent reproductive health programs suggests that although the evidence does not point to a single most effective intervention, a combination of interventions is most likely to be effective [22]. An important element of adolescent friendly clinic programs and determinant of adolescent health seeking behavior is thought to be community support and acceptance of the intervention [7]. The NAFCI standards and approach define quality as not just clinic-based improvements such as clinic management, the availability of drugs, staff competency and attitudes toward caring for youth, but also the involvement of youth and the community as members of the quality improvement team as central to the intervention. Additional studies are being conducted to evaluate whether this approach increases the utilization of services and decreases the incidence of HIV/AIDS, STIs and teenage pregnancy.

The data suggest that the Gold, Silver and Bronze levels for clinic accreditation status could be more rigorously designed; considering the fact that the NAFCI clinics performed consistently better on all the standards and criteria and in many instances significantly better than the control clinics, it seems unreasonable that the combined overall scores of the two groups would be awarded the same ‘adolescent-friendliness’ status of Silver Star.

Limitations

Selection bias cannot be ruled out because non-probability sampling methods were used to select the NAFCI clinics. Also, the statistical power is limited because of the small number of clinics and respondents included in the study. The ‘Hawthorne Effect’ (whatever is observed, changes) cannot be ruled out for the client–provider observations, as chances exist that some providers did not behave as they would have if they were not being observed. In addition, some clinics did not have sufficient adolescent clients attending on the day of the assessment; therefore, actual client–provider interactions could not be observed and simulations were performed instead.

Conclusion

Providing health-care services that meet the needs of adolescents is crucial in fighting the battle against HIV/AIDs, STIs and teenage pregnancy. The adolescent-friendly clinic initiative was created to address these issues in South Africa. Developing adolescent-friendly standards does appear to improve the quality of care provided to young people at public health clinics. To achieve the best results, clinics need to be supported over a period of time by a facilitator trained in quality improvement methodologies.

Acknowledgements

The authors would like to express their appreciation to Winnie Moleko, Marriam Mangochi and Nokuthula Mfaku and Prof. Helen Rees of the Reproductive Health Research Unit and Prof. Gunilla Lindmark of Uppsala University. Financial support was provided by the Henry J. Kaiser Foundation and the South African Government.

References

View Abstract