OUP user menu

Technical quality control practices in mammography screening programs in 22 countries

R. EDWARD HENDRICK, CARRIE KLABUNDE, ANDRE GRIVEGNEE, GONZALO POU, RACHEL BALLARD-BARBASH
DOI: http://dx.doi.org/ 219-226 First published online: 1 June 2002

Abstract

Objective. To assess current technical quality control (QC) practices within breast cancer screening or surveillance programs internationally.

Materials and methods. The International Breast Cancer Screening Network (IBSN) conducted an extensive survey of quality assurance (QA) activities in developed countries known to have population-based breast cancer screening or surveillance programs in place. Twenty-three countries were sent questionnaires that included items about QA and QC requirements at screening sites, the minimum frequencies of QC test performance, and the personnel responsible for performing QC tests.

Results. All 23 countries in the IBSN completed general information on their QA practices. Twenty-two countries responded with complete details on their technical QC practices. The responses indicated a pattern of consistently high-quality control practices among population-based breast cancer screening and surveillance programs. Most programs performed the great majority of QC tests. Variations were observed in the performance frequencies of QC tests and in the personnel responsible for performing QC tests.

Conclusion. QC practices among population-based breast cancer screening and surveillance programs are highly evolved, with the great majority of responding countries following prescribed QC protocols. Further research is needed on appropriate performance frequencies for mammography QC tests.

  • mammography screening
  • quality assurance
  • quality control practices

Quality assurance in mammography has received increasing attention as an essential element of a successful breast cancer screening program. The European Commission developed and published the European Protocol for the Quality Control of the Technical Aspects of Mammography Screening and the European Guidelines for Quality Assurance in Mammography Screening in 1993 [1, 2]. The American College of Radiology developed and published Mammography Quality Control Manuals in 1990 [3] and the Breast Imaging Reporting and Data System (BIRADS) in 1993 [4]. Largely due to the influence of the early screening programs implemented in Europe and the documents listed above, breast cancer screening programs in developed countries have been implemented with extensive quality assurance (QA) and quality control (QC) components.

The International Breast Cancer Screening Network (IBSN) is a consortium of representatives from 23 countries with population-based breast cancer screening or surveillance programs. Administrative support for the IBSN is provided by the United States National Cancer Institute. The IBSN first met in 1988, with 11 countries participating. The sixth meeting of the IBSN was in November 1997, with 17 countries participating. At that meeting, a decision was made to learn more about the QA practices within known screening and surveillance programs. A QA Working Group was appointed, and the Working Group began developing a survey instrument designed to characterize the policies and procedures that IBSN countries have in place to assure high-quality screening mammography.

This paper focuses on international practices related to the technical aspects of quality assurance within those programs, which we refer to as mammography ‘QC’ [5]. These consist primarily of regular equipment performance measurements and regular steps to ensure quality within mammography programs. The goal of this paper is to provide an accurate assessment of QC practices in population-based mammography screening or surveillance programs [6]. Separate papers will be forthcoming on the organization of QA activities, QA in follow-up and treatment, and QA for mammography screening data collection systems [7].

Methods

An 11-member working group of IBSN representatives from eight countries began development of a quality assurance survey in October 1997. The working group constructed a comprehensive questionnaire designed to assess the size and structure of organized breast cancer screening programs, along with the organization of QA, QC, and data collection systems [7]. The survey asked detailed questions about the QC practices in place at screening sites, their frequencies of performance, and the personnel responsible for performing each of the QC tests.

Questionnaire drafts were reviewed by experts in screening mammography, QA and QC, data collection systems, and questionnaire design. On May 26, 1998, the finalized survey instrument was mailed to IBSN program representatives from 23 countries. Countries with both organized screening and opportunistic screening (screening occurring outside the organized program) were asked to respond for the quality assurance measures within their organized program. Countries with multiple organized screening programs were asked to provide a summary response that reflected the majority of those programs. The USA response is based on the QA activities within the National Breast Cancer Consortium, an NCI-sponsored mammography registry and surveillance program focusing on community-based screening mammography in eight states. The QC responses, however, reflect those required of all mammography sites in the USA under Mammography Quality Standards Act (MQSA) Interim Rules [8].

Responses were received between July and October 1998 from 22 countries, and an additional response was received in early 1999. Written survey responses were compiled and analyzed by the QA Working Group in October 1998. Completeness of survey responses was assessed, and missing or ambiguous data elements were resubmitted to the country’s representative for review and/or completion. Final responses were compiled between February and July 1999.

Twenty-two countries completed detailed responses to the QC practices within their organized screening or surveillance programs. Detailed QC practices from one country (Ireland) were not completed due to transition from a pilot screening program to a national screening program in which QC practices had not been defined completely at the time of the survey. The survey response from another country (Japan) was based on a pilot program in the Miyagi prefecture; a national screening program is being planned in Japan. Survey responses from a third country (Germany) represent a population-based screening program being planned for implementation in 1999.

Results

Respondent countries to the quality assurance survey included 21 population-based screening programs (eight organized on a national basis, 13 organized on a sub-national or provincial basis) and one surveillance program (Table 1). The oldest of these population-based screening programs (Finland and Sweden) began in 1986. Five of the respondent programs (Germany, Greece, Hungary, Japan, and Uruguay) were self-identified as pilot programs. In 2000, Japan’s program became a nationally based screening program with required QA and QC.

View this table:
Table 1:

Summary of screening or surveillance programs in 22 countries (1998 data)

CountryProgram descriptionProgram coverageQA and QC requirements and QC guidelines
Program typeYear establishedNo. of programsNo. of facilitiesNo. of units% target pop. servedQA organisationQA required by law? (since)Follow established QC guidelinesQC guidelines followedQC required by
AustraliaNational screen1991 1 35Unknown 54NationalNo, voluntaryYesNational Accreditation RequirementsScreening Program
BelgiumState screen1992 2 35  40UnknownNeitherNo, voluntaryYesEuropean Guidelines for QA in Screening MammographyScreening Program
CanadaState screen198810163 179 65RegionalNo, voluntaryYesCanadian Association of RadiologistsScreening Program
DenmarkState screen1991 1  1   1 18RegionalNo, voluntaryYesNational Board of Radiation ProtectionLaw
FinlandNational screen1986 1 11  97100NationalYes, national (’57)YesFinnish Centre for Radiation Safety, European GuidelinesLaw
FranceState screen1989269001000 25NationalYes, national (’99)YesEuropean Guidelines for QA in Screening MammographyLaw
GermanyState screen1,219991,2 3  8   8  2BothNo, voluntaryYesEuropean Guidelines for QA in Screening MammographyScreening Program
GreeceState screen119891 2  6   6 25RegionalNo, voluntaryYesEuropean Guidelines for QA in Screening MammographyScreening Program
HungaryState screen119911 1  6   6UnknownRegionalYes, state (’99)YesTechnical StandardsPractice Standard
IcelandNational screen1987 1  3   5100NationalNo, voluntaryYesNordic Guidelines (Scandinavian)Law
IrelandNational screen1999 1  2   6 50BothNo, voluntaryYesNational—based on European and UK GuidelinesLaw
IsraelNational screen1997 1 38  40100NationalNo, voluntaryYesNational adaptation of European GuidelinesLaw
ItalyState screen199022 26  43 10RegionalNo, voluntaryYesEuropean Guidelines for QA in Screening MammographyScreening Program
JapanState screen119891 1  2   3 30RegionalNo, voluntary3NoVoluntary3
LuxembourgNational screen1992 1 10  10 98NationalNo, voluntary3YesEuropean Guidelines for QA in Screening MammographyScreening Program
NetherlandsNational screen1989 1 55  62100NationalYes, national (’97)YesDutch Technical Protocol for QCScreening Program
NorwayState screen1995 1  7  13 40NationalNo, voluntaryYesEuropean Guidelines for QA in Screening MammographyScreening Program
PortugalState screen1990 1  7   7 20RegionalNo, voluntaryYesQA in Mammography Screening - 1998Screening Program
SpainState screen1990 1  2   2 60RegionalYes, national (’96)YesEuropean Guidelines for QA in Screening MammographyLaw
SwedenState screen198626 26  60100NationalYes, national (’90)YesEuropean Guidelines for QA in Screening MammographyLaw
UKNational screen1988 1 94 315100BothNo, voluntaryYesRadiographic QC Manual for MammographyScreening Program
UruguayState screen119961 1  1   2 20NeitherNo, voluntaryYesACR Mammography QC ManualScreening Program
USASurveillance1994 8128 166  3BothYes, national (’94)YesACR Mammography QC ManualLaw
  • 1Pilot screening programs at the time of survey.

  • 2Program in implementation at the time of survey.

  • 3A national law on mammography QC was implemented in the year 2000.

Responding programs demonstrated wide ranges in program size, ranging from one unit at one facility (Denmark), to an estimated 1000 units at 900 facilities (France). Coverage of the targeted screening population also varied widely, with screening programs in Finland, Iceland, Israel, Luxembourg, Netherlands, Sweden, and the UK reaching 80–100% of the targeted screening population. Other countries, such as Denmark, Germany, Italy, Portugal, Uruguay, and the USA described programs covering 20% or less of the target population. In some sub-nationally organized programs, however, the targeted population was a single region rather than the entire country. Sub-nationally organized programs with a single program (such as Spain) targeted only a fraction of the national population of screening-eligible women, whereas sub-nationally organized programs with multiple programs (such as Canada, France, and Sweden) targeted larger portions of the national population.

The organization of QA activities did not necessarily match the organization of the screening program itself (Table 1). France, Norway, Spain, and Sweden had sub-nationally organized screening programs, but had national organization of QA activities. Nine of the responding countries had a national organization for QA activities, and eight countries had a sub-national organization for QA activities. Four countries had both sub-national and national QA organizations, and two countries had neither.

Equipment QC was required by law in nine countries and by practice standards or by the screening program in 13 countries (Table 1). All but one of the countries indicated that established QC guidelines were followed, and that country began requiring QC in 2000. Nine countries reported following the QC components of the European Guidelines [2], whereas two countries reported following the American College of Radiology’s QC Manual [3]. Twelve countries have developed country-specific QC guidelines.

Tables 2 and 3 list a number of detailed QC tests performed in different countries. Table 2 includes survey responses for the minimum performance frequencies for each QC test within organized screening programs in each of the 22 countries that provided detailed responses about their QC practices. Most countries reported performing most QC tests. All countries responded positively to performing processor sensitometry, screen-film contact, beam collimation assessment, and automatic exposure control (AEC) tests. Only one population-based screening program responded negatively to performing each of the following tests on a regular basis: phantom image test, cassette cleaning, kVp accuracy/reproducibility, and beam quality measurement. Two countries responded negatively to routine performance of developer temperature, compression force, focal spot size or spatial resolution, and beam entrance exposure tests. Several countries indicated that they did not perform the darkroom fog test (three), average glandular dose measurement (four), film viewbox illumination measurement (four), mammography-unit safety test (four), and repeat analysis test (eight). Only half of the 22 countries surveyed regularly performed the fixer retention test.

View this table:
Table 2:

Minimum quality control test frequencies for population-based screening programs in 22 countries

CountryCriteria
1234567891011121314151617181920212223
AustraliaDDWBQWDBMMBMQQDQQBQQAMM
BelgiumDBNoANoDWBANoANoAWBBBABWNoDA
CanadaDDDBQDDBBQWBBBBBBBBBBMNo
DenmarkDDWANoDMBANoAAAAAAAAANoADA
FinlandDDDAWWWABAAAAAAAAABAAAA
FranceDNoBBNoWWBAABBAABAQAAANoDB
GermanyDDAANoDNoWBNoANoBWBBWABAAWA
GreeceDWDBNoBDBBNoBMBBBBBBBBNoNoB
HungaryDDWQMDWWWNoQAAAAAAAAAAAQ
IcelandDDNoNANoNoMAANoANoANoAAANoAAMNoA
IsraelDDDQDWWBBMWBBBBBBQBBBWB
ItalyDDMBBDWBBBBNoBABBBBBBBDB
JapanDDDBADWANoDNoDANoAAMAAAAANo
LuxembourgDDABNoDDABBANoAWBABABANoWB
NetherlandsDDNoANoDWABNoNoNoAWBABBBBBWA
NorwayDQ–1ANoBWBBWBQAAAADBAAAAB
PortugalDDDQQDWQQQQQQQQQQQQQQQQ
SpainWWDBNoDWBAQQAAQQBQAAAADA
SwedenDDDAQMWAAAADBBBBBABBBMA
UKDD–1NoNoDDMNoMNoDBBBBBBBBBAQ
UruguayDDDNoMWWWWNoQQQQ–1NoQNo–1NoMMNo
USADNoDBQM2WBBQWM/AAAAAAANo3AAANo3
  • 1, processor sensitometry; 2, developer temperature; 3, darkroom cleanliness; 4, darkroom fog; 5, fixer retention; 6, phantom image; 7, cassette cleaning; 8, screen-film contact; 9, compression force; 10, repeat analysis; 11, film viewing conditions; 12, mammography unit safety; 13, unit collimation/beam alignment; 14, focal spot size/spatial resolution; 15, kVp accuracy and reproducibility; 16, beam quality/HVL measurement; 17, automatic exposure control; 18, cassette screen speed uniformity; 19, X-ray output; 20, beam entrance exposure; 21, average glandular dose; 22, artefact evaluation; 23, film viewbox illumination.

  • D, daily; W, weekly; M, monthly; Q, quarterly; B, biannually; A, annually; NA, not applicable due to use of daylight processing only; HVL, half-value layer; kVp, peak kilovoltage.

  • 1No response given.

  • 2Required weekly after April 28, 1999 under MQSA Final Rules [9].

  • 3Required after April 28, 1999 under MQSA Final Rules [9].

View this table:
Table 3:

Personnel performing QC tests in population-based screening programs in 22 countries

CountryCriteria
1234567891011121314151617181920212223
AustraliaRT,MPRT,MPRT–1SPRTRT–1–1–1RTRTSPSPRTSPRTRTSPSP–1MPRT
BelgiumRT,MPMPRT,MPMP–1RT,MPRTMPMP–1MP–1MPMPMPMPMPMPMPMP–1RT,MPMP
CanadaRTRTRTRTRTRTRTRTRTRTRTRT,MP,SPMPMPMPMPMPRTMPMPMPRT,MP–1
DenmarkRTRTRT–1–1RTRTRTRT–1MPMPMPMPMPSPMPMPMP–1MPRT,MDMP
FinlandRTRTRTRTRTRTRTRTSPRTSPSPSPSPSPMPSPRTSPSPSPSPSP
FranceRTRTRTRT,MP–1RT,MDRTMPMPMPMPSPMPMPMPMPMPMPMPMP–1RT,MDMP
GermanyRT,MPRT,MPRT,MPMP–1RTNoRTMP–1MP–1MPRTMPMPRTMPMPMPMPMPMP
GreeceRT,MPRT,MPRT,MPMP–1MPRTMPMP–1MPSPMPMPMPMPMPMPMPMP–1–1MP
HungaryRTRTRTMPMPMPRTRTMP–1MPSPSPSPSPSPSPSPSPSPSPSPMP
IcelandSPSP–1–1–1–1RTRTSP–1SP–1SP–1SPSPSP–1SPSPRT,MD–1SP
IsraelRTRTRTRT,MPRTRT,MPRTRT,MPMPRT,MPRT,MPMPRT,MPMPMPMPMPRT,MPMPMPMPRT,MPMP
ItalyRTRTRTRTMPRT–1RTMPMPMP–1MPMPMPMPMPMPMPMPMPMPMP
JapanRTRTRTRTRTRTRTRT–1RTRTRTRT–1RTRTRTRTRTRTRTRT–1
LuxembourgRT,MPRTMPMP–1RT,MPRTMPMPMPRT–1MPMPMPMPMPMPMPMP–1MPMP
NetherlandsRT,MPRT–1MP–1RT,MPRTMPMP–1–1–1MPMPMPMPMPMPMPMPMPMPMP
NorwayRTRTRTRT,MP–1RT,MPRTRTRT,MPMDRTRTMPMPMPMPRTRTMPMPMPMPRT
PortugalRTRTRTMPMPRTRTMPMPMPMPMPMPMPMPMPMPMPMPMPMPMPMP
SpainRT,MPRT,MPRT,MPMP–1RTRTMPMPMPMDMPMPMPMPMPMPMPMPMPMPRTMD
SwedenRT,MPRT,MPRT,MPMPSPRTRTMPMPMPMP,MDRTMPMPMPMPMPMPMPMPMPMPMP,MD
UKRTRT–1–1–1RTRTRT–1RT–1RTMPMPMPMPMPMPMPMPMPRT,MPRT,MP
UruguayRTRTRT–1SPRT,MDRTRT,MDRT,MD,SP–1SPSPSPSPSPNoSP–1SP–1RT,SP,MDRT,SP,MD–1
USART–1RTRTRTRTRTRTRTRTRTRT,MPMPMPMPMPMPMPMP–2MPMPMPMP–2
  • 1, processor sensitometry; 2, developer temperature; 3, darkroom cleanliness; 4, darkroom fog; 5, fixer retention; 6, phantom image; 7, cassette cleaning; 8, screen-film contact; 9, compression force; 10, repeat analysis; 11, film viewing conditions; 12, mammography unit safety; 13, unit collimation/beam alignment; 14, focal spot size/spatial resolution; 15, kVp accuracy and reproducibility; 16, beam quality/HVL measurement; 17, automatic exposure control; 18, cassette screen speed uniformity; 19, X-ray output; 20, beam entrance exposure; 21, average glandular dose; 22, artefact evaluation; 23, film viewbox illumination.

  • RT, radiographer/radiologic technologist; MP, medical physicist; MD, physician/radiologist; SP, service person; HVL, half-value layer; kVp, peak kilovoltage.

  • 1Test not performed.

  • 2Required after April 28, 1999 under MQSA Final Rules [9].

All screening sites performed processor sensitometry on a daily basis, except for one (which performed the test weekly). Other QC tests, however, were performed with a wider range of test frequencies. For example, the minimum frequency for developer temperature and phantom image quality tests ranged from daily to biannually. The reported minimum frequencies of darkroom cleaning, repeat analysis, mammography unit safety, kVp accuracy/reproducibility, AEC performance, and artefact evaluation ranged from daily to annually (to not routinely performed).

Table 3 summarizes survey responses indicating the personnel responsible for performing each QC test in each of the 22 countries. Only cassette screen cleaning had complete agreement concerning the mammography personnel performing the test (radiographer/technologist, RT). All other tests had a range of personnel responsible for test performance. For example, processor sensitometry was performed by the RT alone in 13 countries, by either the RT or medical physicist (MP) in eight countries, and by a service person (SP) in one country. Screen-film contact was performed by the RT in 11 countries, by the MP in eight countries, by the RT or MP in one country, and by the RT and interpreting physician (MD) in one country. Focal spot size/spatial resolution testing was reported to be performed by the MP in 15 countries, by the SP in four countries, and by the RT in one country. Similarly, average glandular dose was reported to be measured by the MP in 12 countries, by an SP in two countries, by an RT alone in one country, by either the RT or MD in one country, and by the RT, SP, or MD in one country.

Discussion

IBSN survey results indicate that all population-based screening or surveillance programs operate under highly evolved technical QC standards. Of the 23 QC tests listed in Tables 2 and 3, all countries reported performing at least 14 of these tests and more than half of the countries reported performing at least 21 of the 23 tests. A surprising result of this survey is that only nine countries had legally required QC standards. QC was required by practice standards or by the screening program in 13 countries. All but one country surveyed had developed country-specific QC guidelines or had adopted existing international QC guidelines, and the remaining country planned to do so in the year 2000.

The survey showed the effect of guidance documents such as the European Guidelines for Quality Assurance in Screening Mammography on mammography QA/QC practices [2]. Nine countries stated that they had adopted the European Guidelines as their QC standards. Other European countries with their own QA/QC standards had participated in the development of the European Guidelines or had adapted their country-specific standards from the European Guidelines.

The European Guidelines take a broad view of QA, including not only technical QC of mammography, but also QA for cytopathology, histopathology, reporting, and data system management [2]. The ACR Mammography QC Manuals [3], on the other hand, focus on technical QC, but do not include QA in non-radiological areas of breast cancer diagnosis. The ACR-BIRADS lexicon provides reporting terminology standards [4].

Another interesting result of the survey was the variation in minimum test frequencies of many QC tests from country to country (Table 2). While the great majority of countries performed processor QC tests daily and cassette cleaning daily to weekly, other QC tests were reported to be performed with a much wider range of minimum frequencies. These tests included: phantom image quality (daily to biannually among 21 countries, not routinely performed in one country), screen-film contact (weekly to annually), fixer retention (daily to annually in half the countries), compression force test (weekly to annually in 20 countries, not routinely performed in two countries), and artefact evaluation (daily to annually in 20 countries, not routinely performed in two countries).

These results suggest that more attention might be paid to standards for the minimum frequencies of required QC tests. Performing QC tests more frequently than needed adds unnecessary costs to mammography screening; performing QC too infrequently fails to identify problems in a timely manner. Test performance should be tied to the likely frequency of failure of system components. For example, it is an unwise allocation of personnel time and resources to test focal spot size/system resolution daily or weekly when the focal spot size changes little over a period of years from tube installation to just before tube failure. More research on the failure frequency of mammography system components and more solid basing of specific QC test frequencies on failure frequencies might lead to a more reasonable allocation of resources at mammography facilities.

With regard to personnel performing QC tests, most countries had RTs performing daily processor tests and MPs performing more technical measurements such as collimation, focal spot, kVp, beam quality, and AEC performance tests. There were some notable exceptions, however. In Japan, RTs performed all QC tests. In Finland, Hungary, Iceland, and Uruguay, service personnel performed most of the technical tests typically performed by an MP in other countries.

A question that was not addressed in this survey was the comprehensiveness of QA and QC guidelines beyond the population-based screening programs in each country. In some countries, such as Finland, Israel, the Netherlands, Sweden, and the UK, the screening program and its QA/QC requirements exist throughout the country. In other countries, such as the USA, the QC standards are national requirements of every mammography facility. In some countries, however, the screening program and its prescribed QA/QC standards apply only to a fraction of the mammography performed in the country. This survey did not determine the QA and QC standards that apply to mammography performed outside the surveyed national or sub-national screening programs.

Conclusion

The great majority of QC tests are performed routinely in the breast cancer screening programs in each of the 22 respondent countries, demonstrating that equipment QC testing is highly evolved among population-based screening and surveillance programs. Test frequencies and personnel performing each QC test are less standardized from country to country, suggesting that some countries might benefit by comparing details of their QC practices and personnel assignments with those in other countries. Further research is needed to provide more evidence-based recommendations of appropriate test frequencies for most mammography QC tests.

Acknowledgements

The following IBSN members and collaborators contributed data to this study: P. Jha, B. Chapple, Australia; A. Grivegnee, Belgium; F. Bouchard, Canada; E. Lynge, Denmark; M. Hakama, Finland; H. Sancho-Garnier, J. Stines, France; L. von Karsa, Germany; I. Garas, A. Linos, E. Riza, Greece; F. Szabo, A. Petranyi, Hungary; B.F. Sigfusson, Iceland; J. Buttimer, Ireland; G. Rennert, Israel; E. Paci, E. Gentile, M. Roselli del Turco, Italy; N. Ohuchi, Japan; A. Scharpantgen, Luxembourg; M. Broeders, R. Holland, J. Hendriks, K. Siekman, J. Fracheboud, H. de Koning, Netherlands; G. Skare, Norway; V. Rodrigues, Portugal; N. Ascunce, Spain; H. Malmquist, L. Nystrom, G. Svane, Sweden; S. Moss, J. Cooke, J. Patnick, UK; G. Pou, Uruguay; R. Ballard-Barbash, S. Taplin, B. Yankaskas, W. Barlow, E Hendrick, USA. IBSN Quality Assurance Committee Members are: Rachel Ballard-Barbash (NCI, IBSN Chair), Carrie Klabunde (NCI, Study Coordinator), Francoise Bouchard (Canada), Mary Codd (Ireland), Andre Grivegnee (Belgium), Edward Hendrick (USA), Gonzalo Pou (Uruguay), Vitor Rodrigues (Portugal), Helene Sancho-Garnier (France), Astrid Scharpantgen (Luxembourg), and Stephen Taplin (USA). This work was sponsored in part by funding from the National Cancer Institute, Applied Research Branch.

References

View Abstract