In today’s digital environment, a radiologist only sees images saved and shared to the PACS, so a firm understanding of X-ray reject rates is crucial for high image quality and good workflow.
X-rays were the first medical imaging technology to be invented, and they remain one of the most commonly performed exams worldwide today. The technology has evolved immensely since it was first discovered in 1895, progressing from the original screen film to computed radiography (CR) to today’s digital radiography (DR) units. While the technology has changed, the need for high-quality images at the lowest possible dose to the patient has not, and has even heightened as medicine shifts toward value-based care. For DR, this means optimizing protocols and techniques through a robust system of reject rate analysis.
Defining Rejects and Reject Analysis
Before the need to retake an X-ray can be determined, it must be decided what constitutes a “rejected” image. According to Ingrid Reiser, Ph.D., DABR, a clinical diagnostic physicist and associate professor of radiology at the University of Chicago, rejects are patient images that are discarded by the technologist without being presented to the radiologist. She shared the definition as part of a presentation on reject rate analysis at the 2018 Radiological Society of North America (RSNA) annual meeting.
In today’s digital environment, the second part of that definition — without being presented to the radiologist — is the key consideration. A radiologist will only see images that are saved and shared to the picture archiving and communication system (PACS), so they likely have no idea how many images are actually acquired during an exam. In this scenario, having a firm understanding of reject rates is crucial for maintaining high image quality and good workflow. Furthermore, Reiser noted, each image not sent to the radiologist’s workstation represents wasted radiation dose to the patient — a cardinal sin in radiology.
Reiser stressed, however, that the goal of reject rate analysis is not to reduce the rate to zero. Poor-quality images are an unavoidable part of diagnostic imaging, and if they are determined to be poor they should be eliminated from consideration.
Digital Challenges of Reject Analysis
Examining rejects and reject rates tells a story, according to Alisa Walz-Flannigan, Ph.D., diagnostic medical physicist at the Mayo Clinic in Rochester, Minn. Reviewing rejected images (and rejected image data) provides information about image quality standards, inherent imaging problems and modes of failure, as well as how technologists troubleshoot these types of issues.
While technological advances have eliminated many of the challenges of screen-film and computed radiography, digital X-rays come with their own set of difficulties:
- Many hospitals and imaging centers use imaging systems from multiple vendors, and each vendor may collect different bits of data using different methods;
- As a result, information retrieval may be cumbersome, and some desired information may not be retrievable at all;
- Reject analysis may require a software add-on at additional cost; or
- The reject analysis software could interfere with clinical operation.
Quality Control for DR
While imaging standards and practices are unique to every radiology department, there are resources available for practitioners seeking guidance. The American Association of Physicists in Medicine (AAPM) Task Group 151 published a report in 2015 on quality control procedures in digital radiography, with the goal of recommending consistency tests for optimum image acquisition. The task group said the tests should be performed by a medical physicist, or a radiology technologist under the supervision of a medical physicist.
One of the main procedures recommended by the task group is defining a fault tree of actions that need to be taken when certain fault conditions are met. The diagram should begin with a particular problem, such as an artifact in a patient image, and run through possible corrective actions based on the source and severity of the artifact.
The AAPM report also reinforces the importance of performing reject rate analysis: The task force cites a prior study of 18 radiology departments where 14 percent of patient exposure in projection radiography was deemed to be the result of repeated images.
Ultimately, the task group recommends a target reject rate of 8 percent for DR units. A literature review found reject rates for screen-film radiography hovered around 10 percent, with nearly half of all rejects due to exposure errors. The task force arrived at the 8 percent figure for digital with the assumption that exposure errors will be fewer with DR. Ten percent is the recommended threshold for further investigation and possible corrective action.
Starting a Reject Rate Analysis Project
While the principles of reject rate analysis may be straightforward, starting a Reject Rate Analysis Project (RRA) is a major undertaking. Reiser shared her experience leading an RRA project for digital X-ray at the University of Chicago beginning in 2014.
Reiser acknowledged that prior to 2014, all reject rates were self-reported by technologists. The reasons for this were twofold and largely practical: Reject analysis software was optional for many of the department’s DR units, and the features were turned off for systems that were capable of RRA. This was done, Reiser said, to avoid file storage problems with the limited hard drive space.
Quarterly reviews were conducted for each X-ray tech’s workload, and each tech was assigned a quality score for eight randomly chosen exams. (Approximately three exams were chosen per month to ensure a wide range of images were reviewed.) An in-person meeting was conducted with the technologist and the imaging technology coordinator or manager to discuss the findings. Those discussions included:
- Review of the technologist’s reject rate by category (anatomy, clinical area, artifacts, motion, etc.);
- Review of any quality improvement (QI) tickets issued and tracked through the PACS by radiologists. Tickets are assigned by improvement category, such as Joint Commission patient safety goals or image quality reasons. Reiser said the QI tickets were kept as an ongoing quality review measure;
- Changes in reject rates; and
- Ideas or feedback for future quality improvements.
Any corrective actions, if required, are also identified during the in-person meetings.
Reiser concluded by offering advice for department heads looking to design and implement their own reject rate analysis interventions.
At the University of Chicago, the radiology department conducts a series of in-service meetings to teach the specific elements of reject analysis. The meetings cover classification of reject categories, a review of current reject rates and discussion of when to reject an image.
To focus on image review, separate in-services are conducted for technologists targeting specific exam types. Example focus areas include portable X-ray exams, wrist imaging, chest X-rays and lumbar spine exams. One rejected image is chosen randomly for each rejection category and compared with the “approved” diagnostic image.
The goal of these meetings, according to Reiser, is to develop image critique skills as a group, and encourage ownership and accountability for exams performed. For this reason, she recommended using the names of the technologists attached to each image set during discussions, rather than coding for usernames.
Walz-Flannigan shared key questions in her RSNA presentation that should be considered during image review:
- Is the accumulated data accurate?
- Are standards being followed (with the system setup and technologist use)?
- Do standards need to be adjusted?
- Are good images being rejected?
- What challenges are techs facing?
Reiser concluded that while interventions may not always lead to better reject rates, they may still improve image quality, making them a critical component of patient care.