Feature | Digital Radiography (DR) | January 29, 2019 | By Jeff Zagoudis

Achieving Optimum Reject Rate in Digital Radiography

Improved analytics and technology advancements can help minimize the need for X-ray retakes

In today’s digital environment, a radiologist only sees images saved and shared to the PACS, so a firm understanding of X-ray reject rates is crucial for high image quality and good workflow.

In today’s digital environment, a radiologist only sees images saved and shared to the PACS, so a firm understanding of X-ray reject rates is crucial for high image quality and good workflow.

X-rays were the first medical imaging technology to be invented, and they remain one of the most commonly performed exams worldwide today. The technology has evolved immensely since it was first discovered in 1895, progressing from the original screen film to computed radiography (CR) to today’s digital radiography (DR) units. While the technology has changed, the need for high-quality images at the lowest possible dose to the patient has not, and has even heightened as medicine shifts toward value-based care. For DR, this means optimizing protocols and techniques through a robust system of reject rate analysis.

 

Defining Rejects and Reject Analysis

Before the need to retake an X-ray can be determined, it must be decided what constitutes a “rejected” image. According to Ingrid Reiser, Ph.D., DABR, a clinical diagnostic physicist and associate professor of radiology at the University of Chicago, rejects are patient images that are discarded by the technologist without being presented to the radiologist. She shared the definition as part of a presentation on reject rate analysis at the 2018 Radiological Society of North America (RSNA) annual meeting.

In today’s digital environment, the second part of that definition — without being presented to the radiologist — is the key consideration. A radiologist will only see images that are saved and shared to the picture archiving and communication system (PACS), so they likely have no idea how many images are actually acquired during an exam. In this scenario, having a firm understanding of reject rates is crucial for maintaining high image quality and good workflow. Furthermore, Reiser noted, each image not sent to the radiologist’s workstation represents wasted radiation dose to the patient — a cardinal sin in radiology.

Reiser stressed, however, that the goal of reject rate analysis is not to reduce the rate to zero. Poor-quality images are an unavoidable part of diagnostic imaging, and if they are determined to be poor they should be eliminated from consideration.   

 

Digital Challenges of Reject Analysis

Examining rejects and reject rates tells a story, according to Alisa Walz-Flannigan, Ph.D., diagnostic medical physicist at the Mayo Clinic in Rochester, Minn. Reviewing rejected images (and rejected image data) provides information about image quality standards, inherent imaging problems and modes of failure, as well as how technologists troubleshoot these types of issues.

While technological advances have eliminated many of the challenges of screen-film and computed radiography, digital X-rays come with their own set of difficulties:

  • Many hospitals and imaging centers use imaging systems from multiple vendors, and each vendor may collect different bits of data using different methods;
  • As a result, information retrieval may be cumbersome, and some desired information may not be retrievable at all;
  • Reject analysis may require a software add-on at additional cost; or
  • The reject analysis software could interfere with clinical operation.

 

Quality Control for DR

While imaging standards and practices are unique to every radiology department, there are resources available for practitioners seeking guidance. The American Association of Physicists in Medicine (AAPM) Task Group 151 published a report in 2015 on quality control procedures in digital radiography, with the goal of recommending consistency tests for optimum image acquisition. The task group said the tests should be performed by a medical physicist, or a radiology technologist under the supervision of a medical physicist.

One of the main procedures recommended by the task group is defining a fault tree of actions that need to be taken when certain fault conditions are met. The diagram should begin with a particular problem, such as an artifact in a patient image, and run through possible corrective actions based on the source and severity of the artifact.

The AAPM report also reinforces the importance of performing reject rate analysis: The task force cites a prior study of 18 radiology departments where 14 percent of patient exposure in projection radiography was deemed to be the result of repeated images.

Ultimately, the task group recommends a target reject rate of 8 percent for DR units. A literature review found reject rates for screen-film radiography hovered around 10 percent, with nearly half of all rejects due to exposure errors. The task force arrived at the 8 percent figure for digital with the assumption that exposure errors will be fewer with DR. Ten percent is the recommended threshold for further investigation and possible corrective action.   

 

Starting a Reject Rate Analysis Project

While the principles of reject rate analysis may be straightforward, starting a Reject Rate Analysis Project (RRA) is a major undertaking. Reiser shared her experience leading an RRA project for digital X-ray at the University of Chicago beginning in 2014.

Reiser acknowledged that prior to 2014, all reject rates were self-reported by technologists. The reasons for this were twofold and largely practical: Reject analysis software was optional for many of the department’s DR units, and the features were turned off for systems that were capable of RRA. This was done, Reiser said, to avoid file storage problems with the limited hard drive space.

Quarterly reviews were conducted for each X-ray tech’s workload, and each tech was assigned a quality score for eight randomly chosen exams. (Approximately three exams were chosen per month to ensure a wide range of images were reviewed.) An in-person meeting was conducted with the technologist and the imaging technology coordinator or manager to discuss the findings. Those discussions included:

  • Review of the technologist’s reject rate by category (anatomy, clinical area, artifacts, motion, etc.);
  • Review of any quality improvement (QI) tickets issued and tracked through the PACS by radiologists. Tickets are assigned by improvement category, such as Joint Commission patient safety goals or image quality reasons. Reiser said the QI tickets were  kept as an ongoing quality review measure;
  • Changes in reject rates; and
  • Ideas or feedback for future quality improvements.

 

Any corrective actions, if required, are also identified during the in-person meetings.

 

Designing Interventions

Reiser concluded by offering advice for department heads looking to design and implement their own reject rate analysis interventions.

At the University of Chicago, the radiology department conducts a series of in-service meetings to teach the specific elements of reject analysis. The meetings cover classification of reject categories, a review of current reject rates and discussion of when to reject an image.

To focus on image review, separate in-services are conducted for technologists targeting specific exam types. Example focus areas include portable X-ray exams, wrist imaging, chest X-rays and lumbar spine exams. One rejected image is chosen randomly for each rejection category and compared with the “approved” diagnostic image.

The goal of these meetings, according to Reiser, is to develop image critique skills as a group, and encourage ownership and accountability for exams performed. For this reason, she recommended using the names of the technologists attached to each image set during discussions, rather than coding for usernames.

Walz-Flannigan shared key questions in her RSNA presentation that should be considered during image review:

  • Is the accumulated data accurate?
  • Are standards being followed (with the system setup and technologist use)?
  • Do standards need to be adjusted?
  • Are good images being rejected?
  • What challenges are techs facing?

 

Reiser concluded that while interventions may not always lead to better reject rates, they may still improve image quality, making them a critical component of patient care. 

Related Content

RADLogics AI-Powered solution in use: chest X-ray of COVID-19 positive case with heatmap key image.

RADLogics AI-Powered solution in use: chest X-ray of COVID-19 positive case with heatmap key image.

News | Artificial Intelligence | September 23, 2020
September 23, 2020 — RADLogics
The cartilage in this MRI scan of a knee is colorized to show greater contrast between shades of gray.

The cartilage in this MRI scan of a knee is colorized to show greater contrast between shades of gray. Image courtesy of Kundu et al. (2020) PNAS

News | Artificial Intelligence | September 22, 2020
September 22, 2020 — Researchers at the University of Pitts...
New research from King's College London has found that COVID-19 may be diagnosed on the same emergency scans intended to diagnose stroke.

Canon Medical Systems

News | Cardiac Imaging | September 22, 2020
September 22, 2020 — New research from King's College London has
Philips Azurion Lung Edition supports high precision diagnosis and minimally invasive therapy in one room
News | Lung Imaging | September 21, 2020
September 21, 2020 — Philips introduced...
According to a new report published by P&S Intelligence, the global radiotherapy market is expected to expand from $7.2M in 2019 to $17M by 2030.

Image courtesy of Accuray

Feature | Radiation Therapy | September 21, 2020 | By Melinda Taschetta-Millane
According to a...
According to Philips, MR-STAT is a major shift in MRI, relying on a new, smart acquisition scheme and machine-assisted reconstruction. It delivers multiple quantitative MR parameters in a single fast scan, and represents a significant advance in MR tissue classification, fueling big data algorithms and AI-enabled integrated diagnostic solutions.

Image courtesy of Philips Healthcare

Feature | Magnetic Resonance Imaging (MRI) | September 21, 2020 | By Melinda Taschetta-Millane
A new report,...
Indeterminate lesion on PET/CT classified by PET/MRI for 53-y-old man with lung cancer.

Indeterminate lesion on PET/CT classified by PET/MRI for 53-y-old man with lung cancer. Contrast-enhanced CT (A), PET (B), and fused 18F-FDG PET/CT (C) images are displayed in comparison with contrast-enhanced T1-weighted MRI (D), PET, and fused 18F-FDG PET/MRI (F) images. In CT (A), hyperdense, subcentimeter liver lesion (arrows) in segment VII is suggestive of transient hepatic attenuation difference or small hemangioma. As malignancy cannot be excluded, it needs further investigation. On PET/MRI, lesion is clearly classified as metastasis because of contrast enhancement and tracer uptake due to later acquisition time point. Follow-up CT confirmed diagnosis after 78 d. Images created by Ole Martin, University Dusseldorf, Medical Faculty and Benedikt Schaarschmidt, University Hospital Essen.

News | PET-MRI | September 18, 2020
September 18, 2020 — A single-center observational study of more than 1,000 oncological examinations has demonstrated