"I strongly believe that the Game Changers in the field of AI and Healthcare are the ones who will Master the Art of seamless integration into current clinical and radiological workflows, or … total replacement & disruption of how we make it today!"

I previously described how AI could transform the Traumatic Musculoskeletal X-Ray Workflow, and today I will describe a second Conventional Radiography Workflow which could be disrupted by AI; the Chest X-ray Inpatient Follow-up Pathway.

Chest X-rays are the most frequently performed medical imaging exams worldwide and are ordered by a diverse range of healthcare providers from General Practitioners to highly specialised Oncologists. Just in the UK and between April 2012 and March 2013, approximately 7.7 millions Chest X-rays have been performed.

Many hospitalised patients in organ specific departments or intensive care units (ICU) are followed up regularly by Chest X-rays during their hospital stay. These routine radiographs are intended to control the evolution of the thoracic disease and state, and detect complications of bed rest immobilisation (broncho-aspiration and infection) and indwelling devices (malposition). Some patients can have up to 2 or more Chest X-rays per day depending on the severity of the disease, the local habits and the anxiety level of the clinician in charge. This generates an enormous amount of repetitive Chest X-rays, usually slightly different from each other, which are analysed by the clinician at the bedside without radiological expertise.

It is perfectly fine and legal for the non-radiologist physician (we call him the clinician) to check these X-rays bedside and give an immediate feedback and intervention to the patient if needed. Usually (based on my personal experience), clinicians tend to be more focused on the principal disease of the patient and to make a very “selective” analysis of the Chest X-ray. They tend to look for an answer to their specific question and might overlook some additional findings not related directly to the main problem of the patient. By opposition, the radiologists have a more all-inclusive analysis of the radiograph, using check lists and structured interpretation & reporting frameworks to scrutinize all organs and parts of the radiograph, ultimately taking the most of the image.

Let’s take the example of Pneumothorax.

Pneumothorax is an affection in which air is abnormally located between the lung and the chest wall, it compresses lungs and compromises breathing and blood circulation in the heart. Pneumothorax affects approximately 7.4–18 people per 100,000 persons per year. 

Pneumothorax can be managed conservatively (do-not-touch approach), by a small puncture to evacuate the abnormal air collection or through a larger tube inserted inside the chest, specifically in the pleural space between lung and chest wall, and maintained for few days. If these methods are not enough, surgery with pleurodesis (permanent attachment of the lung to the chest wall) is the ultimate option.

A patient hospitalized for pneumothorax will have repetitive Chest X-rays to follow-up the evolution of the abnormal thoracic air collection. Each X-ray slightly different from the previous one until near-resolution.

The non-radiologist clinicians in charge of the patient will take a look at these Chest X-rays at bedside, assess the evolution of the pneumothorax and take a decision based on their interpretation. Again, decisions are usually made without any expert radiologist validation and patients are discharged when the clinicians judge that it is the right time.

But a Chest X-ray does not inform only about the pneumothorax and its complications. Many other chest structures are imaged and could be abnormal. This is where the Radiologist comes in.

In one of my previous Academic Hospitals, 1 senior and 1 junior radiologists were in charge of reviewing these Inpatients Chest X-rays on top of the first clinician’s evaluation at bedside. They performed a complete analysis of the image and created a formal report. This radiologist assessment is coded and billed on top of the first clinician interpretation.

When radiologists’ eyes are confronted to an X-ray, they usually discover lesions not seen by clinicians, some of them potentially lethal for patients.

Again, and similarly to the Musculoskeletal Traumatic X-ray Pathway which I described previously, when the radiologist analyses the X-ray, the patient could have been already discharged and back home, under antibiotics or back to the operating room or the ICU, based only on the clinician interpretation.

When a radiologist reviews these images and finds something unusual, he immediately calls the clinician to report the finding. When the busy clinician has already noticed the abnormality, he is bothered by the call and tensions can arise, but when he is not aware of this finding, he usually appears very collaborative and thankful, and this “annoying” call from the radiologist can ultimately save a life.

I am not sure if in all hospitals radiologists review these inpatient routine follow-up Chest X-rays. Factors that would influence the fact that radiologists review these images are multiple and include the Radiology staffing, the institutional organisation, the teaching purposes, the level of anxiety/confidence of the non-radiologists who read these X-ray at bedside, the reimbursement model (if the reimbursement is better when the X-ray is read by radiologists) and the interest of the radiologists.

Despite the heterogeneity in practices and workflows across the Globe, I believe every X-ray performed should be reviewed by a trained medical imaging expert on top of the first and focused bedside interpretation by the clinician. We should remind here that a missed 3mm lung nodule on an X-ray could turn into an incurable cancer few months or years later…

From the radiologist perspective, this Inpatient routine Chest X-rays review is perceived as repetitive, boring, inaccurate, “old fashioned” 2D imaging (compared to low dose chest CT) and frequently frustrating when the radiologist finds nothing more than the clinician and when the patient is running the NY Marathon after total recovery while the radiologist is just opening his X-ray to analyse it. Usually, radiologists review the X-rays with delays, far from where the action takes place and with limited clinical information.

Despite this Radiologist’s pet peeve and even if they were enthusiastic about reading exclusively routine Chest X-rays to unveil every single incidental small lung nodules or bone metastasis, there are not enough of them to achieve this task.

To illustrate the importance of this phenomenon, let’s take the example of a UK National Health Service (NHS) Hospital in Portsmouth. In this NHS site, 23'000 Chest X-Rays acquired during a 12 months period were just not reviewed by a trained physician and the UK Care Quality Commission asked the Hospital to take “immediate action” to review this backlog and to put in place “robust processes to ensure that any images are reported on and risk-assessed.”

From my perspective, this does not necessarily have to be done by Humans.

Here again I see a Clinical & Radiological Workflow that can be positively impacted by AI and Deep Learning.

AI could act at least at 5 levels...

1 - First and foremost, classification between normal and abnormal X-Rays:

This classification would be more valuable in emergency, outpatient or screening settings than in inpatient ecosystems where patients are already hospitalized for a disease.

2 - Detection and Classification of lesions on Chest X-rays:

Including major findings such as pneumonia, pneumothorax, masses and more subtle findings such as millimetric lung nodules and fibrosis.

The ultimate goal would be to automatically produce a full radiologic analysis and report of the Chest X-ray, integrated within the clinical workflow.

Current AI projects try to replicate narrow tasks with variable accuracies. For example the Stanford Team have developed an AI algorithm for Chest X-Rays analysis of 14 diseases; the CheXnet algorithm. This 121-layer convolutional neural network trained on the largest publicly available database ChestX-ray14 was able to “detect pneumonia from Chest X-ray at a level exceeding practicing radiologists”. I expect AI algorithms to cover a broader spectrum of diseases and findings with greater accuracy and to be integrated in holistic solutions offering strong support to healthcare providers.

3 - Comparison with previous X-rays and assessment of evolution:

For routine Chest X-rays performed once or twice a day, an algorithm could compare each new radiograph with the previous ones, detect & highlight subtle change which might not be detectable by human vision. It might also be better suited to quantify the changes over time for an accurate follow-up of the patient. The clinician in charge of the patient would be alerted only if something is unusual, if an intervention is needed, or when it is time to discharge the patient. The algorithm could also adjust the frequency of X-rays controls depending on the evolution and thus optimizing workflow and radiation exposure.

4 - Predictive analytics:

Based on the routine Chest X-rays (and many other parameters such as bedside monitoring, clinical notes, biological data…), algorithms could predict in advance complications, outcomes, morbidity and mortality risks, the need for prolonged length of stay, the best time to discharge or the best time for an actionable intervention such as the introduction of a specific type of antibiotic, assisted respiration or fluid perfusion.

5 - Unveiling new imaging biomarkers:

Deep Learning algorithms has the ability to process images and find patterns which are not visible to human eyes. Images are a constellation of shapes, borders, colors, shade of grays, noise, contrasts, pixel values, artifacts and patterns. New generation AI algorithms are able to notice radiologic features without human intervention and features not visible to human eyes, thus extracting new knowledge from medical images. Google and Verily have already paved the way for such sophistication in image analysis by revealing their work on retinal scans for prediction of cardiovascular risk factors. 

The Google algorithm was able to predict the level of blood pressure when medical doctors could only classify a retinal scan as normal or suggestive of high blood pressure (with a 4 grades classification). Moreover, the algorithm predicted risk factors that were “not thought to be present or quantifiable on retinal images” such as age, gender, smoking status and major adverse cardiac events. And even more impressing, the Google team showed that the algorithm based his predictions on relevant anatomical structures on the image such as blood vessels and optic disk. This study “provides evidence that deep learning may uncover additional novel signals in retinal images” and hopefully on other types of medical images.

Yesterday (April 11, 2018), the FDA approved marketing of the first AI system for autonomous detection of diabetic retinopathy. The IDx-DR device is the first tool authorized for marketing which screens diabetic retinopathy without the need for a clinician to interpret the image. This AI based tool classify retinal scans into two categories:

- “more than mild diabetic retinopathy detected: refer to an eye care professional” or

- “negative for more than mild diabetic retinopathy; rescreen in 12 months.”

This will allow non-ophthalmologists to screen for diabetic retinopathy on a larger scale and hopefully prevent further eye damages.

“IDx-DR was able to correctly identify the presence of more than mild diabetic retinopathy 87.4 percent of the time and was able to correctly identify those patients who did not have more than mild diabetic retinopathy 89.5 percent of the time.” 

The main difference I see between retinal scans and radiological images (X-ray, CT, MRI, US) is that retinal scans are direct photographs of an anatomical structure, whereas Chest X-rays are an indirect image, a 2D representation of a 3D volume (the chest), based on the degree of attenuation of X-ray when travelling through the body. Thus, it might be more complex to extract accurate biomarkers from Chest X-rays because of the indirect nature of this type image.

Finally AI, when wisely integrated into clinical and radiological workflows, will not only alleviate the burden of radiologists and the uncertainty of clinicians, but will also open new frontiers in image analysis and predictive analytics contributing to a better care for our patients.

Stay tuned for the next chapters!

Dr. Amine Korchi is a Swiss Board Certified Radiologist and Neuroradiologist and Associate Partner (Europe) at HS.