Basics of Image Processing: The Facts and Challenges of Data Harmonization to Improve Radiomics Reproducibility

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This book, endorsed by EuSoMII, provides clinicians, researchers and scientists a useful handbook to navigate the intricate landscape of data harmonization, as we embark on a journey to improve the reproducibility, robustness and generalizability of multi-centric real-world data radiomic studies. In these pages, the authors delve into the foundational principles of radiomics and its far-reaching implications for precision medicine. They describe the different methodologies used in extracting quantitative features from medical images, the building blocks that enable the transformation of images into actionable predictions. This book sweeps from understanding the basis of harmonization to the implementation of all the knowledge acquired to date, with the aim of conveying the importance of harmonizing medical data and providing a useful guidance to enable its applicability and the future use of advanced radiomics-based models in routine clinical practice. As authors embark on this exploration of data harmonization in radiomics, they hope to ignite discussions, foster new ideas, and inspire researchers, clinicians, and scientists alike to embrace the challenges and opportunities that lie ahead. Together, they elevate radiomics as a reproducible technology and establish it as an indispensable and actionable tool in the quest for improved cancer diagnosis and treatment.

Author(s): Ángel Alberich-Bayarri, Fuensanta Bellvís Bataller
Publisher: Springer
Year: 2024

Language: English
Pages: 218

Preface
Contents
1: Era of AI Quantitative Imaging
1.1 Precision Medicine Needs Precision Imaging
1.2 Transforming Clinical Care from Qualitative to Quantitative
1.2.1 Automated Methods Capable of Quantifying Imaging Features Related to Clinical Endpoints
1.2.2 AI-Based Methods as Gold-Standard for Imaging Biomarkers
Image Acquisition and Reconstruction
Image Harmonization
Image Synthesis for Data Augmentation
Image Segmentation
Extraction of Deep Features
AI Models for Prediction of Clinical Endpoints
Integration of Imaging, Clinical, Biological and Pathology Data
References
2: Principles of Image Formation in the Different Modalities
2.1 Ionizing Radiation Imaging
2.1.1 X-Ray Beam Generation
X-Ray Tube
Generator
2.1.2 Radiation-Patient Interaction
Photoelectric Effect
Compton Effect
2.1.3 Image Acquisition and Reconstruction
Computed Tomography
Reconstruction Algorithms
2.1.4 Image Quality
Spatial Resolution
Contrast Resolution
Image Noise
Artifacts
2.2 Nuclear Medicine Imaging
2.2.1 Radiopharmaceuticals
2.2.2 Physics Concepts in PET: Decay, Annihilation, and Coincidences
Radioactive Decay
Electron–Positron Annihilation
Scattered Coincidences
Random Coincidences
Multiple Coincidences
2.2.3 PET Detector Materials
2.2.4 Image Acquisition and Reconstruction
2.2.5 Image Quality
2.3 Magnetic Resonance Imaging
2.3.1 Hardware Components of MR
Magnetic Field Magnet
Magnetic Field Gradient Magnets
Radiofrequency Coils
2.3.2 Physical Basis
Nuclear Spin and Magnetic Moment
Precession and Larmor Frequency
Parallel or Antiparallel Alignment
Resonance and Nutation Motion
Longitudinal Relaxation: T1
Transverse Relaxation: T2 and T2*
Proton Density Image (PD)
T1, T2 or PD Weighted Images
2.3.3 Image Acquisition and Reconstruction
K-Space
2.3.4 Image Quality
Signal-to-Noise Ratio (SNR)
Spatial Resolution
Contrast-to-Noise Ratio (CNR)
Image Acquisition Time
References
3: How to Extract Radiomic Features from Imaging
3.1 Introduction to Radiomic Analysis
3.2 Deep Learning vs. Traditional Machine Learning
3.3 Radiomic Features Extraction Process
3.3.1 Image Preprocessing
3.3.2 Image Segmentation
3.3.3 Feature Extraction and Selection
3.3.4 Standardization
3.4 Deep Learning Radiomic Features
3.4.1 Deep Learning Radiomics and Hand-Crafted Radiomics
References
4: Facts and Needs to Improve Radiomics Reproducibility
4.1 Introduction
4.2 Factors Influencing Reproducibility
4.2.1 Acquisition
4.2.2 Segmentation
4.2.3 Radiomic Features Extraction
4.2.4 Model Construction
4.3 How to Improve Reproducibility
4.3.1 Guidelines and Checklists
4.3.2 Code and Development Platforms
4.4 Recommendations for Achieving Clinical Adoption of Radiomics
References
5: Data Harmonization to Address the Non-biological Variances in Radiomic Studies
5.1 Non-biological Variances in Radiomic Analysis
5.2 Data Harmonization
5.2.1 Data Harmonization in Radiomics Studies
5.2.2 Automatic Harmonization Schemes
5.2.3 Automatic Harmonization Approaches
Location and Scale Methods
Clustering Methods
Matching Methods
Synthesis Methods
Invariant Representation Learning Methods
5.3 Challenges for Data Harmonization
References
6: Harmonization in the Image Domain
6.1 The Need for Image Harmonization
6.2 Image Variability Sources
6.2.1 Image Acquisition
6.3 Harmonization Techniques
6.3.1 Non-AI Methods
Intensity Scaling
Z-Score Normalization
Histogram Equalization
Histogram Matching
6.3.2 AI Methods
Autoencoders
Generative Adversarial Networks (GANs)
Applications and Other Approaches
6.4 Conclusions
References
7: Harmonization in the Features Domain
7.1 Introduction
7.2 Reproducibility of Radiomic Features
7.2.1 Imaging Data Reproducibility
Image Acquisition and Reconstruction Parameters
CT Scans
PET Scans
MRI Sequences
Intra-individual Test-Retest Repeatability
Multi-scanner Reproducibility
7.2.2 Segmentation Reproducibility
7.2.3 Post-processing and Feature Extraction
7.2.4 Reporting Reproducibility
7.3 Normalization Techniques
7.3.1 Statistical Normalization
7.3.2 ComBat
7.3.3 Deep Learning Approaches
7.4 Strategies Overview
References