Microscope Image Processing

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Microscope Image Processing, Second Edition, introduces the basic fundamentals of image formation in microscopy including the importance of image digitization and display, which are key to quality visualization. Image processing and analysis are discussed in detail to provide readers with the tools necessary to improve the visual quality of images, and to extract quantitative information. Basic techniques such as image enhancement, filtering, segmentation, object measurement, and pattern recognition cover concepts integral to image processing. In addition, chapters on specific modern microscopy techniques such as fluorescence imaging, multispectral imaging, three-dimensional imaging and time-lapse imaging, introduce these key areas with emphasis on the differences among the various techniques.

The new edition discusses recent developments in microscopy such as light sheet microscopy, digital microscopy, whole slide imaging, and the use of deep learning techniques for image segmentation and analysis with big data image informatics and management.

Microscope Image Processing, Second Edition, is suitable for engineers, scientists, clinicians, post-graduate fellows and graduate students working in bioengineering, biomedical engineering, biology, medicine, chemistry, pharmacology and related fields, who use microscopes in their work and would like to understand the methodologies and capabilities of the latest digital image processing techniques or desire to develop their own image processing algorithms and software for specific applications.

Author(s): Fatima Merchant, Kenneth Castleman
Edition: 2
Publisher: Academic Press
Year: 2022

Language: English
Pages: 526
City: London

Front Cover
Microscope Image Processing
Copyright
Contents
Foreword to the First Edition
Reference
Foreword to the Second Edition
Preface to the First Edition
Preface to the Second Edition
Acknowledgments
Chapter One: Introduction
1.1. The Microscope and Image Processing
1.2. The Scope of This Book
1.3. Our Approach
1.3.1. The Four Types of Images
1.3.1.1. The Optical Image
1.3.1.2. The Continuous Image
1.3.1.3. The Digital Image
1.3.1.4. The Displayed Image
1.3.2. The Result
1.3.2.1. Analytic Functions
1.3.3. The Sampling Theorem
1.4. The Challenge
1.5. Modern Microscopy
1.6. Nomenclature
1.7. Summary of Important Points
References
Chapter Two: Fundamentals of Microscopy
2.1. The Origins of the Microscope
2.2. Optical Imaging
2.2.1. Image Formation by a Lens
2.2.1.1. Imaging a Point Source
2.2.1.2. Focal Length
2.2.1.3. Magnification
2.2.1.4. Numerical Aperture
2.2.1.5. Lens Shape
2.3. Diffraction Limited Optical Systems
2.3.1. Linear System Analysis
2.4. Incoherent Illumination
2.4.1. The Point Spread Function
2.4.2. The Optical Transfer Function
2.5. Coherent Illumination
2.5.1. The Coherent Point Spread Function
2.5.2. The Coherent Optical Transfer Function
2.6. Resolution
2.6.1. The Abbe Distance
2.6.2. The Rayleigh Distance
2.6.3. Size Calculations
2.7. Aberration
2.8. Calibration
2.8.1. Spatial Calibration
2.8.2. Photometric Calibration
2.9. Summary of Important Points
References
Chapter Three: Image Digitization and Display
3.1. Introduction
3.2. Digitizing Images
3.2.1. Resolution
3.2.2. Sampling
3.2.3. Interpolation
3.2.4. Aliasing
3.2.5. Noise
3.2.6. Shading
3.2.7. Photometry
3.2.8. Geometric Distortion
3.3. Overall System Design
3.3.1. Cumulative Resolution
3.3.2. Design Rules of Thumb
3.3.2.1. Pixel Spacing
3.3.2.2. Resolution
3.3.2.3. Noise
3.3.2.4. Photometry
3.3.2.5. Distortion
3.4. Image Display
3.4.1. Volatile Displays
3.4.2. Displayed Image Size
3.4.3. Aspect Ratio
3.4.4. Photometric Resolution
3.4.5. Grayscale Linearity
3.4.6. Low-frequency Response
3.4.7. High-frequency Response
3.4.7.1. Sampling for Display Purposes
3.4.7.2. Oversampling
3.4.7.3. Resampling
3.4.8. Noise
3.5. Summary of Important Points
References
Chapter Four: Geometric Transformations
4.1. Introduction
4.2. Implementation
4.3. Gray Level Interpolation
4.3.1. Nearest Neighbor Interpolation
4.3.2. Bilinear Interpolation
4.3.3. Bicubic Interpolation
4.3.4. Higher-order Interpolation
4.4. The Spatial Transformation
4.4.1. Control Grid Mapping
4.5. Applications
4.5.1. Distortion Removal
4.5.2. Image Registration
4.5.3. Stitching
4.6. Summary of Important Points
References
Chapter Five: Image Enhancement
5.1. Introduction
5.2. Spatial Domain Enhancement Methods
5.2.1. Contrast Stretching
5.2.2. Clipping and Thresholding
5.2.3. Image Subtraction and Averaging
5.2.4. Histogram Equalization
5.2.5. Histogram Specification
5.2.6. Spatial Filtering
5.2.7. Directional and Steerable Filtering
5.2.8. Median Filter
5.2.9. Anisotropic Diffusion Filter
5.3. Fourier Transform Methods
5.3.1. Wiener Filtering and Wiener Deconvolution
5.3.2. Deconvolution Using a Least Squares Approach
5.3.3. Low-Pass Filtering
5.3.4. High-pass and Band-pass Filtering
5.4. Wavelet Transform Methods
5.4.1. Wavelet Thresholding
5.4.2. Differential wavelet transform and multiscale pointwise product
5.5. Color Image Enhancement
5.5.1. Pseudo-Color Transformations
5.5.2. Color Image Smoothing
5.5.3. Color Image Sharpening
5.6. Summary of Important Points
References
Chapter Six: Morphological Image Processing
6.1. Introduction
6.2. Binary Morphology
6.2.1. Binary Erosion and Dilation
6.2.2. Binary Opening and Closing
6.2.3. Binary Morphological Reconstruction From Markers
6.2.3.1. Connectivity
6.2.3.2. Markers
6.2.3.3. A Priori Selection Using the Image Border for Marker Placement
6.2.3.4. Reconstruction From Opening
6.2.4. Reconstruction Using Area Opening and Closing
6.2.5. Skeletonization
6.3. Grayscale Operations
6.3.1. Threshold Sets and Level Sets
6.3.2. Grayscale Erosion and Dilation
6.3.2.1. Morphological Gradient
6.3.3. Grayscale Opening and Closing
6.3.3.1. The Top-Hat Concept
6.3.3.2. Grayscale Image Filtering
6.3.4. Component Filters and Grayscale Morphological Reconstruction
6.3.4.1. The Reconstruction Process
6.3.4.2. Grayscale Area Opening and Closing
6.3.4.3. Edge-Off Operators
6.3.4.4. h-Maxima and h-Minima Operations
6.3.4.5. Regional Maxima
6.3.4.6. Marker Extraction
6.4. Watershed Segmentation
6.4.1. The Classical Watershed Transform
6.4.2. Filtering the Minima
6.4.3. Texture Detection
6.4.4. Watershed From Markers
6.4.5. Segmentation of Overlapped Convex Cells
6.4.6. Inner and Outer Markers
6.5. Summary of Important Points
References
Chapter Seven: Image Segmentation
7.1. Introduction
7.1.1. Pixel Connectivity
7.2. Region-Based Segmentation
7.2.1. Thresholding
7.2.1.1. Global Thresholding
7.2.1.2. Adaptive Thresholding
7.2.1.3. Threshold Selection
Histogram Smoothing
The ISODATA Algorithm
The Background Symmetry Algorithm
The Triangle Algorithm
Gradient-Based Algorithms
7.2.1.4. Thresholding Circular Spots
7.2.1.5. Thresholding Noncircular and Noisy Spots
Noncircular Spots
Objects of General Shape
7.2.2. Morphological Processing
7.2.2.1. Hole Filling
7.2.2.2. Border Object Removal
7.2.2.3. Separation of Touching Objects
7.2.2.4. The Watershed Algorithm
7.2.3. Region Growing
7.2.4. Region Splitting
7.3. Boundary-Based Segmentation
7.3.1. Boundaries and Edges
7.3.2. Boundary Tracking Based on Maximum Gradient Magnitude
7.3.3. Boundary Finding Based on Gradient Image Thresholding
7.3.4. Boundary Finding Based on Laplacian Image Thresholding
7.3.5. Boundary Finding Based on Edge Detection and Linking
7.3.5.1. Edge Detection
The Roberts Edge Detector
The Sobel Edge Detector
The Prewitt edge detector
The Canny Edge Detector
7.3.5.2. Edge Linking and Boundary Refinement
Heuristic Search
Curve Fitting
The Hough Transform
Active Contours
7.3.6. Encoding Segmented Images
7.3.6.1. The Object Label Map
7.3.6.2. The Boundary Chain Code
7.4. Summary of Important Points
References
Chapter Eight: Object Measurement
8.1. Introduction
8.2. Measures for Binary Objects
8.2.1. Size Measures
8.2.1.1. Area
8.2.1.2. Perimeter
8.2.1.3. Area and Perimeter of a Polygon
8.2.2. Pose Measures
8.2.2.1. Centroid
8.2.2.2. Orientation
8.2.3. Shape Measures
8.2.3.1. Thinness Ratio
8.2.3.2. Rectangularity
8.2.3.3. Circularity
8.2.3.4. Euler Number
8.2.3.5. Moments
Central Moments
Object Dispersion
Rotationally Invariant Moments
Zernike Moments
8.2.3.6. Elongation
8.2.4. Shape Descriptors
8.2.4.1. The Differential Chain Code
8.2.4.2. Fourier Descriptors
8.2.4.3. The Medial Axis Transform
8.2.4.4. Graph Representations
Minimum Spanning Tree
Delaunay Triangulation
8.3. Distance Measures
8.3.1. Euclidean Distance
8.3.2. City-Block Distance
8.3.3. Chessboard Distance
8.4. Gray Level Object Measures
8.4.1. Intensity Measures
8.4.1.1. Integrated Optical Density
8.4.1.2. Average Optical Intensity
8.4.1.3. Contrast
8.4.2. Histogram Measures
8.4.2.1. Mean Gray Level
8.4.2.2. Standard Deviation of Gray Levels
8.4.2.3. Skew
8.4.2.4. Entropy
8.4.2.5. Energy
8.4.3. Texture Measures
8.4.3.1. Statistical Texture Measures
The Gray Level Co-Occurrence Matrix
8.4.3.2. Power Spectrum Features
8.5. Object Measurement Considerations
8.6. Summary of Important Points
References
Chapter Nine: Object Classification
9.1. Introduction
9.2. The Classification Process
9.2.1. Bayes Rule
9.3. The Single-Feature, Two-Class Case
9.3.1. A Priori Probabilities
9.3.2. Conditional Probabilities
9.3.3. Bayes Theorem
9.4. The Three-Feature, Three-Class Case
9.4.1. The Bayes Classifier
9.4.1.1. Prior Probabilities
9.4.1.2. Classifier Training
9.4.1.3. The Mean Vector
9.4.1.4. Covariance
9.4.1.5. Variance and Standard Deviation
9.4.1.6. Correlation
9.4.1.7. The pdf
9.4.1.8. Classification
9.4.1.9. Log Likelihoods
9.4.1.10. The Mahalanobis Distance Classifier
9.4.1.11. Uncorrelated features
9.4.2. A Numerical Example
9.5. Classifier Performance
9.5.1. The Confusion Matrix
9.6. Bayes Risk
9.6.1. The Minimum-Risk Classifier
9.7. Relationships Among Bayes Classifiers
9.8. The Choice of a Classifier
9.8.1. Subclassing
9.8.2. Feature Normalization
9.9. Nonparametric Classifiers
9.9.1. Nearest-Neighbor Classifiers
9.10. Feature Selection
9.10.1. Feature Reduction
9.10.1.1. Principal Component Analysis
9.10.1.2. Linear Discriminant Analysis
9.11. Neural Networks
9.12. Summary of Important Points
References
Chapter Ten: Multispectral Fluorescence Imaging
10.1. Introduction
10.2. Basics of Fluorescence Imaging
10.2.1. Image Formation in Fluorescence Imaging
10.3. Optics in Fluorescence Imaging
10.4. Limitations in Fluorescence Imaging
10.4.1. Instrumentation-Based Aberrations
10.4.1.1. Photon Shot Noise
10.4.1.2. Dark Current
10.4.1.3. Auxiliary Noise Sources
10.4.1.4. Quantization Noise
10.4.1.5. Other Noise Sources
10.4.2. Sample-Based Aberrations
10.4.2.1. Photobleaching
10.4.2.2. Autofluorescence
10.4.2.3. Absorption and Scattering of the Medium
10.4.3. Sample and Instrumentation Handling-Based Aberrations
10.5. Image Corrections in Fluorescence Microscopy
10.5.1. Background Shading Correction
10.5.2. Correction Using the Recorded Image
10.5.3. Correction Using Calibration Images
10.5.3.1. Two-Image Calibration
10.5.3.2. Background Subtraction
10.5.4. Correction Using Surface Fitting
10.5.5. Autofluorescence Correction
10.5.6. Spectral Overlap Correction
10.5.7. Photobleaching Correction
10.6. Quantifying Fluorescence
10.6.1. Fluorescence Intensity and Fluorophore Concentration
10.7. Fluorescence Imaging Techniques
10.7.1. Immunofluorescence
10.7.2. Fluorescence In Situ Hybridization (FISH)
10.7.3. Quantitative Colocalization Analysis
10.7.4. Fluorescence Ratio Imaging (RI)
10.7.5. Fluorescence Resonance Energy Transfer (FRET)
10.7.6. Fluorescence Lifetime Imaging (FLIM) FRET
10.7.6.1. Time Correlated Single Photon Counting (TCSPC) FLIM-FRET
10.7.7. Fluorescence Recovery After Photobleaching (FRAP)
10.7.8. Total Internal Reflectance Fluorescence Microscopy (TIRFM)
10.7.9. Fluorescence Correlation Spectroscopy (FCS)
10.8. Summary of Important Points
References
Chapter Eleven: Three-Dimensional Imaging
11.1. Introduction
11.2. Image Acquisition
11.2.1. Wide-Field 3D Microscopy
11.2.2. Confocal Microscopy
11.2.3. Multiphoton Microscopy
11.2.4. Microscope Configuration
11.2.5. Other 3D Microscopy Techniques
11.3. 3D Image Data
11.3.1. 3D Image Representation
11.3.1.1. 3D Image Notation
11.4. Image Restoration and Deblurring
11.4.1. The Point Spread Function
11.4.1.1. Theoretical Model of the Psf
11.4.1.2. Approximate Methods
11.4.2. Models for Microscope Image Formation
11.4.2.1. Poisson Noise
11.4.2.2. Gaussian Noise
11.4.3. Algorithms for Deblurring and Restoration
11.4.3.1. No-Neighbor Methods
11.4.3.2. Nearest-Neighbor Method
11.4.3.3. Linear Methods
Inverse Filtering
Wiener Deconvolution
Linear Least Squares
Regularization
Tikhonov Regularization
11.4.3.4. Nonlinear Methods
Jansson-van Cittert Method
The Nonlinear Constrained Least Squares Method
The Carrington Algorithm
The Iterative Constrained Tikhonov-Miller Algorithm
11.4.3.5. Maximum Likelihood Restoration
The EM-ML Algorithm
The Richardson-Lucy Algorithm
Maximum Penalized Likelihood Method
Maximum A Posteriori Method
11.4.3.6. Blind Deconvolution
11.4.3.7. Space-Variant Deconvolution
11.4.3.8. Interpretation of Deconvolved Images
11.4.3.9. Commercial and Free Deconvolution Packages
11.5. Image Fusion
11.6. Three-Dimensional Image Processing
11.7. Geometric Transformations
11.8. Pointwise Operations
11.9. Histogram Operations
11.10. Filtering
11.10.1. Linear Filters
11.10.1.1. Finite Impulse Response Filters
11.10.2. Nonlinear Filters
11.10.2.1. Median Filter
11.10.2.2. Weighted Median Filter
11.10.2.3. Minimum and Maximum Filters
11.10.2.4. α-Trimmed Mean Filters
11.10.3. Edge Detection Filters
11.11. Morphological Operators
11.11.1. Binary Morphology
11.11.2. Grayscale Morphology
11.12. Segmentation
11.12.1. Point-Based Segmentation
11.12.2. Edge-Based Segmentation
11.12.3. Region-Based Segmentation
11.12.3.1. Connectivity
11.12.3.2. Region Growing
11.12.3.3. Region Splitting And Merging
11.12.4. Deformable Models
11.13. Comparing 3D Images
11.14. Registration
11.15. Object Measurements in 3D
11.15.1. Euler Number
11.15.2. Bounding Box
11.15.3. Center of Mass
11.15.4. Surface Area Estimation
11.15.4.1. Surface Estimation Using Superquadric Primitives
11.15.4.2. Surface Estimation Using Spherical Harmonics
11.15.5. Length Estimation
11.15.6. Curvature Estimation
11.15.6.1. The Surface Triangulation Method
11.15.6.2. The Cross Patch Method
11.15.7. Volume Estimation
11.15.8. Texture
11.16. 3D Image Display
11.16.1. Montage
11.16.2. Projected Images
11.16.2.1. Voxel Projection
11.16.2.2. Ray Casting
11.16.3. Surface and Volume Rendering
11.16.3.1. Surface Rendering
11.16.3.2. Volume Rendering
11.16.4. Stereo Pairs
11.16.5. Color Anaglyphs
11.16.6. Animations
11.17. Summary of Important Points
References
Chapter Twelve: Superresolution Image Processing
12.1. Introduction
12.2. The Diffraction Limit
12.3. Deconvolution
12.3.1. Signals and Noise
12.3.2. Extrapolating Beyond the Diffraction Limit
12.3.2.1. Statistical Methods
12.3.2.2. Machine Learning Methods
12.4. Superresolution Imaging Techniques
12.4.1. Analytic Continuation
12.4.2. Stimulated Emission Depletion Microscopy
12.4.3. Expansion Microscopy
12.4.4. Single Molecule Localization Microscopy
12.4.5. Structured Illumination Microscopy
12.4.6. Synthetic Superresolution with Machine Learning
12.5. Summary of Important Points
References
Chapter Thirteen: Localization Microscopy
13.1. Introduction
13.1.1. A Brief History of Localization Microscopy
13.2. Overcoming the Diffraction Limit
13.2.1. Diffraction-Limited Resolution
13.2.2. Photoswitching Mechanisms
13.3. Localizing Molecular Position
13.3.1. Spot Candidate Selection
13.3.1.1. Local Intensity Maxima
13.3.1.2. Nonmaximum Suppression
13.3.1.3. Centroid Estimation
13.3.1.4. The Intensity Threshold
13.3.2. Gaussian Model Fitting
13.3.2.1. Least Squares Fitting
13.3.2.2. The Method of Steepest Descent
13.3.2.3. Newtons Method
13.3.2.4. The Levenberg-Marquardt Method
13.3.2.5. Maximum Likelihood Fitting
13.3.3. Localization Methods
13.3.3.1. Spot Centroid Calculation
13.3.3.2. The Radial Symmetry Method
13.3.3.3. Spline and Complex Model Fitting
13.3.4. Visualization of Localization Data
13.3.4.1. Scatterplots
13.3.4.2. Two-dimensional Histograms
Jittering
13.3.4.3. Intensity Interpolation to Neighboring Pixels
Averaged Shifted Histograms
13.3.4.4. Gaussian Rendering
13.3.5. Localization and Image Artifacts in SMLM
13.4. Three-Dimensional Localization Microscopy
13.4.1. Calibration Measurements
13.4.2. Multiplane Imaging
13.4.3. Point Spread Function Engineering
13.4.4. Intensity-Based Approaches
13.4.4.1. Supercritical Angle Localization
13.4.4.2. Photometric Localization
13.5. Quantitative Localization Microscopy
13.5.1. Quality Control of Localization Data
13.5.1.1. Temporal Drift Correction
Fiducial Markers
Self-Alignment
Cross-Correlation Analysis
13.5.2. Localization Precision and Image Resolution
13.5.2.1. Theoretical Localization Precision
13.5.2.2. Experimental Precision and Resolution
Analyzing Isolated Emitter Spots
Tracing and Tracking
Localization Precision, Resolution, and Sampling
Fourier-Ring Correlation
13.5.3. Localization-Based Cluster Analysis
13.5.3.1. Statistical SMLM Cluster Analysis
Ripleys Functions
Correlation-Based Clustering
13.5.3.2. Density-Based Clustering (DBSCAN)
13.5.3.3. K-means Clustering
13.5.3.4. Voronoi Tessellation
13.5.3.5. Bayesian Cluster Analysis
13.5.4. Particle Averaging
13.6. Implementation and Applications of SMLM
13.6.1. Machine and Deep Learning for SMLM
13.6.2. MINFLUX
13.6.3. Applications of SMLM
13.7. Summary of Important Points
References
Chapter Fourteen: Motion Tracking and Analysis
14.1. Introduction
14.2. Image Acquisition
14.2.1. Microscope Setup
14.2.2. Spatial Dimensionality
14.2.3. Temporal Resolution
14.3. Image Preprocessing
14.3.1. Image Denoising
14.3.2. Image Deconvolution
14.3.3. Image Registration
14.4. Image Analysis
14.4.1. Cell Tracking
14.4.1.1. Cell Segmentation
14.4.1.2. Cell Association
14.4.2. Particle Tracking
14.4.2.1. Particle Detection
14.4.2.2. Particle Association
14.5. Trajectory Analysis
14.5.1. Geometry Measurements
14.5.2. Diffusivity Measurements
14.5.3. Velocity Measurements
14.6. Sample Algorithms
14.6.1. Cell Tracking
14.6.2. Particle Tracking
14.7. Summary of Important Points
References
Chapter Fifteen: Deep Learning
15.1. Introduction
15.1.1. Basic Components of Neural Networks
15.1.2. A Timeline of Convolutional Neural Network Development
15.1.3. A Timeline of Deep Learning in Microscopy
15.2. Deep Learning Concepts
15.2.1. Training
15.2.2. Activation Functions
15.2.3. Cost Functions
15.2.4. Convolutional Neural Networks
15.3. Practical Applications
15.3.1. Classification
15.3.2. Detection
15.3.3. Segmentation
15.4. Software Frameworks
15.5. Training Deep Learning Networks
15.5.1. Data Augmentation
15.5.2. Transfer Learning
15.6. Application of Deep Learning for Cell Nuclei Detection
15.7. Challenges
15.8. Summary of Important Points
References
Chapter Sixteen: Image Informatics
16.1. Introduction
16.2. Open-source Software Ecosystems
16.2.1. Java Libraries and Tools
16.2.2. Python Tools
16.2.3. C++ Tools
16.2.4. Tool Interoperation
16.3. Image Acquisition
16.3.1. Image Processing and Analysis
16.3.2. Machine Learning Platforms
16.4. Image Storage and Curation
16.4.1. Data Curation
16.4.2. Storage Backend
16.5. Visualization
16.6. Community
16.7. Conclusion
16.8. Summary of Important Points
References
Glossary
Further reading
Index
Back Cover