Paper
4 March 2011 Incorporating domain knowledge for tubule detection in breast histopathology using O'Callaghan neighborhoods
Ajay Basavanhally, Elaine Yu, Jun Xu, Shridar Ganesan, Michael Feldman, John Tomaszewski, Anant Madabhushi
Author Affiliations +
Abstract
An important criterion for identifying complicated objects with multiple attributes is the use of domain knowledge which reflects the precise spatial linking of the constituent attributes. Hence, simply detecting the presence of the low-level attributes that constitute the object, even in cases where these attributes might be detected in spatial proximity to each other is usually not a robust strategy. The O'Callaghan neighborhood is an ideal vehicle for characterizing objects comprised of multiple attributes spatially connected to each other in a precise fashion because it allows for modeling and imposing spatial distance and directional constraints on the object attributes. In this work we apply the O'Callaghan neighborhood to the problem of tubule identification on hematoxylin and eosin (H & E) stained breast cancer (BCa) histopathology, where a tubule is characterized by a central lumen surrounded by cytoplasm and a ring of nuclei around the cytoplasm. The detection of tubules is important because tubular density is an important predictor in cancer grade determination. In the context of ER+ BCa, grade has been shown to be strongly linked to disease aggressiveness and patient outcome. The more standard pattern recognition approaches to detection of complex objects typically involve training classifiers for low-level attributes individually. For tubule detection, the spatial proximity of lumen, cytoplasm, and nuclei might suggest the presence of a tubule. However such an approach could also suffer from false positive errors due to the presence of fat, stroma, and other lumen-like areas that could be mistaken for tubules. In this work, tubules are identified by imposing spatial and distance constraints using O'Callaghan neighborhoods between the ring of nuclei around each lumen. In this work, cancer nuclei in each image are found via a color deconvolution scheme, which isolates the hematoxylin stain, thereby enabling automated detection of individual cell nuclei. The potential lumen areas are segmented using a Hierarchical Normalized Cut (HNCut) initialized Color Gradient based Active Contour model (CGAC). The HNCut algorithm detects lumen-like areas within the image via pixel clustering across multiple image resolutions. The pixel clustering provides initial contours for the CGAC. From the initial contours, the CGAC evolves to segment the boundaries of the potential lumen areas. A set of 22 graph-based image features characterizing the spatial linking between the tubular attributes is extracted from the O'Callaghan neighborhood in order to distinguish true from false lumens. Evaluation on 1226 potential lumen areas from 14 patient studies produces an area under the receiver operating characteristic curve (AUC) of 0.91 along with the ability to classify true lumen with 86% accuracy. In comparison to manual grading of tubular density over 105 images, our method is able to distinguish histopathology images with low and high tubular density at 89% accuracy (AUC = 0.94).
© (2011) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Ajay Basavanhally, Elaine Yu, Jun Xu, Shridar Ganesan, Michael Feldman, John Tomaszewski, and Anant Madabhushi "Incorporating domain knowledge for tubule detection in breast histopathology using O'Callaghan neighborhoods", Proc. SPIE 7963, Medical Imaging 2011: Computer-Aided Diagnosis, 796310 (4 March 2011); https://doi.org/10.1117/12.878092
Lens.org Logo
CITATIONS
Cited by 38 scholarly publications and 2 patents.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Cancer

Image segmentation

Spatial analysis

RGB color model

Feature extraction

Deconvolution

Breast cancer

Back to Top