The NIH/NIGMS
Center for Integrative Biomedical Computing

SCI Publications

2014


S. Kurugol, K. Kose, B. Park, J.G. Dy, D.H. Brooks, M. Rajadhyaksha. “Automated Delineation of Dermal-Epidermal Junction In Reflectance Confocal Microscopy Image Stacks Of Human Skin,” In Journal of Investigative Dermatology, September, 2014.
DOI: 10.1038/jid.2014.379
PubMed ID: 25184959

ABSTRACT

Reflectance confocal microscopy (RCM) images skin non-invasively, with optical sectioning and nuclear-level resolution comparable to that of pathology. Based on assessment of the dermal-epidermal junction (DEJ) and morphologic features in its vicinity, skin cancer can be diagnosed in vivo with high sensitivity and specificity. However, the current visual, qualitative approach for reading images leads to subjective variability in diagnosis. We hypothesize that machine learning-based algorithms may enable a more quantitative, objective approach. Testing and validation was performed with two algorithms that can automatically delineate the DEJ in RCM stacks of normal human skin. The test set was composed of 15 fair and 15 dark skin stacks (30 subjects) with expert labellings. In dark skin, in which the contrast is high due to melanin, the algorithm produced an average error of 7.9±6.4 μm. In fair skin, the algorithm delineated the DEJ as a transition zone, with average error of 8.3±5.8 μm for the epidermis-to-transition zone boundary and 7.6±5.6 μm for the transition zone-to-dermis. Our results suggest that automated algorithms may quantitatively guide the delineation of the DEJ, to assist in objective reading of RCM images. Further development of such algorithms may guide assessment of abnormal morphological features at the DEJ.



J. Sourati, D. Erdogmus, J.G. Dy, D.H. Brooks. “Accelerated learning-based interactive image segmentation using pairwise constraints,” In IEEE Transactions on Medical Image Processing, Vol. 23, No. 7, pp. 3057-3070. July, 2014.
DOI: 10.1109/TIP.2014.2325783
PubMed ID: 24860031
PubMed Central ID: PMC4096329

ABSTRACT

Algorithms for fully automatic segmentation of images are often not sufficiently generic with suitable accuracy, and fully manual segmentation is not practical in many settings. There is a need for semiautomatic algorithms, which are capable of interacting with the user and taking into account the collected feedback. Typically, such methods have simply incorporated user feedback directly. Here, we employ active learning of optimal queries to guide user interaction. Our work in this paper is based on constrained spectral clustering that iteratively incorporates user feedback by propagating it through the calculated affinities. The original framework does not scale well to large data sets, and hence is not straightforward to apply to interactive image segmentation. In order to address this issue, we adopt advanced numerical methods for eigen-decomposition implemented over a subsampling scheme. Our key innovation, however, is an active learning strategy that chooses pairwise queries to present to the user in order to increase the rate of learning from the feedback. Performance evaluation is carried out on the Berkeley segmentation and Graz-02 image data sets, confirming that convergence to high accuracy levels is realizable in relatively few iterations.


2013


J. Sourati, D.H. Brooks, J.G. Dy, E. Erdogmus. “Constrained Spectral Clustering for Image Segmentation,” In IEEE International Workshop on Machine Learning for Signal Processing, pp. 1--6. 2013.
DOI: 10.1109/MLSP

ABSTRACT

Constrained spectral clustering with affinity propagation in its original form is not practical for large scale problems like image segmentation. In this paper we employ novelty selection sub-sampling strategy, besides using efficient numerical eigen-decomposition methods to make this algorithm work efficiently for images. In addition, entropy-based active learning is also employed to select the queries posed to the user more wisely in an interactive image segmentation framework. We evaluate the algorithm on general and medical images to show that the segmentation results will improve using constrained clustering even if one works with a subset of pixels. Furthermore, this happens more efficiently when pixels to be labeled are selected actively.



J. Sourati, K. Kose, M. Rajadhyaksha, J.G. Dy, D. Erdogmus, D.H. Brooks. “Automated localization of wrinkles and the dermo-epidermal junction in obliquely oriented reflectance confocal microscopic images of human skin,” In Proc. SPIE 8565, Photonic Therapeutics and Diagnostics IX, Vol. 8565, 2013.
DOI: 10.1117/12.2006489

ABSTRACT

Reflectance Confocal Microscopic (RCM) imaging of obliquely-oriented optical sections, rather than with traditional z-stacks, shows depth information that more closely mimics the appearance of skin in orthogonal sections of histology. This approach may considerably reduce the amount of data that must be acquired and processed. However, as with z-stacks, purely visual detection of the dermal-epidermal junction (DEJ) in oblique images remains challenging. Here, we have extended our original algorithm for localization of DEJ in z-stacks to oblique images. In addition, we developed an algorithm for detecting wrinkles, which in addition to its intrinsic merit, gives useful information for DEJ detection.


2012


S. Kurugol, M. Rajadhyaksha, J.G. Dy, D.H. Brooks. “Validation study of automated dermal/epidermal junction localization algorithm in reflectance confocal microscopy images of skin,” In Proceedings of SPIE Photonic Therapeutics and Diagnostics VIII, Vol. 8207, No. 1, pp. 820702-820711. 2012.
DOI: 10.1117/12.909227
PubMed ID: 24376908
PubMed Central ID: PMC3872972

ABSTRACT

Reflectance confocal microscopy (RCM) has seen increasing clinical application for noninvasive diagnosis of skin cancer. Identifying the location of the dermal-epidermal junction (DEJ) in the image stacks is key for effective clinical imaging. For example, one clinical imaging procedure acquires a dense stack of 0.5x0.5mm FOV images and then, after manual determination of DEJ depth, collects a 5x5mm mosaic at that depth for diagnosis. However, especially in lightly pigmented skin, RCM images have low contrast at the DEJ which makes repeatable, objective visual identification challenging. We have previously published proof of concept for an automated algorithm for DEJ detection in both highly- and lightly-pigmented skin types based on sequential feature segmentation and classification. In lightly-pigmented skin the change of skin texture with depth was detected by the algorithm and used to locate the DEJ. Here we report on further validation of our algorithm on a more extensive collection of 24 image stacks (15 fair skin, 9 dark skin). We compare algorithm performance against classification by three clinical experts. We also evaluate inter-expert consistency among the experts. The average correlation across experts was 0.81 for lightly pigmented skin, indicating the difficulty of the problem. The algorithm achieved epidermis/dermis misclassification rates smaller than 10% (based on 25x25 mm tiles) and average distance from the expert labeled boundaries of ~6.4 ?m for fair skin and ~5.3 ?m for dark skin, well within average cell size and less than 2x the instrument resolution in the optical axis.


2011


S. Kurugol, E. Bas, D. Erdogmus, J.G. Dy, G.C. Sharp, D.H. Brooks. “Centerline extraction with principal curve tracing to improve 3D level set esophagus segmentation in CT images,” In Proceedings of IEEE International Conference of the Engineering in Medicine and Biology Society (EMBS), pp. 3403--3406. 2011.
DOI: 10.1109/IEMBS.2011.6090921
PubMed ID: 2225507
PubMed Central ID: PMC3349355

ABSTRACT

For radiotherapy planning, contouring of target volume and healthy structures at risk in CT volumes is essential. To automate this process, one of the available segmentation techniques can be used for many thoracic organs except the esophagus, which is very hard to segment due to low contrast. In this work we propose to initialize our previously introduced model based 3D level set esophagus segmentation method with a principal curve tracing (PCT) algorithm, which we adapted to solve the esophagus centerline detection problem. To address challenges due to low intensity contrast, we enhanced the PCT algorithm by learning spatial and intensity priors from a small set of annotated CT volumes. To locate the esophageal wall, the model based 3D level set algorithm including a shape model that represents the variance of esophagus wall around the estimated centerline is utilized. Our results show improvement in esophagus segmentation when initialized by PCT compared to our previous work, where an ad hoc centerline initialization was performed. Unlike previous approaches, this work does not need a very large set of annotated training images and has similar performance.



S. Kurugol, J.G. Dy, M. Rajadhyaksha, K.W. Gossage, J. Weissman, D.H. Brooks. “Semi-automated Algorithm for Localization of Dermal/ Epidermal Junction in Reflectance Confocal Microscopy Images of Human Skin,” In Proceedings of SPIE, Vol. 7904, pp. 79041A-79041A-10. 2011.
DOI: 10.1117/12.875392
PubMed ID: 21709746
PubMed Central ID: PMC3120112

ABSTRACT

The examination of the dermis/epidermis junction (DEJ) is clinically important for skin cancer diagnosis. Reflectance confocal microscopy (RCM) is an emerging tool for detection of skin cancers in vivo. However, visual localization of the DEJ in RCM images, with high accuracy and repeatability, is challenging, especially in fair skin, due to low contrast, heterogeneous structure and high inter- and intra-subject variability. We recently proposed a semi-automated algorithm to localize the DEJ in z-stacks of RCM images of fair skin, based on feature segmentation and classification. Here we extend the algorithm to dark skin. The extended algorithm first decides the skin type and then applies the appropriate DEJ localization method. In dark skin, strong backscatter from the pigment melanin causes the basal cells above the DEJ to appear with high contrast. To locate those high contrast regions, the algorithm operates on small tiles (regions) and finds the peaks of the smoothed average intensity depth profile of each tile. However, for some tiles, due to heterogeneity, multiple peaks in the depth profile exist and the strongest peak might not be the basal layer peak. To select the correct peak, basal cells are represented with a vector of texture features. The peak with most similar features to this feature vector is selected. The results show that the algorithm detected the skin types correctly for all 17 stacks tested (8 fair, 9 dark). The DEJ detection algorithm achieved an average distance from the ground truth DEJ surface of around 4.7μm for dark skin and around 7-14μm for fair skin.

Keywords: confocal reflectance microscopy, image analysis, skin, classification



S. Kurugol, J.G. Dy, D.H. Brooks, M. Rajadhyaksha. “Pilot study of semiautomated localization of the dermal/epidermal junction in reflectance confocal microscopy images of skin,” In Journal of biomedical optics, Vol. 16, No. 3, International Society for Optics and Photonics, pp. 036005--036005. 2011.
DOI: 10.1117/1.3549740

ABSTRACT

Reflectance confocal microscopy (RCM) continues to be translated toward the detection of skin cancers in vivo. Automated image analysis may help clinicians and accelerate clinical acceptance of RCM. For screening and diagnosis of cancer, the dermal/epidermal junction (DEJ), at which melanomas and basal cell carcinomas originate, is an important feature in skin. In RCM images, the DEJ is marked by optically subtle changes and features and is difficult to detect purely by visual examination. Challenges for automation of DEJ detection include heterogeneity of skin tissue, high inter-, intra-subject variability, and low optical contrast. To cope with these challenges, we propose a semiautomated hybrid sequence segmentation/classification algorithm that partitions z-stacks of tiles into homogeneous segments by fitting a model of skin layer dynamics and then classifies tile segments as epidermis, dermis, or transitional DEJ region using texture features. We evaluate two different training scenarios: 1. training and testing on portions of the same stack; 2. training on one labeled stack and testing on one from a different subject with similar skin type. Initial results demonstrate the detectability of the DEJ in both scenarios with epidermis/dermis misclassification rates smaller than 10% and average distance from the expert labeled boundaries around 8.5 μm.


2010


S. Kurugol, N. Ozay, J.G. Dy, G.C. Sharp, D.H. Brooks. “Locally Deformable Shape Model to Improve 3D Level Set based Esophagus Segmentation,” In Proceedings of the IAPR International Conference on Pattern Recognition, pp. 3955--3958. August, 2010.
PubMed ID: 21731883
PubMed Central ID: PMC3127393

ABSTRACT

In this paper we propose a supervised 3D segmentation algorithm to locate the esophagus in thoracic CT scans using a variational framework. To address challenges due to low contrast, several priors are learned from a training set of segmented images. Our algorithm first estimates the centerline based on a spatial model learned at a few manually marked anatomical reference points. Then an implicit shape model is learned by subtracting the centerline and applying PCA to these shapes. To allow local variations in the shapes, we propose to use nonlinear smooth local deformations. Finally, the esophageal wall is located within a 3D level set framework by optimizing a cost function including terms for appearance, the shape model, smoothness constraints and an air/contrast model.