Vessel bend-based cup segmentation in retinal images
We proposed a method for cup boundary detection from monocular colour fundus image to help quantify cup changes. The method is based on anatomical evidence such as vessel bends at cup boundary, considered relevant by glaucoma experts. Vessels are modeled and detected in a curvature space to better handle inter-image variations. Bends in a vessel are robustly detected using a region of support concept, which automatically selects the right scale for analysis. A reliable subset called r-bends is derived using a multi-stage strategy and a local spline fitting is used to obtain the desired cup boundary. The method has been successfully tested on 133 images comprising 32 normal and 101 glaucomatous images against three glaucoma experts. The proposed method shows high sensitivity in cup to disk ratio-based glaucoma detection and local assessment of the detected cup boundary shows good consensus with the expert markings.
Optic disk and cup boundary detection using regional information
The shape deformation within the optic disk (OD) is an important indicator for the detection of glaucoma. In this paper, relevant disk parameters are estimated using the OD and cup boundaries. A deformable model guided by regional statistics is used to detect the OD boundary. A cup boundary detection scheme is presented based on the appearance of pallor in Lab colour space and the expected cup symmetry. The proposed scheme is tested on 170 images comprising 40 normal and 130 glaucomatous images. The proposed method gives a mean error 0.030 for normal and 0.121 for glaucomatous images in the estimation of cup-to-disk ratio which compares well with reported figures in literature.
Optic disk and cup segmentation for glaucoma assessment
We proposed an automatic OD parameterization technique based on segmented OD and cup regions obtained from monocular retinal images. A novel OD segmentation method is proposed which integrates the local image information around each point of interest in multidimensional feature space to provide robustness against variations found in and around the OD region. We also propose a novel cup segmentation method which is based on anatomical evidence such as vessel bends at the cup boundary, considered relevant by glaucoma experts. Bends in a vessel are robustly detected using a region of support concept, which automatically selects the right scale for analysis. A multi-stage strategy is employed to derive a reliable subset of vessel bends called r-bends followed by a local spline fitting to derive the desired cup boundary. The method has been evaluated on 138 images comprising 33 normal and 105 glaucomatous images against three glaucoma experts. The obtained segmentation results show consistency in handling various geometric and photometric variations found across the dataset. The estimation error of the method for vertical cup-to-disk diameter ratio is 0.09/0.08 (mean/standard deviation) while for cup-to-disk area ratio it is 0.12/0.10. Overall, the obtained qualitative and quantitative results show effectiveness in both segmentation and subsequent OD parameterization for glaucoma assessment.
Joint optic disc and cup boundary extraction from monocular fundus images
We proposed a novel boundary-based Conditional Random Field formulation that extracts both the optic disc and cup boundaries in a single optimization step. In addition to the color gradients, the proposed method explicitly models the depth which is estimated from the fundus image itself using a coupled, sparse dictionary trained on a set of image-depth map (derived from Optical Coherence Tomography) pairs. The estimated depth achieved a correlation coefficient of 0.80 with respect to the ground truth. The proposed segmentation method outperformed several state-of-the-art methods on five public datasets. The average dice coefficient was in the range of 0.87–0.97 for disc segmentation across three datasets and 0.83 for cup segmentation on the DRISHTI-GS1 test set. The method achieved a good glaucoma classification performance with an average AUC of 0.85 for five fold cross-validation on RIM-ONE v2.