Publica
Hier finden Sie wissenschaftliche Publikationen aus den FraunhoferInstituten. Segmentationbased partial volume correction for volume estimation of solid lesions in CT
 IEEE transactions on medical imaging 33 (2014), Nr.2, S.462480 ISSN: 02780062 ISSN: 1558254X 

 Englisch 
 Zeitschriftenaufsatz 
 Fraunhofer MEVIS () 
Abstract
In oncological chemotherapy monitoring, the change of a tumor's size is an important criterion for assessing cancer therapeutics. Measuring the volume of a tumor requires its delineation in 3D. This is called segmentation, which is an intensively studied problem in medical image processing. However, simply counting the voxels within a binary segmentation result can lead to significant differences in the volume, if the lesion has been segmented slightly differently by various segmentation procedures or in different scans, for example due to the limited spatial resolution of computed tomography (CT) or partial volume effects. This variability limits the sensitivity of size measurements and thus of therapy response assessments and it can even lead to misclassifications. We present a fast, generic algorithm for measuring the volume of solid, compact tumors in CT that considers partial volume effects at the border of a given segmentation result. The algorithm is an extension of the segmentationbased partial volume analysis proposed by Kuhnigk for the volumetry of solid lung lesions , such that it can be applied to inhomogeneous lesions and lesions with inhomogeneous surroundings. Our generalized segmentationbased partial volume correction is based on a spatial subdivision of the segmentation result, from which the fraction of tumor for each voxel is computed. It has been evaluated on phantom data, 1516 lesion segmentation pairs (lung nodules, liver metastases and lymph nodes) as well as 1851 lung nodules from the LIDCIDRI database. The evaluations of our algorithm show a more accurate estimation of the real volume and its ability to reduce inter and intraobserver variability significantly for each entity. Overall, the variability (interquartile range) for phantom data is reduced by 49% ( p ≪ 0.001) and the variability between different readers is reduced by 28% ( p ≪ 0.001). The average computation time is 0.2 s.