Retinex theory addresses the problem of separating the illumination from the reflectance in a given image and thereby compensating for non-uniform lighting. This is in general an ill-posed problem. In this paper we propose a variational model for the Retinex problem that unifies previous methods. Similar to previous algorithms, it assumes spatial smoothness of the illumination field. In addition, knowledge of the limited dynamic range of the reflectance is used as a constraint in the recovery process. A penalty term is also included, exploiting a-priori knowledge of the nature of the reflectance image. The proposed formulation adopts a Bayesian view point of the estimation problem, which leads to an algebraic regularization term, that contributes to better conditioning of the reconstruction problem. Based on the proposed variational model, we show that the illumination estimation problem can be formulated as a Quadratic Programming optimization problem. An efficient multi-resolution algorithm is proposed. It exploits the spatial correlation in the reflectance and illumination images. Applications of the algorithm to various color images yield promising results.
KEYWORDS: 3D modeling, Medical imaging, 3D metrology, Algorithm development, Tumors, Lawrencium, 3D image processing, Motion models, Chemical elements, Detection and tracking algorithms
A novel scheme for performing detection and measurements in medical images is presented. The technique is based on active contours evolving in time according to intrinsic geometric measures of the image. The evolving contours naturally split and merge, allowing the simultaneous detection of several objects and both interior and exterior boundaries. The proposed approach is based on the relation between active contours and the computation of geodesics or minimal distance curves. The minimal distance curve lays in a Riemannian space whose metric is defined by the image content. Measurements are performed after the object is detected. Due to the high accuracy achieved by the proposed geodesic approach, it is natural to use it to compute area or length of the detected object, which are of extreme value for diagnosis. Open curves with fix boundaries are computed as well. This addition to the deformable model adds flexibility, allowing the user to choose guiding points in the image or to select regions for measurements. Experimental results of applying the scheme to real medical images demonstrate its potential. The results may be extended to 3D object segmentation as well.
Finding the shortest path between points on a surface is a challenging global optimization problem. It is difficult to devise an algorithm that is computationally efficient, locally accurate and guarantees to converge to the globally shortest path. In this paper a two stage coarse to the fine approach of finding shortest paths is suggested. In the first stage a fast algorithm is used to obtain an approximation to the globally shortest path. In the second stage the approximation is refined into a locally optimal path. In the first stage we use the fast algorithm introduced by Kiryati and Szekely that combines a 3-D length estimator with graph search. This path is then refined to a shorter geodesic curve by an algorithm that deforms an arbitrary initial curve ending at two given surface points via geodesic curvature shortening flow. The 3D curve shortening flow is transformed into an equivalent 2D one that is implemented using an efficient numerical algorithm for curve evolution with fixed end points, introduced by Kimmel and Sapiro.
The medial axis transform (MAT) of a shape, better known as its skeleton, is frequently used in shape analysis and related areas. In this paper a new approach for determining the skeleton of an object, is presented. The boundary is segmented at points of maximal positive curvature and a distance map from each of the segments is calculated. The skeleton is then located by applying simple rules to the zero sets of distance maps differences. A framework is proposed for numerical approximation of distance maps that is consistent with the continuous case, hence does not suffer from digitization bias due to metrication errors of the implementation on the grid. Subpixel accuracy in distance map calculation is obtained by using gray level information along the boundary of the shape in the numerical scheme. The accuracy of the resulting efficient skeletonization algorithm is demonstrated by several examples.
An algorithm for computing the Euclidean distance from the boundary of a given digitized shape is presented. The distance is calculated with sub-pixel accuracy. The algorithm is based on an equal distance contour evolution process. The moving contour is embedded as a level set in a time varying function of higher dimension. This representation of the evolving contour makes possible the use of an accurate and stable numerical scheme, due to Osher and Sethian.
Conference Committee Involvement (1)
Parallel Processing for Imaging Applications
24 January 2011 | San Francisco Airport, California, United States
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.