This paper proposes a greedy algorithm for automated reconstruction of neural arbors from light microscopy stacks of
images. The algorithm is based on the minimum cost path method. While the minimum cost path, obtained using the
Fast Marching Method, results in a trace with the least cumulative cost between the start and the end points, it is not
sufficient for the reconstruction of neural trees. This is because sections of the minimum cost path can erroneously travel
through the image background with undetectable detriment to the cumulative cost. To circumvent this problem we
propose an algorithm that grows a neural tree from a specified root by iteratively re-initializing the Fast Marching fronts.
The speed image used in the Fast Marching Method is generated by computing the average outward flux of the gradient
vector flow field. Each iteration of the algorithm produces a candidate extension by allowing the front to travel a
specified distance and then tracking from the farthest point of the front back to the tree. Robust likelihood ratio test is
used to evaluate the quality of the candidate extension by comparing voxel intensities along the extension to those in the
foreground and the background. The qualified extensions are appended to the current tree, the front is re-initialized, and
Fast Marching is continued until the stopping criterion is met. To evaluate the performance of the algorithm we
reconstructed 6 stacks of two-photon microscopy images and compared the results to the ground truth reconstructions by
using the DIADEM metric. The average comparison score was 0.82 out of 1.0, which is on par with the performance
achieved by expert manual tracers.
This paper presents an algorithm for automated extraction of interest points (IPs)in multispectral and hyperspectral
images. Interest points are features of the image that capture information from its neighbours and they
are distinctive and stable under transformations such as translation and rotation. Interest-point operators for
monochromatic images were proposed more than a decade ago and have since been studied extensively. IPs have
been applied to diverse problems in computer vision, including image matching, recognition, registration, 3D
reconstruction, change detection, and content-based image retrieval. Interest points are helpful in data reduction,
and reduce the computational burden of various algorithms (like registration, object detection, 3D reconstruction
etc) by replacing an exhaustive search over the entire image domain by a probe into a concise set of highly
informative points. An interest operator seeks out points in an image that are structurally distinct, invariant to
imaging conditions, stable under geometric transformation, and interpretable which are good candidates for interest
points. Our approach extends ideas from Lowe's keypoint operator that uses local extrema of Difference of
Gaussian (DoG) operator at multiple scales to detect interest point in gray level images. The proposed approach
extends Lowe's method by direct conversion of scalar operations such as scale-space generation, and extreme
point detection into operations that take the vector nature of the image into consideration. Experimental results
with RGB and hyperspectral images which demonstrate the potential of the method for this application and the
potential improvements of a fully vectorial approach over band-by-band approaches described in the literature.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.