mindboggle.shapes package

Submodules

mindboggle.shapes.laplace_beltrami module

Compute the Laplace-Beltrami spectrum using a linear finite element method.

We follow the definitions and steps given in Reuter et al.’s 2009 paper: “Discrete Laplace-Beltrami Operators for Shape Analysis and Segmentation”

References (please cite when using for publication):

Martin Reuter et al. Discrete Laplace-Beltrami Operators for Shape Analysis and Segmentation Computers & Graphics 33(3):381-390, 2009

Martin Reuter et al. Laplace-Beltrami spectra as “Shape-DNA” of surfaces and solids Computer-Aided Design 38(4):342-366, 2006

Dependency:
Scipy 0.10 or later to solve the generalized eigenvalue problem. Information about using Scipy to solve a generalized eigenvalue problem: http://docs.scipy.org/doc/scipy/reference/tutorial/arpack.html

NOTE

For "points," only include coordinates of vertices in the 3-D structure
whose spectrum is to be calculated. For example, do not use coordinates
of all POINTS from a VTK file as "points" and use only corresponding
faces as "faces." Otherwise this will cause a singular matrix error
when inverting matrices because some rows are all zeros.
Acknowledgments:
  • Dr. Martin Reuter, Harvard Medical School, provided his MATLAB code, and contributed code and bug fixes. He also assisted us in understanding his articles about the Laplace-Beltrami operator.
  • Dr. Eric You Xu, Google (http://www.youxu.info/), explained how eigenvalue problems are solved numerically.
Authors:

Copyright 2016, Mindboggle team (http://mindboggle.info), Apache v2.0 License

area_normalize(points, faces, spectrum)

Normalize a spectrum using areas as suggested in Reuter et al. (2006)

Parameters:
  • points (list of lists of 3 floats) – x,y,z coordinates for each vertex of the structure
  • faces (list of lists of 3 integers) – 3 indices to vertices that form a triangle on the mesh
  • spectrum (list of floats) – LB spectrum of a given shape defined by _points_ and _faces_
Returns:

new_spectrum – LB spectrum normalized by area

Return type:

list of floats

Examples

>>> import numpy as np
>>> from mindboggle.shapes.laplace_beltrami import area_normalize
>>> from mindboggle.shapes.laplace_beltrami import fem_laplacian
>>> # Define a cube:
>>> points = [[0,0,0], [0,1,0], [1,1,0], [1,0,0],
...           [0,0,1], [0,1,1], [1,1,1], [1,0,1]]
>>> faces = [[0,1,2], [2,3,0], [4,5,6], [6,7,4], [0,4,7], [7,3,0],
...          [0,4,5], [5,1,0], [1,5,6], [6,2,1], [3,7,6], [6,2,3]]
>>> spectrum = fem_laplacian(points, faces, spectrum_size=3,
...                          normalization=None)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in spectrum[1::]]
[4.58359, 4.8]
>>> new_spectrum = area_normalize(points, faces, spectrum)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in new_spectrum[1::]]
[27.50155, 28.8]
computeAB(points, faces)

Compute matrices for the Laplace-Beltrami operator.

The matrices correspond to A and B from Reuter’s 2009 article.

Note ::
All points must be on faces. Otherwise, a singular matrix error is generated when inverting D.
Parameters:
  • points (list of lists of 3 floats) – x,y,z coordinates for each vertex
  • faces (list of lists of 3 integers) – each list contains indices to vertices that form a triangle on a mesh
Returns:

  • A (csr_matrix)
  • B (csr_matrix)

Examples

>>> # Define a cube, then compute A and B on a selection of faces.
>>> import numpy as np
>>> from mindboggle.shapes.laplace_beltrami import computeAB
>>> points = [[0,0,0], [1,0,0], [0,0,1], [0,1,1],
...           [1,0,1], [0,1,0], [1,1,1], [1,1,0]]
>>> points = np.array(points)
>>> faces = [[0,2,4], [0,1,4], [2,3,4], [3,4,5], [3,5,6], [0,1,7]]
>>> faces = np.array(faces)
>>> A, B = computeAB(points, faces)
>>> print(np.array_str(A.toarray(), precision=5, suppress_small=True))
[[ 1.5     -1.      -0.5      0.       0.       0.       0.       0.     ]
 [-1.       2.       0.       0.      -0.5      0.       0.      -0.5    ]
 [-0.5      0.       2.      -0.5     -1.       0.       0.       0.     ]
 [ 0.       0.      -0.5      2.56066 -0.35355 -1.20711 -0.5      0.     ]
 [ 0.      -0.5     -1.      -0.35355  1.85355  0.       0.       0.     ]
 [ 0.       0.       0.      -1.20711  0.       1.20711  0.       0.     ]
 [ 0.       0.       0.      -0.5      0.       0.       0.5      0.     ]
 [ 0.      -0.5      0.       0.       0.       0.       0.       0.5    ]]
>>> print(np.array_str(B.toarray(), precision=5, suppress_small=True))
[[0.25    0.08333 0.04167 0.      0.08333 0.      0.      0.04167]
 [0.08333 0.16667 0.      0.      0.04167 0.      0.      0.04167]
 [0.04167 0.      0.16667 0.04167 0.08333 0.      0.      0.     ]
 [0.      0.      0.04167 0.28452 0.10059 0.10059 0.04167 0.     ]
 [0.08333 0.04167 0.08333 0.10059 0.36785 0.05893 0.      0.     ]
 [0.      0.      0.      0.10059 0.05893 0.20118 0.04167 0.     ]
 [0.      0.      0.      0.04167 0.      0.04167 0.08333 0.     ]
 [0.04167 0.04167 0.      0.      0.      0.      0.      0.08333]]
fem_laplacian(points, faces, spectrum_size=10, normalization='areaindex', verbose=False)

Compute linear finite-element method Laplace-Beltrami spectrum after Martin Reuter’s MATLAB code.

Comparison of fem_laplacian() with Martin Reuter’s Matlab eigenvalues:

fem_laplacian() results for Twins-2-1 left hemisphere (6 values): [4.829758648026221e-18, 0.0001284173002467199, 0.0002715181572272745, 0.0003205150847159417, 0.0004701628070486448, 0.0005768904023010318]

Martin Reuter’s shapeDNA-tria Matlab code: {-4.7207711983791511358e-18 ; 0.00012841730024672144738 ; 0.00027151815722727096853 ; 0.00032051508471592313632 ; 0.0004701628070486902353 ; 0.00057689040230097490998 }

fem_laplacian() results for Twins-2-1 left postcentral (1022): [6.3469513010430304e-18, 0.0005178862383467463, 0.0017434911095630772, 0.003667561767487686, 0.005429017880363784, 0.006309346984678924]

Martin Reuter’s Matlab code: {-2.1954862991027e-18 ; 0.0005178862383468 ; 0.0017434911095628 ; 0.0036675617674875 ; 0.0054290178803611 ; 0.006309346984678 }

Julien Lefevre, regarding comparison with Spongy results: “I have done some comparisons between my Matlab codes and yours on python and everything sounds perfect: The evaluation has been done only for one mesh (about 10000 vertices). - L2 error between your A and B matrices and mine are about 1e-16. - I have also compared eigenvalues of the generalized problem; even if the error is slightly increasing, it remains on the order of machine precision. - computation time for 1000 eigenvalues was 67s with python versus 63s in matlab. And it is quite the same behavior for lower orders. - Since the eigenvalues are increasing with order, it is also interesting to look at the relative error… high frequencies are not so much perturbed.”

Parameters:
  • points (list of lists of 3 floats) – x,y,z coordinates for each vertex of the structure
  • faces (list of lists of 3 integers) – 3 indices to vertices that form a triangle on the mesh
  • spectrum_size (integer) – number of eigenvalues to be computed (the length of the spectrum)
  • normalization (string) – the method used to normalize eigenvalues if None, no normalization is used if “area”, use area of the 2D structure as in Reuter et al. 2006 if “index”, divide eigenvalue by index to account for linear trend if “areaindex”, do both (default)
  • verbose (bool) – print statements?
Returns:

spectrum – first spectrum_size eigenvalues for Laplace-Beltrami spectrum

Return type:

list

Examples

>>> import numpy as np
>>> from mindboggle.shapes.laplace_beltrami import fem_laplacian
>>> # Define a cube:
>>> points = [[0,0,0], [0,1,0], [1,1,0], [1,0,0],
...           [0,0,1], [0,1,1], [1,1,1], [1,0,1]]
>>> faces = [[0,1,2], [2,3,0], [4,5,6], [6,7,4], [0,4,7], [7,3,0],
...          [0,4,5], [5,1,0], [1,5,6], [6,2,1], [3,7,6], [6,2,3]]
>>> spectrum = fem_laplacian(points, faces, spectrum_size=3,
...                          normalization=None, verbose=False)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in spectrum[1::]]
[4.58359, 4.8]
>>> spectrum = fem_laplacian(points, faces, spectrum_size=3,
...                          normalization="area", verbose=False)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in spectrum[1::]]
[27.50155, 28.8]
>>> # Spectrum for entire left hemisphere of Twins-2-1:
>>> from mindboggle.mio.vtks import read_vtk
>>> from mindboggle.mio.fetch_data import prep_tests
>>> urls, fetch_data = prep_tests()
>>> label_file = fetch_data(urls['left_freesurfer_labels'], '', '.vtk')
>>> points, f1,f2, faces, labels, f3,f4,f5 = read_vtk(label_file)
>>> spectrum = fem_laplacian(points, faces, spectrum_size=6,
...                          normalization=None, verbose=False)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in spectrum[1::]]
[0.00013, 0.00027, 0.00032, 0.00047, 0.00058]
>>> # Spectrum for Twins-2-1 left postcentral pial surface (22):
>>> from mindboggle.guts.mesh import keep_faces, reindex_faces_points
>>> I22 = [i for i,x in enumerate(labels) if x==1022] # postcentral
>>> faces = keep_faces(faces, I22)
>>> faces, points, o1 = reindex_faces_points(faces, points)
>>> spectrum = fem_laplacian(points, faces, spectrum_size=6,
...                          normalization=None, verbose=False)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in spectrum[1::]]
[0.00057, 0.00189, 0.00432, 0.00691, 0.00775]
>>> # Area-normalized spectrum for a single label (postcentral):
>>> spectrum = fem_laplacian(points, faces, spectrum_size=6,
...                          normalization="area", verbose=False)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in spectrum[1::]]
[2.69259, 8.97865, 20.44857, 32.74477, 36.739]
index_normalize(spectrum)

Normalize a spectrum by division of index to account for linear increase of Eigenvalue magnitude (Weyl’s law in 2D) as suggested in Reuter et al. (2006) and used in BrainPrint (Wachinger et al. 2015)

Parameters:spectrum (list of floats) – LB spectrum of a given shape
Returns:new_spectrum – Linearly re-weighted LB spectrum
Return type:list of floats

Examples

>>> import numpy as np
>>> from mindboggle.shapes.laplace_beltrami import area_normalize
>>> from mindboggle.shapes.laplace_beltrami import fem_laplacian
>>> # Define a cube:
>>> points = [[0,0,0], [0,1,0], [1,1,0], [1,0,0],
...           [0,0,1], [0,1,1], [1,1,1], [1,0,1]]
>>> faces = [[0,1,2], [2,3,0], [4,5,6], [6,7,4], [0,4,7], [7,3,0],
...          [0,4,5], [5,1,0], [1,5,6], [6,2,1], [3,7,6], [6,2,3]]
>>> spectrum = fem_laplacian(points, faces, spectrum_size=3,
...                          normalization=None)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in spectrum[1::]]
[4.58359, 4.8]
>>> new_spectrum = index_normalize(spectrum)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in new_spectrum[1::]]
[4.58359, 2.4]
spectrum_from_file(vtk_file, spectrum_size=10, exclude_labels=[-1], normalization='areaindex', area_file='', verbose=False)

Compute Laplace-Beltrami spectrum of a 3D shape in a VTK file.

Parameters:
  • vtk_file (string) – the input vtk file
  • spectrum_size (integer) – number of eigenvalues to be computed (the length of the spectrum)
  • exclude_labels (list of integers) – labels to be excluded
  • normalization (string) – the method used to normalize eigenvalues if None, no normalization is used if “area”, use area of the 2D structure as in Reuter et al. 2006 if “index”, divide eigenvalue by index to account for linear trend if “areaindex”, do both (default)
  • area_file (string) – name of VTK file with surface area scalar values
  • verbose (bool) – print statements?
Returns:

spectrum – first spectrum_size of Laplace-Beltrami spectrum

Return type:

list of floats

Examples

>>> # Spectrum for entire left hemisphere of Twins-2-1:
>>> import numpy as np
>>> from mindboggle.shapes.laplace_beltrami import spectrum_from_file
>>> from mindboggle.shapes.laplace_beltrami import spectrum_per_label
>>> from mindboggle.mio.fetch_data import prep_tests
>>> urls, fetch_data = prep_tests()
>>> vtk_file = fetch_data(urls['left_freesurfer_labels'], '', '.vtk')
>>> spectrum = spectrum_from_file(vtk_file, spectrum_size=6,
...     exclude_labels=[-1], normalization=None, area_file="", verbose=False)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in spectrum[1::]]
[0.00013, 0.00027, 0.00032, 0.00047, 0.00058]
>>> spectrum = spectrum_from_file(vtk_file, spectrum_size=6,
...     exclude_labels=[-1], normalization="areaindex", area_file="",
...     verbose=False)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in spectrum[1::]]
[14.12801, 14.93573, 11.75397, 12.93141, 12.69348]
spectrum_of_largest(points, faces, spectrum_size=10, exclude_labels=[-1], normalization='areaindex', areas=None, verbose=False)

Compute Laplace-Beltrami spectrum on largest connected segment.

In case a surface patch is fragmented, we select the largest fragment, remove extraneous triangular faces, and reindex indices.

Parameters:
  • points (list of lists of 3 floats) – x,y,z coordinates for each vertex of the structure
  • faces (list of lists of 3 integers) – 3 indices to vertices that form a triangle on the mesh
  • spectrum_size (integer) – number of eigenvalues to be computed (the length of the spectrum)
  • exclude_labels (list of integers) – background values to exclude
  • normalization (string) – the method used to normalize eigenvalues if None, no normalization is used if “area”, use area of the 2D structure as in Reuter et al. 2006 if “index”, divide eigenvalue by index to account for linear trend if “areaindex”, do both (default)
  • areas (numpy array or list of floats (or None)) – surface area scalar values for all vertices
  • verbose (bool) – print statements?
Returns:

spectrum – first spectrum_size eigenvalues for Laplace-Beltrami spectrum

Return type:

list

Examples

>>> # Spectrum for left postcentral + pars triangularis pial surfaces:
>>> import numpy as np
>>> from mindboggle.mio.vtks import read_scalars, read_vtk, write_vtk
>>> from mindboggle.guts.mesh import keep_faces, reindex_faces_points
>>> from mindboggle.shapes.laplace_beltrami import spectrum_of_largest
>>> from mindboggle.mio.fetch_data import prep_tests
>>> urls, fetch_data = prep_tests()
>>> label_file = fetch_data(urls['left_freesurfer_labels'], '', '.vtk')
>>> area_file = fetch_data(urls['left_area'], '', '.vtk')
>>> spectrum_size = 6
>>> exclude_labels = [-1]
>>> normalization = None
>>> points, indices, lines, faces, labels, f1, npoints, f2 = read_vtk(label_file,
...     return_first=True, return_array=True)
>>> I20 = [i for i,x in enumerate(labels) if x==1020] # pars triangularis
>>> I22 = [i for i,x in enumerate(labels) if x==1022] # postcentral
>>> I22.extend(I20)
>>> faces = keep_faces(faces, I22)
>>> faces, points, o1 = reindex_faces_points(faces, points)
>>> areas, u1 = read_scalars(area_file, True, True)
>>> verbose = False
>>> spectrum = spectrum_of_largest(points, faces, spectrum_size,
...     exclude_labels, normalization, areas, verbose)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in spectrum[1::]]
[0.00057, 0.00189, 0.00432, 0.00691, 0.00775]

View both segments (skip test):

>>> from mindboggle.mio.plots import plot_surfaces
>>> scalars = np.zeros(np.shape(labels))
>>> scalars[I22] = 1
>>> vtk_file = 'test_two_labels.vtk'
>>> write_vtk(vtk_file, points, indices, lines, faces,
...           scalars, scalar_names='scalars', scalar_type='int')
>>> plot_surfaces(vtk_file) # doctest: +SKIP
spectrum_per_label(vtk_file, spectrum_size=10, exclude_labels=[-1], normalization='areaindex', area_file='', largest_segment=True, verbose=False)

Compute Laplace-Beltrami spectrum per labeled region in a file.

Parameters:
  • vtk_file (string) – name of VTK surface mesh file containing index scalars (labels)
  • spectrum_size (integer) – number of eigenvalues to be computed (the length of the spectrum)
  • exclude_labels (list of integers) – labels to be excluded
  • normalization (string) – the method used to normalize eigenvalues if None, no normalization is used if “area”, use area of the 2D structure as in Reuter et al. 2006 if “index”, divide eigenvalue by index to account for linear trend if “areaindex”, do both (default)
  • area_file (string (optional)) – name of VTK file with surface area scalar values
  • largest_segment (bool) – compute spectrum only for largest segment with a given label?
  • verbose (bool) – print statements?
Returns:

  • spectrum_lists (list of lists) – first eigenvalues for each label’s Laplace-Beltrami spectrum
  • label_list (list of integers) – list of unique labels for which spectra are obtained

Examples

>>> # Uncomment "if label==22:" below to run example:
>>> # Spectrum for Twins-2-1 left postcentral (22) pial surface:
>>> import numpy as np
>>> from mindboggle.shapes.laplace_beltrami import spectrum_per_label
>>> from mindboggle.mio.fetch_data import prep_tests
>>> urls, fetch_data = prep_tests()
>>> vtk_file = fetch_data(urls['left_freesurfer_labels'], '', '.vtk')
>>> area_file = fetch_data(urls['left_area'], '', '.vtk')
>>> spectrum_size = 6
>>> exclude_labels = [0]  #[-1]
>>> largest_segment = True
>>> verbose = False
>>> spectrum_lists, label_list = spectrum_per_label(vtk_file,
...     spectrum_size, exclude_labels, None, area_file, largest_segment,
...     verbose)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in spectrum_lists[0][1::]]
[0.00054, 0.00244, 0.00291, 0.00456, 0.00575]
>>> label_list[0:10]
[1029, 1005, 1011, 1021, 1008, 1025, 999, 1013, 1007, 1022]
wesd(EVAL1, EVAL2, Vol1, Vol2, show_error=False, N=3)

Weighted Spectral Distance. See Konukoglu et al. (2012)

Note ::
At present, algorithm doesn’t return normalized result. It therefore doesn’t require calculation of volume.
Parameters:
  • EVAL1 (numpy array of floats) – LB spectrum from the 1st shape
  • EVAL2 (numpy array of floats) – LB spectrum from the 2nd shape
  • Vol1 (float) – Volume (area for 2D) of the 1st shape
  • Vol2 (float) – Volume (area for 2D) of the 2nd shape
  • show_error (boolean) – display the error of WESD using Eqs.(9) and (10) in Konukoglu (2012)? default: false
  • N (integer) – The length of spetrum used (N>=3, default: 3)

mindboggle.shapes.likelihood module

Compute (fundus) likelihood values that highlight deep and highly curved portions of a surface mesh.

Compute likelihood values for a VTK surface mesh:

compute_likelihood():
Compute likelihood values for a given VTK surface mesh. This is run after training on the distributions of depth and curvature values across multiple VTK surface mesh files in the functions below.

Learn distributions from training data (see Examples below):

estimate_depth_curvature_distributions():
Estimate distribution means, sigmas (standard deviations), and weights for VTK surface mesh depth and curvature scalars along and outside sulcus label borders within folds (as defined by a labeling protocol).

estimate_depth_curvature_distributions() calls the following functions:

concatenate_sulcus_scalars():
Prepare data for estimating scalar distributions along and outside fundi. Extract (e.g., depth, curvature) scalar values in folds, along sulcus label boundaries as well as outside the sulcus label boundaries. Concatenate these scalar values across multiple files.
fit_normals_to_histogram():
This Estimation-Maximization method returns estimated means, sigmas (standard deviations) and weights from a distribution of values.

Authors: Yrjo Hame, 2012-2013 . yrjo.hame@gmail.com Arno Klein, 2012-2016 . arno@mindboggle.info . www.binarybottle.com

Copyright 2016, Mindboggle team (http://mindboggle.info), Apache v2.0 License

compute_likelihood(trained_file, depth_file, curvature_file, folds, save_file=False, background_value=-1)

Compute likelihoods based on input values, folds, estimated parameters.

Compute likelihood values for a given VTK surface mesh file, after training on distributions of depth and curvature values from multiple files.

Parameters:
  • trained_file (pickle compressed file) – depth_border, curv_border, depth_nonborder, and curv_nonborder are dictionaries containing lists of floats (estimates of depth or curvature means, sigmas, and weights trained on fold vertices either on or off sulcus label borders)
  • depth_file (string) – VTK surface mesh file with depth values in [0,1] for all vertices
  • curvature_file (string) – VTK surface mesh file with curvature values in [-1,1] for all vertices
  • folds (list of integers) – fold number for all vertices (-1 for non-fold vertices)
  • save_file (bool) – save output VTK file?
  • background_value (integer or float) – background value
Returns:

  • likelihoods (list of floats) – likelihood values for all vertices (0 for non-fold vertices)
  • likelihoods_file (string (if save_file)) – name of output VTK file with likelihood scalars (-1 for non-fold vertices)

Notes

The depth_curv_border_nonborder_parameters.pkl file needs to be updated.

Examples

>>> from mindboggle.mio.vtks import read_scalars
>>> from mindboggle.shapes.likelihood import compute_likelihood
>>> from mindboggle.mio.fetch_data import prep_tests
>>> urls, fetch_data = prep_tests()
>>> depth_file = fetch_data(urls['left_travel_depth'], '', '.vtk')
>>> curvature_file = fetch_data(urls['left_mean_curvature'], '', '.vtk')
>>> folds_file = fetch_data(urls['left_folds'], '', '.vtk')
>>> trained_file = fetch_data(urls[
...     'depth_curv_border_nonborder_parameters'], '', '.pkl') # doctest: +SKIP
>>> folds, name = read_scalars(folds_file)
>>> save_file = True
>>> background_value = -1
>>> likelihoods, likelihoods_file = compute_likelihood(trained_file,
...     depth_file, curvature_file, folds, save_file, background_value) # doctest: +SKIP

View result (skip test):

>>> from mindboggle.mio.plots import plot_surfaces # doctest: +SKIP
>>> plot_surfaces('likelihoods.vtk', folds_file) # doctest: +SKIP
concatenate_sulcus_scalars(scalar_files, fold_files, label_files, background_value=-1)

Prepare data for estimating scalar distributions along and outside fundi.

Extract (e.g., depth, curvature) scalar values in folds, along sulcus label boundaries as well as outside the sulcus label boundaries. Concatenate these scalar values across multiple files.

Parameters:
  • scalar_files (list of strings) – names of surface mesh VTK files with scalar values to concatenate
  • fold_files (list of strings (corr. to each list in scalar_files)) – VTK files with fold numbers as scalars (-1 for non-fold vertices)
  • label_files (list of strings (corr. to fold_files)) – VTK files with label numbers (-1 for unlabeled vertices)
  • background_value (integer or float) – background value
Returns:

  • border_scalars (list of floats) – concatenated scalar values within folds along sulcus label boundaries
  • nonborder_scalars (list of floats) – concatenated scalar values within folds outside sulcus label boundaries

Examples

>>> # Concatenate (duplicate) depth scalars:
>>> import numpy as np
>>> from mindboggle.shapes.likelihood import concatenate_sulcus_scalars
>>> from mindboggle.mio.fetch_data import prep_tests
>>> urls, fetch_data = prep_tests()
>>> depth_file = fetch_data(urls['left_travel_depth'], '', '.vtk')
>>> labels_file = fetch_data(urls['left_freesurfer_labels'], '', '.vtk')
>>> folds_file = fetch_data(urls['left_folds'], '', '.vtk')
>>> scalar_files = [depth_file, depth_file]
>>> fold_files = [folds_file, folds_file]
>>> label_files = [labels_file, labels_file]
>>> background_value = -1
>>> border, nonborder = concatenate_sulcus_scalars(scalar_files,
...     fold_files, label_files, background_value)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in border[0:5]]
[3.48284, 2.57157, 4.27596, 4.56549, 3.84881]
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in nonborder[0:5]]
[2.87204, 2.89388, 3.55364, 2.81681, 3.70736]
estimate_distribution(scalar_files, scalar_range, fold_files, label_files, background_value=-1, verbose=False)

Estimate sulcus label border scalar distributions from VTK files.

Learn distributions from training data (different surface meshes). Estimate distribution means, sigmas (standard deviations), and weights for VTK surface mesh scalars (e.g., depth, curvature) along and outside sulcus label borders within folds.

Note : The number of classes, k, is currently hard-coded.

Parameters:
  • scalar_files (list of strings) – names of VTK files with scalar values for all surface vertices
  • scalar_range (list of floats) – range of values to estimate distribution
  • fold_files (list of strings) – names of VTK files with fold numbers for scalar values
  • label_files (list of strings) – names of VTK files with label numbers for scalar values
  • background_value (integer or float) – background value
  • verbose (bool) – print statements?
Returns:

  • border_parameters (dictionary containing lists of floats) – means, sigmas, weights
  • nonborder_parameters (dictionary containing lists of floats) – means, sigmas, weights

Examples

>>> import numpy as np
>>> import pickle
>>> from mindboggle.shapes.likelihood import estimate_distribution
>>> from mindboggle.mio.fetch_data import prep_tests
>>> # Train on a single surface mesh (using FreeSurfer vs. manual labels):
>>> urls, fetch_data = prep_tests()
>>> depth_file = fetch_data(urls['left_travel_depth'], '', '.vtk')
>>> curv_file = fetch_data(urls['left_mean_curvature'], '', '.vtk')
>>> folds_file = fetch_data(urls['left_folds'], '', '.vtk')
>>> labels_file = fetch_data(urls['left_freesurfer_labels'], '', '.vtk')
>>> depth_files = [depth_file]
>>> curv_files = [curv_file]
>>> fold_files = [folds_file]
>>> label_files = [labels_file]
>>> #
>>> # # Train on many Mindboggle-101 surface meshes:
>>> # import os
>>> # mindboggle_path = '../../Mindboggle101_mindboggle_results'
>>> # label_path = os.environ['SUBJECTS_DIR']
>>> # x_path = os.path.join(os.environ['MINDBOGGLE'], 'x')
>>> # atlas_list_file = os.path.join(x_path, 'mindboggle101_atlases.txt')
>>> # depth_files = []
>>> # curv_files = []
>>> # fold_files = []
>>> # label_files = []
>>> # for atlas in atlas_list:
>>> #  if 'OASIS' in atlas or 'NKI' in atlas or 'MMRR-21' in atlas:
>>> #   print(atlas)
>>> #   for h in ['lh','rh']:
>>> #     depth_file = os.path.join(mindboggle_path, 'shapes',
>>> #         '_hemi_'+h+'_subject_'+atlas, h+'.pial.travel_depth.vtk')
>>> #     curv_file = os.path.join(mindboggle_path, 'shapes',
>>> #         '_hemi_'+h+'_subject_'+atlas, h+'.pial.mean_curvature.vtk')
>>> #     folds_file = os.path.join(mindboggle_path, 'features',
>>> #         '_hemi_'+h+'_subject_'+atlas, 'folds.vtk')
>>> #     labels_file = os.path.join(label_path, atlas, 'label',
>>> #         h+'.labels.DKT25.manual.vtk')
>>> #     depth_files.append(depth_file)
>>> #     curv_files.append(curv_file)
>>> #     fold_files.append(folds_file)
>>> #     label_files.append(labels_file)
>>> #
>>> scalar_files = depth_files
>>> scalar_range = np.linspace(0, 1, 51, endpoint=True) # (0 to 1 by 0.02)
>>> background_value = -1
>>> verbose = False
>>> depth_border, depth_nonborder = estimate_distribution(scalar_files,
...     scalar_range, fold_files, label_files, background_value, verbose)
>>> scalar_files = curv_files
>>> scalar_range = np.linspace(-1, 1, 101, endpoint=True) # (-1 to 1 by 0.02)
>>> curv_border, curv_nonborder = estimate_distribution(scalar_files,
...     scalar_range, fold_files, label_files, verbose)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in depth_border['means']]
[13.0869, 0.0, 0.0]
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in depth_nonborder['means']]
[14.59311, 6.16008, 0.0]
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in curv_border['means']]
[3.06449, -0.76109, -3.43184]
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in curv_nonborder['means']]
[0.62236, -1.55192, -5.19359]
>>> pickle.dump([depth_border, curv_border, depth_nonborder, curv_nonborder],
...     open("depth_curv_border_nonborder_parameters.pkl", "wb"))
fit_normals_to_histogram(data, x, verbose=False)

This Estimation-Maximization method returns estimated means, sigmas (standard deviations) and weights, each of length k (number of classes).

Parameters:
  • data (list of floats) – data to estimate distribution means, sigmas, and weights
  • x (list of floats) – range of values used to initialize distribution means and sigmas
Returns:

  • means (list of floats) – estimated mean for each class
  • sigmas (list of floats) – estimated standard deviation for each class
  • weights (list of floats) – weight for each class
  • verbose (bool) – print statements?

Examples

>>> import numpy as np
>>> from mindboggle.shapes.likelihood import fit_normals_to_histogram
>>> from mindboggle.mio.vtks import read_scalars
>>> from mindboggle.mio.fetch_data import prep_tests
>>> urls, fetch_data = prep_tests()
>>> depth_file = fetch_data(urls['left_travel_depth'], '', '.vtk')
>>> scalars, name = read_scalars(depth_file)
>>> x = np.linspace(0, 1, 51, endpoint=True)
>>> verbose = False
>>> means, sigmas, weights = fit_normals_to_histogram(scalars, x, verbose)
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in means]
[13.38742, 3.90712, 0.10946]
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in sigmas]
[5.80721, 2.58297, 0.10209]
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in weights]
[0.43959, 0.39286, 0.16755]

mindboggle.shapes.surface_shapes module

Shape calculations.

Authors:

Copyright 2013, Mindboggle team (http://mindboggle.info), Apache v2.0 License

area(command, surface_file, verbose=False)

Measure area of each vertex in a surface mesh. (Calls Joachim Giard’s C++ code)

Parameters:
  • command (string) – Voronoi-based surface area C++ executable command
  • surface_file (string) – vtk file with surface mesh
  • verbose (bool) – print statements?
Returns:

area_file – vtk file with surface area per vertex of mesh

Return type:

string

Examples

>>> import os
>>> import numpy as np
>>> from mindboggle.shapes.surface_shapes import area
>>> from mindboggle.mio.vtks import read_scalars
>>> from mindboggle.mio.fetch_data import prep_tests
>>> urls, fetch_data = prep_tests()
>>> surface_file = fetch_data(urls['left_pial'], '', '.vtk')
>>> verbose = False
>>> ccode_path = os.environ['vtk_cpp_tools']
>>> command = os.path.join(ccode_path, 'area', 'PointAreaMain')
>>> area_file = area(command, surface_file, verbose)
>>> scalars, name = read_scalars(area_file)
>>> np.allclose(scalars[0:8],
...             [0.48270401731, 0.39661528543, 0.57813454792, 0.70574099571,
...              0.84318527207, 0.57642554119, 0.66942016035, 0.70629953593])
True
curvature(command, method, arguments, surface_file, verbose=False)

Measure curvature values of each vertex in a surface mesh (-m 0). (Calls Joachim Giard’s C++ code)

Command line usage: CurvatureMain [Options] InputVTKMesh MeanCurvatureOutput

Command line options: -m Method: set the method used to compute curvature(s) (default 0): 0: Use ComputePrincipalCurvatures() function to compute both mean and Gaussian curvatures based on the relative direction of the normal vectors in a small neighborhood 1: Use ComputeBothCurvatures() function to compute both mean and Gaussian curvatures based on the local ratios between a filtered surface and the original surface area 2: Use ComputeCurvature() function to compute the mean curvature based on the direction of the displacement vectors during a Laplacian filtering -n Neighborhood: neighborhood size (default 0.7) -g GaussianCurvVTK: save Gaussian curvature (for -m 0 or 1) -x MaxCurvVTK: save maximum curvature (for -m 0) -i MinCurvVTK: save minimum curvature (for -m 0) -d DirectionVTK: save minimal curvature’s direction (for -m 0)

Command line example: CurvatureMain -m 2 -n 0.7 lh.pial.vtk lh.pial.mean_curvature.vtk

Command line example: CurvatureMain -m 0 -n 2 -i lh.min_curv.vtk -x lh.max_curv.vtk -g lh.gaussian_curv.vtk -d lh.min_dir.vtk lh.pial.vtk lh.mean_curv.vtk

Notes: -m 0 is best if you have low resolution or want local peaks, but can be too sensitive to the local linear geometry of the mesh, unless the neighborhood parameter is set high enough (like 2).

-m 1 is not well tested and the filtering is done using Euclidean distances, so it’s only good for incorrect but fast visualization.

-m 2 is a good approximation based on the Laplacian, but very large curvatures (negative or positive) are underestimated (saturation).

Parameters:
  • command (string) – C++ executable command for computing curvature
  • method (integer {0,1,2}) – method number
  • arguments (string) – additional arguments, such as neighborhood parameter
  • surface_file (string) – name of VTK surface mesh file
  • verbose (bool) – print statements?
Returns:

  • mean_curvature_file (string) – mean curvature file
  • gauss_curvature_file (string) – gauss curvature file
  • max_curvature_file (string) – max curvature file
  • min_curvature_file (string) – min curvature file
  • min_curvature_vector_file (string) – min curvature vector file

Examples

>>> import os
>>> import numpy as np
>>> from mindboggle.shapes.surface_shapes import curvature
>>> from mindboggle.mio.vtks import read_scalars
>>> from mindboggle.mio.fetch_data import prep_tests
>>> urls, fetch_data = prep_tests()
>>> surface_file = fetch_data(urls['left_pial'], '', '.vtk')
>>> method = 2
>>> arguments = '-n 0.7'
>>> verbose = False
>>> ccode_path = os.environ['vtk_cpp_tools']
>>> command = os.path.join(ccode_path, 'curvature', 'CurvatureMain')
>>> mean_curvature_file, f1,f2,f3,f4 = curvature(command, method,
...     arguments, surface_file, verbose)
>>> scalars, name = read_scalars(mean_curvature_file)
>>> np.allclose(scalars[0:8], [-5.8136068088, -5.9312990469, -6.2805500474, -5.6210018286, -5.6963067208, -5.8039874097, -5.8726460688, -5.7106966401])
True
geodesic_depth(command, surface_file, verbose=False)

Estimate geodesic depth of each vertex in a surface mesh. (Calls Joachim Giard’s C++ code)

Parameters:
  • command (travel depth C++ executable command) –
  • surface_file (vtk file) –
  • verbose (bool) – print statements?
Returns:

depth_file – vtk file with geodesic depth per vertex of mesh

Return type:

string

Examples

>>> import os
>>> import numpy as np
>>> from mindboggle.shapes.surface_shapes import geodesic_depth
>>> from mindboggle.mio.vtks import read_scalars
>>> from mindboggle.mio.fetch_data import prep_tests
>>> urls, fetch_data = prep_tests()
>>> surface_file = fetch_data(urls['left_pial'], '', '.vtk')
>>> verbose = False
>>> ccode_path = os.environ['vtk_cpp_tools']
>>> command = os.path.join(ccode_path, 'geodesic_depth', 'GeodesicDepthMain')
>>> depth_file = geodesic_depth(command, surface_file, verbose)
>>> scalars, name = read_scalars(depth_file)
>>> np.allclose(scalars[0:8], [0.020259869839, 0.06009166489, 0.12858575442, 0.045639221313, 0.007742772964, 0.052839111255, 0.053538904296, 0.013158746337])
True
travel_depth(command, surface_file, verbose=False)

Measure “travel depth” of each vertex in a surface mesh. (Calls Joachim Giard’s C++ code)

Parameters:
  • command (string) – travel depth C++ executable command
  • surface_file (string) – vtk file
  • verbose (bool) – print statements?
Returns:

depth_file – vtk file with travel depth per vertex of mesh

Return type:

string

Examples

>>> import os
>>> import numpy as np
>>> from mindboggle.shapes.surface_shapes import travel_depth
>>> from mindboggle.mio.vtks import read_scalars
>>> from mindboggle.mio.fetch_data import prep_tests
>>> urls, fetch_data = prep_tests()
>>> surface_file = fetch_data(urls['left_pial'], '', '.vtk')
>>> verbose = False
>>> ccode_path = os.environ['vtk_cpp_tools']
>>> command = os.path.join(ccode_path, 'travel_depth', 'TravelDepthMain')
>>> depth_file = travel_depth(command, surface_file, verbose)
>>> scalars, name = read_scalars(depth_file)
>>> np.allclose(scalars[0:8], [0.020259869839, 0.06009166489, 0.12858575442, 0.045639221313, 0.007742772964, 0.052839111255, 0.053538904296, 0.013158746337])
True

mindboggle.shapes.volume_shapes module

Compute shape measures from volume images.

Authors:

Copyright 2016, Mindboggle team (http://mindboggle.info), Apache v2.0 License

thickinthehead(segmented_file, labeled_file, cortex_value=2, noncortex_value=3, labels=[], names=[], propagate=False, output_dir='', save_table=False, output_table='', verbose=False)

Compute a simple thickness measure for each labeled cortex region volume.

Since Mindboggle accepts FreeSurfer data as input, we include FreeSurfer cortical thickness estimates with Mindboggle’s shape measures. However, surface mesh reconstruction from MRI data does not always produce favorable results. For example, we found that at least a quarter of the over one hundred EMBARC brain images we processed through FreeSurfer clipped ventral cortical regions, resulting in bad surface patches in those regions. For comparison, we built this function called thickinthehead which computes a simple thickness measure for each cortical region using a segmentation volume rather than surfaces.

We have revised this algorithm from the original published version. We removed upsampling to reduce memory issues for large image volumes, and replaced the estimated volume of middle cortical layer with an estimate of its surface area. We made these revisions to be less susceptible to deviations in voxel size from isometric 1mm^3 voxels for which thickinthehead was originally built.

Steps

1. Extract noncortex and cortex into separate files.
2. Either mask labels with cortex or fill cortex with labels.
3. Extract outer and inner boundary voxels of the cortex,
   by morphologically eroding the cortex (=2) by one voxel bordering
   the outside of the brain (=0) and bordering the inside of the brain
   (non-cortex=3).
4. Estimate middle cortical layer's surface area by the average
   surface area of the outer and inner boundary voxels of the cortex,
   where surface area is roughly estimated as the average face area
   of a voxel times the number of voxels.
5. Compute the volume of a labeled region of cortex.
6. Estimate the thickness of the labeled cortical region as the
   volume of the labeled region (#5) divided by the
   estimate of the middle cortical surface area of that region (#4).

Note:

- Cortex, noncortex, & label files are from the same coregistered brain.
- Calls ANTs functions: ImageMath and Threshold
- There may be slight discrepancies between volumes computed by
  thickinthehead() and volumes computed by volume_per_label();
  in 31 of 600+ ADNI 1.5T images, some volume_per_label() volumes
  were slightly larger (in the third decimal place), presumably due to
  label propagation through the cortex in thickinthehead().
  This is more pronounced in ANTs vs. FreeSurfer-labeled volumes.
Parameters:
  • segmented_file (string) – image volume with cortex and noncortex (and any other) labels
  • labeled_file (string) – corresponding image volume with index labels
  • cortex_value (integer) – cortex label value in segmented_file
  • noncortex_value (integer) – noncortex label value in segmented_file
  • labels (list of integers) – label indices
  • names (list of strings) – label names
  • propagate (bool) – propagate labels through cortex (or mask labels with cortex)?
  • output_dir (string) – output directory
  • save_table (bool) – save output table file with label volumes and thickness values?
  • output_table (string) – name of output table file with label volumes and thickness values
  • verbose (bool) – print statements?
Returns:

  • label_volume_thickness (list of lists of integers and floats) – label indices, volumes, and thickness values (default -1)
  • output_table (string) – name of output table file with label volumes and thickness values

Examples

>>> # Example simply using ants segmentation and labels vs. hybrid segmentation:
>>> import os
>>> import numpy as np
>>> from mindboggle.shapes.volume_shapes import thickinthehead
>>> from mindboggle.mio.fetch_data import prep_tests
>>> urls, fetch_data = prep_tests()
>>> segmented_file = fetch_data(urls['ants_segmentation'], '', '.nii.gz')
>>> labeled_file = fetch_data(urls['ants_labels'], '', '.nii.gz')
>>> cortex_value = 2
>>> noncortex_value = 3
>>> #labels = [2]
>>> labels = list(range(1002,1036)) + list(range(2002,2036))
>>> labels.remove(1004)
>>> labels.remove(2004)
>>> labels.remove(1032)
>>> labels.remove(2032)
>>> labels.remove(1033)
>>> labels.remove(2033)
>>> names = []
>>> propagate = False
>>> output_dir = ''
>>> save_table = True
>>> output_table = ''
>>> verbose = False

Skip online test because it requires installation of ANTs:

>>> label_volume_thickness, output_table = thickinthehead(segmented_file,
...     labeled_file, cortex_value, noncortex_value, labels, names,
...     propagate, output_dir, save_table, output_table, verbose) # doctest: +SKIP
>>> [np.int("{0:.{1}f}".format(x, 5)) label_volume_thickness[0][0:10]] # doctest: +SKIP
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in label_volume_thickness[1][0:5]] # doctest: +SKIP
>>> [np.float("{0:.{1}f}".format(x, 5)) for x in label_volume_thickness[2][0:5]] # doctest: +SKIP
volume_per_brain_region(input_file, include_labels=[], exclude_labels=[], label_names=[], save_table=False, output_table='', verbose=False)

Compute volume per labeled region in a nibabel-readable image.

Computing the volume per labeled region is very straightforward: this function simply multiplies the volume per voxel by the number of voxels per region.

Note: Results are truncated at three decimal places because we found that volume label propagation led to differences in the third decimal place.

Parameters:
  • input_file (string) – name of image file, consisting of index-labeled pixels/voxels
  • include_labels (list of integers) – labels to include (if empty, use unique numbers in image volume file)
  • exclude_labels (list of integers) – labels to be excluded
  • label_names (list of strings) – label names corresponding to labels to include
  • save_table (bool) – save output table file with labels and volume values?
  • output_table (string) – name of output table file with labels and volume values
  • verbose (bool) – print statements?
Returns:

  • unique_labels (list of integers) – unique label numbers (default -1)
  • volumes (list of floats) – volume for each label (default -1)
  • output_table (string) – name of output volume table file (if output_table not empty)

Examples

>>> import os
>>> import numpy as np
>>> from mindboggle.mio.labels import DKTprotocol
>>> from mindboggle.shapes.volume_shapes import volume_per_brain_region
>>> from mindboggle.mio.fetch_data import prep_tests
>>> urls, fetch_data = prep_tests()
>>> input_file = fetch_data(urls['freesurfer_labels'], '', '.nii.gz')
>>> dkt = DKTprotocol()
>>> include_labels = dkt.label_numbers
>>> exclude_labels = []
>>> label_names = dkt.label_names
>>> save_table = True
>>> output_table = 'volumes.csv'
>>> verbose = False
>>> unique_labels, volumes, table = volume_per_brain_region(input_file,
...     include_labels, exclude_labels, label_names, save_table,
...     output_table, verbose)
>>> [np.float("{0:.{1}f}".format(x, 5))
...  for x in [y for y in volumes if y > 0][0:5]]
[971.99797, 2413.99496, 2192.99543, 8328.98262, 2940.99386]
>>> [np.float("{0:.{1}f}".format(x, 5))
...  for x in [y for y in volumes if y > 0][5:10]]
[1997.99583, 10905.97725, 11318.97639, 10789.97749, 2700.99437]

Module contents