Skip to main content
Log in

From a Non-Local Ambrosio-Tortorelli Phase Field to a Randomized Part Hierarchy Tree

  • Published:
Journal of Mathematical Imaging and Vision Aims and scope Submit manuscript

Abstract

In its most widespread imaging and vision applications, Ambrosio and Tortorelli (AT) phase field is a technical device for applying gradient descent to Mumford and Shah simultaneous segmentation and restoration functional or its extensions. As such, it forms a diffuse alternative to sharp interfaces or level sets and parametric techniques. The functionality of the AT field, however, is not limited to segmentation and restoration applications. We demonstrate the possibility of coding parts—features that are higher level than edges and boundaries—after incorporating higher level influences via distances and averages. The iteratively extracted parts using the level curves with double point singularities are organized as a proper binary tree. Inconsistencies due to non-generic configurations for level curves as well as due to visual changes such as occlusion are successfully handled once the tree is endowed with a probabilistic structure. As a proof of concept, we present (1) the most probable configurations from our randomized trees; and (2) correspondence matching results between illustrative shape pairs.

The work is a significant step towards establishing exponentially decaying diffuse distance fields as bridges between low level visual processing and shape computations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21

Similar content being viewed by others

References

  1. Ambrosio, L., Tortorelli, V.: On the approximation of functionals depending on jumps by elliptic functionals via Γ-convergence. Commun. Pure Appl. Math. 43(8), 999–1036 (1990)

    Article  MATH  MathSciNet  Google Scholar 

  2. Aslan, C., Tari, S.: An axis-based representation for recognition. In: ICCV, pp. 1339–1346 (2005)

    Google Scholar 

  3. Aslan, C., Erdem, A., Erdem, E., Tari, S.: Disconnected skeleton: Shape at its absolute scale. IEEE Trans. Pattern Anal. 30(12), 2188–2203 (2008)

    Article  Google Scholar 

  4. Aubry, M., Schlickewei, U., Cremers, D.: The wave kernel signature: A quantum mechanical approach to shape analysis. In: ICCV—Workshop on Dynamic Shape Capture and Analysis (2011)

    Google Scholar 

  5. Bai, X., Wang, B., Yao, C., Liu, W., Tu, Z.: Co-transduction for shape retrieval. IEEE Trans. Image Process. 21(5), 2747–2757 (2012)

    Article  MathSciNet  Google Scholar 

  6. Bajaj, C.L., Pascucci, V., Schikore, D.R.: The contour spectrum. In: Proceedings of the 8th conference on Visualization (1997)

    Google Scholar 

  7. Ballester, C., Caselles, V., Igual, L., Garrido, L.: Level lines selection with variational models for segmentation and encoding. J. Math. Imaging Vis. 27(1), 5–27 (2007)

    Article  MathSciNet  Google Scholar 

  8. Bar, L., Sochen, N., Kiryati, N.: Image deblurring in the presence of impulsive noise. Int. J. Comput. Vis. 70(3), 279–298 (2006)

    Article  Google Scholar 

  9. Biasotti, S., Cerri, A., Frosini, P., Giorgi, D., Landi, C.: Multidimensional size functions for shape comparison. J. Math. Imaging Vis. 32(2), 161–179 (2008)

    Article  MathSciNet  Google Scholar 

  10. Braides, A.: Approximation of Free-discontinuity Problems. Lecture Notes in Mathematics, vol. 1694. Springer, Berlin (1998)

    MATH  Google Scholar 

  11. Buades, A., Coll, B., Morel, J.M.: A non-local algorithm for image denoising. In: CVPR, pp. 60–65. Springer, Berlin (2005)

    Google Scholar 

  12. Burgeth, B., Weickert, J., Tari, S.: Minimally stochastic schemes for singular diffusion equations. In: Tai, X.C., Lie, K.A., Chan, T.F., Osher, S. (eds.) Image Processing Based on Partial Differential Equations, Mathematics and Visualization, pp. 325–339. Springer, Berlin (2006)

    Google Scholar 

  13. Chan, T., Vese, L.: Active contours without edges. IEEE Trans. Image Process. 10(2), 266–277 (2001)

    Article  MATH  Google Scholar 

  14. Cremers, D., Tischhäuser, F., Weickert, J., Schnörr, C.: Diffusion snakes: Introducing statistical shape knowledge into the Mumford-Shah functional. Int. J. Comput. Vis. 50(3), 295–313 (2002)

    Article  MATH  Google Scholar 

  15. Dimitrov, P., Lawlor, M., Zucker, S.: Distance images and intermediate-level vision. In: SSVM, pp. 653–664. Springer, Berlin (2011)

    Google Scholar 

  16. Droske, M., Rumpf, M.: Multi scale joint segmentation and registration of image morphology. IEEE Trans. Pattern Anal. 29(12), 2181–2194 (2007)

    Article  Google Scholar 

  17. Edelsbrunner, H., Letscher, D., Zomorodian, A.: Topological persistence and simplification. Discrete Comput. Geom. 28, 511–533 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  18. Erdem, E., Tari, S.: Mumford-Shah regularizer with contextual feedback. J. Math. Imaging Vis. 33(1), 67–84 (2009)

    Article  MathSciNet  Google Scholar 

  19. Erdem, E., Sancar-Yilmaz, A., Tari, S.: Mumford-Shah regularizer with spatial coherence. In: SSVM, pp. 545–555. Springer, Berlin (2007)

    Google Scholar 

  20. Gebal, K., Bærentzen, J.A., Aanæs, H., Larsen, R.: Shape analysis using the Auto Dinfusion Function. Comput. Graph. Forum 28, 1405–1413 (2009)

    Article  Google Scholar 

  21. Gilboa, G., Darbon, J., Osher, S., Chan, T.: Nonlocal convex functionals for image regularization. UCLA CAM-report 06-57, (2006)

  22. Gorelick, L., Galun, M., Sharon, E., Basri, R., Brandt, A.: Shape representation and classification using the Poisson equation. IEEE Trans. Pattern Anal. 28(12), 1991–2005 (2006)

    Article  Google Scholar 

  23. Jin, Y., Jost, J., Wang, G.: A nonlocal version of the Osher-Sol-Vese model. J. Math. Imaging Vis. 44, 99–113 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  24. Jung, M., Vese, L.: Nonlocal variational image deblurring models in the presence of Gaussian or impulse noise. In: SSVM, pp. 401–412. Springer, Berlin (2009)

    Google Scholar 

  25. Jung, M., Bresson, X., Chan, T., Vese, L.: Color image restoration using nonlocal Mumford-Shah regularizers. In: EMMCVPR, pp. 373–387. Springer, Berlin (2009)

    Google Scholar 

  26. Kontschieder, P., Donoser, M., Bischof, H.: Beyond pairwise shape similarity analysis. In: ACCV 2009. Lecture Notes in Computer Science, vol. 5996, pp. 655–666. Springer, Berlin (2010)

    Chapter  Google Scholar 

  27. Lee, T.S., Yuille, A.: Efficient coding of visual scenes by grouping and segmentation. In: Doya, K., Ishii, S., Pouget, A., Rao, R. (eds.) Bayesian Brain: Probabilistic Approaches to Neural Coding, pp. 141–185. MIT Press, New York (2007)

    Google Scholar 

  28. Lee, T.S., Mumford, D., Romero, R., Lamme, V.A.: The role of the primary visual cortex in higher level vision. Vis. Res. 38(15–16), 2429–2454 (1998)

    Article  Google Scholar 

  29. March, R., Dozio, M.: A variational method for the recovery of smooth boundaries. Image Vis. Comput. 15(9), 705–712 (1997)

    Article  Google Scholar 

  30. Meyer, F.: Topographic distance and watershed lines. Signal Process. 38, 113–125 (1994)

    Article  MATH  Google Scholar 

  31. Morse, S.P.: Concepts of use in contour map processing. Commun. ACM 12(3), 147–152 (1969)

    Article  MATH  Google Scholar 

  32. Mumford, D., Shah, J.: Optimal approximations by piecewise smooth functions and associated variational problems. Commun. Pure Appl. Math. 42, 577–685 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  33. Patz, T., Preusser, T.: Ambrosio-Tortorelli segmentation of stochastic images. In: ECCV, pp. 254–267. Springer, Berlin (2010)

    Google Scholar 

  34. Patz, T., Kirby, R., Preusser, T.: Ambrosio-Tortorelli segmentation of stochastic images: model extensions, theoretical investigations and numerical methods. Int. J. Comput. Vis. (2012). doi:10.1007/s11263-012-0578-8, 23 pp.

    Google Scholar 

  35. Pelillo, M., Siddiqi, K., Zucker, S.: Matching hierarchical structures using association graphs. IEEE Trans. Pattern Anal. 21(11), 1105–1120 (1999)

    Article  Google Scholar 

  36. Peng, T., Jermyn, I., Prinet, V., Zerubia, J.: Extended phase field higher-order active contour models for networks. Int. J. Comput. Vis. 88(1), 111–128 (2010)

    Article  Google Scholar 

  37. Pien, H., Desai, M., Shah, J.: Segmentation of MR images using curve evolution and prior information. Int. J. Pattern Recognit. 11(8), 1233–1245 (1997)

    Article  Google Scholar 

  38. Preußer, T., Droske, M., Garbe, C., Rumpf, M., Telea, A.: A phase field method for joint denoising, edge detection and motion estimation. SIAM J. Appl. Math. 68(3), 599–618 (2007)

    Article  MathSciNet  Google Scholar 

  39. Proesman, M., Pauwels, E., van Gool, L.: Coupled geometry-driven diffusion equations for low-level vision. In: Romeny, B. (ed.) Geometry Driven Diffusion in Computer Vision. Lecture Notes in Computer Science. Kluwer, Amsterdam (1994)

    Google Scholar 

  40. Reuter, M.: Hierarchical shape segmentation and registration via topological features of Laplace-Beltrami eigenfunctions. Int. J. Comput. Vis. 89(2), 287–308 (2010)

    Article  Google Scholar 

  41. Rosin, P.L., West, G.: Salience distance transforms. Graph. Models Image Process. 57(6), 483–521 (1995)

    Article  MATH  Google Scholar 

  42. Rosman, G., Bronstein, M.M., Bronstein, A.M., Kimmel, R.: Nonlinear dimensionality reduction by topologically constrained isometric embedding. Int. J. Comput. Vis. 89(1), 56–68 (2010)

    Article  Google Scholar 

  43. Shah, J.: Segmentation by nonlinear diffusion. In: CVPR, pp. 202–207. Springer, Berlin (1991)

    Google Scholar 

  44. Shah, J.: A common framework for curve evolution, segmentation and anisotropic diffusion. In: CVPR, pp. 136–142. Springer, Berlin (1996)

    Google Scholar 

  45. Shah, J.: Skeletons and segmentation of shapes. Tech. rep, Northeastern University (2005). See http://www.math.neu.edu/~shah/publications.html

  46. Shah, J., Pien, H., Gauch, J.: Recovery of shapes of surfaces with discontinuities by fusion of shading and range data within a variational framework. IEEE Trans. Image Process. 5(8), 1243–1251 (1996)

    Article  Google Scholar 

  47. Sun, J., Ovsjanikov, M., Guibas, L.: A concise and provably informative multi-scale signaturebased on heat diffusion. In: Comput. Graph. Forum (2009)

  48. Tari, S.: Hierarchical shape decomposition via level sets. In: ISMM, pp. 215–225. Springer, Berlin (2009)

    Google Scholar 

  49. Tari, S.: Fluctuating distance fields. In: Breuss, M., Bruckestein, A., Maragos, P. (eds.) Innovations in Shape Analysis—Proceedings of Dagstuhl Workshop, Mathematics and Visualization. Springer, Berlin (2013)

    Google Scholar 

  50. Tari, S., Genctav, M.: From a modified Ambrosio-Tortorelli to a randomized part hierarchy tree. In: SSVM, pp. 267–278. Springer, Berlin (2011)

    Google Scholar 

  51. Tari, S., Shah, J.: Local symmetries of shapes in arbitrary dimension. In: ICCV, pp. 1123–1128 (1998)

    Google Scholar 

  52. Tari, S., Shah, J., Pien, H.: Extraction of shape skeletons from grayscale images. Comput. Vis. Image Underst. 66(2), 133–146 (1997)

    Article  Google Scholar 

  53. Teboul, S., Blanc-Fraud, L., Aubert, G., Barlaud, M.: Variational approach for edge preserving regularization using coupled PDE’s. IEEE Trans. Image Process. 7, 387–397 (1998)

    Article  Google Scholar 

  54. Yang, X., Bai, X., Koknar-Tezel, S., Latecki, J.: Densifying distance spaces for shape and image retrieval. J. Math. Imaging Vis. (2012). doi:10.1007/s10851-012-0363-x

    Google Scholar 

  55. Zhu, S.C., Yuille, A.L.: FORMS: a flexible object recognition and modeling system. Int. J. Comput. Vis. 20(3), 187–212 (1996)

    Article  Google Scholar 

  56. Zucker, S.: Distance images and the enclosure field: applications in intermediate-level computer and biological vision. In: Breuss, M., Bruckestein, A., Maragos, P. (eds.) Innovations in Shape Analysis—Proceedings of Dagstuhl Workshop, Mathematics and Visualization. Springer, Berlin (2013)

    Google Scholar 

Download references

Acknowledgements

This work has been partially funded by TUBITAK grant 112E208, the Alexander von Humboldt Foundation, and TUBITAK-BIDEB fellowship.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sibel Tari.

Additional information

Preliminary conference version introducing randomized part hierarchy tree has appeared in SSVM 2011. The non-local field has first presented in [48].

Appendix

Appendix

Field Computation

To keep implementation simple we combine (9a) and (9b), and multiply the inhomogeneity function f by ρ 2 as scaling affects neither the geometrical nor the topological features of the level curves. In discrete setting this gives

$$ \mathbb{L}_* \big( \omega_{i,j} \big) \,-\, \frac{1}{ \rho^2 } \omega _{i,j} \, - \, \left( \frac{1}{|\varOmega|}\sum_{(k,l) \in\varOmega} \omega _{k,l} \right) + f_{i,j} \,=\, 0 $$
(10)

where \(\mathbb{L}_{*}\) denotes the discrete Laplace operator; ω i,j ω(x=ih x ,y=jh y ) with h x and h y are spatial discretization step sizes that are taken as the pixel width. Based on our discussions in Sect. 2, we set ρ 2=|Ω|. Next, we define a relaxed scheme:

$$\omega_{i,j}^{n+1} = \omega_{i,j}^{n} + \tau A $$

where A is the left hand side of (10) and τ is the relaxation parameter selected smaller than \(\frac{|\varOmega|}{ 4|\varOmega|+2} \). When implemented in parallel the ω i,j values required for A need to be taken from the n th step; if, however, the values are being updated sequentially, updated values are used for faster convergence.

Saddle Point Detection

For locating saddle points, we do not rely on indefiniteness of the Hessian. Instead, we find watershed regions by calling Matlab’s watershed routine, which uses Meyer’s method [30]. We eliminate all those watershed boundaries that do not neighbor Ω +. On each of the remaining watershed boundaries, the saddle point is the minimum of the restriction of ω to the respective watershed boundary. Typically, the considered watershed boundaries extend from Ω + to the shape boundary. It may be possible, however, that a watershed boundary touching Ω + bifurcates before reaching the shape boundary. In this case there are indeed two watershed boundaries; the respective saddle points are given by the respective minima after the bifurcation.

Tree Matching

Let (V i ,E i ), i=1,2 be two rooted trees. Let k,lV 1 and m,nV 2 be distinct nodes of the respective trees. The tree association graph of the two trees is the graph (V,E) where V:=V 1×V 2, and the graph nodes (k,m)∈V and (l,n)∈V are adjacent when the connectivity between k and l is equivalent to that of m and n. Specifically, we say (k,m)∈V and (l,n)∈V are adjacent if level(k)−level(l)=level(m)−level(n) and the length of the path from k to l in the first tree is the same with the length of the path from m to n in the second tree.

Defining the equivalence between two sets of nodes in respective trees by comparing levels and path lengths, there exists a bijection between maximal subtree isomorphism and maximal clique of the association graph of the two trees; i.e., tree matching is equivalent to finding the maximal clique in the association graph. If the trees are attributed—e.g. in our case (V i ,E i ,α) where α is a function that assigns an attribute vector [α (1)(u) , α (2)(u)]T to each node u in either tree—then subtree isomorphism with the largest similarity is called maximum similarity subtree isomorphism. In this case, the weighted association graph is the weighted graph (V,E,c) such that c(z) for z≡(u,v), zV, uV 1 and vV 2 is defined via a similarity measure \(\operatorname{sim}(\cdot,\cdot)\) in the attribute space: \(c(z) = \operatorname{sim} (\alpha(u), \alpha(v))\). The attributes and similarity measure are calculated as described in Sect. 4.1.

Suitably defining a weight matrix M using node weights c(⋅), the global maximizer of x T Mx gives the maximum weight clique, which is solved iteratively:

$$ x_i^{n+1} = x_i^{n} \frac{\left(M \, x^n \right)_i}{(x^n)^T \, M x^n} $$
(11)

where n is the iteration variable. The matrix M=(m ij ) is given via a matrix B=(b ij ) as follows:

$$m_{ij} = \underset{i,j}{\operatorname{{max}}} (b_{ij}) - b_{ij} $$

where

$$b_{ij} = \begin{cases} 0 & \text{if } i \neq j, \text{and node } i \text{ is}\\[-3pt] & \text{adjacent to node } j \\[3pt] \frac{1}{2 c(u_i)} & \text{if } i = j \\[4pt] \frac{1}{2 c(u_i)}+\frac{1}{2 c(u_j)} & \text{otherwise} \end{cases} $$

Let maximum weighted clique be CV. The solution x to the maximization problem (via the iterative scheme (11)) is expected to be

$$ x_i^* = \begin{cases} \frac{c(u_i)}{\sum_{u_j \in C} c(u_j)} & \text{if } u_i \in C \\ 0 & \text{otherwise} \end{cases} $$
(12)

The iterative scheme (11) returns an approximation to the limit vector x in (12). What remains is how to interpret this vector. The paper [35] provides no suggestion on this. We adopt the following strategy instead of simply thresholding.

We start with an empty clique. Then starting with the node with the highest x value, we gradually add nodes to the clique in the order of decreasing x value. After each inclusion we compute expected vector using (12) and check the difference between this estimate and the actual vector returned by the iterative scheme (11). We keep adding nodes till inclusion of nodes no longer decreases the difference.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Tari, S., Genctav, M. From a Non-Local Ambrosio-Tortorelli Phase Field to a Randomized Part Hierarchy Tree. J Math Imaging Vis 49, 69–86 (2014). https://doi.org/10.1007/s10851-013-0441-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10851-013-0441-8

Keywords

Navigation