Robust surface reconstruction via dictionary learning

Shiyao Xiong, Juyong Zhang, Jianmin Zheng, Jianfei Cai, Ligang Liu

Research output: Contribution to journalArticleResearchpeer-review

54 Citations (Scopus)

Abstract

(Figure Presented). Surface reconstruction from point cloud is of great practical importance in computer graphics. Existing methods often realize reconstruction via a few phases with respective goals, whose integration may not give an optimal solution. In this paper, to avoid the inherent limitations of multi-phase processing in the prior art, we propose a unified framework that treats geometry and connectivity construction as one joint optimization problem. The framework is based on dictionary learning in which the dictionary consists of the vertices of the reconstructed triangular mesh and the sparse coding matrix encodes the connectivity of the mesh. The dictionary learning is formulated as a constrained ℓ2,q-optimization (0 < q < 1), aiming to find the vertex position and triangulation that minimize an energy function composed of point-to-mesh metric and regularization. Our formulation takes many factors into account within the same framework, including distance metric, noise/outlier resilience, sharp feature preservation, no need to estimate normal, etc., thus providing a global and robust algorithm that is able to efficiently recover a piecewise smooth surface from dense data points with imperfections. Extensive experiments using synthetic models, real world models, and publicly available benchmark show that our method outperforms the state-of-the-art in terms of accuracy, robustness to noise and outliers, geometric feature and detail preservation, and mesh connectivity.

Original languageEnglish
Article number201
Number of pages12
JournalACM Transactions on Graphics
Volume33
Issue number6
DOIs
Publication statusPublished - Nov 2014
Externally publishedYes

Keywords

  • Dictionary learning
  • Distance metric
  • Point cloud
  • Sparse coding
  • Surface reconstruction
  • ℓ-optimization

Cite this