Evolving mazes from images

Liang Wan, Xiaopei Liu, Tien-Tsin Wong, Chi-Sing Leung

Research output: Contribution to journalArticleResearchpeer-review

12 Citations (Scopus)

Abstract

We propose a novel reaction diffusion (RD) simulator to evolve image-resembling mazes. The evolved mazes faithfully preserve the salient interior structures in the source images. Since it is difficult to control the generation of desired patterns with traditional reaction diffusion, we develop our RD simulator on a different computational platform, cellular neural networks. Based on the proposed simulator, we can generate the mazes that exhibit both regular and organic appearance, with uniform and/or spatially varying passage spacing. Our simulator also provides high controllability of maze appearance. Users can directly and intuitively paint to modify the appearance of mazes in a spatially varying manner via a set of brushes. In addition, the evolutionary nature of our method naturally generates maze without any obvious seam even though the input image is a composite of multiple sources. The final maze is obtained by determining a solution path that follows the user-specified guiding curve. We validate our method by evolving several interesting mazes from different source images.

Original languageEnglish
Pages (from-to)287-297
Number of pages11
JournalIEEE Transactions on Visualization and Computer Graphics
Volume16
Issue number2
DOIs
Publication statusPublished - Mar 2010
Externally publishedYes

Keywords

  • Cellular neural networks
  • Intuitive user controls.
  • Maze
  • Multiscale RD simulator

Cite this