Geometric enclosing networks

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

Training model to generate data has increasingly attracted research attention and become important in modern world applications. We propose in this paper a new geometry-based optimization approach to address this problem. Orthogonal to current state-of-the-art density-based approaches, most notably VAE and GAN, we present a fresh new idea that borrows the principle of minimal enclosing ball to train a generator G (z) in such a way that both training and generated data, after being mapped to the feature space, are enclosed in the same sphere. We develop theory to guarantee that the mapping is bijective so that its inverse from feature space to data space results in expressive nonlinear contours to describe the data manifold, hence ensuring data generated are also lying on the data manifold learned from training data. Our model enjoys a nice geometric interpretation, hence termed Geometric Enclosing Networks (GEN), and possesses some key advantages over its rivals, namely simple and easy-to-control optimization formulation, avoidance of mode collapsing and efficiently learn data manifold representation in a completely unsupervised manner. We conducted extensive experiments on synthesis and real-world datasets to illustrate the behaviors, strength and weakness of our proposed GEN, in particular its ability to handle multi-modal data and quality of generated data.

Original languageEnglish
Title of host publicationProceedings of the 27th International Joint Conference on Artificial Intelligence, IJCAI 2018
EditorsJerome Lang
Place of PublicationCalifornia USA
PublisherInternational Joint Conferences on Artificial Intelligence
Pages2355-2361
Number of pages7
ISBN (Electronic)9780999241127
ISBN (Print)9780999241127
Publication statusPublished - 2018
EventInternational Joint Conference on Artificial Intelligence 2018 - Stockholm, Sweden
Duration: 13 Jul 201819 Jul 2018
https://www.ijcai.org/proceedings/2018/

Conference

ConferenceInternational Joint Conference on Artificial Intelligence 2018
Abbreviated titleIJCAI 2018
CountrySweden
CityStockholm
Period13/07/1819/07/18
Internet address

Cite this

Le, T., Vu, H., Nguyen, T. D., & Phung, D. (2018). Geometric enclosing networks. In J. Lang (Ed.), Proceedings of the 27th International Joint Conference on Artificial Intelligence, IJCAI 2018 (pp. 2355-2361). California USA: International Joint Conferences on Artificial Intelligence.
Le, Trung ; Vu, Hung ; Nguyen, Tu Dinh ; Phung, Dinh. / Geometric enclosing networks. Proceedings of the 27th International Joint Conference on Artificial Intelligence, IJCAI 2018. editor / Jerome Lang. California USA : International Joint Conferences on Artificial Intelligence, 2018. pp. 2355-2361
@inproceedings{49b940f61a5d4c908943204456477183,
title = "Geometric enclosing networks",
abstract = "Training model to generate data has increasingly attracted research attention and become important in modern world applications. We propose in this paper a new geometry-based optimization approach to address this problem. Orthogonal to current state-of-the-art density-based approaches, most notably VAE and GAN, we present a fresh new idea that borrows the principle of minimal enclosing ball to train a generator G (z) in such a way that both training and generated data, after being mapped to the feature space, are enclosed in the same sphere. We develop theory to guarantee that the mapping is bijective so that its inverse from feature space to data space results in expressive nonlinear contours to describe the data manifold, hence ensuring data generated are also lying on the data manifold learned from training data. Our model enjoys a nice geometric interpretation, hence termed Geometric Enclosing Networks (GEN), and possesses some key advantages over its rivals, namely simple and easy-to-control optimization formulation, avoidance of mode collapsing and efficiently learn data manifold representation in a completely unsupervised manner. We conducted extensive experiments on synthesis and real-world datasets to illustrate the behaviors, strength and weakness of our proposed GEN, in particular its ability to handle multi-modal data and quality of generated data.",
author = "Trung Le and Hung Vu and Nguyen, {Tu Dinh} and Dinh Phung",
year = "2018",
language = "English",
isbn = "9780999241127",
pages = "2355--2361",
editor = "Jerome Lang",
booktitle = "Proceedings of the 27th International Joint Conference on Artificial Intelligence, IJCAI 2018",
publisher = "International Joint Conferences on Artificial Intelligence",

}

Le, T, Vu, H, Nguyen, TD & Phung, D 2018, Geometric enclosing networks. in J Lang (ed.), Proceedings of the 27th International Joint Conference on Artificial Intelligence, IJCAI 2018. International Joint Conferences on Artificial Intelligence, California USA, pp. 2355-2361, International Joint Conference on Artificial Intelligence 2018, Stockholm, Sweden, 13/07/18.

Geometric enclosing networks. / Le, Trung; Vu, Hung; Nguyen, Tu Dinh; Phung, Dinh.

Proceedings of the 27th International Joint Conference on Artificial Intelligence, IJCAI 2018. ed. / Jerome Lang. California USA : International Joint Conferences on Artificial Intelligence, 2018. p. 2355-2361.

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

TY - GEN

T1 - Geometric enclosing networks

AU - Le, Trung

AU - Vu, Hung

AU - Nguyen, Tu Dinh

AU - Phung, Dinh

PY - 2018

Y1 - 2018

N2 - Training model to generate data has increasingly attracted research attention and become important in modern world applications. We propose in this paper a new geometry-based optimization approach to address this problem. Orthogonal to current state-of-the-art density-based approaches, most notably VAE and GAN, we present a fresh new idea that borrows the principle of minimal enclosing ball to train a generator G (z) in such a way that both training and generated data, after being mapped to the feature space, are enclosed in the same sphere. We develop theory to guarantee that the mapping is bijective so that its inverse from feature space to data space results in expressive nonlinear contours to describe the data manifold, hence ensuring data generated are also lying on the data manifold learned from training data. Our model enjoys a nice geometric interpretation, hence termed Geometric Enclosing Networks (GEN), and possesses some key advantages over its rivals, namely simple and easy-to-control optimization formulation, avoidance of mode collapsing and efficiently learn data manifold representation in a completely unsupervised manner. We conducted extensive experiments on synthesis and real-world datasets to illustrate the behaviors, strength and weakness of our proposed GEN, in particular its ability to handle multi-modal data and quality of generated data.

AB - Training model to generate data has increasingly attracted research attention and become important in modern world applications. We propose in this paper a new geometry-based optimization approach to address this problem. Orthogonal to current state-of-the-art density-based approaches, most notably VAE and GAN, we present a fresh new idea that borrows the principle of minimal enclosing ball to train a generator G (z) in such a way that both training and generated data, after being mapped to the feature space, are enclosed in the same sphere. We develop theory to guarantee that the mapping is bijective so that its inverse from feature space to data space results in expressive nonlinear contours to describe the data manifold, hence ensuring data generated are also lying on the data manifold learned from training data. Our model enjoys a nice geometric interpretation, hence termed Geometric Enclosing Networks (GEN), and possesses some key advantages over its rivals, namely simple and easy-to-control optimization formulation, avoidance of mode collapsing and efficiently learn data manifold representation in a completely unsupervised manner. We conducted extensive experiments on synthesis and real-world datasets to illustrate the behaviors, strength and weakness of our proposed GEN, in particular its ability to handle multi-modal data and quality of generated data.

UR - http://www.scopus.com/inward/record.url?scp=85055674660&partnerID=8YFLogxK

M3 - Conference Paper

SN - 9780999241127

SP - 2355

EP - 2361

BT - Proceedings of the 27th International Joint Conference on Artificial Intelligence, IJCAI 2018

A2 - Lang, Jerome

PB - International Joint Conferences on Artificial Intelligence

CY - California USA

ER -

Le T, Vu H, Nguyen TD, Phung D. Geometric enclosing networks. In Lang J, editor, Proceedings of the 27th International Joint Conference on Artificial Intelligence, IJCAI 2018. California USA: International Joint Conferences on Artificial Intelligence. 2018. p. 2355-2361