Unsupervised learning of eye gaze representation from the web

Neeru Dubey, Shreya Ghosh, Abhinav Dhall

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

2 Citations (Scopus)


Automatic eye gaze estimation has interested researchers for a while now. In this paper, we propose an unsupervised learning based method for estimating the eye gaze region. To train the proposed network "Ize-Net" in self-supervised manner, we collect a large 'in the wild' dataset containing 1,54,251 images from the web. For the images in the database, we divide the gaze into three regions based on an automatic technique based on pupil-centers localization and then use a feature-based technique to determine the gaze region. The performance is evaluated on the Tablet Gaze and CAVE datasets by fine-tuning results of Ize-Net for the task of eye gaze estimation. The feature representation learned is also used to train traditional machine learning algorithms for eye gaze estimation. The results demonstrate that the proposed method learns a rich data representation, which can be efficiently finetuned for any eye gaze estimation dataset.

Original languageEnglish
Title of host publicationInternational Joint Conference on Neural Networks (IJCNN) 2019
EditorsPlamen Angelov, Manuel Roveri
Place of PublicationPiscataway NJ USA
PublisherIEEE, Institute of Electrical and Electronics Engineers
Number of pages7
ISBN (Electronic)9781728119854
ISBN (Print)9781728119861
Publication statusPublished - 2019
Externally publishedYes
EventIEEE International Joint Conference on Neural Networks 2019 - Budapest, Hungary
Duration: 14 Jul 201919 Jul 2019
https://ieeexplore.ieee.org/xpl/conhome/8840768/proceeding (Proceedings)


ConferenceIEEE International Joint Conference on Neural Networks 2019
Abbreviated titleIJCNN 2019
Internet address

Cite this