Condensing class diagrams with minimal manual labeling cost

Xinli Yang, David Lo, Xin Xia, Jianling Sun

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

8 Citations (Scopus)

Abstract

Traditionally, to better understand the design of a project, developers can reconstruct a class diagram from source code using a reverse engineering technique. However, the raw diagram is often perplexing because there are too many classes in it. Condensing the reverse engineered class diagram into a compact class diagram which contains only the important classes would enhance the understandability of the corresponding project. A number of recent works have proposed several supervised machine learning solutions that can be used for condensing reverse engineered class diagrams given a set of classes that are manually labeled as important or not. However, a challenge impacts the practicality of the proposed solutions, which is the expensive cost for manual labeling of training samples. More training samples will lead to better performance, but means higher manual labeling cost. Too much manual labeling will make the problem pointless since the aim is to automatically identify important classes. In this paper, to bridge this research gap, we propose a novel approach MCCondenser which only requires a small amount of training data but can still achieve a reasonably good performance. MCCondenser firstly selects a small proportion of all data, which are the most representative, as training data in an unsupervised way using k-means clustering. Next, it uses ensemble learning to handle the class imbalance problem so that a suitable classifier can be constructed based on the limited training data. To evaluate the performance of MCCondenser, we use datasets from nine open source projects, i.e., ArgoUML, JavaClient, JGAP, JPMC, Mars, Maze, Neuroph, Wro4J and xUML, containing a total of 2640 classes. We compare MCCondenser with two baseline approaches proposed by Thung et al., both of which are state-of-the-art approaches aimed to reduce the manual labeling cost. The experimental results show that MCCondenser can achieve an average AUC score of 0.73, which improves those of the two baselines by nearly 20% and 10% respectively.

Original languageEnglish
Title of host publicationProceedings - 2016 IEEE 40th Annual Computer Software and Applications Conference, COMPSAC 2016
Subtitle of host publication10-14 June 2016 Atlanta, Georgia
EditorsSorel Reisman, Sheikh Iqbal Ahamed, Ling Liu, Dejan Milojicic, William Claycomb, Mihhail Matskin, Hiroyuki Sato, Zhiyong Zhang
Place of PublicationPiscataway NJ USA
PublisherIEEE, Institute of Electrical and Electronics Engineers
Pages22-31
Number of pages10
Volume1
ISBN (Electronic)9781467388450
DOIs
Publication statusPublished - 2016
Externally publishedYes
EventInternational Computer Software and Applications Conference 2016 - Atlanta, United States of America
Duration: 10 Jun 201614 Jun 2016
Conference number: 40th
https://www.computer.org/web/compsac2016
https://ieeexplore.ieee.org/xpl/conhome/7551592/proceeding (Proceedings)

Conference

ConferenceInternational Computer Software and Applications Conference 2016
Abbreviated titleCOMPSAC 2016
CountryUnited States of America
CityAtlanta
Period10/06/1614/06/16
Internet address

Keywords

  • Class Diagram
  • Cost Saving
  • Ensemble Learning
  • Manual Labeling
  • Unsupervised Learning

Cite this