Abstract
Machine learning has reached a point where many probabilistic methods can be understood as variations, extensions and combinations of a much smaller set of abstract themes, e.g., as different instances of the EM algorithm. This enables the systematic derivation of algorithms customized for different models. Here, we describe the AUTOBAYES system which takes a high-level statistical model specification, uses powerful symbolic techniques based on schema-based program synthesis and computer algebra to derive an efficient specialized algorithm for learning that model, and generates executable code implementing that algorithm. This capability is far beyond that of code collections such as Matlab toolboxes or even tools for model-independent optimization such as BUGS for Gibbs sampling: complex new algorithms can be generated without new programming, algorithms can be highly specialized and tightly crafted for the exact structure of the model and data, and efficient and commented code can be generated for different languages or systems. We present automatically-derived algorithms ranging from closed-form solutions of Bayesian textbook problems to recently-proposed EM algorithms for clustering, regression, and a multinomial form of PCA.
Original language | English |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 15 - Proceedings of the 2002 Conference, NIPS 2002 |
Publisher | Neural Information Processing Systems (NIPS) |
ISBN (Print) | 0262025507, 9780262025508 |
Publication status | Published - 1 Jan 2003 |
Event | Advances in Neural Information Processing Systems 2002 - Vancouver, Canada Duration: 9 Dec 2002 → 14 Dec 2002 Conference number: 15th |
Conference
Conference | Advances in Neural Information Processing Systems 2002 |
---|---|
Abbreviated title | NIPS 2002 |
Country/Territory | Canada |
City | Vancouver |
Period | 9/12/02 → 14/12/02 |