Bayesian Spiking Neurons (BSNs) provide a probabilistic interpretation of how neurons can perform inference and learning. Learning in a single BSN can be formulated as an online maximum-likelihood expectation-maximisation (ML-EM) algorithm. This form of learning is quite slow. Here, an alternative to this learning algorithm, called Fast Learning (FL), is presented. The FL algorithm is shown to have acceptable convergence performance when compared to the ML-EM algorithm. Moreover, for our implementations the FL algorithm is approximately 25 times faster than the ML-EM algorithm. Although only approximate, the FL algorithm therefore makes learning in hierarchical BSN networks much more tractable.
|Title of host publication||2012 International Joint Conference on Neural Networks, IJCNN 2012|
|Publication status||Published - 22 Aug 2012|
|Event||2012 Annual International Joint Conference on Neural Networks, IJCNN 2012, Part of the 2012 IEEE World Congress on Computational Intelligence, WCCI 2012 - Brisbane, QLD, Australia|
Duration: 10 Jun 2012 → 15 Jun 2012
|Conference||2012 Annual International Joint Conference on Neural Networks, IJCNN 2012, Part of the 2012 IEEE World Congress on Computational Intelligence, WCCI 2012|
|Period||10/06/12 → 15/06/12|