This package reproduces the results found in the paper Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles.
This package is the result of code taken/adapted from different GitHub repositories:
- Deep Ensemble, mpritzkoleit
This package follows the Fort, S., Hu, H., & Lakshminarayanan, B. (2019). Deep ensembles: A loss landscape perspective. arXiv preprint arXiv:1912.02757..
Important properties of SGD optimization and it's escape property are analysed here, following: Wu, L., & Ma, C. (2018). How sgd selects the global minima in over-parameterized learning: A dynamical stability perspective. Advances in Neural Information Processing Systems, 31.
- Boston Housing
- Concrete Strength
- Energy Efficiency
- Kin8nm
- Naval Propulsion Plant
- Power Plant
- Protein Structure
- Wine Quality Red
- Yacht Hydrodynamics
- Year Prediction MSD
- MNIST
- SVHN
- Fort, S., Hu, H., & Lakshminarayanan, B. (2019). Deep ensembles: A loss landscape perspective. arXiv preprint arXiv:1912.02757.
- Gal, Y., & Ghahramani, Z. (2016, June). Dropout as a bayesian approximation: Representing model uncertainty in deep learning. In international conference on machine learning (pp. 1050-1059). PMLR.
- Hernández-Lobato, J. M., & Adams, R. (2015, June). Probabilistic backpropagation for scalable learning of bayesian neural networks. In International conference on machine learning (pp. 1861-1869). PMLR..
- Lakshminarayanan, B., Pritzel, A., & Blundell, C. (2017). Simple and scalable predictive uncertainty estimation using deep ensembles. Advances in neural information processing systems, 30..
- Wu, L., & Ma, C. (2018). How sgd selects the global minima in over-parameterized learning: A dynamical stability perspective. Advances in Neural Information Processing Systems, 31.