Uncertainty Ensembles Of Deep Neural Networks As Predictive Distribution For Regression

Keywords

Loading...
Thumbnail Image

Issue Date

2019-12-01

Language

en

Document type

Journal Title

Journal ISSN

Volume Title

Publisher

Title

ISSN

Volume

Issue

Startpage

Endpage

DOI

Abstract

Recent years have seen the rise of Deep Neural Networks (DNN) and Bayesian Approaches in machine learning. Combining the mathematical expressiveness of DNNs with the quanti cation of their predictions' reliability through the Bayesian approach into Bayesian Neural Networks (BNN) promises a revolution for decision making both by humans and arti cial agents. However, certain theoretical and practical hurdles stand in the way of the reliable use of BNNs. This work aims to provide a primer on the theoretical problems encountered when building fully Bayesian Neural Networks and argues that the use of ensembles of DNNs can lead to a simple, practical substitute. To do so, we compare six di erent popular approaches to explicit and implicit ensembling of DNNs from the literature in the context of regression problems. We evaluate them on two synthetic and one real-life data sets with respect to the common metrics mean squared error (mse) and negative log predictive density (nlpd). Additionally, we introduce one metric that captures the correlation of the uncertainty of the predictive distribution on its error ('correlation between error and uncertainty,' cobeau). We focus on comparability between the methods by forcing them to ensemble a shared, independently determined network architecture with a predetermined training schedule in order to obtain their predictive distribution.

Description

Citation

Faculty

Faculteit der Sociale Wetenschappen