Deeply uncertain: comparing methods of uncertainty quantification in deep learning algorithms

Caldeira, João and Nord, Brian (2020) Deeply uncertain: comparing methods of uncertainty quantification in deep learning algorithms. Machine Learning: Science and Technology, 2 (1). 015002. ISSN 2632-2153

[thumbnail of Caldeira_2021_Mach._Learn.__Sci._Technol._2_015002.pdf] Text
Caldeira_2021_Mach._Learn.__Sci._Technol._2_015002.pdf - Published Version

Download (571kB)

Abstract

We present a comparison of methods for uncertainty quantification (UQ) in deep learning algorithms in the context of a simple physical system. Three of the most common uncertainty quantification methods—Bayesian neural networks (BNNs), concrete dropout (CD), and deep ensembles (DEs) — are compared to the standard analytic error propagation. We discuss this comparison in terms endemic to both machine learning ('epistemic' and 'aleatoric') and the physical sciences ('statistical' and 'systematic'). The comparisons are presented in terms of simulated experimental measurements of a single pendulum—a prototypical physical system for studying measurement and analysis techniques. Our results highlight some pitfalls that may occur when using these UQ methods. For example, when the variation of noise in the training set is small, all methods predicted the same relative uncertainty independently of the inputs. This issue is particularly hard to avoid in BNN. On the other hand, when the test set contains samples far from the training distribution, we found that no methods sufficiently increased the uncertainties associated to their predictions. This problem was particularly clear for CD. In light of these results, we make some recommendations for usage and interpretation of UQ methods.

Item Type: Article
Subjects: STM Academic > Multidisciplinary
Depositing User: Unnamed user with email support@stmacademic.com
Date Deposited: 17 Jul 2023 06:07
Last Modified: 30 Oct 2023 05:20
URI: http://article.researchpromo.com/id/eprint/1203

Actions (login required)

View Item
View Item