One World ABC Seminar: David Nott
on the November 12, 2020
at 11:30 am [UK time]
For this fourteenth session of the One World ABC Seminar, David Nott from National University of Singapore will talk about "Marginally-calibrated deep distributional regression".
Inspired by the "One World Probability Seminar", we decided to run The One World ABC Seminar, a weekly/fortnightly series of seminars that will take place on Blackboard Collaborate on Thursdays at 11.30am [UK time]. The idea is to gather members and disseminate results and innovation during these weeks and months under lockdown.
David Nott
Abstract
Deep neural network (DNN) regression models are widely used in applications requiringstate-of-the-art predictive accuracy. However, until recently there has been little work onaccurate uncertainty quantification for predictions from such models. We add to this liter-ature by outlining an approach to constructing predictive distributions that are ‘marginallycalibrated’. This is where the long run average of the predictive distributions of the responsevariable matches the observed empirical margin. Our approach considers a DNN regres-sion with a conditionally Gaussian prior for the final layer weights, from which an implicitcopula process on the feature space is extracted. This copula process is combined with anon-parametrically estimated marginal distribution for the response. The end result is ascalable distributional DNN regression method with marginally calibrated predictions, andour work complements existing methods for probability calibration. The approach is firstillustrated using two applications of dense layer feed-forward neural networks. However,our main motivating applications are in likelihood-free inference, where distributional deepregression is used to estimate marginal posterior distributions. In two complex ecologicaltime series examples we employ the implicit copulas of convolutional networks, and showthat marginal calibration results in improved uncertainty quantification. Our approach alsoavoids the need for manual specification of summary statistics, a requirement that is bur-densome for users and typical of competing likelihood-free inference methods. This is jointwork with Nadja Klein and Michael Smith.
References
[1] N. Klein, D. J. Nott, M.S. Smith. Marginally-calibrated deep distributional regression,J. Comput. Graph. Stat., arXiv:2006.14126, doi 10.1080/10618600.2020.1807996.
Published on October 7, 2020