OOD Samples via Density-Based Pseudo-Counts.
arXiv:2006.09239.
Cramer, E., Witthaut, D., Mitsos, A., and Dahmen, M.
(2022). Multivariate Probabilistic Forecasting of In-
traday Electricity Prices using Normalizing Flows.
arXiv:2205.13826.
Dabney, W., Ostrovski, G., Silver, D., and Munos, R. (2018a).
Implicit Quantile Networks for Distributional Rein-
forcement Learning. Proceedings of the 35 th Inter-
national Conference on Machine Learning, Stockholm,
Sweden, PMLR 80. arXiv:1806.06923.
Dabney, W., Rowland, M., Bellemare, M. G., and Munos,
R. (2018b). Distributional Reinforcement Learn-
ing with Quantile Regression. The Thirty-Second
AAAI Conferenceon Artificial Intelligence (AAAI-18).
arXiv:1710.10044.
Duan, T., Anand, A., Ding, D. Y., Thai, K. K., Basu, S., Ng,
A., and Schuler, A. (2020). Ngboost: Natural gradient
boosting for probabilistic prediction. Proceedings of
the 37th International Conference on Machine Learn-
ing, PMLR, pages 2690–2700. arXiv:1910.03225.
Dumas, J., Lanaspeze, A. W. D., Cornélusse, B., and Sutera,
A. (2021). A deep generative model for probabilis-
tic energy forecasting in power systems: normalizing
flows. arXiv:2106.09370.
Endres, D. M. and Schindelin, J. E. (2003). A New Metric
for Probability Distributions. IEEE Trans. Inf. Theory,
49:1858–1860.
Gal, Y. and Ghahramani, Z. (2016). Dropout as a Bayesian
Approximation: Representing Model Uncertainty in
Deep Learning. Proceedings of The 33rd International
Conference on Machine Learning, PMLR, 48:1050–
1059. arXiv:1506.02142.
Gawlikowski, J., Tassi, C. R. N., Ali, M., Lee, J., Humt, M.,
Feng, J., Kruspe, A., Triebel, R., Jung, P., Roscher, R.,
Shahzad, M., Yang, W., Bamler, R., and Zhu, X. X.
(2021). A Survey of Uncertainty in Deep Neural Net-
works. arXiv:2107.03342.
Gneiting, T. and Raftery, A. E. (2007). Strictly Proper Scor-
ing Rules, Prediction, and Estimation. Journal of the
American Statistical Association, 102:359–378.
Gneiting, T., Stanberry, L. I., Grimit, E. P., Held, L., and
Johnson, N. A. (2008). Assessing probabilistic fore-
casts of multivariate quantities, with an application to
ensemble predictionsof surface winds. Test, 17:211–
235.
Gretton, A., Borgwardt, K. M., Rasch, M. J., Schölkopf, B.,
and Smola, A. (2012). A Kernel Two-Sample Test.
Journal of Machine Learning Research, 13:723–773.
Harakeh, A. and Waslander, S. L. (2021). Estimating and
Evaluating Regression Predictive Uncertainty in Deep
Object Detectors. In ICLR 2021. arXiv:2101.05036.
Huang, G., Li, Y., Pleiss, G., Liu, Z., Hopcroft, J. E.,
and Weinberger, K. Q. (2017). Snapshot Ensem-
bles: Train 1, Get M for Free. In 5th International
Conference on Learning Representations, ICLR 2017.
arXiv:1704.00109.
Huang, G., Sun, Y., Liu, Z., Sedra, D., and Weinberger,
K. Q. (2016). Deep Networks with Stochastic Depth.
In Leibe, B., Matas, J., Sebe, N., and Welling, M.,
editors, ECCV 2016, volume 9908 of Lecture Notes in
Computer Science. arXiv:1603.09382.
Hüllermeier, E. and Waegeman, W. (2021). Aleatoric and
epistemic uncertainty in machine learning: an intro-
duction to concepts and methods. Machine Learning,
110:457–506. arXiv:1910.09457.
Jamgochian, A., Wu, D., Menda, K., Jung, S., and Kochen-
derfer, M. J. (2022). Conditional Approximate Nor-
malizing Flows for Joint Multi-Step Probabilistic
Forecasting with Application to Electricity Demand.
arXiv:2201.02753.
Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W.,
Ye, Q., and Liu, T.-Y. (2017). LightGBM: A Highly
Efficient Gradient Boosting Decision Tree. Advances
in Neural Information Processing Systems 30 (NIPS
2017), pages 3149–3157.
Klambauer, G., Unterthiner, T., Mayr, A., and Hochreiter,
S. (2017). Self-Normalizing Neural Networks. 31st
Conference on Neural Information Processing Systems
(NIPS 2017). arXiv:1706.02515.
Kobyzev, I., Prince, S. J., and Brubaker, M. A. (2021).
Normalizing Flows: An Introduction and Review of
Current Methods. IEEE Transactions on Pattern
Analysis and Machine Intelligence, 43(11):3964–3979.
arXiv:1908.09257.
Kullback, S. and Leibler, R. (1951). On information and
sufficiency. Annals of Mathematical Statistics, 22:79–
86.
Lahlou, S., Jain, M., Nekoei, H., Butoi, V., Bertin, P.,
Rector-Brooks, J., Korablyov, M., and Bengio, Y.
(2021). DEUP: Direct Epistemic Uncertainty Predic-
tion. arXiv:2102.08501.
Lakshminarayanan, B., Pritzel, A., and Blundell, C. (2017).
Simple and Scalable Predictive Uncertainty Estima-
tion using Deep Ensembles. Advances in Neu-
ral Information Processing Systems 30 (NIPS 2017).
arXiv:1612.01474.
Lampinen, J. and Vehtari, A. (2001). Bayesian approach
for neural networks—review and case studies. Neural
Networks, 14:257–274.
Lázaro-Gredilla, M., Quiñonero-Candela, J., Rasmussen,
C. E., and Figueiras-Vidal, A. R. (2010). Sparse Spec-
trum Gaussian Process Regression. Journal of Machine
Learning Research, 11:1865–1881.
Ma, X., Xia, L., Zhou, Z., Yang, J., and Zhao, Q. (2020).
DSAC: Distributional Soft Actor Critic for Risk-
Sensitive Reinforcement Learning. arXiv:2004.14547.
Maddox, W. J., Izmailov, P., Garipov, T., Vetrov, D. P., and
Wilson, A. G. (2019). A Simple Baseline for Bayesian
Uncertainty in Deep Learning. In Advances in Neural
Information Processing Systems 32 (NeurIPS 2019).
arXiv:1902.02476.
Meinshausen, N. (2006). Quantile Regression Forests. Jour-
nal of Machine Learning Research, 7:983–999.
März, A. and Kneib, T. (2022). Distributional Gradient
Boosting Machines. arXiv:2204.00778.
Nguyen-Tang, T., Gupta, S., and Venkatesh, S. (2021). Distri-
butional Reinforcement Learning via Moment Match-
ing. The Thirty-Fifth AAAI Conference on Artificial
Intelligence (AAAI-21). arXiv:2007.12354.
NCTA 2022 - 14th International Conference on Neural Computation Theory and Applications
302