16. Model averaging and mixing#
In this chapter we discuss the challenge of combining the insights from a number of individual physics models to produce inference endowed with the physics models’ collective wisdom. Section Bayesian inference in the multi-model setting provides the general setup for this problem, and introduces the crucial distinction between \({\cal M}\)-closed and \({\cal M}\)-open settings. Section Bayesian model averaging describes the standard Bayesian solution: Bayesian Model Averaging (BMA); we then explain why BMA can only resolve the challenge in the \({\cal M}\)-closed context. Section Using Bayesian model mixing to open the model space then articulates paths to generalize BMA to a more sophisticated Bayesian Model Mixing (BMM), wherein we combine information from different models in a more textured way than BMA accomplishes. We end with Section A tale of two models: contrasting BMA with BMM, which gives an example where BMM improves upon BMA by leveraging information on the local performance of two different models across the input domain.