In this talk, we will present a new statistical principle that combines the most desirable aspects of both parameter inference and density estimation. This leads us to the PrO posterior, which expresses uncertainty as a consequence of predictive ability. Doing so leads to inferences which predictively dominate both classical and generalised Bayes posterior predictive distributions. Our PrO posteriors adapt to the level of model misspecification: they concentrate around the true model in the same way as Bayes and Gibbs posteriors if the model can recover the data-generating distribution, but do not concentrate toward a single parameter in the presence of non-trivial forms of model misspecification. Instead, they stabilise towards a predictively optimal posterior whose degree of irreducible uncertainty admits an interpretation as the degree of model misspecification—a sharp contrast to how Bayesian uncertainty and its existing extensions behave.