As a paradigm in statistics the
'Bayesian choice' goes back to Thomas Bayes in the 18th century, but is often
contrasted with ‘classical’ statistics as developed in the 20th century. In the
last decades its popularity has risen, partly due to increasing computational
power and the invention of new algorithms, but also due to the needs of
modelling high-dimensional data sets.

‘Nonparametrics’
refers to the use of functions as parameters, rather than Euclidean vectors.
Bayesian nonparametrics was long thought to be
problematic, because inference requires a prior probability distribution on the
parameter set, which in nonparametric situations is a subset of an
infinite-dimensional space. Not only was it difficult to come up with
computationally tractable proposals for such priors, also by their nature prior
probability measures support on small (sigma-compact) sets and hence were
thought to add too much `prior information’ (prior to any observed data) to
lead to useful statistical inference.

Mathematical and practical insights of
the last decade have shown that these difficulties can be overcome. Developing
new computational methods and theoretical (mathematical) investigation of
properties of Bayesian methods go hand in hand with application of
nonparametric Bayesian methodology in many areas of science.

The 25 participants investigated current
challenges and solutions in a very interactive environment, about 60% of the
time in plenary discussion and the remaining in smaller, specialised groups.
For the plenary discussions, a topic was presented by a specialist in an
informal manner, always also involving the ‘blackboard’. This invariably lead
to many comments and questions from the audience, to the benefit of audience
and presenter alike.

New insights were obtained regarding
Bayesian uncertainty quantification, either through global measures or through functionals, or by the use of a different topology. There
was special interest in species sampling priors, Bayesian sparse modelling, and
applications in biostatistical modelling and
causality. The work in smaller groups consisted of collaborations on ongoing research work as well as new projects,
which eventually will lead to tangible output in the form of research papers.