Author's School

Graduate School of Arts & Sciences

Author's Department/Program



English (en)

Date of Award

January 2011

Degree Type


Degree Name

Doctor of Philosophy (PhD)

Chair and Committee

Jefferson Gill


Model comparison and hypothesis testing is an integral part of all data analyses. In this thesis, I present two new families of information criteria that can be used to perform model comparison. In Chapter 1, I review the necessary background to motivate the thesis. Of particular interest is the role of priors for estimation and model comparison as well as the role that information theory can play in the latter. As we will see, many existing forms of model comparison can be viewed in an information theoretic manner, which motivates defining new families of criteria. In Chapter 2, I present the two new criteria and discuss their properties. The first criterion is based purely on posterior predictive densities and Kullback-Leibler divergences and decomposes into terms that describe the fit and complexity of the model. In this manner, it behaves similar to popular criteria, such as the AIC or the DIC. I then present the second family of criteria, which are a modification of the marginal distribution by an appropriate R\'{e}nyi divergence. This modification of the marginal allows the investigator to use priors that reflect vague prior knowledge while not suffering the paradoxes that can arise from such priors. One particularly nice aspect of this family of criteria is that it subsumes the Bayes' factor as a special case and produces an infinite family of criteria that are asymptotically equivalent to the Bayes' factor. In this manner, the criteria can be modified to achieve certain goals in small samples while maintaining asymptotic consistency. I conclude the thesis with a short discussion of the computational difficulties that arise when using the criteria and explore possible ways to overcome them.


Permanent URL: