Module Information

Module Identifier
MA36010
Module Title
Comparative Statistical Inference
Academic Year
2018/2019
Co-ordinator
Semester
Semester 2
Pre-Requisite
Other Staff

Course Delivery

Delivery Type Delivery length / details
Lecture 22 x 1 Hour Lectures
 

Assessment

Assessment Type Assessment length / details Proportion
Semester Exam 2 Hours   (Written Examination)  100%
Supplementary Exam 2 Hours   (Written Examination)  100%

Learning Outcomes

On completion of this module, a student should be able to:
1. construct and interpret Classical confidence intervals and tests of hypotheses for a population mean (Normal) and a probability parameter;
2. set up a Bayesian analysis of the same situations;
3. interpret prior and posterior distributions for parameters and construct Bayesian confidence intervals;
4. explain the differences between Classical and Bayesian analyses;
5. extend the ideas to other distributional families.

Brief description

This module re-examines the ideas of likelihood, confidence intervals and hypothesis testing in Classical Inference and considers their interpretation more deeply. An alternative approach known as Bayesian Inference is introduced in which prior information is modelled in the form of a distribution and updated in the presence of data using Bayes's Theorem. The concepts prior, posterior, predictive and preposterior are introduced. Applications to inferences about a (Normal) population mean, a (Binomial) probability parameter and other distributional families are discussed in detail. The meanings and interpretations of the two approaches are discussed at length.

Aims

To introduce the basic ideas and concepts of statistical inference.

Content

1. CLASSICAL INFERENCE Basic ideas. Likelihood and maximum likelihood estimation. Point estimators, bias, mean squared error. Consistency, relative efficiency. The Cramer-Rao Theorem and the minimum variance bound. Efficiency. MVBUE's and their existence. Sufficiency. Properties of maximum likelihood estimators.
2. BAYESIAN INFERENCE Bayes' Theorem. Prior and posterior odds. Prior and posterior distributions. Conjugate families. Prior knowledge and prior ignorance. Quantification of knowledge. Predictive distributions. Preposterior distributions. Bayesian point estimation, loss functions.
3. CONFIDENCE STATEMENTS Classical: pivotal functions, confidence intervals. Bayesian: highest density intervals, predictive intervals. Interpretation of relative likelihood intervals.
4. HYPOTHESIS TESTING Classical: null and alternative hypotheses. Neyman Pearson theory. UMP tests.
5. OVERVIEW Comparisons between Classical and Bayesian approaches.

Notes

This module is at CQFW Level 6