Statistics 1000-116bST
This course gives an introduction to classical statistics, dealing with theoretical statistics and applications to data analysis. The topics are:
1) Statistical Models, non-parametric, semi-parametric, parametric, the empirical distribution, the Kolmogorov-Smirnov test.
2) Parameters and Sufficiency: Sufficient statistics, minimal sufficient statistics, complete statistics, factorisation theorem.
3) Exponential families and their parametrisations
4) Parameter Estimation: Minimum contrast, estimating equation method, maximum likelihood, method of moments, least squares. Kullback Leibler divergence, maximum likelihood as a minimum contrast.
5) The information inequality, linear predictors.
6) Complete Sufficiency and UMVU (Uniform Minimum Variance Unbiased) estimators.
7) Asymptotic results for estimators, consistency, the Delta method.
8) Confidence Intervals: Pivot method. Hypothesis Testing: Likelihood Ratio Test, Neyman Pearson lemma, Monotone Likelihood Ratio, Rubin Karlin theorem, p-values, Confidence intervals by inverting a test statistic.
9) Gaussian Linear Models
10) Asymptotic Likelihood Ratio test, Chi squared tests, Wald statistic, Logistic regression.
There are also computer laboratories (15 hours) where the modelling techniques are applied using R.
Social Skills
The student should understand the principles of data analysis and should (using R), carry out statistical tests, be able to analyse data using Gaussian linear models and use these models for prediction.
Type of course
Prerequisites (description)
Course coordinators
Learning outcomes
1) Statistical Models, non-parametric, semi-parametric, parametric, the empirical distribution, the Kolmogorov-Smirnov test.
2) Parameters and Sufficiency: Sufficient statistics, minimal sufficient statistics, complete statistics, factorisation theorem.
3) Exponential families and their parametrisations
4) Parameter Estimation: Minimum contrast, estimating equation method, maximum likelihood, method of moments, least squares. Kullback Leibler divergence, maximum likelihood as a minimum contrast.
5) The information inequality, linear predictors.
6) Complete Sufficiency and UMVU (Uniform Minimum Variance Unbiased) estimators.
7) Asymptotic results for estimators, consistency, the Delta method.
8) Confidence Intervals: Pivot method. Hypothesis Testing: Likelihood Ratio Test, Neyman Pearson lemma, Monotone Likelihood Ratio, Rubin Karlin theorem, p-values, Confidence intervals by inverting a test statistic.
9) Gaussian Linear Models
10) Asymptotic Likelihood Ratio test, Chi squared tests, Wald statistic, Logistic regression.
Analyse data, construct statistical models, estimate parameters and use models for prediction using the R programming language, present conclusions clearly.
Assessment criteria
1) A written examination
2) Tutorial participation
3) Laboratory work.
The final grade is decided by a combination of the grades from the points above.
Bibliography
[1] P. Bickel and K. Doksum, Mathematical Statistics: Basic ideas and selected topics, Vol. 1, 2001.
[2] J. Noble, Notatki do wykładu ze Statystyki (ang):
www.mimuw.edu.pl/~noble/courses/Statistics
Additional information
Information on level of this course, year of study and semester when the course unit is delivered, types and amount of class hours - can be found in course structure diagrams of apropriate study programmes. This course is related to the following study programmes:
Additional information (registration calendar, class conductors, localization and schedules of classes), might be available in the USOSweb system: