Syllabus for Probability and Statistics Comprehensive, Part 1
Matrices and systems of linear equations. Vector spaces
fields, subspaces, linear independence, basis, dimension. Determinants.
transformations, associated matrices, change of basis, dimension
vectorspaces. Eigenvalues, eigenspaces, diagonalization, Jordan
canonical form. Inner
product spaces, bilinear, quadratic and hermitian forms. Adjoint,
orthogonal and unitary operators. Diagonalization in Euclidean and
spaces. The spectral theorem.
References: Schaum's Outlines: Linear Algebra: Chapters 1--13.
Basic Real Analysis:
Real numbers: Infimum and supremum, limits of sequences, monotone
sequences, Cauchy sequences.
Continuity: limits of functions, continuous functions, the intermediate
value theorem, maxima and minima, uniform continuity, monotone
functions, inverse functions.
Differentiation: the derivative, mean value theorem, l’Hospital’s rule,
expansion with remainder.
Integration: Riemann integrals, the fundamental theorem of calculus,
Sequences of functions: pointwise and uniform convergence, continuity
and convergence, interchange of limit with derivatives and integrals,
Arzela-Ascoli theorem, Weierstrass and
Stone-Weierstrass approximation theorems. Differentiation of integrals
Infinite series: series of numbers and functions, absolute convergence,
Elementary functions: rigorous introduction of the exponential,
logarithmic, trigonometric and inverse trigonometric functions.
Functions of several variables: the derivative as a linear
theorem, the inverse and implicit function theorems.
Vector calculus: multiple integrals, path and surface integrals, change
of variables theorem for integrals, calculation of areas, volumes and
arc-lengths, the integral theorems of vector analysis (Green’s,
Stokes’, and Gauss’ theorems).
Metric spaces: basic topology, compactness, connectedness,
References: “Vector Calculus”, Marsden and Tromba
“Principles of Mathematical Analysis”, Walter Rudin
“Elementary Classical Analysis”, J. Marsden and M. Hoffman
Analytic functions, Cauchy-Riemann equations, entire
exponential, trigonometric, and logarithmic functions, Euler’s formula.
Line integrals, Cauchy’s theorem, Cauchy’s integral formula, power
representation and consequences, uniqueness theorem, mean value
maximum modulus principle, open mapping theorem.
Morera’s theorem, Liouville’s theorem and applications, meromorphic
Laurent expansions, residue theorem and applications, fractional linear
References: “Function Theory of One Complex Variable”, R. Greene and S.
Basic concepts: random variables, mass and density functions,
combinatorial analysis, conditional probability, Bayes' formula,
Expectation: moments, generating functions (moment, probability,
factorial), characteristic function, Markov's inequality, Chebyshev's
inequality, conditional expectation, independence, correlation.
Special distributions and their generating functions: binomial,
negative binomial, Poisson, hypergeometric, multinomial, gamma, sums of
independent gamma variables, beta, relationships between gamma and
beta, normal, linear combinations of normal variables, exponential,
Cauchy, Raleigh, Weibull, extreme value.
Functions of random variables and transformations.
Limit theorems: types of convergence (almost sure, in probability, in
distribution, Lp), continuity theorem, central limit
theorem, law of large numbers.
Sampling distributions: independent random sampling, Chi-square, t and
F distributions, order statistics, independence between X-bar and S2,
noncentral Chi-square, t and F.
“A First Course in Probability” by Sheldon Ross, 6th edition,
Saddle River, 2002.
Multivariate normal distribution: properties, moment generating
function, marginal and conditional densities, Cochran's theorem.
Basic concepts: sufficiency, completeness and ancillarity.
Point estimation: Fisher's information, Cramer-Rao lower bound, minimum
variance unbiased estimation, Rao-Blackwell theorem.
Methods of estimation: moment, least-squares, likelihood function and
maximum likelihood. Interval estimation: pivotal quantities,
Asymptotic inference: large sample properties of MLEs. likelihood ratio
Hypothesis testing: Neyman-Pearson fundamental lemma, uniformly most
powerful tests, likelihood ratio test.
Nonparametric inference: Kolmogorov-Smirnoff, Pearson chi-square,
contingency tables, Wilcoxon and permutation tests.
Bayesian inference: prior and posterior distributions, Bayesian
intervals, improper priors.
“Introduction to Mathematical Statistics” by Robert V. Hogg and Allen
5th edition, Macmillan,
New York, 1995.
“Statistical Inference” by S.D. Silvey, Chapman & Hall, London, 1991.