Syllabus for Probability and Statistics Qualifying Exam
Matrices and systems of linear equations. Vector spaces over general fields, subspaces, linear independence, basis, dimension. Determinants. Linear transformations, associated matrices, change of basis, dimension formula. Dual vectorspaces. Eigenvalues, eigenspaces, diagonalization, Jordan canonical form. Inner product spaces, bilinear, quadratic and hermitian forms. Adjoint, self-adjoint, orthogonal and unitary operators. Diagonalization in Euclidean and unitary spaces. The spectral theorem.
References: Schaum's Outlines: Linear Algebra: Chapters 1--13.
Real numbers: Infimum and supremum, limits of sequences, monotone sequences, Cauchy sequences. Continuity: limits of functions, continuous functions, the intermediate value theorem, maxima and minima, uniform continuity, monotone functions, inverse functions. Differentiation: the derivative, mean value theorem, l’Hospital’s rule, Taylor’s expansion with remainder. Integration: Riemann integrals, the fundamental theorem of calculus, improper integrals. Sequences of functions: pointwise and uniform convergence, continuity and convergence, interchange of limit with derivatives and integrals, Arzela-Ascoli theorem, Weierstrass and Stone-Weierstrass approximation theorems. Differentiation of integrals with parameters. Infinite series: series of numbers and functions, absolute convergence, power series. Elementary functions: rigorous introduction of the exponential, logarithmic, trigonometric and inverse trigonometric functions. Functions of several variables: the derivative as a linear transformation, Taylor’s theorem, the inverse and implicit function theorems. Vector calculus: multiple integrals, path and surface integrals, change of variables theorem for integrals, calculation of areas, volumes and arc-lengths, the integral theorems of vector analysis (Green’s, Stokes’, and Gauss’ theorems). Metric spaces: basic topology, compactness, connectedness, completeness.
References: “Vector Calculus”, Marsden and Tromba
“Principles of Mathematical Analysis”, Walter Rudin
“Elementary Classical Analysis”, J. Marsden and M. Hoffman
Analytic functions, Cauchy-Riemann equations, entire functions, the exponential, trigonometric, and logarithmic functions, Euler’s formula. Line integrals, Cauchy’s theorem, Cauchy’s integral formula, power series representation and consequences, uniqueness theorem, mean value theorem, maximum modulus principle, open mapping theorem. Morera’s theorem, Liouville’s theorem and applications, meromorphic functions, Laurent expansions, residue theorem and applications, fractional linear transformations.
References: “Function Theory of One Complex Variable”, R. Greene and S. Krantz
Basic concepts: random variables, mass and density functions, combinatorial analysis, conditional probability, Bayes' formula, independence.
Expectation: moments, generating functions (moment, probability, factorial), characteristic function, Markov's inequality, Chebyshev's inequality, conditional expectation, independence, correlation.
Special distributions and their generating functions: binomial, negative binomial, Poisson, hypergeometric, multinomial, gamma, sums of independent gamma variables, beta, relationships between gamma and beta, normal, linear combinations of normal variables, exponential, Cauchy, Raleigh, Weibull, extreme value.
Functions of random variables and transformations.
Limit theorems: types of convergence (almost sure, in probability, in distribution, Lp), continuity theorem, central limit theorem, law of large numbers. References:
“A First Course in Probability” by Sheldon Ross, 6th edition, Prentice-Hall, Upper Saddle River, 2002.
Sampling distributions: independent random sampling, Chi-square, t and F distributions, order statistics, independence between X-bar and S2, noncentral Chi-square, t and F.
Multivariate normal distribution: properties, moment generating function, marginal and conditional densities, Cochran's theorem.
Basic concepts: sufficiency, completeness and ancillarity.
Point estimation: Fisher's information, Cramer-Rao lower bound, minimum variance unbiased estimation, Rao-Blackwell theorem.
Methods of estimation: moment, least-squares, likelihood function and maximum likelihood. Interval estimation: pivotal quantities, conditional inference.
Asymptotic inference: large sample properties of MLEs. likelihood ratio statistic.
Hypothesis testing: Neyman-Pearson fundamental lemma, uniformly most powerful tests, likelihood ratio test.
Nonparametric inference: Kolmogorov-Smirnoff, Pearson chi-square, contingency tables, Wilcoxon and permutation tests.
Bayesian inference: prior and posterior distributions, Bayesian intervals, improper priors. References:
“Introduction to Mathematical Statistics” by Robert V. Hogg and Allen T. Craig, 5th edition, Macmillan, New York, 1995.
“Statistical Inference” by S.D. Silvey, Chapman & Hall, London, 1991.