\appendix \chapter{Algebra and calculus basics} %\section{Exponentials} %Exponential curves: \section{Exponentials and logarithms} Exponentials are written as $e^x$ or $\exp(x)$, where $e=2.718\ldots$. By definition $\exp(-\infty)=0$, $\exp(0)=1$, $\exp(1)=e$, and $\exp(\infty)=\infty$. In \R, $e^x$ is \code{exp(x)}; if you want the value of $e$ use \code{exp(1)}. Logarithms are the solutions to exponential or power equations like $y=e^x$ or $y=10^x$. \emph{Natural} logs, $\ln$ or $\log_e$, are logarithms base $e$; \emph{common} logs, $\log_{10}$, are typically logarithms base 10. When you see just ``$\log$'' it's usually in a context where the difference doesn't matter (although in \R\ $\log_{10}$ is \code{log10} and $\log_e$ is \code{log}). <>= curve(log(x),from=0,to=10) abline(h=0,lty=2) abline(v=1,lty=2) @ \begin{enumerate} \item{$\log(1)=0$. If $x>1$ then $\log(x)>0$, and vice versa. $\log(0) = -\infty$ (more or less); logarithms are undefined for $x<0$.} \item{Logarithms convert products to sums: $\log(ab) = \log(a)+\log(b)$.} \item{Logarithms convert powers to multiplication: $\log(a^n) = n \log(a)$.} \item{You can't do anything with $\log(a+b)$.} \item{Converting bases: $\log_x(a) = \log_y(a)/\log_y(x)$. In particular, $\log_{10}(a) = \log_e(a)/\log_e(10) \approx \log_e(a)/2.3$ and $\log_e(a) = \log_{10}(a)/\log_{10}(e) \approx \log_{10}(a)/0.434$. This means that converting between log bases just means multiplying or dividing by a constant. Here's the proof: \begin{eqnarray*} y & = & \log_{10}(x) \\ 10^y & = & x \\ \log_e(10^y) & = & \log_e(x) \\ y \log_e(10) & = & \log_e(x) \\ y & = & \log_e(x)/\log_e(10) \end{eqnarray*} (compare the first and last lines).} \item{The derivative of the logarithm, $d(\log x)/dx$, equals $1/x$. This is always positive for $x>0$ (which are the only values for which the logarithm is defined anyway).} \item{The fact that $d(\log x)/dx>0$ means the function is \emph{monotonic} (always either increasing or decreasing), which means that if $x>y$ then $\log(x)>\log(y)$ and if $x>= curve(gamma(x),from=0.1,to=5,ylab="") mtext(expression(Gamma(x)),side=2,at=12,line=3) abline(h=c(1,2,6,24,120),col="gray") @ Factorials and gamma functions get very large, and you often have to compute ratios of factorials or gamma functions (e.g. the binomial coefficient, $N!/(k! (N-k)!)$. Numerically, it is more efficient and accurate to compute the logarithms of the factorials first, add and subtract them, and then exponentiate the result: $\exp(\log N! - \log k! - \log(N-k)!)$. \R\ provides the log-factorial (\code{lfactorial}) and log-gamma (\code{lgamma}) functions for this purpose. (Actually, \R\ also provides \code{choose} and \code{lchoose} for the binomial coefficient and the log-binomial coefficient, but the log-gamma is more generally useful.) About the only reason that the gamma function (as opposed to factorials) ever comes up in ecology is that it is the \emph{normalizing constant} (see ch. 4) for the gamma \emph{distribution}, which is usually denoted as Gamma (not $\Gamma$): $\mbox{Gamma}(x,a,s) = \frac{1}{s^a \Gamma(a)} x^{a-1} e^{-x/s}$. \section{Probability} \begin{enumerate} \item{Probability distributions always add or integrate to 1 over all possible values.} \item{Probabilities of independent events are multiplied: $p(A \mbox{ and } B) = p(A) p(B)$.} \item{The \emph{binomial coefficient}, \begin{equation} {N \choose k} = \frac{N!}{k! (N-k)!}, \end{equation} is the number of different ways of choosing $k$ objects out of a set of $N$, without regard to order. $!$ denotes a factorial: $n!=n \times n-1 \times ... \times 2 \times 1$. (Proof: think about picking $k$ objects out of $N$, without replacement but keeping track of order. The number of different ways to pick the first object is $N$. The number of different ways to pick the second object is $N-1$, the third $N-2$, and so forth, so the total number of choices is $N \times N-1 \times ... N-k+1 = N!/(N-k)!$. The number of possible orders for this set (permutations) is $k!$ by the same argument ($k$ choices for the first element, $k-1$ for the next \ldots). Since we don't care about the order, we divide the number of ordered ways ($N!/(N-k)!$) by the number of possible orders ($k!$) to get the binomial coefficient.) } \end{enumerate} \section{The delta method: formula and derivation} The formula for the delta method of approximating variances is: \begin{equation} \mbox{Var}(f(x,y)) \approx \left(\frac{\partial f}{\partial x} \right)^2 \mbox{Var}(x) + \left(\frac{\partial f}{\partial y} \right)^2 \mbox{Var}(y) + 2 \left(\frac{\partial f}{\partial x}\frac{\partial f}{\partial y} \right) \mbox{Cov}(x,y) \end{equation} \cite{Lyons1991} gives a very readable alternative description of the delta method; \cite{Oehlert1992} gives a short technical description of the formal assumptions necessary for the delta method to apply. This formula is exact in a bunch of simple cases: \begin{itemize} \item{Multiplying by a constant: $\mbox{Var}(ax) = a^2 \mbox{Var}(x)$} \item{Sum or difference of independent variables: $\mbox{Var}(x\pm y) = \mbox{Var}(x)+ \mbox{Var}(y)$} \item{Product or ratio of independent variables: $\mbox{Var}(x \cdot y) = y^2 \mbox{Var}(x)+x^2 \mbox{Var}(y) = x^2 y^2 \left( \frac{\mbox{Var}(x)}{x^2} + \frac{\mbox{Var}(y)}{y^2}\right)$: this also implies that $(\mbox{CV}(x \cdot y))^2 = (\mbox{CV}(x))^2 +(\mbox{CV}(y))^2$} \item{The formula is exact for linear functions of normal or multivariate normal variables.} \end{itemize} You can also extend the formula to more than two variables if you like. Derivation: use the (multivariable) Taylor expansion of $f(x,y)$ including \emph{linear terms only}: $$ f(x,y) \approx f(\bar x,\bar y) + \frac{\partial f}{\partial x}(x-\bar x) + \frac{\partial f}{\partial y}(y-\bar y) $$ where the derivatives are evaluated at $(\bar x,\bar y)$. Substitute this in to the formula for the variance of $f(x,y)$: \begin{eqnarray} \mbox{Var}(f(x,y)) & = & \int P(x,y) (f(x,y) - f(\bar x,\bar y))^2 \, dx \, dy \\ & = & \int P(x,y) \left(f(\bar x,\bar y) + \frac{\partial f}{\partial x}(x-\bar x) + \frac{\partial f}{\partial y}(y-\bar y) - f(\bar x,\bar y) \right)^2 \, dx \, dy \\ & = & \int P(x,y) \left(\frac{\partial f}{\partial x}(x-\bar x) + \frac{\partial f}{\partial y}(y-\bar y)\right)^2 \, dx \, dy \\ & = & \int P(x,y) \left( \left(\frac{\partial f}{\partial x}\right)^2 (x-\bar x)^2 + \left(\frac{\partial f}{\partial y}\right)^2(y-\bar y)^2 + 2 \frac{\partial f}{\partial x} \frac{\partial f}{\partial y} (x-\bar x) (y - \bar y) \right) \, dx \, dy \nonumber \\ & & \\ & = & \int P(x,y) \left(\frac{\partial f}{\partial x}\right)^2 (x-\bar x)^2 \, dx \, dy \nonumber \\ & & \quad \mbox{} + \int P(x,y) \left(\frac{\partial f}{\partial y}\right)^2 (y-\bar y)^2 \, dx \, dy \nonumber \\ & & \quad \mbox{} + \int P(x,y) \, 2 \frac{\partial f}{\partial x} \frac{\partial f}{\partial y} (x-\bar x) (y - \bar y) \, dx \, dy \\ & = & \left(\frac{\partial f}{\partial x}\right)^2 \int P(x,y) (x-\bar x)^2 \, dx \, dy \nonumber \\ & & \quad \mbox{} + \left(\frac{\partial f}{\partial y}\right)^2 \int P(x,y) (y-\bar y)^2 \, dx \, dy \nonumber \\ & & \quad \mbox{} + 2 \frac{\partial f}{\partial x} \frac{\partial f}{\partial y} \int P(x,y) (x-\bar x) (y - \bar y) \, dx \, dy \\ & = & \left(\frac{\partial f}{\partial x}\right)^2 \mbox{Var}(x) +\left(\frac{\partial f}{\partial y}\right)^2 \mbox{Var}(y) + 2 \frac{\partial f}{\partial x} \frac{\partial f}{\partial y} \mbox{Cov}(x,y) \end{eqnarray} \section{Linear algebra basics}