[Home]

Table of contents


$\newcommand{\v}{\vec}$

Exercise set 1

  1. Do all the exercises at the end of chapter 2 of the textbook.
  2. Solve the following approximate system using R: $$\begin{eqnarray*} 3a + 4b + c & \approx & 3.4\\ 3a + 4b + c & \approx & 3.5\\ 4a + 3b + 2c & \approx & 10.1\\ 4a + 3b + 2c & \approx & 9.8\\ 6a + 5b + 2c & \approx & 5.6 \end{eqnarray*}$$
  3. Let's see how R tackles a linear model where the design matrix that is not full column rank: $$ X = \left[\begin{array}{ccccccccccc} 1 & 1 & 0\\ 1 & 1 & 0\\ 1 & 0 & 1\\ 1 & 0 & 1\\ \end{array}\right],\quad \v y = \left[\begin{array}{ccccccccccc}3.4\\3.5\\10.5\\10.3 \end{array}\right]. $$ Here the first column of $X$ is a column of $1$'s. So you may just type the last two columns in R, and omit the -1.
  4. In the problem above R produced one least squares solution. But we know that there are infinitely many. Write down two more solutions. Can you write a general form for all least squares solutions here?
  5. R automatically stores various qunatities computed by lm. We shall explore some of them here. Let's work with the linear model from the last exercise. Create the full design matrix (including its first column) and type:
    myfit = lm(y~X-1)
    
    The variable myfit now contains lots of the information about the fit. You may extract the computed least squares solution $\hv \beta $ as
    myfit$coef
    
    This may be used in future computations. Compute $\h y = X\hv \beta.$ Remember that %*% is the R notation for matrix multiplication. This $\h y$ is the foot of the perpendicular dropped from $\v y$ to $\col(X).$ Usually $\hv y$ is called the fitted vector. R already computes them:
    myfit$fitted
    
    The vector $\v y - \hv y$ is called the residual vector:
    myfit$resid
    
    There are many other pieces of information packed in myfit:
    names(myfit)
    
  6. Consider a linear model $\v y = X \beta +\epsilon,$ where $X$ is not full col rank. Pick any basis of $\col(X).$ Stack these vectors side by side a columns to get a matrix $B.$ Let $\v w = B(B'B) ^{-1} B' \v y.$ Show that $\v w = \hv y$ irrespective of the choice of $B.$
  7. Consider a linear model with design matrix $$ X = \left[\begin{array}{ccccccccccc} 1 & 1 & 0\\ 1 & 1 & 0\\ 1 & 0 & 1\\ 1 & 0 & 1\\ \end{array}\right]. $$ If $\v \beta = (\beta_1, \beta_2, \beta_3)',$ then show that whatever least squares solution $\hv \beta $ you take, $\h \beta_2-\h \beta_3$ is always the same. Characterise all vectors $\v \ell\in{\mathbb R}^3$ such that $\v \ell' \hv \beta$ does not depend on the choice of the least squares solution.
  8. Generalise the characterisation from the last problem to arbitrary design matrix.
  9. Redo the above problem with the extra condition: $\beta_0-\alpha_0 = (\beta_1-a_1) x_0.$

Comments

To post an anonymous comment, click on the "Name" field. This will bring up an option saying "I'd rather post as a guest."