(Solved):
Problem 1. Consider the simple linear regression model in which \( Y_{i}=\beta_{0}+\beta_{1} x_{i} ...
Problem 1. Consider the simple linear regression model in which \( Y_{i}=\beta_{0}+\beta_{1} x_{i}+\varepsilon_{i} \), under our standard assumptions that \( \varepsilon_{i} \) are i.i.d \( N\left(0, \sigma^{2}\right), 0 \leq i \leq n \). a) Show that this can be expressed in matrix notation as \[ \mathbf{Y}=\mathbf{X} \boldsymbol{\beta}+\varepsilon \] where \( \varepsilon \) is the column vector of \( \varepsilon_{i} \) 's, \( \boldsymbol{\beta} \) is the vector of coefficients, \( \mathbf{X} \) is the design matrix (with \( n \) rows and 2 columns), and \( \mathbf{Y} \) is the vector of response variables. b) Assuming that \( \mathbf{X} \) is of full column rank, compute \[ \mathbf{b}=\left(\mathbf{X}^{\top} \mathbf{X}\right)^{-1} \mathbf{X}^{\top} \mathbf{Y} \] explicitly, and show that this corresponds to the least-squares estimates \( b_{0}, b_{1} \) we have obtained before. c) Use this to write \( \hat{\boldsymbol{Y}}=\mathbf{X}\left(\mathbf{X}^{\top} \mathbf{X}\right)^{-\mathbf{1}} \mathbf{X}^{\top} \mathbf{Y}=\mathbf{P Y} \), where \( \mathbf{P}=\mathbf{X}\left(\mathbf{X}^{\top} \mathbf{X}\right)^{-\mathbf{1}} \mathbf{X}^{\top} \).