Pset 3 (spring 2020)
(Optional: if you wish to download this notebook as an .ipynb file, use the download button in the upper right and OPTION-click (MAC) or ALT-Click (Linux and Windows ) then "Save Link as.." or "Download Linked File As.." or something similar on your browser.)
Problem 1. Let x̄$\ = \frac{1}{n}\sum_{i=1}^n x_i$ denote the mean of x. What is sum( x.-x̄). (Remember x.-x̄ is the vector x with an elementwise subtraction of x̄). Explain your answer.
If you wish to try some examples in Julia (optional):
using Statistics
n = 8
x = rand(n)
x̄ = mean(x)
sum( x .- x̄)
2.220446049250313e-16
Problem 2. (We may supply some hints as we go, so check back in )
This problem shows how the QR idea can derive the formulas for best fit lines.
a)Suppose (xᵢ,yᵢ) for i=1,...m are data which we will fit with a best least squares line. Let
a = x .- x̄ and b = y .- ȳ.
The slope of the best line through the (aᵢ,bᵢ) is 1) the same 2) bigger 3) smaller
than the line through the (xᵢ,yᵢ) for i=1,...m ? Explain.
b) In terms of a and b, write an m x 2 matrix A and a right hand side expressing $A [slope; intercept] = $ right hand side.
c) What is the dot product of the first column of A with the second? (Hint: this is easy)
d) Use the result in c to derive the QR factorization of $A$ without too much hard work.
e) What is a formula for the slope of the best least squares line? (Hint: we are heading towards the formula for β̂ in the simple linear regression formula in wikipedia )
f) The intercept is very simple? Why is this result obvious?
g) What are the slope and intercept for the original data (xᵢ,yᵢ) without any hard work?
Problem 3. Suppose A=QR is square, where Q is orthogonal and R is upper triangular and invertible. Write the solution to Ax=b in terms of b and possibly Q,Qᵀ,and R.
Problem 4. Set up a matrix least squares problem if we are interested in taking n data points $(x_i,y_i)$ and we wish to find the best function f(x)=$c_1e^x+c_2e^{−x}$ through the data points.
Problem 5. Find $A^T$ and $A^{-1}$ and $(A^{-1})^T$ and $(A^T)^{-1}$ for $A=\left( \begin{matrix} 1 & 0 \\ 9 & 3 \end{matrix}\right)$.
Problem 6. A matrix $A$ is symmetric if $A=A^T$. Which of these are true?
6a. The block matrix $\begin{pmatrix} 0 & A\\ A & 0 \end{pmatrix}$ is automatically symmetric.
6b. If A and B are symmetric then their product AB is symmetric.
6c. If A is not symmetric then $A^{−1}$ is not symmetric.
6d. When A,B,C are symmetric, the transpose of ABC is CBA
. If $A=A^T$ and $B=B^T$, which of these matrices are certainly symmetric?
6e. $A^2−B^2$
6f. $(A+B)(A−B)$
6g. $ABA$
6h. $ABAB$
Problem 7 (GS p.106 13)
Compute U and L for the symmetric matrix A:
$\left( \begin{matrix}
a & a & a & a \\
a & b & b & b \\
a & b & c & c \\
a & b & c & d
\end{matrix} \right) $.
Find four conditions on a,b,c,d to get A=LU with four pivots.
Problem 8. First computation of a singular value decomposition.
Here we wish for you to become familiar with an SVD by changing the m and n from what appears here and testing that the results have the required properties. You merely need to execute and turn in a screen shot. Describe in a few words what kind of matrix is U, V, and diagm(Σ) for your chosen m and n. Say sizes, and use words like orthogonal, tall skinny orthogonal, diagonal, triangular etc.
One version of the SVD produces $$ A = U * Diagonal(\Sigma) * V',$$ where $U'U=I$, $V'V=I$, $\Sigma=(\sigma_1,\sigma_2,...)$ with $\sigma_1 \ge \sigma_2 \ge ... \gt 0$.
$U$ is $m \times n$ and $V$ is $n \times n$ if $ m \ge n$, and
$U$ is $m \times m$ and $V$ is $n \times m$ if $ m \lt n$.
using LinearAlgebra
A = rand(5,3)
U,Σ,V = svd(A)
display(Σ)
3-element Array{Float64,1}: 2.0908607204660132 0.5438073613767322 0.1985541583756679
U'U ≈ I
true
V'V ≈ I
true
# A ≈ U * diagm(Σ) * V'
A ≈ U * Diagonal(Σ) * V'
true
Problem 9. Suppose a square $A$ has an LU factorization $A=LU$ where $L$ and $U$ are invertible. If $A=QR$, what is $r_{11}$ in terms of possibly elements of $L$ and $U$?
Problem 10 . Suppose an n x 2 matrix $A$ is written as $QR$, where $Q$ is a tall-skinny orthogonal matrix and is also n x 2 and R = $ \left( \begin{matrix} 1 & 3 \\ 0 & 4 \end{matrix} \right) $
What is the norm of the second column of $A$? Briefly explain?