Textbook Notes (280,000)
CA (170,000)
UTSG (10,000)
MAT (400)
Chapter 2

MAT247H1 Chapter Notes - Chapter 2: Linear Map, Linear Independence, Orthogonal Complement


Department
Mathematics
Course Code
MAT247H1
Professor
Fiona T Rahman
Chapter
2

This preview shows half of the first page. to view the full 2 pages of the document.
MAT 247S - Orthogonal projections
Let Vbe an inner product space. Recall that if Sis a nonempty subset of V, then we define
the orthogonal complement of Sin Vto be the set S={xV| hx, yi= 0 for all yS}. It is
an easy exercise to prove that Sis a subspace of V.
Lemma. Let Sbe a subset of Vthat contains 0. Then SS={0}.
Proof. Let xSS. Then hx, x i= 0. According to the fourth property of inner products, we
must have x=0.
Lemma. Let Sbe a subset of V. Let W= span(S). Then S=W.
Proof. Since SW, it follows from the definitions of Sand Wthat WS.
Let yW. Then there exist vectors y1, . . . , ynSand scalars c1, . . . , cnsuch that y=
c1y1+· · · +cnyn. Then, using the properties of inner products, we have, for xS,
hy, x i=
n
X
j=1
hcjyj, x i=
n
X
j=1
cjhyj, x i= 0,
since yjSand xS. Therefore we have hx, y i=hy, x i=¯
0 = 0 for all xSand all yW.
This tells us that SW. Since we already had the reverse inclusion, the lemma follows.
Corollary. If Wis a finite-dimensional subspace of an inner product space Vand {y1, . . . , yd}is
a basis of W, then W={xV| h x, yji= 0,for 1 jd}.
If Wis a finite-dimensional subspace of an inner product space V, the linear operator T∈ L(V)
described in the next theorem will be called the orthogonal projection of Von W(see the first
paragraph on page 399 of the text, and also Theorem 6.6 on page 350).
Theorem. Let Wbe a finite-dimensional subspace of an inner product space V.
(1) There exists a unique T∈ L(V)such that T(x) = xfor all xWand W=N(T).
(2) Suppose that Vis finite-dimensional. Then dim W+ dim W= dim V.
(3) Let {y1, . . . , yd}be an orthonormal basis for W(d= dim W). Let Tbe as in part (1). Then
T(x) = Pd
j=1hx, yjiyj,xV.
Proof. Let {y1, . . . , yd}be an orthonormal basis of W. (Note that such a basis must exist, according
to Theorem 6.5.) Define T(x) = Pd
j=1hx, yjiyj. First, we show that Tis linear: For x,zVand
cF,
T(x+z) =
d
X
j=1
hx+z, yjiyj=
d
X
j=1
(hx, yji+hz, yji)yj=
d
X
j=1
(hx, yjiyj+hz, yjiyj) = T(x) + T(z)
T(cx) =
d
X
j=1
hcx, yjiyj=
d
X
j=1
chx, yjiyj=c
d
X
j=1
hx, yjiyj
=cT (x).
Now suppose that xW. Then, according to Theorem 6.5, we must have x=Pd
j=1hx, yjiyj.
This is the same as x=T(x). Therefore T(x) = xfor all xW.
Next, let xW. Then hx, yji= 0 for 1 jd, and so T(x) = Pd
j=1 0·yj={0}.
Therefore WN(T).
You're Reading a Preview

Unlock to view full version