# MAT247H1 Chapter Notes - Chapter 2: Linear Map, Linear Independence, Orthogonal Complement

by OC118869

Department

MathematicsCourse Code

MAT247H1Professor

Fiona T RahmanChapter

2This

**preview**shows half of the first page. to view the full**2 pages of the document.**MAT 247S - Orthogonal projections

Let Vbe an inner product space. Recall that if Sis a nonempty subset of V, then we deﬁne

the orthogonal complement of Sin Vto be the set S⊥={x∈V| hx, yi= 0 for all y∈S}. It is

an easy exercise to prove that S⊥is a subspace of V.

Lemma. Let Sbe a subset of Vthat contains 0. Then S∩S⊥={0}.

Proof. Let x∈S∩S⊥. Then hx, x i= 0. According to the fourth property of inner products, we

must have x=0.

Lemma. Let Sbe a subset of V. Let W= span(S). Then S⊥=W⊥.

Proof. Since S⊂W, it follows from the deﬁnitions of S⊥and W⊥that W⊥⊂S⊥.

Let y∈W. Then there exist vectors y1, . . . , yn∈Sand scalars c1, . . . , cnsuch that y=

c1y1+· · · +cnyn. Then, using the properties of inner products, we have, for x∈S⊥,

hy, x i=

n

X

j=1

hcjyj, x i=

n

X

j=1

cjhyj, x i= 0,

since yj∈Sand x∈S⊥. Therefore we have hx, y i=hy, x i=¯

0 = 0 for all x∈S⊥and all y∈W.

This tells us that S⊥⊂W⊥. Since we already had the reverse inclusion, the lemma follows.

Corollary. If Wis a ﬁnite-dimensional subspace of an inner product space Vand {y1, . . . , yd}is

a basis of W, then W⊥={x∈V| h x, yji= 0,for 1 ≤j≤d}.

If Wis a ﬁnite-dimensional subspace of an inner product space V, the linear operator T∈ L(V)

described in the next theorem will be called the orthogonal projection of Von W(see the ﬁrst

paragraph on page 399 of the text, and also Theorem 6.6 on page 350).

Theorem. Let Wbe a ﬁnite-dimensional subspace of an inner product space V.

(1) There exists a unique T∈ L(V)such that T(x) = xfor all x∈Wand W⊥=N(T).

(2) Suppose that Vis ﬁnite-dimensional. Then dim W+ dim W⊥= dim V.

(3) Let {y1, . . . , yd}be an orthonormal basis for W(d= dim W). Let Tbe as in part (1). Then

T(x) = Pd

j=1hx, yjiyj,x∈V.

Proof. Let {y1, . . . , yd}be an orthonormal basis of W. (Note that such a basis must exist, according

to Theorem 6.5.) Deﬁne T(x) = Pd

j=1hx, yjiyj. First, we show that Tis linear: For x,z∈Vand

c∈F,

T(x+z) =

d

X

j=1

hx+z, yjiyj=

d

X

j=1

(hx, yji+hz, yji)yj=

d

X

j=1

(hx, yjiyj+hz, yjiyj) = T(x) + T(z)

T(cx) =

d

X

j=1

hcx, yjiyj=

d

X

j=1

chx, yjiyj=c

d

X

j=1

hx, yjiyj

=cT (x).

Now suppose that x∈W. Then, according to Theorem 6.5, we must have x=Pd

j=1hx, yjiyj.

This is the same as x=T(x). Therefore T(x) = xfor all x∈W.

Next, let x∈W⊥. Then hx, yji= 0 for 1 ≤j≤d, and so T(x) = Pd

j=1 0·yj={0}.

Therefore W⊥⊂N(T).

###### You're Reading a Preview

Unlock to view full version