My Math Forum  

Go Back   My Math Forum > College Math Forum > Linear Algebra

Linear Algebra Linear Algebra Math Forum


Reply
 
LinkBack Thread Tools Display Modes
January 4th, 2013, 12:08 PM   #1
Newbie
 
Joined: Feb 2012

Posts: 6
Thanks: 0

Orthogonal projection of eigenvector

Hi All,

I'm studying for my Linear Algebra exam, but there is a part of the solution manual that I don't get.

The problem is as follows: we have a symmetric matrix A which is
Code:
[  5  -4  -2 ]
[ -4   5   2 ]
[ -2   2   2 ]
and they have given two eigenvectors, v1 = (-2,2,1) and v2 = (1,1,1). Now the question is to orthogonally diagonalize A.

The first thing I did was determining the eigenvalues, which are 10 (with multiplicity 1) and 1 (with multiplicity 2). v1 is corresponding to 10 and v2 is corresponding to v2. The next thing to do is to find a second eigenvector for the basis of the eigenspace corresponding to eigenvalue 1.

Computations led to the vector v3 = (1,0,2), just like the solution manual said. However, to eventually get to the matrix P (to form A = PDP^(-1) ), they convert v3 via an orthogonal projection to (1,-1,4). My question is how they get there. Obviously the reason is that v2 and v3 are not orthogonal, but how do they come to the (1,-1,4) vector?

I hope everything is clear, any help would be appreciated!
Robertoo is offline  
 
January 4th, 2013, 02:28 PM   #2
Math Team
 
Joined: Sep 2007

Posts: 2,409
Thanks: 6

Re: Orthogonal projection of eigenvector

Quote:
Originally Posted by Robertoo
Hi All,

I'm studying for my Linear Algebra exam, but there is a part of the solution manual that I don't get.

The problem is as follows: we have a symmetric matrix A which is
Code:
[  5  -4  -2 ]
[ -4   5   2 ]
[ -2   2   2 ]
and they have given two eigenvectors, v1 = (-2,2,1) and v2 = (1,1,1). Now the question is to orthogonally diagonalize A.
How nice of them to give you those eigenvectors- just don't believe everything you are told! Yes,

so (-2, 2 ,1) is an eigenvector corresponding to eigenvalue 10.

However,

but that is NOT a multiple of <1, 1, 1> so <1, 1, 1> is NOT an eigenvector. But it is easy to see that 1 is an eigenvalue.

In order that (x, y, z) be a eignvector corresponding to eigenvalue 1, we must have

That gives the equations 5x- 4y- 2z= x, -4x+ 5y+ 2z= y, -2x+ 2y+ 2z= z or 4x- 4y- 2z= 0, -4x+ 4y+ 2z= 0, -2x+ 2y+ z= 0, all of which are equivalent to z= 2x- 2y. That is, we have (x, y, z)= (x, y, 2x- 2y)= (x, 0, 2x)+ (0, y, -2y)= x(1, 0, 2)+ y(0, 1, -2). That tells us that (1, 0, 2) and (0, 1, -2) span the space of all eigenvectors corresponding to eigenvalue 1. That does NOT incude (1, 1, 1)!

Quote:
The first thing I did was determining the eigenvalues, which are 10 (with multiplicity 1) and 1 (with multiplicity 2). v1 is corresponding to 10 and v2 is corresponding to v2. The next thing to do is to find a second eigenvector for the basis of the eigenspace corresponding to eigenvalue 1.

Computations led to the vector v3 = (1,0,2), just like the solution manual said. However, to eventually get to the matrix P (to form A = PDP^(-1) ), they convert v3 via an orthogonal projection to (1,-1,4). My question is how they get there. Obviously the reason is that v2 and v3 are not orthogonal, but how do they come to the (1,-1,4) vector?

I hope everything is clear, any help would be appreciated!
Once you have three eigenvectors, to "orthogonally diagonalize" A, you need three orthonormal eigenvectors. It is easy to see that (-2, 2, 1) is orthogonal to both (1, 0, 2) and (0, 1, -2) so you can just divide (-2, 2, 1) by its length, 3. (1, 0, 2) has length so you can divide by that. To find a vector in the same space as (1, 0, 2) and (0, 1, 2) that is perpendicular to (1, 0, 2), find the projection of (0, 1, 2) on (1, 0, 2) and then subtract that from (0, 1, 2). And, of course, divide by its length.
HallsofIvy is offline  
January 5th, 2013, 04:24 AM   #3
Newbie
 
Joined: Feb 2012

Posts: 6
Thanks: 0

Re: Orthogonal projection of eigenvector

Oops, I have made a mistake; the eigenvector that the book gave should be (1,1,0), which is an eigenvector corresponding to the eigenvalue 1. So, for eigenvalue 1, the eigenspace would be spanned by (1,1,0) and (1,0,2) in my findings. However, then I'm still not convinced how they can convert (1,0,2) into (1,-1,4). It seems that the eigenspace would change dramatically when doing this right?

I do understand the part about normalizing the eigenvectors though!

Thanks for all the help though!
BTW, is there a guide on how to type in matrix notation, like you do?
Robertoo is offline  
January 5th, 2013, 07:58 AM   #4
Newbie
 
Joined: Feb 2012

Posts: 6
Thanks: 0

Re: Orthogonal projection of eigenvector

Nevermind about the matrix code; I see you can use LaTex codes here!

About the problem at hand, here is the full solution from the manual. The underlined part is the part I don't fully understand.
Attached Images
File Type: png Schermafbeelding 2013-01-05 om 17.56.13.png (82.5 KB, 435 views)
Robertoo is offline  
Reply

  My Math Forum > College Math Forum > Linear Algebra

Tags
eigenvector, orthogonal, projection



Search tags for this page
Click on a term to search for related topics.
Thread Tools
Display Modes


Similar Threads
Thread Thread Starter Forum Replies Last Post
orthogonal projection of ellipsoid. atraxis Applied Math 6 June 16th, 2015 02:31 AM
orthogonal projection of ellipsoid. atraxis Algebra 0 November 24th, 2013 07:15 AM
Eigenvector of 3x3 matrix jones123 Calculus 2 May 30th, 2013 07:51 AM
3d orthogonal projection on a plane abcdefg Algebra 1 September 27th, 2011 04:35 AM
Eigenvector Question wannabe1 Linear Algebra 2 April 17th, 2010 02:32 PM





Copyright © 2018 My Math Forum. All rights reserved.