My Math Forum Eigenvalue proof!

 Linear Algebra Linear Algebra Math Forum

 September 17th, 2013, 11:13 AM #1 Newbie   Joined: Sep 2013 Posts: 2 Thanks: 0 Eigenvalue proof! Suppose that A is an n x n matrix with eigenvalues ?1,..., ?n. Let B=A + rI where r is an arbitrary scalar. Prove that the eigenvalues of B are: (?1 + r),...,( ?n + r) If anyone has ideas, any help appreciated
 September 17th, 2013, 11:27 AM #2 Senior Member   Joined: Jun 2013 From: London, England Posts: 1,316 Thanks: 116 Re: Eigenvalue proof! For matrices A and B and vector v, you have: (A+B)v = Av + Bv Maybe you can take it from there.
 September 17th, 2013, 11:36 AM #3 Newbie   Joined: Sep 2013 Posts: 2 Thanks: 0 Re: Eigenvalue proof! Just to be clear -> rI where I is identity matrix Thank you for your fast reply. What is v?
September 17th, 2013, 11:42 AM   #4
Senior Member

Joined: Jun 2013
From: London, England

Posts: 1,316
Thanks: 116

Re: Eigenvalue proof!

Quote:
 Originally Posted by Pero For matrices A and B and vector v, you have: (A+B)v = Av + Bv Maybe you can take it from there.
v is a vector. And A and B are any matrices. Maybe I should have used X and Y to avoid confusion.

I assume you know what eigenvalues and eigenvectors are?

 Tags eigenvalue, proof

 Thread Tools Display Modes Linear Mode

 Similar Threads Thread Thread Starter Forum Replies Last Post gthoren Abstract Algebra 2 October 5th, 2010 05:43 AM wannabe1 Linear Algebra 2 April 17th, 2010 02:33 PM tinynerdi Linear Algebra 3 April 5th, 2010 12:56 AM Ackoo Linear Algebra 1 May 7th, 2008 07:19 AM lakayii Algebra 1 December 31st, 1969 04:00 PM

 Contact - Home - Forums - Cryptocurrency Forum - Top