My Math Forum What is Eigenvalues, Eigenvectors?

 Elementary Math Fractions, Percentages, Word Problems, Equations, Inequations, Factorization, Expansion

 March 27th, 2018, 02:24 AM #1 Banned Camp   Joined: Nov 2017 From: india Posts: 204 Thanks: 2 What is Eigenvalues, Eigenvectors? hello what is application of Eigenvalues, Eigenvectors ?
 March 27th, 2018, 04:20 AM #2 Math Team   Joined: Jan 2015 From: Alabama Posts: 3,264 Thanks: 902 Well what math do you know that an answer would make sense to you? Do you know what "eigenvalues" and "eigenvectors" are? Have you taken a "Linear Algebra" course? Linear algebra, in its most general sense studies the very concept of "linearity" and "linear" problems. Specifically, Linear Algebra defines "vector space" as a set in which we have an idea of adding vectors and multiplying vectors by numbers- the basic "linear" operations. A "linear transformation" is a function, A, that maps members of one vector space to another such that A(u+ v)= Au+ Av, where u and v are vectors and such that $\displaystyle A(\mu u)= \mu u$ where u is a vector and $\displaystyle \mu$ is a number. vectors in finite dimensional vector spaces can be written as vertical lists of numbers and linear transformations as matrices. A lot of linear algebra turns out to be dealing with matrices. In fact, students often (incorrectly) think of linear algebra as "matrices". Given a linear transformation, A, $\displaystyle \lambda$ is said to be an "eigenvalue" of A if there exists a non-zero vector, v, such that $\displaystyle Av= \lambda v$- that is, on v, A acts just like the much simpler operation of just multiplying by the number $\displaystyle \lambda$. And, in that case, $\displaystyle \lambda$ is an eigenvalue corresponding to eigenvector v, v is a eigenvector corresponding to eigenvalue $\displaystyle \lambda$. Eigenvalues and eigenvectors occur in all types of applications. For example, we can write systems of equations in matrix form- writing, for example, the two equations in two unknowns, $\displaystyle ax+ by= c$, $\displaystyle dx+ ey= f$ as $\displaystyle \begin{pmatrix}a & b \\ d & e \end{pmatrix}\begin{pmatrix}x \\ y \end{pmatrix}= \begin{pmatrix}c \\ f\end{pmatrix}$ or, symbolically, as "Ax= B" where A is the two by two matrix $\displaystyle \begin{pmatrix}a & b \\ d & e \end{pmatrix}$, x is the vertical matrix $\displaystyle \begin{pmatrix}x \\ y \end{pmatrix}$ and B s the vertical matrix $\displaystyle \begin{pmatrix} c \\ f \end{pmatrix}$. If A has a "complete set of eigenvector"- meaning that there exist a basis for the vector space consisting of eigenvector of A- the A can be "diagonalized": there exist a matrix, C, the matrix having the eigenvectors of A as columns, such that $\displaystyle C^{-1}AC= D$ where D is the diagonal matrix having the eigenvalues of A on its main diagonal, zeros every where else. Given the matrix equation (system of equations) Ax= B, multiply on both sides of the equation by $\displaystyle C^{-1}$ to get $\displaystyle C^{-1}Ax= C^{-1}B$. Then let $\displaystyle y= C^{-1}x$ so that $\displaystyle x= Cy$ and that equation becomes $\displaystyle C^{-1}ACy= Dy= C^{-1}B$. But now D is a diagonal matrix so that the equations are now "uncoupled". That is, instead of a system of n equation with perhaps every equation involving all n unknowns, we have n equations each involving a single unknown so that each can be solved separately: $\displaystyle \begin{pmatrix}a & 0 \\ 0 & b \end{pmatrix}\begin{pmatrix}x \\ y \end{pmatrix}= \begin{pmatrix}e \\ f \end{pmatrix}$ is just the two equations $\displaystyle ax= e$ and $\displaystyle by= f$ which have the solutions $\displaystyle x= \frac{e}{a}$ and $\displaystyle y= \frac{f}{b}$. One result of that is that any solution to a "linear problem" can be written as a "linear combination" of basic solutions (again, multiplying solutions by numbers and adding solutions). Those "basic solutions" are the eigenvectors of the linear operator. For example, the differential equation $\displaystyle \frac{d^2y}{dx^2}+ 3\frac{dy}{dx}+ 2y= 0$ can be written as two separate first order equations by letting $\displaystyle y_1= y(x)$ and $\displaystyle y_2= \frac{dy}{dx}$. Then $\displaystyle \frac{dy_2}{dx}+ 3y_2+ 2y_1= 0$ or $\displaystyle \frac{dy_2}{dx}= -2y_1- 3y_2$. That is, the second order equation can be written as two first order equations, $\displaystyle \frac{dy_1}{dx}= y_2$ and $\displaystyle \frac{dy_2}{dx}= -y_1- 3y_2$ which can, in turn, be written as the single matrix equation $\displaystyle \frac{d}{dx}\begin{pmatrix}y_1 \\ y_2\end{pmatrix}= \begin{pmatrix}0 & 1 \\ -2 & -3\end{pmatrix}$. The matrix $\displaystyle \begin{pmatrix}0 & 1 \\ -2 & -3\end{pmatrix}$ has "eigenvalue equation" $\displaystyle \left|\begin{array}{cc}0-\lambda & 1 \\ -2 & -3-\lambda\end{array}\right|= \lambda(3+ \lambda)+ 2= \lambda^2+ 3\lambda+ 2= (\lambda+ 2)(\lambda+ 1)= 0$ so has eigenvalues $\displaystyle \lambda= -1$ and $\displaystyle \lambda= -2$. If $\displaystyle \begin{pmatrix}x \\ y \end{pmatrix}$ is an eigenvector of that matrix corresponding to eigenvalue -1 then we must have $\displaystyle \begin{pmatrix}0 & 1 \\ -2 & -3 \end{pmatrix}\begin{pmatrix}x \\ y \end{pmatrix}= -1\begin{pmatrix}x \\ y \end{pmatrix}$ or y= -x and -2x- 3y= -y. The second equation is the same as y= -x also so any multiple of $\displaystyle \begin{pmatrix}1 \\ -1 \end{pmatrix}$ is an eigenvector corresponding to eigenvalue -1. Similarly if $\displaystyle \begin{pmatrix}x \\ y \end{pmatrix}$ is an eigenvector of that matrix corresponding to eigenvalue -2 then we must have $\displaystyle \begin{pmatrix}0 & 1 \\ -2 & -3 \end{pmatrix}\begin{pmatrix}x \\ y \end{pmatrix}= -2\begin{pmatrix}x \\ y \end{pmatrix}$ which is satisfied by all x and y= -2x. That is, any multiple of $\displaystyle \begin{pmatrix}1 \\ -2\end{pmatrix}$ is an eigenvector corresponding to eigenvalue -2. That means that using $\displaystyle C= \begin{pmatrix}1 & 1 \\ -1 & -2\end{pmatrix}$, the matrix having those eigenvectors as columns, we can write this system of equations as $\displaystyle \frac{d}{dx}\begin{pmatrix}z_1 \\ z_2\end{pmatrix}= \begin{pmatrix}-1 & 0 \\ 0 & -2\end{pmatrix}\begin{pmatrix}z_1 \\ z_2\end{pmatrix}$ which is the same as the two "uncoupled" differential equations $\displaystyle \frac{dz_1}{dx}= -z_1$ and $\displaystyle \frac{dz_2}{dt}= -2z_2$. It is easy to see that those have solutions $\displaystyle z_1(x)= e^{-x}$ and $\displaystyle z_2(x)= e^{-2x}$ which is enough to tell us that the general solution to the original equation is $\displaystyle y(x)=Ae^{-x}+ Be^{-2x}$ were A and B are any two numbers. That's longer than I had intended to write but, hey, whole books have been written on "eigenvalues" and "eigenvectors"! Thanks from Ould Youbba Last edited by Country Boy; March 27th, 2018 at 04:25 AM.

 Thread Tools Display Modes Linear Mode

 Similar Threads Thread Thread Starter Forum Replies Last Post hk4491 Linear Algebra 3 March 4th, 2014 01:03 PM international Linear Algebra 2 July 10th, 2013 04:01 PM baz Linear Algebra 1 April 11th, 2013 10:28 AM bonildo Linear Algebra 7 June 11th, 2012 04:06 PM bonildo Linear Algebra 4 June 8th, 2012 07:02 PM

 Contact - Home - Forums - Cryptocurrency Forum - Top