My Math Forum Lorentz matrix problem

 Linear Algebra Linear Algebra Math Forum

 February 29th, 2012, 04:26 PM #1 Newbie   Joined: Feb 2012 Posts: 4 Thanks: 0 Lorentz matrix problem I looked at a copy of "Problem Book in Quantum Field Theory" by Voja Radovanovic, and it has a problem where you need to prove properties of a matrix based on the property that it doesn't change the "spacetime interval" of a vector. There's an argument in the solution of the problem that I don't understand, and I'm hoping someone can help me with it. I'll summarize the problem and proof in matrix language instead of index notation. Given a matrix $g= \left( \begin{matrix}{ccc} 1 &0 \\ 0=&-1=&0 \\ 0=&-1=&0 \\ 0=&-1 \end{matrix} \right)=$, and a vector $x$, define $x^2= gx \cdot x$. Show that if $(\Lambda x)^2= x^2$ for every x, then $\Lambda^{\top} g \Lambda= g$ The proof is short: he just substitutes the definition, then does a little algebra to get $\Lambda^{\top} g \Lambda x \cdot x= g x \cdot x$. Then he claims that since this is true for *every* x, that $\Lambda^{\top} g \Lambda= g$. The last part I don't understand. For example, $\left( \begin{matrix}{ccc} 0=&-1=&0 \\ 1=&0 \\ 0=&1=&0 \\ 0=&1 \end{matrix} \right) x \cdot x= \left( \begin{matrix}{ccc} 0 &1=&0 \\ -1=&0 \\ 0=$ for *every* x, but certainly those matrices are not equal. I don't know if I caught an error in the proof or if I'm just not understanding something about it. Does anyone have a more detailed proof of this that shows how you get from $\forall x \Lambda^{\top} g \Lambda x \cdot x= g x \cdot x$ to $\Lambda^{\top} g \Lambda= g$ ?
 February 29th, 2012, 04:37 PM #2 Senior Member   Joined: Feb 2012 Posts: 628 Thanks: 1 Re: Lorentz matrix problem In the notation $gx \cdot x$, is that g times the dot product $x \cdot x$, or is it the dot product of $gx$ and $x$, or is it something else entirely? Obviously the interpretation g times x times x makes no sense (under matrix multiplication, anyway).
 February 29th, 2012, 08:41 PM #3 Newbie   Joined: Feb 2012 Posts: 4 Thanks: 0 Re: Lorentz matrix problem It is (gx) * x.
February 29th, 2012, 08:50 PM   #4
Newbie

Joined: Feb 2012

Posts: 4
Thanks: 0

Re: Lorentz matrix problem

Here's the original text for those who enjoy index notation or want to see it in situ.

Quote:
 The square of the length of a four-vector, $x$ is $x^2= g_{\mu \nu} x^{\mu} x^{\nu}$. By substituting $x'^{\mu} = \Lambda^{\mu}_{\rho}x^{\rho}$ into the condition $x'^2 = x^2$ one obtains: $g_{\mu\nu}\Lambda^{\mu}_{\rho}\Lambda^{\nu} _{\sigma} x^{\rho}x^{\sigma}= g_{\rho\sigma}x^{\rho}x^{\sigma}$ Since (1.1) is valid for any vector $x \in M_4$, we get $\Lambda^{\mu}_{\rho}g_{\mu\nu}\Lambda^{\nu} _{\sigma}= g_{\rho\sigma}$.

 March 1st, 2012, 11:17 AM #5 Newbie   Joined: Feb 2012 Posts: 4 Thanks: 0 Re: Lorentz matrix problem Well, I managed to figure this out after a bit more pondering. It's necessary to use the properties of g on both sides of the equation. For arbitrary matrices A and B, where $(Ax) \cdot x= (Bx) \cdot x$ for all x, you can show that their entries on the main diagonal must be equal by picking each unit vector in the basis and substituting it for x. For example, picking $x= \left( \begin{matrix} 1 \\ 0 \\ 0 \\ 0 \end{matrix} \right)$ in $A_{ij} x_i x_j= B_{ij} x_i x_j$ yields $A_{00}= B_{00}$ Now that we know the diagonals are equal, you can show by picking each vector in the basis with two 1's in it (e.g. (0, 1, 1, 0)) that $A_{ij} + A_{ji}= B_{ij} + B_{ji}$. For example, picking $x= \left( \begin{matrix} 1 \\ 0 \\ 1 \\ 0 \end{matrix} \right)$ in $A_{ij} x_i x_j= B_{ij} x_i x_j$ yields $A_{00} + A_{02} + A_{20} + A_{22}= B_{00} + B_{02} + B_{20} + B_{22}$ which implies $A_{02} + A_{20}= B_{02} + B_{20}$ since $A_{nn}= B_{nn}$ for all indices n. Now going from the general case to the special case where B = g, and i ? j, then $A_{ij} + A_{ji}= g_{ij} + g_{ji} = 0$, so the entries off diagonal are antisymmetric. Finally, we know that $\Lambda^{\top} g \Lambda$ is symmetric, since g is symmetric and $(\Lambda^{\top} g \Lambda)^{\top}= \Lambda^{\top} g^{\top} \Lambda = \Lambda^{\top} g \Lambda$. The only way that the off-diagonal entries can be both symmetric and antisymmetric is if they are zero, and the on-diagonal entries must be equal, so $\Lambda^{\top} g \Lambda= g$ as expected. The same proof seems like it would work for any diagonal matrix substituted for g, so there's probably some slightly more sophisticated linear algebra that would make short work of this through some decomposition or something. I think the more general statement would be for any matrix A and diagonal matrix D, if $(A^{\top} D A x) \cdot x= (D x) \cdot x$ for all x, then $A^{\top} D A= D$. Is this an obvious linear algebra thing? Because that proof in the book glossed over it like it was nothing.

 Tags lorentz, matrix, problem

 Thread Tools Display Modes Linear Mode

 Similar Threads Thread Thread Starter Forum Replies Last Post golomorf Linear Algebra 0 February 17th, 2013 10:47 AM Marvin174 Linear Algebra 0 March 19th, 2012 01:09 PM ben.g95 Physics 0 October 12th, 2011 04:20 PM halloweengrl23 Linear Algebra 2 December 3rd, 2010 02:00 PM jlfr Linear Algebra 2 March 30th, 2008 07:02 PM

 Contact - Home - Forums - Cryptocurrency Forum - Top