My Math Forum  

Go Back   My Math Forum > College Math Forum > Linear Algebra

Linear Algebra Linear Algebra Math Forum


Reply
 
LinkBack Thread Tools Display Modes
July 23rd, 2017, 03:04 AM   #1
Newbie
 
Joined: May 2017
From: Germany

Posts: 5
Thanks: 0

Need help with three (small) problems)

hello,

I've been preparing for an upcoming exam for a first-semester linear algebra lecture this week, and for that I've reworked through all of the old assignments. There are a total of 3 tasks I've been struggling with. It's probably all relatively simple stuff.

1. Two matrices A, B are called similar if there exist invertible matrices S, T with B = SAT. Now I have to proof that a) this is an equivalence relation (that's no problem), and that b) two matrices are similar iff they have the same rank; also there is a distinct representant C for each class, which on the diagonal has rank(C) many 1's and otherwise only 0 entries.

I totally get why this is true, but I'm not sure how you're supposed to proof this formally. Also, perhaps more relevantly, it seems to me that you can bring any matrix in such a form using elementary elimination or permutation line operations (add a* one line to another line or swap two lines (but do you have to proof this or can you just state that?)) and that each operation is equivalent to the multiplication with an elimination/permutation matrix. Each of those matrices is invertible therefore their product is invertible ... but then why do you need both S and T? Wouldn't S alone do the job?

2. Let U = <(1, 2, -1)^T, (2,1,2)^T> and let Û = <(1, 0, -1)^T>

let p be the projection from V = R^3 to U along Û, so im(p) = U, ker(p) = Û.

Proof that for all x ∈ V, p(x) = A(B^-1)x where

$\text{A} = \begin{pmatrix}
1 & 2 & 0\\
2 & 1 & 0\\
-1 & 2 & 0
\end{pmatrix} \\
\text{B} = \begin{pmatrix}
1 & 2 & 1\\
2 & 1 & 0\\
-1 & 2 & -1
\end{pmatrix}$

This one is confusing me a bit. It's possible that the problem isn't stated properly. Wouldn't the matrix representation of p depend on the basis of V? The assignment does not state that the basis of V consists of the three vectors which describe U and Û. But if it did, wouldn't all x ∈ U just have the form (a, b, 0)? Though, even then I can't quite proof that A(B^-1)x = x... I know that Ax = Bx would hold and so x = (B^-1)Ax but that's not sufficient.

I think the proof works without really using the specic numbers.

3. (This one in particular ought to be easy because it's from a very early assignment)

Let A be an upper triangular matrix with diagonal entries ≠ 0. Prove by induction that for each vector b there exists exactly one vector x so that Ax = b (so basically a system of equations which is already in a nice form) and that xn = and for k < n, xk =

I don't see how induction would work in principle over multiple systems and I can't get it to work over entires of one system. I remember that I already failed on this one when I first had to do it.

Okay, those are the three. Any solutions are appreciated!

Last edited by skipjack; July 23rd, 2017 at 03:16 PM.
sty silver is offline  
 
July 23rd, 2017, 05:30 AM   #2
Newbie
 
Joined: May 2017
From: Germany

Posts: 5
Thanks: 0

Edit: since they don't show up properly (anoymore), for 2. that's

A =
B =
sty silver is offline  
July 23rd, 2017, 10:20 AM   #3
Senior Member
 
Joined: Dec 2012
From: Hong Kong

Posts: 853
Thanks: 311

Math Focus: Stochastic processes, statistical inference, data mining, computational linguistics
1. Are you familiar with coordinate systems? This would be much, much easier to get intuitively if you are.

The first part of b) should be relatively simple to show - separately show that rank B = rank AT and rank AT = rank SAT. The invertibility of S and T should be a big hint.

3. Induce on n. The basis step is trivial. For the inductive step, partition the matrix so that you 'solve' the system A' x = b' where A' and b' are of size n - 1. Then you can show that the last component of x is also unique, and you're done.
123qwerty is offline  
July 23rd, 2017, 11:44 AM   #4
Newbie
 
Joined: May 2017
From: Germany

Posts: 5
Thanks: 0

Quote:
Originally Posted by 123qwerty View Post
1. Are you familiar with coordinate systems? This would be much, much easier to get intuitively if you are.
I am, but only because I already had the second part of the lecture this semester. However, they have separate exams, so I wouldn't be allowed to use coordinate systems in the exam for this one. It has to work without them.

I'll look into the other two and see whether I can apply what you said.

Edit for OP:
in 2.

A = ((1, 2, 0), (2,1,0), (-1,2,0))
B = ((1, 2, 1), (2,1,0), (-1,2,-1))

Edit2: yeah, the similar => same rank implication isn't hard. the way back is the problem.

Last edited by sty silver; July 23rd, 2017 at 11:58 AM.
sty silver is offline  
July 24th, 2017, 10:41 AM   #5
Senior Member
 
Joined: Dec 2012
From: Hong Kong

Posts: 853
Thanks: 311

Math Focus: Stochastic processes, statistical inference, data mining, computational linguistics
2. $p(x) = \begin{bmatrix} 1 & 2 & 1 \\
2 & 1 & 0 \\
-1 & 2 & 1 \\ \end{bmatrix}\begin{bmatrix} 1 & 0 & 0 \\
0 & 1 & 0 \\
0 & 0 & 0 \\ \end{bmatrix}\begin{bmatrix} 1 & 2 & 1 \\
2 & 1 & 0\\
-1 & 2 & -1 \end{bmatrix}^{-1}x$

I think you can fill in the rest (i.e. multiply the first two matrices).

Note that there is no 'the basis of V'; every vector space has an infinite number of bases.
123qwerty is offline  
Reply

  My Math Forum > College Math Forum > Linear Algebra

Tags
problems, small



Thread Tools
Display Modes


Similar Threads
Thread Thread Starter Forum Replies Last Post
A small question yashrs Probability and Statistics 1 September 15th, 2015 09:55 PM
Small change. jiasyuen Calculus 5 October 5th, 2014 07:13 PM
Small problem thinkfast Algebra 9 January 28th, 2012 11:07 PM
Small inequality utoetu Real Analysis 4 January 9th, 2011 06:50 AM





Copyright © 2017 My Math Forum. All rights reserved.