My Math Forum  

Go Back   My Math Forum > College Math Forum > Linear Algebra

Linear Algebra Linear Algebra Math Forum

LinkBack Thread Tools Display Modes
December 12th, 2014, 07:19 PM   #1
Joined: Dec 2014
From: Ohio

Posts: 1
Thanks: 0

Relationship between singular matrices and linear dependency?

I know that when a matrix is singular it has no inverse and its determinant is 0.
How does that relate to its linear dependency?
How can I tell or prove this to myself?
and is it the columns or rows that are linearly dependent?

Last edited by skipjack; December 12th, 2014 at 09:07 PM.
vyoung831 is offline  
December 12th, 2014, 09:20 PM   #2
Global Moderator
Joined: Dec 2006

Posts: 18,235
Thanks: 1437

For a simple example, consider

a b
c d

(where a is non-zero).

If the determinant is zero, ad = bc, so the second row is c/a times the first row and the second column is b/a times the first column.
skipjack is offline  

  My Math Forum > College Math Forum > Linear Algebra

dependency, linear, matrices, relationship, singular

Thread Tools
Display Modes

Similar Threads
Thread Thread Starter Forum Replies Last Post
Solving non-singular matrix (Linear Algebra) dpdieciocho Linear Algebra 1 July 16th, 2013 05:28 PM
Linear Transformations and Matrices Artus Linear Algebra 2 January 8th, 2013 09:36 AM
Solving Linear Systems using Matrices yusufali Linear Algebra 3 September 19th, 2012 02:00 PM
linear system of matrices remeday86 Linear Algebra 5 June 25th, 2010 10:41 PM
Linear system matrices remeday86 Linear Algebra 1 June 15th, 2010 06:55 PM

Copyright © 2017 My Math Forum. All rights reserved.