MedVision ad

Jumping steps in matrices and linear equations (1 Viewer)

leehuan

Well-Known Member
Joined
May 31, 2014
Messages
5,805
Gender
Male
HSC
2015
Suppose a homogenous system of linear equations Ax=b is represented as the augumented matrix [A|0]

But then supposed that the matrix of coefficients A is a square matrix.
(This implies that the number of variables we had at the start, is the same as the amount of equations we were given)

Would you be safe to say that x=0 is the only solution to this system?
 

InteGrand

Well-Known Member
Joined
Dec 11, 2014
Messages
6,109
Gender
Male
HSC
N/A
Suppose a homogenous system of linear equations Ax=b is represented as the augumented matrix [A|0]

But then supposed that the matrix of coefficients A is a square matrix.

Would you be safe to say that x=0 is the only solution to this system?
No.

E.g. If the system was:

x + y = 0

2x + 2y = 0

(So the matrix is:

[1 1]
[2 2] )

there are clearly infinitely many solutions.
 

leehuan

Well-Known Member
Joined
May 31, 2014
Messages
5,805
Gender
Male
HSC
2015
Oh of course.

In that case, what if

(a11, a12, a13, ..., a1n) \neq lambda(am1, am2, am3, ..., amn)

(sorry for the laziness)
 

seanieg89

Well-Known Member
Joined
Aug 8, 2006
Messages
2,662
Gender
Male
HSC
2007
Still untrue. Given a square matrix A, there is a unique solution to Ax=y IFF A is invertible. (Prove this!)

A matrix can fail to be invertible even if no two of its rows are scalar multiples of each other.

Eg/

 
Last edited:

InteGrand

Well-Known Member
Joined
Dec 11, 2014
Messages
6,109
Gender
Male
HSC
N/A
Oh of course.

In that case, what if

(a11, a12, a13, ..., a1n) \neq lambda(am1, am2, am3, ..., amn)

(sorry for the laziness)
In general, what we need (and is sufficient) to conclude that 0 is the only solution is that the columns of the matrix are what is called linearly independent (in the case of a square matrix, we can also replace the word 'columns' in the above with 'rows').

Basically what this means is that all the equations of our square matrix need to truly be 'different'; none of them should be able to be obtained from the other ones using linear combinations (i.e. by adding them or by adding scalar multiples of them).

E.g. Consider this homogeneous system:

x + y + z = 0

2x + y + z = 0

3x + 2y + 2z = 0.

Here, Equation (row) 3 is simply a sum of the previous two. It turns out that this implies there'll be infinitely many solutions (even though none of the rows are multiples of another; it's the linear combination thing rather than multiple thing that's important. In the case of two rows though, linear combination becomes equivalent to multiple; not the case for higher numbers of rows.)

The last equation is essentially not a 'different' equation. It is 'redundant' as we can 'get' it from the previous two equations (and by 'get', we mean it is a linear combination of the others). So this system is like having just two equations, so it's not surprising there'd be infinitely many solutions. This idea generalises to higher numbers of rows too.

Edit: answered above.
 
Last edited:

leehuan

Well-Known Member
Joined
May 31, 2014
Messages
5,805
Gender
Male
HSC
2015
Ah ok thanks. Haven't been taught this yet.
 

Users Who Are Viewing This Thread (Users: 0, Guests: 1)

Top