Message Turncoat in a DM to get moderator attention

Users Online(? lurkers):
4 / 24 posts
Posts: 70
0 votes RE: Linear Algebra. Shilov

1.9 Linear Dependence between Columns

-----------------------------------------------------------
Linear Combinations : :
-----------------------------------------------------------

Suppose we have m columns containing n elements each.

Posted Image

We multiply all elements of the first column by some number λ₁, all elements in the second columns by λ₂, and so on.

The sum λ₁A₁ + λ₂A₂ + . . . + λₘAₘ is equivalent to a new column C with elements c₁,c₂,...,cₙ.

Posted Image

The column C is known as a linear combination of the columns A₁, A₂, . . . , Aₘ and coefficients λ₁, λ₂, . . ., λₘ

Suppose we have columns that are not chosen independently that make up a determinant D of order n.

If one of the columns of the determinant D is a linear combination of the other columns, then D = 0.

Consider the qth column of Determinant D as a linear combination of the jth, kth, ..., pth columns of D with coefficients λ₁,λ₂,...,λₚ respectively.

The value of determinant is not changed bu adding the elements of one column multiplied by an arbitrary number to the corresponding elements of another column.

Hence, by subtracting from the qth column first the jth column multiplied by λⱼ, then the kth column multiplied by λₖ, and so on until the pth column multiplied by the λₚ, we do not change the value of the determinant D.

However, given qth column is a linear combination of the others their subtraction from it leads to a zero column and as such D = 0.

-----------------------------------------------------------
Rank of a Matrix A : :
-----------------------------------------------------------

Consider a matrix A with m columns and n rows.

If k columns and k rows of this matrix are held fixed then the elements appearing at the intersections of these columns and rows form a square matrix of order k.

The determinant of this new matrix is a minor of order k of the original matrix A.

If not all aᵢₖ are zero, the we can always find an integer r with the following two properties:
(1) The matrix A has a minor of order r which does not vanish.
(2) Every minor of the matrix A of order r+1 and higher vanishes.

The number r with these properties is called the rank of the matrix A.

If all aᵢₖ vanish, then the rank of the matrix A is r=0.

-----------------------------------------------------------
Basis Minor : :
-----------------------------------------------------------

The minor with an order r is called the basis minor of matrix A.

The columns which contain the basis minor are called the basis columns.

Any column of the matrix A is a linear combination of its basis columns.

Consider a matrix A and assume that the basis minor of the matrix is located in the first r rows and first r columns.

Let s be any integer from 1 to m, let k be any integer from 1 to n.

Posted Image

If k <= r, then the determinant will obviously be zero given it will have identical rows.

If s <= r, then the determinant will obviously be zero given it will have identical columns.

If k > r and s > r then the determinant equals zero because it will have a minor of the matrix A of order r+1.

Expand the determinant D in respect to its last row,

aₖ₁Aₖ₁ + aₖ₂Aₖ₂ + . . . + aₖᵣAₖᵣ + aₖₛAₖₛ = 0

Given the cofactors are formed from aᵢⱼ where i<=r they do not depend on the number k, therefore,

Aₖ₁ = c₁, Aₖ₂ = c₂, ..., Aₖᵣ = cᵣ, Aₖₛ = cₛ

And so,

c₁a₁₁ + c₂a₁₂ + . . . + cᵣa₁ᵣ + cₛa₁ₛ = 0
c₁a₂₁ + c₂a₂₂ + . . . + cᵣa₂ᵣ + cₛa₂ₛ = 0
     .        .          .        .
c₁aₙ₁ + c₂aₙ₂ + . . . + cᵣaₙᵣ + cₛaₙₛ = 0


cₛ = Aₖₛ are the following cofactor,

Posted Image

Hence, cₛ = Aₖₛ are the basis minor of the matrix A and do not vanish.

Divide each equation by cₛ,

(c₁/cₛ)a₁₁ + (c₂/cₛ)a₁₂ + . . . + (cᵣ/cₛ)a₁ᵣ + (cₛ/cₛ)a₁ₛ = 0
(c₁/cₛ)a₂₁ + (c₂/cₛ)a₂₂ + . . . + (cᵣ/cₛ)a₂ᵣ + (cₛ/cₛ)a₂ₛ = 0
       .           .            .            .           .           .
(c₁/cₛ)aₙ₁ + (c₂/cₛ)aₙ₂ + . . . + (cᵣ/cₛ)aₙᵣ + (cₛ/cₛ)aₙₛ = 0

We let all terms cⱼ/cₛ = λⱼ (j=1,2,3...,r) and move them to the right side of the equation,

a₁ₛ = λ₁a₁₁ + λ₂a₁₂ + . . . + λᵣa₁ᵣ
a₂ₛ = λ₁a₂₁ + λ₂a₂₂ + . . . + λᵣa₂ᵣ
          .          .          .
aₙₛ = λ₁aₙ₁ + λ₂aₙ₂ + . . . + λᵣaₙᵣ

It is shown that the sth column of the matrix A is a linear combination of the first r columns, hence the determinant of matrix A vanishes.

-----------------------------------------------------------
Determinants that Vanish : :
----------------------------------------------------------

If the determinant D vanishes, then it has at least one column which is a linear combination of the other columns.

Since D = 0 the basis minor of this matrix would be of order r < n.

After specifying the r basis columns we can find at least one column which is not one of the basis columns.

By the basis minor theorem this column is a linear combination of the basis columns.

-----------------------------------------------------------
Linear Dependence : :
-----------------------------------------------------------

In cases where λ₁, λ₂, . . . , λₘ of a linear combination of m columns are all zero, then the linear combination is obviously zero.

In cases where λ₁, λ₂, . . . , λₘ are not zero but the determinant still vanishes we say that its columns
A₁, A₂, . . . , Aₘ are linearly dependent.

Hence, the determinant D vanishes if and only if there is a linear dependence between its columns.

Posts: 2377
0 votes RE: Linear Algebra. Shilov

"Hence, the determinant D vanishes if and only if there is a linear dependence between its columns." columns or rows

 

now, how to work the problem backwards? given the singular square matrix n x n, which columns or rows are linearly dependent?

How do I find the offending columns or rows ? Once I find them, I can just add a small white noise error and, BOOOMMMM, the solution is satisfactory. After all, this is real data from the real world. A small error means nothing.

FEAR! FEAR! FEAR! FEAR! FEAR! FEAR!
last edit on 6/19/2022 6:28:26 AM
Posts: 70
0 votes RE: Linear Algebra. Shilov
LiYang said: 

"Hence, the determinant D vanishes if and only if there is a linear dependence between its columns." columns or rows

Previously in the text, and these notes, it has been stated that everything to be stated about columns is true for rows.

now, how to work the problem backwards? given the singular square matrix n x n, which columns or rows are linearly dependent?

How do I find the offending columns or rows ? Once I find them, I can just add a small white noise error and, BOOOMMMM, the solution is satisfactory. After all, this is real data from the real world. A small error means nothing.

We can generalize "Hence, the determinant D vanishes if and only if there is a linear dependence between its columns or rows" to the statement vectors x1,x2,...,xk are linearly dependent if an only if one of the vectors can be expressed as a linear combination of the others.

As such, to identify the offending columns or rows is to identify those columns or rows that are linear combinations of all other columns and rows. I am unaware of tips and tricks to do this but according to what has been stated in this thread finding the basis minor would give you those zeroth columns and rows given its excludes all zeroth columns and rows.

Interestingly, the above process is effectively finding those rows and columns that are linearly independent. Hence, to be linearly independent is to be a bases I assume.

Posts: 2377
0 votes RE: Linear Algebra. Shilov
LiYang said: 

"Hence, the determinant D vanishes if and only if there is a linear dependence between its columns." columns or rows

Previously in the text, and these notes, it has been stated that everything to be stated about columns is true for rows.

now, how to work the problem backwards? given the singular square matrix n x n, which columns or rows are linearly dependent?

How do I find the offending columns or rows ? Once I find them, I can just add a small white noise error and, BOOOMMMM, the solution is satisfactory. After all, this is real data from the real world. A small error means nothing.

We can generalize "Hence, the determinant D vanishes if and only if there is a linear dependence between its columns or rows" to the statement vectors x1,x2,...,xk are linearly dependent if an only if one of the vectors can be expressed as a linear combination of the others.

As such, to identify the offending columns or rows is to identify those columns or rows that are linear combinations of all other columns and rows. I am unaware of tips and tricks to do this but according to what has been stated in this thread finding the basis minor would give you those zeroth columns and rows given its excludes all zeroth columns and rows.

Interestingly, the above process is effectively finding those rows and columns that are linearly independent. Hence, to be linearly independent is to be a bases I assume.

 yeah, thanks for looking, my problem is not trivial. nxn is in the order of 1000x1000, so it is difficult to find the dependencies. Adding white noise to all of the matrix seems to help but pisses me off due to being such a cave man method.

FEAR! FEAR! FEAR! FEAR! FEAR! FEAR!
4 / 24 posts
This site contains NSFW material. To view and use this site, you must be 18+ years of age.