词条 | Rank factorization |
释义 |
Given an matrix of rank , a rank decomposition or rank factorization of is a product , where is an matrix and is an matrix. ExistenceEvery finite-dimensional matrix has a rank decomposition: Let be an matrix whose column rank is . Therefore, there are linearly independent columns in ; equivalently, the dimension of the column space of is . Let be any basis for the column space of and place them as column vectors to form the matrix . Therefore, every column vector of is a linear combination of the columns of . To be precise, if is an matrix with as the -th column, then where 's are the scalar coefficients of in terms of the basis . This implies that , where is the -th element of . Non-uniquenessIf is rank factorization, taking and gives another rank factorization for any invertible matrix of compatible dimensions. Conversely, if are two rank factorizations of , then there exists an invertible matrix such that and .[1] ConstructionRank factorization from row echelon formsIn practice, we can construct one specific rank factorization as follows: we can compute , the reduced row echelon form of . Then is obtained by removing from all non-pivot columns, and by eliminating all zero rows of . ExampleConsider the matrix is in reduced echelon form. Then is obtained by removing the third column of , the only one which is not a pivot column, and by getting rid of the last row of zeroes, so It is straightforward to check that ProofLet be an permutation matrix such that in block partitioned form, where the columns of are the pivot columns of . Every column of is a linear combination of the columns of , so there is a matrix such that , where the columns of contain the coefficients of each of those linear combinations. So , being the identity matrix. We will show now that . Transforming into its reduced row echelon form amounts to left-multiplying by a matrix which is a product of elementary matrices, so , where . We then can write , which allows us to identify , i.e. the nonzero rows of the reduced echelon form, with the same permutation on the columns as we did for . We thus have , and since is invertible this implies , and the proof is complete. Singular value decompositionOne can also construct a full rank factorization of by using its singular value decomposition Since is a full column rank matrix and is a full row rank matrix, we can take and . Consequencesrank(A) = rank(AT)An immediate consequence of rank factorization is that the rank of is equal to the rank of its transpose . Since the columns of are the rows of , the column rank of equals its row rank. Proof: To see why this is true, let us first define rank to mean column rank. Since , it follows that . From the definition of matrix multiplication, this means that each column of is a linear combination of the columns of . Therefore, the column space of is contained within the column space of and, hence, rank ≤ rank. Now, is , so there are columns in and, hence, rank ≤ = rank. This proves that rank ≤ rank. Now apply the result to to obtain the reverse inequality: since = , we can write rank = rank ≤ rank. This proves rank ≤ rank. We have, therefore, proved rank ≤ rank and rank ≤ rank, so rank = rank. (Also see the first proof of column rank = row rank under rank). Notes1. ^{{cite journal|last1=Piziak|first1=R.|last2=Odell|first2=P. L.|title=Full Rank Factorization of Matrices|journal=Mathematics Magazine|date=1 June 1999|volume=72|issue=3|pages=193|doi=10.2307/2690882}} References{{refbegin}}
2 : Matrix decompositions|Linear algebra |
随便看 |
|
开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。