Matrix decomposition, also called matrix factorization, is splitting a matrix into its elemental parts. There are several methods for matrix decomposition. In machine learning, Singular-Value Decomposition or SVD is one of the most frequently used due to its simplicity.
Mathematically, SVD can be described as:
Consider A an m × n matrix with singular values s1 ≥ s2 ≥ … ≥ sn ≥ 0, and r the number of nonzero singular values of A, or equivalently the rank of A.
Then, A singular value decomposition of A is the factorization
U is an m × m orthogonal matrix, whose columns are called the left-singular vectors of A,
Σ is an m × n matrix where σi,j = si for i = 1, … , r and i = j, and σi,j = 0 for all other cases, whose diagonal values are known as the singular values of A; and
V is an n × n orthogonal matrix, whose columns are called the right-singular vectors of A.
The SVD is calculated through iterative numerical methods, and it is one of the most widely used decomposition methods because all matrices have an SVD.
SVD has a wide range of applications, such as least-squares linear regression, compressing, denoising, and data reduction. It is also applied in necessary matrix algebra calculations, such as finding the inverse of a matrix.
In machine learning, it is used for dimensionality reduction. A dataset with more features (columns) than observations (rows) is reduced to a smaller subset containing the most relevant features to the problem.
It is also used to calculate the inverse matrices when the number of columns and rows are different. This method was independently developed by E. Moore and Roger Penrose and is called pseudoinverse or Moore-Penrose Inverse.
Many of the methods included in LogicPlum’s platform are based on matrix operations. Therefore, SVD is frequently employed by the platform to simplify tasks and make calculations.
The advantage for the users is that the system automatically handles all these operations, reducing their need for advanced mathematical knowledge.