The multivariate common distribution is a generalization of the univariate (one-dimensional) common distribution to a variety of dimensions. It describes an opportunity distribution over vectors in a d-dimensional space, the place d is a constructive integer.
A random vector X = (X1, X2, …, Xd) follows a multivariate common distribution if its density function could also be written as:
f(x) = (1 / (sqrt((2π)^d * |Σ|))) * exp(-0.5 * ((x — μ)’ * Σ^-1 * (x — μ)))
the place:
* x is a column vector representing a selected degree inside the d-dimensional space;
* μ is a column vector of means for each dimension;
* Σ is the covariance matrix, which encodes particulars in regards to the variances and correlations among the many many completely totally different dimensions;
* |Σ| denotes the determinant of the covariance matrix;
* ‘ denotes the transpose operator;
* exp() is the exponential function.
The multivariate common distribution is completely specified by its indicate vector μ and covariance matrix Σ. Intuitively, the indicate vector presents the center of mass of the distribution, whereas the covariance matrix captures the shape and orientation of the ellipsoidal contours of equal probability. Particularly, the eigenvectors of the covariance matrix determine the directions of the principal axes of the ellipse, whereas the corresponding eigenvalues give the lengths of those axes.
One important property of the multivariate common distribution is that linear transformations shield normality. That is, if X follows a multivariate common distribution and A is a unbroken matrix, then the transformed random vector Y = AX moreover follows a multivariate common distribution, with indicate vector μ_Y = Aμ and covariance matrix Σ_Y = AΣA’. This property makes the multivariate common distribution considerably useful in a lot of areas of statistics, along with hypothesis testing, regression analysis, and machine finding out.