Factor analysis addresses the problem of testing whether n observed random variables are conditionally independent given k hidden variables, called the factors. In the case where the joint distribution of all n + k variables is multivariate Gaussian, the parameter space Fn,k for the k-factor model is the set of (n x n)-covariance matrices of the form D + S where D is diagonal and positive definite and S is positive semidefinite of rank at most k. An algebraic approach to factor analysis seeks to determine all polynomial relations among the matrix entries in Fn,k ; these relations are called model invariants. Clearly, model invariants for Fm,k yield model invariants for Fn,k for any n ≥ m by taking principal (m x m)-submatrices. This leads to the natural question of whether, for fixed k and varying n, the model invariants are finitely generated in the natural manner. I will discuss a set-theoretic version of this question, which we recently settled.
back to EIDMA Seminar Combinatorial Theory announcements