Abstract
The aim of this paper is to propose approximated numerical models and methods for solving (n × n) linear equations, AX = b where parameters in the matrix A and the vector b are random and correlated. The resulting of X is expressed in statistical forms of a mean vector E[X] and a variance-covariance matrix K[XXT ]. Additionally, co-variances between A and X can also be estimated. The modeling approach is simply based on approximations by the first order Taylor’s series expansion. Then, a linear statistical matrix equations for finding E [X] and K[XXT ] can be obtained by taking the expected value on the approximated results. The complexity of the approach can be proven to be within O(n 4). In the special case of deterministic A and probabilistic b, the complexity is reduced to O(n 3). A numerical example is illustrated and solved by a sample Matlab code. The code is extended to solve a set of randomly generated problems with various problem sizes. Computational results are reported and discussed. Then, the approach is preliminary extended for handling higher order correlations in case of certain A and random b.