A self-correcting matrix iteration for the Moore-Penrose generalized inverse

William H Pierce
1996 Linear Algebra and its Applications  
Let X k be the k th iterate of an algorithm fbr finding the Moore-Penrose generalized inverse of a singular matrix A. The popular algorithm Xk+ t = Xk(2I -AXt) has a first-order error component, but the more eomplieated algorithm given here has no first-order error eomponent for general A and X~.
doi:10.1016/0024-3795(94)00306-8 fatcat:7zwsuee63bcxrpzdj327wu75ou