Skip to content

Commit

Permalink
sync with sklearn version
Browse files Browse the repository at this point in the history
  • Loading branch information
rc committed Jan 28, 2019
1 parent 0773048 commit 9d909e2
Showing 1 changed file with 18 additions and 11 deletions.
29 changes: 18 additions & 11 deletions scipy/sparse/linalg/eigen/lobpcg/lobpcg.py
Original file line number Diff line number Diff line change
@@ -1,13 +1,20 @@
"""
Pure SciPy implementation of Locally Optimal Block Preconditioned Conjugate
Gradient Method (LOBPCG), see
https://bitbucket.org/joseroman/blopex
License: BSD
Authors: Robert Cimrman, Andrew Knyazev
Examples in tests directory contributed by Nils Wagner.
Locally Optimal Block Preconditioned Conjugate Gradient Method (LOBPCG).
References
----------
.. [1] A. V. Knyazev (2001),
Toward the Optimal Preconditioned Eigensolver: Locally Optimal
Block Preconditioned Conjugate Gradient Method.
SIAM Journal on Scientific Computing 23, no. 2,
pp. 517-541. http://dx.doi.org/10.1137/S1064827500366124
.. [2] A. V. Knyazev, I. Lashuk, M. E. Argentati, and E. Ovchinnikov (2007),
Block Locally Optimal Preconditioned Eigenvalue Xolvers (BLOPEX)
in hypre and PETSc. https://arxiv.org/abs/0705.2626
.. [3] A. V. Knyazev's C and MATLAB implementations:
https://bitbucket.org/joseroman/blopex
"""

from __future__ import division, print_function, absolute_import
Expand Down Expand Up @@ -201,9 +208,9 @@ def lobpcg(A, X,
Here, ``invA`` could of course have been used directly as a preconditioner.
Let us then solve the problem:
>>> eigs, vecs = lobpcg(A, X, Y=Y, M=M, tol=1e-4, maxiter=40, largest=False)
>>> eigs, vecs = lobpcg(A, X, Y=Y, M=M, largest=False)
>>> eigs
array([ 4., 5., 6.])
array([4., 5., 6.])
Note that the vectors passed in Y are the eigenvectors of the 3 smallest
eigenvalues. The results returned are orthogonal to those.
Expand Down

0 comments on commit 9d909e2

Please sign in to comment.