-
Notifications
You must be signed in to change notification settings - Fork 241
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use power iteration and Dimensionality reduction for eigenvalues #340
Comments
Is there a use case for this that differs from existing methods? Like it is better for large matrices vs Jacobi, for instance? |
The Jacobi method only works on symmetric matrixes. |
Would implementing this solve issue #346 where power Iteration finds an eigenvalue that is not necessarily the dominant one because of the selection of the random starting vector? Basically, could you use this process to find all the eigenvalues, and then just figure out what the dominant one is and return that for the powerIteration method? |
That’s a good idea. Instead of finding just the “largest”, find a few and then grab the largest of those. Alternatively, I wonder if, after a solution is found, the solution eigenvector could be perturbed slightly and used as a new starting point to ensure that it converges on the same eigenvalue, and not one that is larger. |
I think I figured out the general technique for simultaneously finding eigenvalues and eigenvectors for any matrix...numerically.
use power iteration to find largest...this gives both the value and the vector, then subtract it from the source matrix and repeat.
The text was updated successfully, but these errors were encountered: