'Eigenvectors in Julia vs Numpy
I'm currently working to diagonalize a 5000x5000 Hermitian matrix, and I find that when I use Julia's eigen
function in the LinearAlgebra
module, which produces both the eigenvalues and eigenvectors, I get different results for the eigenvectors compared to when I solve the problem using numpy's np.linalg.eigh
function. I believe both of them use BLAS, but I'm not sure what else they may be using that is different.
Has anyone else experienced this/knows what is going on?
Solution 1:[1]
numpy.linalg.eigh(a, UPLO='L')
is a different algorithm. It assumes the matrix is symmetric and takes the lower triangular matrix (as a default) to more efficiently compute the decomposition.
The equivalent to Julia's LinearAlgebra.eigen()
is numpy.linalg.eig
. You should get the same result if you turn your matrix in Julia into a Symmetric(A, uplo=:L)
matrix before feeding it into LinearAlgebra.eigen()
.
Check out numpy's docs on eig
and eigh
. Whilst Julia's standard LinearAlgebra capabilities are here. If you go down to the special matrices sections, it details what special methods it uses depending on the type of special matrix thanks to multiple dispatch.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 |