![]() ![]() If your initial matrix is a matrix of floats though then it is already approximate in which case there is a good chance that there will be no benefit in using SymPy. Exact or symbolic calculations are a lot slower than fixed precision floating point though so you should expect things to slow down a lot compared to using NumPy or SciPy. Determining that the eigenvalues are precisely real will require exact arithmetic in general. The advantage of using SymPy rather than NumPy or SciPy in this context is just if you want to perform the calculation exactly or symbolically. I can throw more hardware at it on a cluster, but have not been able to try this yet. However this does not output an answer in 12 hours with any amount of hardware thrown at it. T = np.loadtxt('rep10_T_ij.dat', delimiter=' ')ĭisplay(roots(poly(char_poly, domain=CC))) My understanding is that this will force answers as real numbers. Is it possible somehow to find complex eigenvalues using SymPy? ![]() Using sympy as described here makes sense: I turned to the sympy library, which also returned complex numbers as the solution. The scipy and numpy libraries return complex numbers - which is apparently due to not being able to solve the characteristic polynomial as real numbers or the algorithms being optimised to do this. Columns do not have to sum to anything in particular. Each row of the matrix, T, sums to 1.0 (100% probability). Elements of the matrix, T, are probabilities from 0 to 1. Am trying to determine the (real) left-eigenvectors and eigenvalues of a 100 by 100 matrix. ? In the paper, they say that phi diagonalizes A=FISH_sp and B=FISH_xc but I can't reproduce it. Which don't give same values for a given column of FISH_sp and FISH_xc) How could I fix this wrong result (I am talking about the ratios : FISH_sp*phi./phi % Check eigen values : OK, columns of eigenvalues D2 found ! % Check eigen values : OK, columns of eigenvalues D1 found ! So, I don't find that matrix of eigenvectors Phi diagonalizes A and B since the eigenvalues expected are not columns of identical values.īy the way, I find the eigenvalues D1 and D2 coming from : = eig(FISH_sp) % Check if phi diagolize FISH_sp : NOT OK, not identical eigenvalues % Check eigen values : OK, columns of eigenvalues found ! % DEBUG : check identity matrix => OK, Identity matrix found ! % V2 corresponds to eigen vectors of FISH_xc Indeed, by doing : % Marginalizing over uncommon parameters between the two matrices I have wrong results if I want to say that phi diagonalizes both A=FISH_sp and B=FISH_xc matrices. From a numerical point of view, why don't I get the same results between the method in 1) and the method in 3) ? I mean about the Phi eigen vectors matrix and the Lambda diagonal matrix.Maybe, we could arrange this relation such that : A*Phi'=Phi'*Lambda_A' Indeed, what I have done up to now is to to find a parallel relation between A*Phi and B*Phi, linked by Lambda diagonal matrix. Now, I would like to do the link between this generalized problem and the eventual common eigenvectors between A and B matrices (respectively Fish_sp and Fish_xc).So at the end, I find phi eigenvectors matrix (phi) and lambda diagonal matrix (D1). ![]() % Applying each step of algorithm 1 on page 7 Here my little Matlab script for this method : % Diagonalize A = FISH_sp and B = Fish_xc Here the interested part (sorry, I think Latex is not available on stakoverflow) : I have followed all the steps of this algorithm and it seems to give better results when I make the Fisher synthesis. To summarize, the algorithm used is described on page 7.
0 Comments
Leave a Reply. |