Way to find common eigen vectors for a pair of matrices with known eigen values

์กฐํšŒ ์ˆ˜: 4 (์ตœ๊ทผ 30์ผ)
petit
petit 2021๋…„ 1์›” 16์ผ
๋Œ“๊ธ€: petit 2021๋…„ 1์›” 17์ผ
Hello,
I am looking for finding or rather building common eigenvectors matrix X between 2 matrices A and B such as :
AX=aX
with "a" the diagonal matrix corresponding to the eigenvalues
BX=bX
with "b" the diagonal matrix corresponding to the eigenvalues
where A and B are square and diagonalizable matrices.
I took a look in a similar post but had not managed to conclude, i.e can't have valid results when I build the final wanted endomorphism F defined by :
F = P D P^-1
I have also read the wikipedia topic and this interesting paper but couldn't have to extract methods pretty easy to implement.
How can I build these common eigenvectors and finding also the eigenvalues associated? I am a little lost between all the potential methods that exist to carry it out.
The screen capture below shows that the kernel of commutator has to be different from null vector :
From another maths forum, one advices me to use Singular values Decomposition (SVD) on the commutator [A,B], that is in Matlab doing by :
"If ๐‘ฃ is a common eigenvector, then โ€–(๐ด๐ตโˆ’๐ต๐ด)๐‘ฃโ€–=0. The SVD approach gives you a unit-vector ๐‘ฃ that minimizes โ€–(๐ด๐ตโˆ’๐ต๐ด)๐‘ฃโ€– (with the constraint that โ€–๐‘ฃโ€–=1)"
So I have extracted the approximative eigen vectors V from :
[U,S,V] = svd(A*B-B*A)
1) Is there a way to increase the accuracy to minimize โ€–(๐ด๐ตโˆ’๐ต๐ด)๐‘ฃโ€– as much as possible ?
IMPORTANT REMARK :
I saw there is another function called
rref
which can accept a tolerance parameter but :
1.1 What's the difference with singular values decomposition SVD algorithm ?
1.2 If this routine is efficient, which criterion could I apply for a pertinent choice of this tolerance value
2) Are there other alternative algorithms that could give better results than SVD and rref ?
I know there is not in my case analytical to find a common eigen vectors basis but with a relative small tolerance,
we may find an approximative basis. By the way, I didn't find any documentation about this.
The 2 matrices to find approximative common eigen vectors matrix are available in attachment.
3) If it is possible, Could anyone try to apply a function Matlab appropriate to find a basis of common eigen vectors or write a small Matlab script for this ?
Even a simple approximation would be enough, everything depends of the tolerance that we are ready to accept but currently I don't know how to introduce this tolerance parameter with `SVD` algorithms (if there are different versions in SVD algorithm) or alternative algorithms.
Any suggestion/track/clue/help is welcome
Best Regards
tags: matrix, matrix manipulation , minimization problem, eigen vectors, eigen values, SVD algorithm, nullspace, basis of vectors.
  ๋Œ“๊ธ€ ์ˆ˜: 3
David Goodmanson
David Goodmanson 2021๋…„ 1์›” 17์ผ
ํŽธ์ง‘: David Goodmanson 2021๋…„ 1์›” 17์ผ
Hi petit,
Could you explain the relationship of
F = P D P^-1
to A and B?
And for clarity, are you just looking for cases where (not counting multiplication of either of them by an overall constant), an eigenvector a1 of A exactly equals an eigenvector b1 of B? And then possibly another eigenvector a2 of A exactly equals another eigenvector b2 of B, and so forth? Or do you want the solution to the related but more difficult problem where
neither of (a1,a2), equals either of (b1,b2), but b1 is a linear combination of (a1,a2) and b2 is a different linear combination of (a1,a2)
so that (a1,a2) and (b1,b2) span the same 2d subspace? And the same idea with (a1,a2,a3) and (b1,b2,b3) etc?
petit
petit 2021๋…„ 1์›” 17์ผ
Hi David,
to put further information, by "common eigen vecors basis", I talk about the passing matrix P where I could write :
A = P D_a P^-1
and in the same time :
B = P D_b P^-1
At the beginning of this study, I thought that a simple linear combination of eigen vectors containing into P1 (coming from diagonalization of A) and P2 (coming from diagnalization of B) matrices , by writting, alpha*P1 + beta*P2, would be enough. But I couldn't have concluded.
That's why, in a second time, I tried not a linear combination but a matricial combination by writing also alpha*P1 +beta*P2, with alpha and beta which are matrices.
I tried in this post common eigen vectors for a pair of matrices and Global minimum finding to do this bu there too, it is difficult to conclude.
" Or do you want the solution to the related but more difficult problem where
neither of (a1,a2), equals either of (b1,b2), but b1 is a linear combination of (a1,a2) and b2 is a different linear combination of (a1,a2) "
The only thing that makes interest is to build as I said above a matricial combination of already existent eigen vectors taken individually. But I may not be on the right track by doing this and surely there exists potentially other methods but I have not enough background to choose one pertinent method.
So I hope to have given more details for my issue, feel free to add comments if things are yet unclear.
Best Regards

๋Œ“๊ธ€์„ ๋‹ฌ๋ ค๋ฉด ๋กœ๊ทธ์ธํ•˜์‹ญ์‹œ์˜ค.

๋‹ต๋ณ€ (0๊ฐœ)

์นดํ…Œ๊ณ ๋ฆฌ

Help Center ๋ฐ File Exchange์—์„œ Linear Algebra์— ๋Œ€ํ•ด ์ž์„ธํžˆ ์•Œ์•„๋ณด๊ธฐ

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by