Prove that the eigenvalues of a block matrix are the combined eigenvalues of its blocksProve that the set of...
Can't make sense of a paragraph from Lovecraft
Why restrict private health insurance?
Are there historical instances of the capital of a colonising country being temporarily or permanently shifted to one of its colonies?
Rationale to prefer local variables over instance variables?
Does a difference of tense count as a difference of meaning in a minimal pair?
Is a piano played in the same way as a harmonium?
Finitely many repeated replacements
What is Tony Stark injecting into himself in Iron Man 3?
Shifting between bemols (flats) and diesis (sharps)in the key signature
How to write a chaotic neutral protagonist and prevent my readers from thinking they are evil?
Expressing logarithmic equations without logs
Why does Solve lock up when trying to solve the quadratic equation with large integers?
Professor forcing me to attend a conference, I can't afford even with 50% funding
Why do phishing e-mails use faked e-mail addresses instead of the real one?
Why is gluten-free baking possible?
Is it safe to abruptly remove Arduino power?
Are small insurances worth it?
How can I find out information about a service?
How do spaceships determine each other's mass in space?
For which categories of spectra is there an explicit description of the fibrant objects via lifting properties?
Source permutation
When a wind turbine does not produce enough electricity how does the power company compensate for the loss?
Is it possible to find 2014 distinct positive integers whose sum is divisible by each of them?
Getting the || sign while using Kurier
Prove that the eigenvalues of a block matrix are the combined eigenvalues of its blocks
Prove that the set of eigenvalues of block matrix with blocks $A$ and $B$ is the union of eigenvalues of $A$ and $B$.How to prove $I-BA$ is invertibleGraph spectrum: bipartite structure and cluster structureEigenvalues of a block matrix where it has common elementsRecovering a Matrix knowing its eigenvectors and eigenvaluesEigenvalues of Block Anti-Diagonal MatrixShowing that a matrix is invertible and finding its inverseBlock matrix with Lower triangular matrices as blocksEigenvalues of block matrixrank of block triangular matrix and its relation to the rank of its diagonal blocksProve a Block Matrix is Positive-Definite Given the elementsDeterminant of block matrix with off-diagonal blocks conjugate of each other.Eigenvalues and Eigenvectors of a block tridiagonal block MatrixEigenvalues and eigenvectors of a Block Tridiagonal Matrix
$begingroup$
Let $A$ be a block upper triangular matrix:
$$A = begin{pmatrix} A_{1,1}&A_{1,2}\ 0&A_{2,2} end{pmatrix}$$
where $A_{1,1} ∈ C^{p times p}$, $A_{2,2} ∈ C^{(n-p) times (n-p)}$. Show that the eigenvalues of $A$ are the combined eigenvalues of $A_{1,1}$ and $A_{2,2}$
I've been pretty much stuck looking at this for a good hour and a half, so any help would be much appreciated. Thanks.
matrices eigenvalues-eigenvectors block-matrices
$endgroup$
add a comment |
$begingroup$
Let $A$ be a block upper triangular matrix:
$$A = begin{pmatrix} A_{1,1}&A_{1,2}\ 0&A_{2,2} end{pmatrix}$$
where $A_{1,1} ∈ C^{p times p}$, $A_{2,2} ∈ C^{(n-p) times (n-p)}$. Show that the eigenvalues of $A$ are the combined eigenvalues of $A_{1,1}$ and $A_{2,2}$
I've been pretty much stuck looking at this for a good hour and a half, so any help would be much appreciated. Thanks.
matrices eigenvalues-eigenvectors block-matrices
$endgroup$
4
$begingroup$
Thank you so much for posting this question! You may have just single handedly saved my research!!!
$endgroup$
– Paul
Jan 31 '13 at 20:43
$begingroup$
What if $A_{2,1}$ is not $0$ ?
$endgroup$
– Ashutosh Gupta
May 12 '16 at 4:40
$begingroup$
@tsiki What is $A_{21}$ is not zero?
$endgroup$
– Babai
Jul 19 '16 at 7:44
add a comment |
$begingroup$
Let $A$ be a block upper triangular matrix:
$$A = begin{pmatrix} A_{1,1}&A_{1,2}\ 0&A_{2,2} end{pmatrix}$$
where $A_{1,1} ∈ C^{p times p}$, $A_{2,2} ∈ C^{(n-p) times (n-p)}$. Show that the eigenvalues of $A$ are the combined eigenvalues of $A_{1,1}$ and $A_{2,2}$
I've been pretty much stuck looking at this for a good hour and a half, so any help would be much appreciated. Thanks.
matrices eigenvalues-eigenvectors block-matrices
$endgroup$
Let $A$ be a block upper triangular matrix:
$$A = begin{pmatrix} A_{1,1}&A_{1,2}\ 0&A_{2,2} end{pmatrix}$$
where $A_{1,1} ∈ C^{p times p}$, $A_{2,2} ∈ C^{(n-p) times (n-p)}$. Show that the eigenvalues of $A$ are the combined eigenvalues of $A_{1,1}$ and $A_{2,2}$
I've been pretty much stuck looking at this for a good hour and a half, so any help would be much appreciated. Thanks.
matrices eigenvalues-eigenvectors block-matrices
matrices eigenvalues-eigenvectors block-matrices
edited 2 days ago
Rodrigo de Azevedo
13k41960
13k41960
asked Feb 10 '11 at 23:40
tsikitsiki
3291315
3291315
4
$begingroup$
Thank you so much for posting this question! You may have just single handedly saved my research!!!
$endgroup$
– Paul
Jan 31 '13 at 20:43
$begingroup$
What if $A_{2,1}$ is not $0$ ?
$endgroup$
– Ashutosh Gupta
May 12 '16 at 4:40
$begingroup$
@tsiki What is $A_{21}$ is not zero?
$endgroup$
– Babai
Jul 19 '16 at 7:44
add a comment |
4
$begingroup$
Thank you so much for posting this question! You may have just single handedly saved my research!!!
$endgroup$
– Paul
Jan 31 '13 at 20:43
$begingroup$
What if $A_{2,1}$ is not $0$ ?
$endgroup$
– Ashutosh Gupta
May 12 '16 at 4:40
$begingroup$
@tsiki What is $A_{21}$ is not zero?
$endgroup$
– Babai
Jul 19 '16 at 7:44
4
4
$begingroup$
Thank you so much for posting this question! You may have just single handedly saved my research!!!
$endgroup$
– Paul
Jan 31 '13 at 20:43
$begingroup$
Thank you so much for posting this question! You may have just single handedly saved my research!!!
$endgroup$
– Paul
Jan 31 '13 at 20:43
$begingroup$
What if $A_{2,1}$ is not $0$ ?
$endgroup$
– Ashutosh Gupta
May 12 '16 at 4:40
$begingroup$
What if $A_{2,1}$ is not $0$ ?
$endgroup$
– Ashutosh Gupta
May 12 '16 at 4:40
$begingroup$
@tsiki What is $A_{21}$ is not zero?
$endgroup$
– Babai
Jul 19 '16 at 7:44
$begingroup$
@tsiki What is $A_{21}$ is not zero?
$endgroup$
– Babai
Jul 19 '16 at 7:44
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Let $A$ be the original matrix of size $n times n$. One way out is to use the identity. (Result from Schur Complement) https://en.wikipedia.org/wiki/Schur_complement
$det left( begin{matrix} B_{1,1}&B_{1,2}\ B_{2,1 }&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22} - B_{21}B_{11}^{-1}B_{12})$.
We know that $lambda$ is a number such that $Ax = lambda x$. From which we get $det(A-lambda I_n) = 0$.
In your case, the matrix $A_{21}$ is a zero matrix and hence, we get $det(A-lambda I_n) = det left( left( begin{matrix} A_{1,1}&A_{1,2}\ 0&A_{2,2} end{matrix}right) - lambda I_n right) = det left( begin{matrix} A_{1,1} - lambda I_{k}&A_{1,2}\ 0&A_{2,2} - lambda I_{n-k} end{matrix}right)$
Hence $det(A-lambda I_n) = det(A_{1,1} - lambda I_{k}) times det(A_{22} - lambda I_{n-k})$.
So we get that if $lambda$ is an eigen value of $A_{11}$ or $A_{22}$, then either $det(A_{11}-lambda I_k) = 0$ or $det(A_{22}-lambda I_{n-k}) = 0$ and hence $det(A-lambda I_n) = 0$ and hence $lambda$ is an eigenvalue of $A$.
Similarly, if $lambda$ is an eigenvalue of $A$, then $det(A-lambda I_n) = 0$, then either $det(A_{11}-lambda I_k) = 0$ or $det(A_{22}-lambda I_{n-k}) = 0$ and hence $lambda$ is an eigen value of $A_{11}$ or $A_{22}$.
Edit
There is actually a small error in the above argument.
You might wonder that if $lambda$ is an eigenvalue of $A_{11}$, then $A_{11} - lambda I_k$ is not invertible and hence the identity $det left( begin{matrix} B_{1,1}&B_{1,2}\ B_{2,1 }&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22} - B_{21}B_{11}^{-1}B_{12})$ is false since $B_{11}$ is not invertible.
However, there is an another identity $det left( begin{matrix} B_{1,1}&B_{1,2}\ 0&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22})$ which is always true. (Prove both the identites as an exercise).
We can make use of this identity to get
$det(A-lambda I_n) = det(A_{1,1} - lambda I_{k}) times det(A_{22} - lambda I_{n-k})$.
$endgroup$
$begingroup$
@user17762 Is there a similar statement when $A_{21}$ is not zero matrix?
$endgroup$
– Babai
Jul 19 '16 at 7:45
add a comment |
$begingroup$
A simpler way is from the definition. Is is easy to show that if $lambda_1$ is an eigenvalue of the upper diagonal block $A_{1,1}$, with eigenvector $p_1$, (size $n_1$) then it's also an eigenvalue of the full matrix, with the same eigenvector augmented with zeros.
$A_{1,1} ; p_1 = lambda_1 p_1$ with $p_1 ne 0 $
So
$$ left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} p_1 \ 0 end{matrix} right) =
left( begin{matrix} A_{1,1} ; p_1 \ 0 end{matrix} right) =
left( begin{matrix} lambda_1 p_1 \ 0 end{matrix} right) =
lambda_1 left( begin{matrix} p_1 \ 0 end{matrix} right) $$
Hence if $lambda$ is eigenvalue of $A_{1,1}$ then it's also eigenvalue of $A$. There are $n_1$ (counting multiplicity) such eigenvalues. The same applies to the lower diagonal block $A_{2,2}$. So we have found the $n_1$ + $n_2 = n$ eigenvalues of the full matrix. (Wrong! This only applied to block diagonal matrix - Fixed below)
Suposse now that $lambda_2$ is eigenvalue of $A_{2,2}$ with eigenvector $p_2$.
If $lambda_2$ is also eigenvalue of $A_{1,1}$, we have proved above that it's also eigenvalue of $A$. So, let's assume it's not eigenvalue of $A_{1,1}$ - hence $|A_{1,1} - lambda_2 I|ne 0$. Now
$$left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} x \ p_2 end{matrix} right) =
left( begin{matrix} A_{1,1} x + A_{1,2} p_2 \ lambda_2 p_2 end{matrix} right)
$$
We can make $ A_{1,1} x + A_{1,2} p_2 = lambda_2 x$ by choosing $x = - (A_{1,1} - lambda_2 I)^{-1} A_{1,2} ; p_2$; and so we found an eigenvector for $A$ with $lambda_2$ as eigenvalue.
It this way, we showed that if $lambda$ is eigenvalue of $A_{1,1}$ or $A_{2,2}$, then it's an eigenvalue of $A$.
To complete the proof, one should show the other way round: that if $lambda$ is eigenvalue of $A$ then it's eigenvalue of $A_{1,1}$ or $A_{2,2}$. But that's easy:
$$left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} x_1 \ x_2 end{matrix} right) =
left( begin{matrix} A_{1,1} ; x_1 + A_{1,2} ; x_2 \ A_{2,2} ; x_2 end{matrix} right)
= left( begin{matrix} lambda ; x_1 \ lambda ; x_2 end{matrix} right)
$$
Now, either $x_2 = 0$ or not. If not, then $lambda$ is eigenvalue of $A_{2,2}$. If yes,
it's eigenvalue of $A_{1,1}$.
$endgroup$
$begingroup$
Well, you can't use exactly the same argument for an eigenvector $p_2$ of $A_{2,2}$ since $A (0, p_2)^T = (A_{1,2}p_2, A_{2,2}p_2)^T$.
$endgroup$
– Calle
Feb 11 '11 at 1:40
$begingroup$
You are right! Fortunately, I could fix it :-)
$endgroup$
– leonbloy
Feb 11 '11 at 2:16
2
$begingroup$
You could probably save some writing by noting that $A$ and $A^T$ have the same eigen values. So your initial argument could be carried on to $A_{11}$ by transposing $A$ and doing the same argument for $A_{11}^T$
$endgroup$
– user17762
Feb 11 '11 at 2:21
$begingroup$
Nice argument though.
$endgroup$
– user17762
Feb 11 '11 at 5:37
1
$begingroup$
@Babai If $A_{21}ne 0$ but $A_{12} = 0$ it's the same thing. If both off diagonal submatrices are non zero, then the thesis is false (eigenvalues of the full matrix are not given by the diagonal submatrices alone).
$endgroup$
– leonbloy
Jul 19 '16 at 12:04
|
show 2 more comments
$begingroup$
For another approach for a proof you can use the Gershgorin disc theorem (sometimes Hirschhorn due to pronounciation differences between alphabets) to prove the disks for the individual matrices are the same as the discs for the large matrix so the sets of possible eigenvalues must be the same. This is because the radial contribution to the disks are 0 all over all entries for the lower left block since $|0| = 0$ and $0+0=0$.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f21454%2fprove-that-the-eigenvalues-of-a-block-matrix-are-the-combined-eigenvalues-of-its%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Let $A$ be the original matrix of size $n times n$. One way out is to use the identity. (Result from Schur Complement) https://en.wikipedia.org/wiki/Schur_complement
$det left( begin{matrix} B_{1,1}&B_{1,2}\ B_{2,1 }&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22} - B_{21}B_{11}^{-1}B_{12})$.
We know that $lambda$ is a number such that $Ax = lambda x$. From which we get $det(A-lambda I_n) = 0$.
In your case, the matrix $A_{21}$ is a zero matrix and hence, we get $det(A-lambda I_n) = det left( left( begin{matrix} A_{1,1}&A_{1,2}\ 0&A_{2,2} end{matrix}right) - lambda I_n right) = det left( begin{matrix} A_{1,1} - lambda I_{k}&A_{1,2}\ 0&A_{2,2} - lambda I_{n-k} end{matrix}right)$
Hence $det(A-lambda I_n) = det(A_{1,1} - lambda I_{k}) times det(A_{22} - lambda I_{n-k})$.
So we get that if $lambda$ is an eigen value of $A_{11}$ or $A_{22}$, then either $det(A_{11}-lambda I_k) = 0$ or $det(A_{22}-lambda I_{n-k}) = 0$ and hence $det(A-lambda I_n) = 0$ and hence $lambda$ is an eigenvalue of $A$.
Similarly, if $lambda$ is an eigenvalue of $A$, then $det(A-lambda I_n) = 0$, then either $det(A_{11}-lambda I_k) = 0$ or $det(A_{22}-lambda I_{n-k}) = 0$ and hence $lambda$ is an eigen value of $A_{11}$ or $A_{22}$.
Edit
There is actually a small error in the above argument.
You might wonder that if $lambda$ is an eigenvalue of $A_{11}$, then $A_{11} - lambda I_k$ is not invertible and hence the identity $det left( begin{matrix} B_{1,1}&B_{1,2}\ B_{2,1 }&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22} - B_{21}B_{11}^{-1}B_{12})$ is false since $B_{11}$ is not invertible.
However, there is an another identity $det left( begin{matrix} B_{1,1}&B_{1,2}\ 0&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22})$ which is always true. (Prove both the identites as an exercise).
We can make use of this identity to get
$det(A-lambda I_n) = det(A_{1,1} - lambda I_{k}) times det(A_{22} - lambda I_{n-k})$.
$endgroup$
$begingroup$
@user17762 Is there a similar statement when $A_{21}$ is not zero matrix?
$endgroup$
– Babai
Jul 19 '16 at 7:45
add a comment |
$begingroup$
Let $A$ be the original matrix of size $n times n$. One way out is to use the identity. (Result from Schur Complement) https://en.wikipedia.org/wiki/Schur_complement
$det left( begin{matrix} B_{1,1}&B_{1,2}\ B_{2,1 }&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22} - B_{21}B_{11}^{-1}B_{12})$.
We know that $lambda$ is a number such that $Ax = lambda x$. From which we get $det(A-lambda I_n) = 0$.
In your case, the matrix $A_{21}$ is a zero matrix and hence, we get $det(A-lambda I_n) = det left( left( begin{matrix} A_{1,1}&A_{1,2}\ 0&A_{2,2} end{matrix}right) - lambda I_n right) = det left( begin{matrix} A_{1,1} - lambda I_{k}&A_{1,2}\ 0&A_{2,2} - lambda I_{n-k} end{matrix}right)$
Hence $det(A-lambda I_n) = det(A_{1,1} - lambda I_{k}) times det(A_{22} - lambda I_{n-k})$.
So we get that if $lambda$ is an eigen value of $A_{11}$ or $A_{22}$, then either $det(A_{11}-lambda I_k) = 0$ or $det(A_{22}-lambda I_{n-k}) = 0$ and hence $det(A-lambda I_n) = 0$ and hence $lambda$ is an eigenvalue of $A$.
Similarly, if $lambda$ is an eigenvalue of $A$, then $det(A-lambda I_n) = 0$, then either $det(A_{11}-lambda I_k) = 0$ or $det(A_{22}-lambda I_{n-k}) = 0$ and hence $lambda$ is an eigen value of $A_{11}$ or $A_{22}$.
Edit
There is actually a small error in the above argument.
You might wonder that if $lambda$ is an eigenvalue of $A_{11}$, then $A_{11} - lambda I_k$ is not invertible and hence the identity $det left( begin{matrix} B_{1,1}&B_{1,2}\ B_{2,1 }&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22} - B_{21}B_{11}^{-1}B_{12})$ is false since $B_{11}$ is not invertible.
However, there is an another identity $det left( begin{matrix} B_{1,1}&B_{1,2}\ 0&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22})$ which is always true. (Prove both the identites as an exercise).
We can make use of this identity to get
$det(A-lambda I_n) = det(A_{1,1} - lambda I_{k}) times det(A_{22} - lambda I_{n-k})$.
$endgroup$
$begingroup$
@user17762 Is there a similar statement when $A_{21}$ is not zero matrix?
$endgroup$
– Babai
Jul 19 '16 at 7:45
add a comment |
$begingroup$
Let $A$ be the original matrix of size $n times n$. One way out is to use the identity. (Result from Schur Complement) https://en.wikipedia.org/wiki/Schur_complement
$det left( begin{matrix} B_{1,1}&B_{1,2}\ B_{2,1 }&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22} - B_{21}B_{11}^{-1}B_{12})$.
We know that $lambda$ is a number such that $Ax = lambda x$. From which we get $det(A-lambda I_n) = 0$.
In your case, the matrix $A_{21}$ is a zero matrix and hence, we get $det(A-lambda I_n) = det left( left( begin{matrix} A_{1,1}&A_{1,2}\ 0&A_{2,2} end{matrix}right) - lambda I_n right) = det left( begin{matrix} A_{1,1} - lambda I_{k}&A_{1,2}\ 0&A_{2,2} - lambda I_{n-k} end{matrix}right)$
Hence $det(A-lambda I_n) = det(A_{1,1} - lambda I_{k}) times det(A_{22} - lambda I_{n-k})$.
So we get that if $lambda$ is an eigen value of $A_{11}$ or $A_{22}$, then either $det(A_{11}-lambda I_k) = 0$ or $det(A_{22}-lambda I_{n-k}) = 0$ and hence $det(A-lambda I_n) = 0$ and hence $lambda$ is an eigenvalue of $A$.
Similarly, if $lambda$ is an eigenvalue of $A$, then $det(A-lambda I_n) = 0$, then either $det(A_{11}-lambda I_k) = 0$ or $det(A_{22}-lambda I_{n-k}) = 0$ and hence $lambda$ is an eigen value of $A_{11}$ or $A_{22}$.
Edit
There is actually a small error in the above argument.
You might wonder that if $lambda$ is an eigenvalue of $A_{11}$, then $A_{11} - lambda I_k$ is not invertible and hence the identity $det left( begin{matrix} B_{1,1}&B_{1,2}\ B_{2,1 }&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22} - B_{21}B_{11}^{-1}B_{12})$ is false since $B_{11}$ is not invertible.
However, there is an another identity $det left( begin{matrix} B_{1,1}&B_{1,2}\ 0&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22})$ which is always true. (Prove both the identites as an exercise).
We can make use of this identity to get
$det(A-lambda I_n) = det(A_{1,1} - lambda I_{k}) times det(A_{22} - lambda I_{n-k})$.
$endgroup$
Let $A$ be the original matrix of size $n times n$. One way out is to use the identity. (Result from Schur Complement) https://en.wikipedia.org/wiki/Schur_complement
$det left( begin{matrix} B_{1,1}&B_{1,2}\ B_{2,1 }&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22} - B_{21}B_{11}^{-1}B_{12})$.
We know that $lambda$ is a number such that $Ax = lambda x$. From which we get $det(A-lambda I_n) = 0$.
In your case, the matrix $A_{21}$ is a zero matrix and hence, we get $det(A-lambda I_n) = det left( left( begin{matrix} A_{1,1}&A_{1,2}\ 0&A_{2,2} end{matrix}right) - lambda I_n right) = det left( begin{matrix} A_{1,1} - lambda I_{k}&A_{1,2}\ 0&A_{2,2} - lambda I_{n-k} end{matrix}right)$
Hence $det(A-lambda I_n) = det(A_{1,1} - lambda I_{k}) times det(A_{22} - lambda I_{n-k})$.
So we get that if $lambda$ is an eigen value of $A_{11}$ or $A_{22}$, then either $det(A_{11}-lambda I_k) = 0$ or $det(A_{22}-lambda I_{n-k}) = 0$ and hence $det(A-lambda I_n) = 0$ and hence $lambda$ is an eigenvalue of $A$.
Similarly, if $lambda$ is an eigenvalue of $A$, then $det(A-lambda I_n) = 0$, then either $det(A_{11}-lambda I_k) = 0$ or $det(A_{22}-lambda I_{n-k}) = 0$ and hence $lambda$ is an eigen value of $A_{11}$ or $A_{22}$.
Edit
There is actually a small error in the above argument.
You might wonder that if $lambda$ is an eigenvalue of $A_{11}$, then $A_{11} - lambda I_k$ is not invertible and hence the identity $det left( begin{matrix} B_{1,1}&B_{1,2}\ B_{2,1 }&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22} - B_{21}B_{11}^{-1}B_{12})$ is false since $B_{11}$ is not invertible.
However, there is an another identity $det left( begin{matrix} B_{1,1}&B_{1,2}\ 0&B_{2,2} end{matrix} right) = det(B_{1,1}) times det(B_{22})$ which is always true. (Prove both the identites as an exercise).
We can make use of this identity to get
$det(A-lambda I_n) = det(A_{1,1} - lambda I_{k}) times det(A_{22} - lambda I_{n-k})$.
edited Apr 8 '17 at 14:50
jatin gupta
32
32
answered Feb 10 '11 at 23:50
user17762
$begingroup$
@user17762 Is there a similar statement when $A_{21}$ is not zero matrix?
$endgroup$
– Babai
Jul 19 '16 at 7:45
add a comment |
$begingroup$
@user17762 Is there a similar statement when $A_{21}$ is not zero matrix?
$endgroup$
– Babai
Jul 19 '16 at 7:45
$begingroup$
@user17762 Is there a similar statement when $A_{21}$ is not zero matrix?
$endgroup$
– Babai
Jul 19 '16 at 7:45
$begingroup$
@user17762 Is there a similar statement when $A_{21}$ is not zero matrix?
$endgroup$
– Babai
Jul 19 '16 at 7:45
add a comment |
$begingroup$
A simpler way is from the definition. Is is easy to show that if $lambda_1$ is an eigenvalue of the upper diagonal block $A_{1,1}$, with eigenvector $p_1$, (size $n_1$) then it's also an eigenvalue of the full matrix, with the same eigenvector augmented with zeros.
$A_{1,1} ; p_1 = lambda_1 p_1$ with $p_1 ne 0 $
So
$$ left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} p_1 \ 0 end{matrix} right) =
left( begin{matrix} A_{1,1} ; p_1 \ 0 end{matrix} right) =
left( begin{matrix} lambda_1 p_1 \ 0 end{matrix} right) =
lambda_1 left( begin{matrix} p_1 \ 0 end{matrix} right) $$
Hence if $lambda$ is eigenvalue of $A_{1,1}$ then it's also eigenvalue of $A$. There are $n_1$ (counting multiplicity) such eigenvalues. The same applies to the lower diagonal block $A_{2,2}$. So we have found the $n_1$ + $n_2 = n$ eigenvalues of the full matrix. (Wrong! This only applied to block diagonal matrix - Fixed below)
Suposse now that $lambda_2$ is eigenvalue of $A_{2,2}$ with eigenvector $p_2$.
If $lambda_2$ is also eigenvalue of $A_{1,1}$, we have proved above that it's also eigenvalue of $A$. So, let's assume it's not eigenvalue of $A_{1,1}$ - hence $|A_{1,1} - lambda_2 I|ne 0$. Now
$$left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} x \ p_2 end{matrix} right) =
left( begin{matrix} A_{1,1} x + A_{1,2} p_2 \ lambda_2 p_2 end{matrix} right)
$$
We can make $ A_{1,1} x + A_{1,2} p_2 = lambda_2 x$ by choosing $x = - (A_{1,1} - lambda_2 I)^{-1} A_{1,2} ; p_2$; and so we found an eigenvector for $A$ with $lambda_2$ as eigenvalue.
It this way, we showed that if $lambda$ is eigenvalue of $A_{1,1}$ or $A_{2,2}$, then it's an eigenvalue of $A$.
To complete the proof, one should show the other way round: that if $lambda$ is eigenvalue of $A$ then it's eigenvalue of $A_{1,1}$ or $A_{2,2}$. But that's easy:
$$left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} x_1 \ x_2 end{matrix} right) =
left( begin{matrix} A_{1,1} ; x_1 + A_{1,2} ; x_2 \ A_{2,2} ; x_2 end{matrix} right)
= left( begin{matrix} lambda ; x_1 \ lambda ; x_2 end{matrix} right)
$$
Now, either $x_2 = 0$ or not. If not, then $lambda$ is eigenvalue of $A_{2,2}$. If yes,
it's eigenvalue of $A_{1,1}$.
$endgroup$
$begingroup$
Well, you can't use exactly the same argument for an eigenvector $p_2$ of $A_{2,2}$ since $A (0, p_2)^T = (A_{1,2}p_2, A_{2,2}p_2)^T$.
$endgroup$
– Calle
Feb 11 '11 at 1:40
$begingroup$
You are right! Fortunately, I could fix it :-)
$endgroup$
– leonbloy
Feb 11 '11 at 2:16
2
$begingroup$
You could probably save some writing by noting that $A$ and $A^T$ have the same eigen values. So your initial argument could be carried on to $A_{11}$ by transposing $A$ and doing the same argument for $A_{11}^T$
$endgroup$
– user17762
Feb 11 '11 at 2:21
$begingroup$
Nice argument though.
$endgroup$
– user17762
Feb 11 '11 at 5:37
1
$begingroup$
@Babai If $A_{21}ne 0$ but $A_{12} = 0$ it's the same thing. If both off diagonal submatrices are non zero, then the thesis is false (eigenvalues of the full matrix are not given by the diagonal submatrices alone).
$endgroup$
– leonbloy
Jul 19 '16 at 12:04
|
show 2 more comments
$begingroup$
A simpler way is from the definition. Is is easy to show that if $lambda_1$ is an eigenvalue of the upper diagonal block $A_{1,1}$, with eigenvector $p_1$, (size $n_1$) then it's also an eigenvalue of the full matrix, with the same eigenvector augmented with zeros.
$A_{1,1} ; p_1 = lambda_1 p_1$ with $p_1 ne 0 $
So
$$ left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} p_1 \ 0 end{matrix} right) =
left( begin{matrix} A_{1,1} ; p_1 \ 0 end{matrix} right) =
left( begin{matrix} lambda_1 p_1 \ 0 end{matrix} right) =
lambda_1 left( begin{matrix} p_1 \ 0 end{matrix} right) $$
Hence if $lambda$ is eigenvalue of $A_{1,1}$ then it's also eigenvalue of $A$. There are $n_1$ (counting multiplicity) such eigenvalues. The same applies to the lower diagonal block $A_{2,2}$. So we have found the $n_1$ + $n_2 = n$ eigenvalues of the full matrix. (Wrong! This only applied to block diagonal matrix - Fixed below)
Suposse now that $lambda_2$ is eigenvalue of $A_{2,2}$ with eigenvector $p_2$.
If $lambda_2$ is also eigenvalue of $A_{1,1}$, we have proved above that it's also eigenvalue of $A$. So, let's assume it's not eigenvalue of $A_{1,1}$ - hence $|A_{1,1} - lambda_2 I|ne 0$. Now
$$left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} x \ p_2 end{matrix} right) =
left( begin{matrix} A_{1,1} x + A_{1,2} p_2 \ lambda_2 p_2 end{matrix} right)
$$
We can make $ A_{1,1} x + A_{1,2} p_2 = lambda_2 x$ by choosing $x = - (A_{1,1} - lambda_2 I)^{-1} A_{1,2} ; p_2$; and so we found an eigenvector for $A$ with $lambda_2$ as eigenvalue.
It this way, we showed that if $lambda$ is eigenvalue of $A_{1,1}$ or $A_{2,2}$, then it's an eigenvalue of $A$.
To complete the proof, one should show the other way round: that if $lambda$ is eigenvalue of $A$ then it's eigenvalue of $A_{1,1}$ or $A_{2,2}$. But that's easy:
$$left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} x_1 \ x_2 end{matrix} right) =
left( begin{matrix} A_{1,1} ; x_1 + A_{1,2} ; x_2 \ A_{2,2} ; x_2 end{matrix} right)
= left( begin{matrix} lambda ; x_1 \ lambda ; x_2 end{matrix} right)
$$
Now, either $x_2 = 0$ or not. If not, then $lambda$ is eigenvalue of $A_{2,2}$. If yes,
it's eigenvalue of $A_{1,1}$.
$endgroup$
$begingroup$
Well, you can't use exactly the same argument for an eigenvector $p_2$ of $A_{2,2}$ since $A (0, p_2)^T = (A_{1,2}p_2, A_{2,2}p_2)^T$.
$endgroup$
– Calle
Feb 11 '11 at 1:40
$begingroup$
You are right! Fortunately, I could fix it :-)
$endgroup$
– leonbloy
Feb 11 '11 at 2:16
2
$begingroup$
You could probably save some writing by noting that $A$ and $A^T$ have the same eigen values. So your initial argument could be carried on to $A_{11}$ by transposing $A$ and doing the same argument for $A_{11}^T$
$endgroup$
– user17762
Feb 11 '11 at 2:21
$begingroup$
Nice argument though.
$endgroup$
– user17762
Feb 11 '11 at 5:37
1
$begingroup$
@Babai If $A_{21}ne 0$ but $A_{12} = 0$ it's the same thing. If both off diagonal submatrices are non zero, then the thesis is false (eigenvalues of the full matrix are not given by the diagonal submatrices alone).
$endgroup$
– leonbloy
Jul 19 '16 at 12:04
|
show 2 more comments
$begingroup$
A simpler way is from the definition. Is is easy to show that if $lambda_1$ is an eigenvalue of the upper diagonal block $A_{1,1}$, with eigenvector $p_1$, (size $n_1$) then it's also an eigenvalue of the full matrix, with the same eigenvector augmented with zeros.
$A_{1,1} ; p_1 = lambda_1 p_1$ with $p_1 ne 0 $
So
$$ left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} p_1 \ 0 end{matrix} right) =
left( begin{matrix} A_{1,1} ; p_1 \ 0 end{matrix} right) =
left( begin{matrix} lambda_1 p_1 \ 0 end{matrix} right) =
lambda_1 left( begin{matrix} p_1 \ 0 end{matrix} right) $$
Hence if $lambda$ is eigenvalue of $A_{1,1}$ then it's also eigenvalue of $A$. There are $n_1$ (counting multiplicity) such eigenvalues. The same applies to the lower diagonal block $A_{2,2}$. So we have found the $n_1$ + $n_2 = n$ eigenvalues of the full matrix. (Wrong! This only applied to block diagonal matrix - Fixed below)
Suposse now that $lambda_2$ is eigenvalue of $A_{2,2}$ with eigenvector $p_2$.
If $lambda_2$ is also eigenvalue of $A_{1,1}$, we have proved above that it's also eigenvalue of $A$. So, let's assume it's not eigenvalue of $A_{1,1}$ - hence $|A_{1,1} - lambda_2 I|ne 0$. Now
$$left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} x \ p_2 end{matrix} right) =
left( begin{matrix} A_{1,1} x + A_{1,2} p_2 \ lambda_2 p_2 end{matrix} right)
$$
We can make $ A_{1,1} x + A_{1,2} p_2 = lambda_2 x$ by choosing $x = - (A_{1,1} - lambda_2 I)^{-1} A_{1,2} ; p_2$; and so we found an eigenvector for $A$ with $lambda_2$ as eigenvalue.
It this way, we showed that if $lambda$ is eigenvalue of $A_{1,1}$ or $A_{2,2}$, then it's an eigenvalue of $A$.
To complete the proof, one should show the other way round: that if $lambda$ is eigenvalue of $A$ then it's eigenvalue of $A_{1,1}$ or $A_{2,2}$. But that's easy:
$$left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} x_1 \ x_2 end{matrix} right) =
left( begin{matrix} A_{1,1} ; x_1 + A_{1,2} ; x_2 \ A_{2,2} ; x_2 end{matrix} right)
= left( begin{matrix} lambda ; x_1 \ lambda ; x_2 end{matrix} right)
$$
Now, either $x_2 = 0$ or not. If not, then $lambda$ is eigenvalue of $A_{2,2}$. If yes,
it's eigenvalue of $A_{1,1}$.
$endgroup$
A simpler way is from the definition. Is is easy to show that if $lambda_1$ is an eigenvalue of the upper diagonal block $A_{1,1}$, with eigenvector $p_1$, (size $n_1$) then it's also an eigenvalue of the full matrix, with the same eigenvector augmented with zeros.
$A_{1,1} ; p_1 = lambda_1 p_1$ with $p_1 ne 0 $
So
$$ left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} p_1 \ 0 end{matrix} right) =
left( begin{matrix} A_{1,1} ; p_1 \ 0 end{matrix} right) =
left( begin{matrix} lambda_1 p_1 \ 0 end{matrix} right) =
lambda_1 left( begin{matrix} p_1 \ 0 end{matrix} right) $$
Hence if $lambda$ is eigenvalue of $A_{1,1}$ then it's also eigenvalue of $A$. There are $n_1$ (counting multiplicity) such eigenvalues. The same applies to the lower diagonal block $A_{2,2}$. So we have found the $n_1$ + $n_2 = n$ eigenvalues of the full matrix. (Wrong! This only applied to block diagonal matrix - Fixed below)
Suposse now that $lambda_2$ is eigenvalue of $A_{2,2}$ with eigenvector $p_2$.
If $lambda_2$ is also eigenvalue of $A_{1,1}$, we have proved above that it's also eigenvalue of $A$. So, let's assume it's not eigenvalue of $A_{1,1}$ - hence $|A_{1,1} - lambda_2 I|ne 0$. Now
$$left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} x \ p_2 end{matrix} right) =
left( begin{matrix} A_{1,1} x + A_{1,2} p_2 \ lambda_2 p_2 end{matrix} right)
$$
We can make $ A_{1,1} x + A_{1,2} p_2 = lambda_2 x$ by choosing $x = - (A_{1,1} - lambda_2 I)^{-1} A_{1,2} ; p_2$; and so we found an eigenvector for $A$ with $lambda_2$ as eigenvalue.
It this way, we showed that if $lambda$ is eigenvalue of $A_{1,1}$ or $A_{2,2}$, then it's an eigenvalue of $A$.
To complete the proof, one should show the other way round: that if $lambda$ is eigenvalue of $A$ then it's eigenvalue of $A_{1,1}$ or $A_{2,2}$. But that's easy:
$$left( begin{matrix} A_{1,1}&A_{1,2} \ 0 &A_{2,2} end{matrix} right)
left( begin{matrix} x_1 \ x_2 end{matrix} right) =
left( begin{matrix} A_{1,1} ; x_1 + A_{1,2} ; x_2 \ A_{2,2} ; x_2 end{matrix} right)
= left( begin{matrix} lambda ; x_1 \ lambda ; x_2 end{matrix} right)
$$
Now, either $x_2 = 0$ or not. If not, then $lambda$ is eigenvalue of $A_{2,2}$. If yes,
it's eigenvalue of $A_{1,1}$.
edited Sep 21 '15 at 0:04
answered Feb 11 '11 at 0:32
leonbloyleonbloy
41.5k647108
41.5k647108
$begingroup$
Well, you can't use exactly the same argument for an eigenvector $p_2$ of $A_{2,2}$ since $A (0, p_2)^T = (A_{1,2}p_2, A_{2,2}p_2)^T$.
$endgroup$
– Calle
Feb 11 '11 at 1:40
$begingroup$
You are right! Fortunately, I could fix it :-)
$endgroup$
– leonbloy
Feb 11 '11 at 2:16
2
$begingroup$
You could probably save some writing by noting that $A$ and $A^T$ have the same eigen values. So your initial argument could be carried on to $A_{11}$ by transposing $A$ and doing the same argument for $A_{11}^T$
$endgroup$
– user17762
Feb 11 '11 at 2:21
$begingroup$
Nice argument though.
$endgroup$
– user17762
Feb 11 '11 at 5:37
1
$begingroup$
@Babai If $A_{21}ne 0$ but $A_{12} = 0$ it's the same thing. If both off diagonal submatrices are non zero, then the thesis is false (eigenvalues of the full matrix are not given by the diagonal submatrices alone).
$endgroup$
– leonbloy
Jul 19 '16 at 12:04
|
show 2 more comments
$begingroup$
Well, you can't use exactly the same argument for an eigenvector $p_2$ of $A_{2,2}$ since $A (0, p_2)^T = (A_{1,2}p_2, A_{2,2}p_2)^T$.
$endgroup$
– Calle
Feb 11 '11 at 1:40
$begingroup$
You are right! Fortunately, I could fix it :-)
$endgroup$
– leonbloy
Feb 11 '11 at 2:16
2
$begingroup$
You could probably save some writing by noting that $A$ and $A^T$ have the same eigen values. So your initial argument could be carried on to $A_{11}$ by transposing $A$ and doing the same argument for $A_{11}^T$
$endgroup$
– user17762
Feb 11 '11 at 2:21
$begingroup$
Nice argument though.
$endgroup$
– user17762
Feb 11 '11 at 5:37
1
$begingroup$
@Babai If $A_{21}ne 0$ but $A_{12} = 0$ it's the same thing. If both off diagonal submatrices are non zero, then the thesis is false (eigenvalues of the full matrix are not given by the diagonal submatrices alone).
$endgroup$
– leonbloy
Jul 19 '16 at 12:04
$begingroup$
Well, you can't use exactly the same argument for an eigenvector $p_2$ of $A_{2,2}$ since $A (0, p_2)^T = (A_{1,2}p_2, A_{2,2}p_2)^T$.
$endgroup$
– Calle
Feb 11 '11 at 1:40
$begingroup$
Well, you can't use exactly the same argument for an eigenvector $p_2$ of $A_{2,2}$ since $A (0, p_2)^T = (A_{1,2}p_2, A_{2,2}p_2)^T$.
$endgroup$
– Calle
Feb 11 '11 at 1:40
$begingroup$
You are right! Fortunately, I could fix it :-)
$endgroup$
– leonbloy
Feb 11 '11 at 2:16
$begingroup$
You are right! Fortunately, I could fix it :-)
$endgroup$
– leonbloy
Feb 11 '11 at 2:16
2
2
$begingroup$
You could probably save some writing by noting that $A$ and $A^T$ have the same eigen values. So your initial argument could be carried on to $A_{11}$ by transposing $A$ and doing the same argument for $A_{11}^T$
$endgroup$
– user17762
Feb 11 '11 at 2:21
$begingroup$
You could probably save some writing by noting that $A$ and $A^T$ have the same eigen values. So your initial argument could be carried on to $A_{11}$ by transposing $A$ and doing the same argument for $A_{11}^T$
$endgroup$
– user17762
Feb 11 '11 at 2:21
$begingroup$
Nice argument though.
$endgroup$
– user17762
Feb 11 '11 at 5:37
$begingroup$
Nice argument though.
$endgroup$
– user17762
Feb 11 '11 at 5:37
1
1
$begingroup$
@Babai If $A_{21}ne 0$ but $A_{12} = 0$ it's the same thing. If both off diagonal submatrices are non zero, then the thesis is false (eigenvalues of the full matrix are not given by the diagonal submatrices alone).
$endgroup$
– leonbloy
Jul 19 '16 at 12:04
$begingroup$
@Babai If $A_{21}ne 0$ but $A_{12} = 0$ it's the same thing. If both off diagonal submatrices are non zero, then the thesis is false (eigenvalues of the full matrix are not given by the diagonal submatrices alone).
$endgroup$
– leonbloy
Jul 19 '16 at 12:04
|
show 2 more comments
$begingroup$
For another approach for a proof you can use the Gershgorin disc theorem (sometimes Hirschhorn due to pronounciation differences between alphabets) to prove the disks for the individual matrices are the same as the discs for the large matrix so the sets of possible eigenvalues must be the same. This is because the radial contribution to the disks are 0 all over all entries for the lower left block since $|0| = 0$ and $0+0=0$.
$endgroup$
add a comment |
$begingroup$
For another approach for a proof you can use the Gershgorin disc theorem (sometimes Hirschhorn due to pronounciation differences between alphabets) to prove the disks for the individual matrices are the same as the discs for the large matrix so the sets of possible eigenvalues must be the same. This is because the radial contribution to the disks are 0 all over all entries for the lower left block since $|0| = 0$ and $0+0=0$.
$endgroup$
add a comment |
$begingroup$
For another approach for a proof you can use the Gershgorin disc theorem (sometimes Hirschhorn due to pronounciation differences between alphabets) to prove the disks for the individual matrices are the same as the discs for the large matrix so the sets of possible eigenvalues must be the same. This is because the radial contribution to the disks are 0 all over all entries for the lower left block since $|0| = 0$ and $0+0=0$.
$endgroup$
For another approach for a proof you can use the Gershgorin disc theorem (sometimes Hirschhorn due to pronounciation differences between alphabets) to prove the disks for the individual matrices are the same as the discs for the large matrix so the sets of possible eigenvalues must be the same. This is because the radial contribution to the disks are 0 all over all entries for the lower left block since $|0| = 0$ and $0+0=0$.
answered Apr 8 '17 at 16:59
mathreadlermathreadler
15k72263
15k72263
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f21454%2fprove-that-the-eigenvalues-of-a-block-matrix-are-the-combined-eigenvalues-of-its%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
4
$begingroup$
Thank you so much for posting this question! You may have just single handedly saved my research!!!
$endgroup$
– Paul
Jan 31 '13 at 20:43
$begingroup$
What if $A_{2,1}$ is not $0$ ?
$endgroup$
– Ashutosh Gupta
May 12 '16 at 4:40
$begingroup$
@tsiki What is $A_{21}$ is not zero?
$endgroup$
– Babai
Jul 19 '16 at 7:44