Kernel of matrix polynomials The Next CEO of Stack OverflowLinear Algebra: Minimum Polynomial...
Do I need to write [sic] when a number is less than 10 but isn't written out?
How to delete every two lines after 3rd lines in a file contains very large number of lines?
Is French Guiana a (hard) EU border?
How to edit “Name” property in GCI output?
TikZ: How to reverse arrow direction without switching start/end point?
The past simple of "gaslight" – "gaslighted" or "gaslit"?
Why is the US ranked as #45 in Press Freedom ratings, despite its extremely permissive free speech laws?
Help understanding this unsettling image of Titan, Epimetheus, and Saturn's rings?
Would a completely good Muggle be able to use a wand?
Bartok - Syncopation (1): Meaning of notes in between Grand Staff
How to write a definition with variants?
Can this equation be simplified further?
Why the difference in type-inference over the as-pattern in two similar function definitions?
Rotate a column
Is it possible to replace duplicates of a character with one character using tr
Is it my responsibility to learn a new technology in my own time my employer wants to implement?
Is micro rebar a better way to reinforce concrete than rebar?
Won the lottery - how do I keep the money?
Method for adding error messages to a dictionary given a key
Is it okay to majorly distort historical facts while writing a fiction story?
What did we know about the Kessel run before the prequels?
When you upcast Blindness/Deafness, do all targets suffer the same effect?
Writing differences on a blackboard
Running a General Election and the European Elections together
Kernel of matrix polynomials
The Next CEO of Stack OverflowLinear Algebra: Minimum Polynomial QuestionShowing diagonalisability using primary decompositionshow T is one one iff $ker(T)=0$Kernel of GCD between minimal polynomial and any polynomialSymmetric positive definite matrix, why $text{Im}(S^{-1}A) = (text{Ker}(A))^{bot_S}$?The kernel of nilpotent MatrixIf $P_1$ and $P_2$ are coprime, how do I show that $ker ( (P_1 times P_2 )(f) )=ker(P_1 (f)) oplus ker(P_2(f))$?Understanding sets of polynomials as subspacesChange of basis of the kernel of a rectangular matrixKernel of polynomial of matrix
$begingroup$
I am told that $f(A)=g(A)h(A), f(A)=0$ is a polynomial of an $n times n$ matrix $A$.
I am further told that $g(x)$ and $h(x)$ are coprime. I need to show that $mathbb{R}^n$ is the direct sum of $text{ker }g(A)$ and $text{ker }h(A)$.
My attempt at a solution:
Since $g(x)$ and $h(x)$ are coprime, this means that they have no roots or factors in common, so I can deduce that $text{ker }g(A) cap text{ker }h(A) ={0}$.
I now want to show that if a vector $v in mathbb{R}^n$ is not in $text{ker }g(A)$, then it is in $text{ker }h(A)$ and vice versa.
Since $f(A)=0$, we have $f(A)v=0 Rightarrow g(A)h(A)v=0$
Because of commutativity of polynomial multiplication, this also means that $h(A)g(A)v=0$.
If $v notin text{ker }g(A)$ then $g(A)vne 0$ so $h(A)v=0$. Is this true? Am I on the right track?
Any help would be greatly appreciated!
linear-algebra matrices polynomials
$endgroup$
add a comment |
$begingroup$
I am told that $f(A)=g(A)h(A), f(A)=0$ is a polynomial of an $n times n$ matrix $A$.
I am further told that $g(x)$ and $h(x)$ are coprime. I need to show that $mathbb{R}^n$ is the direct sum of $text{ker }g(A)$ and $text{ker }h(A)$.
My attempt at a solution:
Since $g(x)$ and $h(x)$ are coprime, this means that they have no roots or factors in common, so I can deduce that $text{ker }g(A) cap text{ker }h(A) ={0}$.
I now want to show that if a vector $v in mathbb{R}^n$ is not in $text{ker }g(A)$, then it is in $text{ker }h(A)$ and vice versa.
Since $f(A)=0$, we have $f(A)v=0 Rightarrow g(A)h(A)v=0$
Because of commutativity of polynomial multiplication, this also means that $h(A)g(A)v=0$.
If $v notin text{ker }g(A)$ then $g(A)vne 0$ so $h(A)v=0$. Is this true? Am I on the right track?
Any help would be greatly appreciated!
linear-algebra matrices polynomials
$endgroup$
1
$begingroup$
You target is wrong : the direct sum is not the disjoint union. You need to show that any $v$ can be written as $v_1 + v_2$ where $v_1 in ker g(A)$ and $v_2 in ker h(A)$. For this purpose you must use relatively primality.
$endgroup$
– астон вілла олоф мэллбэрг
Mar 17 at 7:46
$begingroup$
Thanks. Do you have any hints on how to do this? All I can think of is showing that since $f(A)v=0 forall v in mathbb{R}^n$ then $f(A)(v_1+v_2)=0$ so $v_1+v_2$ must be in one of the kernels - but can't be in both.
$endgroup$
– Andrew
Mar 17 at 7:50
$begingroup$
@Andrew Hint: Bezout's identity.
$endgroup$
– M. Vinay
Mar 17 at 7:52
add a comment |
$begingroup$
I am told that $f(A)=g(A)h(A), f(A)=0$ is a polynomial of an $n times n$ matrix $A$.
I am further told that $g(x)$ and $h(x)$ are coprime. I need to show that $mathbb{R}^n$ is the direct sum of $text{ker }g(A)$ and $text{ker }h(A)$.
My attempt at a solution:
Since $g(x)$ and $h(x)$ are coprime, this means that they have no roots or factors in common, so I can deduce that $text{ker }g(A) cap text{ker }h(A) ={0}$.
I now want to show that if a vector $v in mathbb{R}^n$ is not in $text{ker }g(A)$, then it is in $text{ker }h(A)$ and vice versa.
Since $f(A)=0$, we have $f(A)v=0 Rightarrow g(A)h(A)v=0$
Because of commutativity of polynomial multiplication, this also means that $h(A)g(A)v=0$.
If $v notin text{ker }g(A)$ then $g(A)vne 0$ so $h(A)v=0$. Is this true? Am I on the right track?
Any help would be greatly appreciated!
linear-algebra matrices polynomials
$endgroup$
I am told that $f(A)=g(A)h(A), f(A)=0$ is a polynomial of an $n times n$ matrix $A$.
I am further told that $g(x)$ and $h(x)$ are coprime. I need to show that $mathbb{R}^n$ is the direct sum of $text{ker }g(A)$ and $text{ker }h(A)$.
My attempt at a solution:
Since $g(x)$ and $h(x)$ are coprime, this means that they have no roots or factors in common, so I can deduce that $text{ker }g(A) cap text{ker }h(A) ={0}$.
I now want to show that if a vector $v in mathbb{R}^n$ is not in $text{ker }g(A)$, then it is in $text{ker }h(A)$ and vice versa.
Since $f(A)=0$, we have $f(A)v=0 Rightarrow g(A)h(A)v=0$
Because of commutativity of polynomial multiplication, this also means that $h(A)g(A)v=0$.
If $v notin text{ker }g(A)$ then $g(A)vne 0$ so $h(A)v=0$. Is this true? Am I on the right track?
Any help would be greatly appreciated!
linear-algebra matrices polynomials
linear-algebra matrices polynomials
asked Mar 17 at 7:33
AndrewAndrew
357213
357213
1
$begingroup$
You target is wrong : the direct sum is not the disjoint union. You need to show that any $v$ can be written as $v_1 + v_2$ where $v_1 in ker g(A)$ and $v_2 in ker h(A)$. For this purpose you must use relatively primality.
$endgroup$
– астон вілла олоф мэллбэрг
Mar 17 at 7:46
$begingroup$
Thanks. Do you have any hints on how to do this? All I can think of is showing that since $f(A)v=0 forall v in mathbb{R}^n$ then $f(A)(v_1+v_2)=0$ so $v_1+v_2$ must be in one of the kernels - but can't be in both.
$endgroup$
– Andrew
Mar 17 at 7:50
$begingroup$
@Andrew Hint: Bezout's identity.
$endgroup$
– M. Vinay
Mar 17 at 7:52
add a comment |
1
$begingroup$
You target is wrong : the direct sum is not the disjoint union. You need to show that any $v$ can be written as $v_1 + v_2$ where $v_1 in ker g(A)$ and $v_2 in ker h(A)$. For this purpose you must use relatively primality.
$endgroup$
– астон вілла олоф мэллбэрг
Mar 17 at 7:46
$begingroup$
Thanks. Do you have any hints on how to do this? All I can think of is showing that since $f(A)v=0 forall v in mathbb{R}^n$ then $f(A)(v_1+v_2)=0$ so $v_1+v_2$ must be in one of the kernels - but can't be in both.
$endgroup$
– Andrew
Mar 17 at 7:50
$begingroup$
@Andrew Hint: Bezout's identity.
$endgroup$
– M. Vinay
Mar 17 at 7:52
1
1
$begingroup$
You target is wrong : the direct sum is not the disjoint union. You need to show that any $v$ can be written as $v_1 + v_2$ where $v_1 in ker g(A)$ and $v_2 in ker h(A)$. For this purpose you must use relatively primality.
$endgroup$
– астон вілла олоф мэллбэрг
Mar 17 at 7:46
$begingroup$
You target is wrong : the direct sum is not the disjoint union. You need to show that any $v$ can be written as $v_1 + v_2$ where $v_1 in ker g(A)$ and $v_2 in ker h(A)$. For this purpose you must use relatively primality.
$endgroup$
– астон вілла олоф мэллбэрг
Mar 17 at 7:46
$begingroup$
Thanks. Do you have any hints on how to do this? All I can think of is showing that since $f(A)v=0 forall v in mathbb{R}^n$ then $f(A)(v_1+v_2)=0$ so $v_1+v_2$ must be in one of the kernels - but can't be in both.
$endgroup$
– Andrew
Mar 17 at 7:50
$begingroup$
Thanks. Do you have any hints on how to do this? All I can think of is showing that since $f(A)v=0 forall v in mathbb{R}^n$ then $f(A)(v_1+v_2)=0$ so $v_1+v_2$ must be in one of the kernels - but can't be in both.
$endgroup$
– Andrew
Mar 17 at 7:50
$begingroup$
@Andrew Hint: Bezout's identity.
$endgroup$
– M. Vinay
Mar 17 at 7:52
$begingroup$
@Andrew Hint: Bezout's identity.
$endgroup$
– M. Vinay
Mar 17 at 7:52
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Use the fact that if $g,h$ are relatively prime polynomials then there exist polynomials $p(x),q(x)$ such that $p(x)g(x) + q(x)h(x) = 1$ for all $x$,(Bezout identity), so in particular, $p(A)g(A) + q(A)h(A) = I$.
Now, with this in mind, let $v in Bbb R^n$. Then, we may write $v = p(A)g(A)v + q(A)h(A)v$.
Now, it is clear(by commutativity of the polynomial operators in $A$) that :
$$
h(A)[p(A)g(A)v] = p(A)[g(A)h(A)v] = 0 \
g(A)[q(A)h(A)v] = q(A)[g(A)h(A)v] = 0
$$
Therefore $v$ is a sum of two elements lying in the kernels of $h(A)$ and $g(A)$ respectively. Adding the disjointness of these subspaces gives the required conclusion.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3151238%2fkernel-of-matrix-polynomials%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Use the fact that if $g,h$ are relatively prime polynomials then there exist polynomials $p(x),q(x)$ such that $p(x)g(x) + q(x)h(x) = 1$ for all $x$,(Bezout identity), so in particular, $p(A)g(A) + q(A)h(A) = I$.
Now, with this in mind, let $v in Bbb R^n$. Then, we may write $v = p(A)g(A)v + q(A)h(A)v$.
Now, it is clear(by commutativity of the polynomial operators in $A$) that :
$$
h(A)[p(A)g(A)v] = p(A)[g(A)h(A)v] = 0 \
g(A)[q(A)h(A)v] = q(A)[g(A)h(A)v] = 0
$$
Therefore $v$ is a sum of two elements lying in the kernels of $h(A)$ and $g(A)$ respectively. Adding the disjointness of these subspaces gives the required conclusion.
$endgroup$
add a comment |
$begingroup$
Use the fact that if $g,h$ are relatively prime polynomials then there exist polynomials $p(x),q(x)$ such that $p(x)g(x) + q(x)h(x) = 1$ for all $x$,(Bezout identity), so in particular, $p(A)g(A) + q(A)h(A) = I$.
Now, with this in mind, let $v in Bbb R^n$. Then, we may write $v = p(A)g(A)v + q(A)h(A)v$.
Now, it is clear(by commutativity of the polynomial operators in $A$) that :
$$
h(A)[p(A)g(A)v] = p(A)[g(A)h(A)v] = 0 \
g(A)[q(A)h(A)v] = q(A)[g(A)h(A)v] = 0
$$
Therefore $v$ is a sum of two elements lying in the kernels of $h(A)$ and $g(A)$ respectively. Adding the disjointness of these subspaces gives the required conclusion.
$endgroup$
add a comment |
$begingroup$
Use the fact that if $g,h$ are relatively prime polynomials then there exist polynomials $p(x),q(x)$ such that $p(x)g(x) + q(x)h(x) = 1$ for all $x$,(Bezout identity), so in particular, $p(A)g(A) + q(A)h(A) = I$.
Now, with this in mind, let $v in Bbb R^n$. Then, we may write $v = p(A)g(A)v + q(A)h(A)v$.
Now, it is clear(by commutativity of the polynomial operators in $A$) that :
$$
h(A)[p(A)g(A)v] = p(A)[g(A)h(A)v] = 0 \
g(A)[q(A)h(A)v] = q(A)[g(A)h(A)v] = 0
$$
Therefore $v$ is a sum of two elements lying in the kernels of $h(A)$ and $g(A)$ respectively. Adding the disjointness of these subspaces gives the required conclusion.
$endgroup$
Use the fact that if $g,h$ are relatively prime polynomials then there exist polynomials $p(x),q(x)$ such that $p(x)g(x) + q(x)h(x) = 1$ for all $x$,(Bezout identity), so in particular, $p(A)g(A) + q(A)h(A) = I$.
Now, with this in mind, let $v in Bbb R^n$. Then, we may write $v = p(A)g(A)v + q(A)h(A)v$.
Now, it is clear(by commutativity of the polynomial operators in $A$) that :
$$
h(A)[p(A)g(A)v] = p(A)[g(A)h(A)v] = 0 \
g(A)[q(A)h(A)v] = q(A)[g(A)h(A)v] = 0
$$
Therefore $v$ is a sum of two elements lying in the kernels of $h(A)$ and $g(A)$ respectively. Adding the disjointness of these subspaces gives the required conclusion.
answered Mar 17 at 8:12
астон вілла олоф мэллбэргастон вілла олоф мэллбэрг
40.1k33577
40.1k33577
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3151238%2fkernel-of-matrix-polynomials%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
You target is wrong : the direct sum is not the disjoint union. You need to show that any $v$ can be written as $v_1 + v_2$ where $v_1 in ker g(A)$ and $v_2 in ker h(A)$. For this purpose you must use relatively primality.
$endgroup$
– астон вілла олоф мэллбэрг
Mar 17 at 7:46
$begingroup$
Thanks. Do you have any hints on how to do this? All I can think of is showing that since $f(A)v=0 forall v in mathbb{R}^n$ then $f(A)(v_1+v_2)=0$ so $v_1+v_2$ must be in one of the kernels - but can't be in both.
$endgroup$
– Andrew
Mar 17 at 7:50
$begingroup$
@Andrew Hint: Bezout's identity.
$endgroup$
– M. Vinay
Mar 17 at 7:52