Inequality of vector norms with projections The Next CEO of Stack OverflowVector Spaces and...
Can we say or write : "No, it'sn't"?
What does "Its cash flow is deeply negative" mean?
Why do remote companies require working in the US?
Is it my responsibility to learn a new technology in my own time my employer wants to implement?
Calculus II Question
Why do professional authors make "consistency" mistakes? And how to avoid them?
WOW air has ceased operation, can I get my tickets refunded?
Is micro rebar a better way to reinforce concrete than rebar?
How does the mv command work with external drives?
If a black hole is created from light, can this black hole then move at speed of light?
Should I tutor a student who I know has cheated on their homework?
What is the result of assigning to std::vector<T>::begin()?
How to safely derail a train during transit?
Would this house-rule that treats advantage as a +1 to the roll instead (and disadvantage as -1) and allows them to stack be balanced?
Why has the US not been more assertive in confronting Russia in recent years?
Is there a difference between "Fahrstuhl" and "Aufzug"
What is "(CFMCC)" on an ILS approach chart?
How do I make a variable always equal to the result of some calculations?
Are there any limitations on attacking while grappling?
Interfacing a button to MCU (and PC) with 50m long cable
Written every which way
Is it possible to search for a directory/file combination?
Anatomically Correct Strange Women In Ponds Distributing Swords
Several mode to write the symbol of a vector
Inequality of vector norms with projections
The Next CEO of Stack OverflowVector Spaces and Projectionsshow projection matrix is equal to matrix times its transposeFinding orthogonal projections onto $1$ (co)-dimensional subspaces of $mathbb R^n$How to prove that $P_{w^perp}=I-P_{w}$ for the following condition?Relationship between angle of vector with corresponding projections on orthogonal subspacesFinding orthogonal vectorOrthogonal projection of vector.Does the orthogonal projection of a vector onto a subspace equal the original vector?Proving an inequality about orthogonal projectionsOrthogonal projection and orthogonal complement
$begingroup$
Let $W$ be a vector subspace of $V$, a space with a dot product; $vin V$. Let $p_W(v)$ be the orthogonal projection of $v$ onto $W$ and $win W, wneq p_W(v)$.
How can i prove that $||v-w|| > ||v-p_W(v)||$?
linear-algebra vector-spaces norm orthogonality
$endgroup$
add a comment |
$begingroup$
Let $W$ be a vector subspace of $V$, a space with a dot product; $vin V$. Let $p_W(v)$ be the orthogonal projection of $v$ onto $W$ and $win W, wneq p_W(v)$.
How can i prove that $||v-w|| > ||v-p_W(v)||$?
linear-algebra vector-spaces norm orthogonality
$endgroup$
$begingroup$
Pray tell, what is $w'$?
$endgroup$
– Robert Lewis
Mar 16 at 17:49
1
$begingroup$
My bad, I meant $w$
$endgroup$
– Noé Duarte González
Mar 16 at 17:55
$begingroup$
Thanks for the correction!
$endgroup$
– Robert Lewis
Mar 16 at 18:06
add a comment |
$begingroup$
Let $W$ be a vector subspace of $V$, a space with a dot product; $vin V$. Let $p_W(v)$ be the orthogonal projection of $v$ onto $W$ and $win W, wneq p_W(v)$.
How can i prove that $||v-w|| > ||v-p_W(v)||$?
linear-algebra vector-spaces norm orthogonality
$endgroup$
Let $W$ be a vector subspace of $V$, a space with a dot product; $vin V$. Let $p_W(v)$ be the orthogonal projection of $v$ onto $W$ and $win W, wneq p_W(v)$.
How can i prove that $||v-w|| > ||v-p_W(v)||$?
linear-algebra vector-spaces norm orthogonality
linear-algebra vector-spaces norm orthogonality
edited Mar 16 at 17:54
Noé Duarte González
asked Mar 16 at 17:42
Noé Duarte GonzálezNoé Duarte González
63
63
$begingroup$
Pray tell, what is $w'$?
$endgroup$
– Robert Lewis
Mar 16 at 17:49
1
$begingroup$
My bad, I meant $w$
$endgroup$
– Noé Duarte González
Mar 16 at 17:55
$begingroup$
Thanks for the correction!
$endgroup$
– Robert Lewis
Mar 16 at 18:06
add a comment |
$begingroup$
Pray tell, what is $w'$?
$endgroup$
– Robert Lewis
Mar 16 at 17:49
1
$begingroup$
My bad, I meant $w$
$endgroup$
– Noé Duarte González
Mar 16 at 17:55
$begingroup$
Thanks for the correction!
$endgroup$
– Robert Lewis
Mar 16 at 18:06
$begingroup$
Pray tell, what is $w'$?
$endgroup$
– Robert Lewis
Mar 16 at 17:49
$begingroup$
Pray tell, what is $w'$?
$endgroup$
– Robert Lewis
Mar 16 at 17:49
1
1
$begingroup$
My bad, I meant $w$
$endgroup$
– Noé Duarte González
Mar 16 at 17:55
$begingroup$
My bad, I meant $w$
$endgroup$
– Noé Duarte González
Mar 16 at 17:55
$begingroup$
Thanks for the correction!
$endgroup$
– Robert Lewis
Mar 16 at 18:06
$begingroup$
Thanks for the correction!
$endgroup$
– Robert Lewis
Mar 16 at 18:06
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Let $v = v_| + v_bot $, where $v_| = p_W(v)$, $v_bot in W^bot$.
$ |v-w|^2 = (v-w)^2 = v^2 + w^2 - 2vw = v_bot^2 + v_|^2 + w^2 - 2v_|w = v_bot^2 + (v_|-w)^2$
$ |v-v_||^2 = v_bot^2 < v_bot^2 + (v_|-w)^2 $, QED.
$endgroup$
add a comment |
$begingroup$
Since $W subset V$ is a fixed subspace, I am going to drop the subscript "$W"$ from $P$; in this answer, $P = P_W$.
Recall that an orthogonal projection $P$ satisfies
$P^2 = P = P^T; tag 1$
thus for any
$x, y in V tag 2$
we have
$langle x, Py rangle = langle P^Tx, y rangle = langle Px, y rangle; tag 3$
now consider the three vectors $v$, $Pv$, and $w$;
$Pv, w in W Longrightarrow Pv - w in W, tag 4$
$langle v - Pv, w rangle = langle v, w rangle - langle Pv, w rangle = langle v, w rangle - langle v, Pw rangle = langle v, w rangle - langle v, w rangle = 0, tag 5$
where we have used (3) and the fact that $Pw = w$ for $w in W$; since this holds for every $w in W$, we have established that
$v - Pv in W^bot; tag 6$
we write
$v - w = (v - Pv) + (Pv - w), tag 7$
and by virtue of (4) and (6)
$langle v - Pv, Pv - w rangle = 0, tag 8$
whence
$Vert v - w Vert^2 = langle v - w, v - w rangle = langle (v - Pv) + (Pv - w), (v - Pv) + (Pv - w) rangle$
$= langle v - Pv, v - Pv rangle -2langle v - Pv, Pv - w rangle + langle Pv - w, Pv - w rangle$
$= Vert v - Pv Vert^2 + Vert Pv - w Vert^2;tag 9$
since each term on the right is non-negative, we find that
$Vert v - w Vert^2 ge Vert v - Pv Vert^2, tag{10}$
whence
$Vert v - w Vert ge Vert v - Pv Vert, ; forall w in W, tag{11}$
with equality holding precisely when
$Vert w - Pv Vert = 0 Longleftrightarrow w = Pv; tag{12}$
thus
$w ne Pv Longrightarrow Vert v - w Vert > Vert v - Pv Vert. tag{13}$
$OEDelta$.
Inequality of vector norms with projections
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3150627%2finequality-of-vector-norms-with-projections%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Let $v = v_| + v_bot $, where $v_| = p_W(v)$, $v_bot in W^bot$.
$ |v-w|^2 = (v-w)^2 = v^2 + w^2 - 2vw = v_bot^2 + v_|^2 + w^2 - 2v_|w = v_bot^2 + (v_|-w)^2$
$ |v-v_||^2 = v_bot^2 < v_bot^2 + (v_|-w)^2 $, QED.
$endgroup$
add a comment |
$begingroup$
Let $v = v_| + v_bot $, where $v_| = p_W(v)$, $v_bot in W^bot$.
$ |v-w|^2 = (v-w)^2 = v^2 + w^2 - 2vw = v_bot^2 + v_|^2 + w^2 - 2v_|w = v_bot^2 + (v_|-w)^2$
$ |v-v_||^2 = v_bot^2 < v_bot^2 + (v_|-w)^2 $, QED.
$endgroup$
add a comment |
$begingroup$
Let $v = v_| + v_bot $, where $v_| = p_W(v)$, $v_bot in W^bot$.
$ |v-w|^2 = (v-w)^2 = v^2 + w^2 - 2vw = v_bot^2 + v_|^2 + w^2 - 2v_|w = v_bot^2 + (v_|-w)^2$
$ |v-v_||^2 = v_bot^2 < v_bot^2 + (v_|-w)^2 $, QED.
$endgroup$
Let $v = v_| + v_bot $, where $v_| = p_W(v)$, $v_bot in W^bot$.
$ |v-w|^2 = (v-w)^2 = v^2 + w^2 - 2vw = v_bot^2 + v_|^2 + w^2 - 2v_|w = v_bot^2 + (v_|-w)^2$
$ |v-v_||^2 = v_bot^2 < v_bot^2 + (v_|-w)^2 $, QED.
answered Mar 16 at 18:14
colt_browningcolt_browning
768110
768110
add a comment |
add a comment |
$begingroup$
Since $W subset V$ is a fixed subspace, I am going to drop the subscript "$W"$ from $P$; in this answer, $P = P_W$.
Recall that an orthogonal projection $P$ satisfies
$P^2 = P = P^T; tag 1$
thus for any
$x, y in V tag 2$
we have
$langle x, Py rangle = langle P^Tx, y rangle = langle Px, y rangle; tag 3$
now consider the three vectors $v$, $Pv$, and $w$;
$Pv, w in W Longrightarrow Pv - w in W, tag 4$
$langle v - Pv, w rangle = langle v, w rangle - langle Pv, w rangle = langle v, w rangle - langle v, Pw rangle = langle v, w rangle - langle v, w rangle = 0, tag 5$
where we have used (3) and the fact that $Pw = w$ for $w in W$; since this holds for every $w in W$, we have established that
$v - Pv in W^bot; tag 6$
we write
$v - w = (v - Pv) + (Pv - w), tag 7$
and by virtue of (4) and (6)
$langle v - Pv, Pv - w rangle = 0, tag 8$
whence
$Vert v - w Vert^2 = langle v - w, v - w rangle = langle (v - Pv) + (Pv - w), (v - Pv) + (Pv - w) rangle$
$= langle v - Pv, v - Pv rangle -2langle v - Pv, Pv - w rangle + langle Pv - w, Pv - w rangle$
$= Vert v - Pv Vert^2 + Vert Pv - w Vert^2;tag 9$
since each term on the right is non-negative, we find that
$Vert v - w Vert^2 ge Vert v - Pv Vert^2, tag{10}$
whence
$Vert v - w Vert ge Vert v - Pv Vert, ; forall w in W, tag{11}$
with equality holding precisely when
$Vert w - Pv Vert = 0 Longleftrightarrow w = Pv; tag{12}$
thus
$w ne Pv Longrightarrow Vert v - w Vert > Vert v - Pv Vert. tag{13}$
$OEDelta$.
Inequality of vector norms with projections
$endgroup$
add a comment |
$begingroup$
Since $W subset V$ is a fixed subspace, I am going to drop the subscript "$W"$ from $P$; in this answer, $P = P_W$.
Recall that an orthogonal projection $P$ satisfies
$P^2 = P = P^T; tag 1$
thus for any
$x, y in V tag 2$
we have
$langle x, Py rangle = langle P^Tx, y rangle = langle Px, y rangle; tag 3$
now consider the three vectors $v$, $Pv$, and $w$;
$Pv, w in W Longrightarrow Pv - w in W, tag 4$
$langle v - Pv, w rangle = langle v, w rangle - langle Pv, w rangle = langle v, w rangle - langle v, Pw rangle = langle v, w rangle - langle v, w rangle = 0, tag 5$
where we have used (3) and the fact that $Pw = w$ for $w in W$; since this holds for every $w in W$, we have established that
$v - Pv in W^bot; tag 6$
we write
$v - w = (v - Pv) + (Pv - w), tag 7$
and by virtue of (4) and (6)
$langle v - Pv, Pv - w rangle = 0, tag 8$
whence
$Vert v - w Vert^2 = langle v - w, v - w rangle = langle (v - Pv) + (Pv - w), (v - Pv) + (Pv - w) rangle$
$= langle v - Pv, v - Pv rangle -2langle v - Pv, Pv - w rangle + langle Pv - w, Pv - w rangle$
$= Vert v - Pv Vert^2 + Vert Pv - w Vert^2;tag 9$
since each term on the right is non-negative, we find that
$Vert v - w Vert^2 ge Vert v - Pv Vert^2, tag{10}$
whence
$Vert v - w Vert ge Vert v - Pv Vert, ; forall w in W, tag{11}$
with equality holding precisely when
$Vert w - Pv Vert = 0 Longleftrightarrow w = Pv; tag{12}$
thus
$w ne Pv Longrightarrow Vert v - w Vert > Vert v - Pv Vert. tag{13}$
$OEDelta$.
Inequality of vector norms with projections
$endgroup$
add a comment |
$begingroup$
Since $W subset V$ is a fixed subspace, I am going to drop the subscript "$W"$ from $P$; in this answer, $P = P_W$.
Recall that an orthogonal projection $P$ satisfies
$P^2 = P = P^T; tag 1$
thus for any
$x, y in V tag 2$
we have
$langle x, Py rangle = langle P^Tx, y rangle = langle Px, y rangle; tag 3$
now consider the three vectors $v$, $Pv$, and $w$;
$Pv, w in W Longrightarrow Pv - w in W, tag 4$
$langle v - Pv, w rangle = langle v, w rangle - langle Pv, w rangle = langle v, w rangle - langle v, Pw rangle = langle v, w rangle - langle v, w rangle = 0, tag 5$
where we have used (3) and the fact that $Pw = w$ for $w in W$; since this holds for every $w in W$, we have established that
$v - Pv in W^bot; tag 6$
we write
$v - w = (v - Pv) + (Pv - w), tag 7$
and by virtue of (4) and (6)
$langle v - Pv, Pv - w rangle = 0, tag 8$
whence
$Vert v - w Vert^2 = langle v - w, v - w rangle = langle (v - Pv) + (Pv - w), (v - Pv) + (Pv - w) rangle$
$= langle v - Pv, v - Pv rangle -2langle v - Pv, Pv - w rangle + langle Pv - w, Pv - w rangle$
$= Vert v - Pv Vert^2 + Vert Pv - w Vert^2;tag 9$
since each term on the right is non-negative, we find that
$Vert v - w Vert^2 ge Vert v - Pv Vert^2, tag{10}$
whence
$Vert v - w Vert ge Vert v - Pv Vert, ; forall w in W, tag{11}$
with equality holding precisely when
$Vert w - Pv Vert = 0 Longleftrightarrow w = Pv; tag{12}$
thus
$w ne Pv Longrightarrow Vert v - w Vert > Vert v - Pv Vert. tag{13}$
$OEDelta$.
Inequality of vector norms with projections
$endgroup$
Since $W subset V$ is a fixed subspace, I am going to drop the subscript "$W"$ from $P$; in this answer, $P = P_W$.
Recall that an orthogonal projection $P$ satisfies
$P^2 = P = P^T; tag 1$
thus for any
$x, y in V tag 2$
we have
$langle x, Py rangle = langle P^Tx, y rangle = langle Px, y rangle; tag 3$
now consider the three vectors $v$, $Pv$, and $w$;
$Pv, w in W Longrightarrow Pv - w in W, tag 4$
$langle v - Pv, w rangle = langle v, w rangle - langle Pv, w rangle = langle v, w rangle - langle v, Pw rangle = langle v, w rangle - langle v, w rangle = 0, tag 5$
where we have used (3) and the fact that $Pw = w$ for $w in W$; since this holds for every $w in W$, we have established that
$v - Pv in W^bot; tag 6$
we write
$v - w = (v - Pv) + (Pv - w), tag 7$
and by virtue of (4) and (6)
$langle v - Pv, Pv - w rangle = 0, tag 8$
whence
$Vert v - w Vert^2 = langle v - w, v - w rangle = langle (v - Pv) + (Pv - w), (v - Pv) + (Pv - w) rangle$
$= langle v - Pv, v - Pv rangle -2langle v - Pv, Pv - w rangle + langle Pv - w, Pv - w rangle$
$= Vert v - Pv Vert^2 + Vert Pv - w Vert^2;tag 9$
since each term on the right is non-negative, we find that
$Vert v - w Vert^2 ge Vert v - Pv Vert^2, tag{10}$
whence
$Vert v - w Vert ge Vert v - Pv Vert, ; forall w in W, tag{11}$
with equality holding precisely when
$Vert w - Pv Vert = 0 Longleftrightarrow w = Pv; tag{12}$
thus
$w ne Pv Longrightarrow Vert v - w Vert > Vert v - Pv Vert. tag{13}$
$OEDelta$.
Inequality of vector norms with projections
answered Mar 17 at 19:46
Robert LewisRobert Lewis
48.5k23167
48.5k23167
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3150627%2finequality-of-vector-norms-with-projections%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Pray tell, what is $w'$?
$endgroup$
– Robert Lewis
Mar 16 at 17:49
1
$begingroup$
My bad, I meant $w$
$endgroup$
– Noé Duarte González
Mar 16 at 17:55
$begingroup$
Thanks for the correction!
$endgroup$
– Robert Lewis
Mar 16 at 18:06