Do large integers carry more information than small integers?Entropy of generatable(?) structuresWhy do lower...
Is divide-by-zero a security vulnerability?
Finitely many repeated replacements
Does a difference of tense count as a difference of meaning in a minimal pair?
Was it really inappropriate to write a pull request for the company I interviewed with?
When Schnorr signatures are part of Bitcoin will it be possible validate each block with only one signature validation?
Confusion about Complex Continued Fraction
How to resolve: Reviewer #1 says remove section X vs. Reviewer #2 says expand section X
What is Tony Stark injecting into himself in Iron Man 3?
Do I really need to have a scientific explanation for my premise?
Why aren't there more Gauls like Obelix?
How to write a chaotic neutral protagonist and prevent my readers from thinking they are evil?
Should I take out a loan for a friend to invest on my behalf?
Recommendation letter by significant other if you worked with them professionally?
Doubts in understanding some concepts of potential energy
What would be the most expensive material to an intergalactic society?
Trig Subsitution When There's No Square Root
What can I do if someone tampers with my SSH public key?
MySQL importing CSV files really slow
Can one live in the U.S. and not use a credit card?
Can the alpha, lambda values of a glmnet object output determine whether ridge or Lasso?
Is it possible to find 2014 distinct positive integers whose sum is divisible by each of them?
Why is there an extra space when I type "ls" in the Desktop directory?
Why couldn't the separatists legally leave the Republic?
Is it possible to avoid unpacking when merging Association?
Do large integers carry more information than small integers?
Entropy of generatable(?) structuresWhy do lower probability messages contain more information?Do ordered lists contain more information than unordered lists?Self-information, one event half as likely than another event conveys twice the amount of information?How come that HSL can contain more information than RGB?Why does the information content of a less probable event yield more bits than a more probable one?Is the channel capacity (defined as the maximum of mutual information) always less than 1?Please, clarify relationship between Hausdorff dimension and storage space of points in Cantor sets.In what sense does a hyperbolic space have more information capacity than an Euclidean one?How much information in a die toss or a coin toss result
$begingroup$
Naively, the answer seems to be yes. Our representation of numbers in a base-ten system requires that large integers take more digits. Large numbers also take up more space when we write them in binary, and can be used to encode more information on a computer.
But as all integers represent one instance out of the set of all possible integers, their informational entropy should be equivalent, as are sides of a die in a dice game.
So, do large integers carry more information than small integers, or does the interaction of integers with a base-ten system produce different amounts of information for different integers?
information-theory
New contributor
$endgroup$
|
show 4 more comments
$begingroup$
Naively, the answer seems to be yes. Our representation of numbers in a base-ten system requires that large integers take more digits. Large numbers also take up more space when we write them in binary, and can be used to encode more information on a computer.
But as all integers represent one instance out of the set of all possible integers, their informational entropy should be equivalent, as are sides of a die in a dice game.
So, do large integers carry more information than small integers, or does the interaction of integers with a base-ten system produce different amounts of information for different integers?
information-theory
New contributor
$endgroup$
8
$begingroup$
How do you define information?
$endgroup$
– John Douma
2 days ago
1
$begingroup$
The Kolmogorov-complexity is in fact a measure of how many "information" a number contains. Most large numbers cannot be compreessed (that is created by a much smaller program than the program "print<number>"), hence in this sense contain more information in general, when they are larger.
$endgroup$
– Peter
2 days ago
$begingroup$
Another definition of information is as an appropriate function of the probability a random integer would be the value of interest. All sufficiently large integers must be very improbable once a distribution is specified, so the information is greater.
$endgroup$
– J.G.
2 days ago
1
$begingroup$
Depends. In a language where there are only two words, but each word is 999 letters long, each one is still only one bit's worth of information. There needs to be more context.
$endgroup$
– Anadactothe
2 days ago
1
$begingroup$
It seems useless to talk about a game of dice if nobody can ever know how many sides each die has. But if you tell me you rolled $1$ on a $45249081$-sided die, then you have given me a lot more information than just the number $1.$
$endgroup$
– David K
2 days ago
|
show 4 more comments
$begingroup$
Naively, the answer seems to be yes. Our representation of numbers in a base-ten system requires that large integers take more digits. Large numbers also take up more space when we write them in binary, and can be used to encode more information on a computer.
But as all integers represent one instance out of the set of all possible integers, their informational entropy should be equivalent, as are sides of a die in a dice game.
So, do large integers carry more information than small integers, or does the interaction of integers with a base-ten system produce different amounts of information for different integers?
information-theory
New contributor
$endgroup$
Naively, the answer seems to be yes. Our representation of numbers in a base-ten system requires that large integers take more digits. Large numbers also take up more space when we write them in binary, and can be used to encode more information on a computer.
But as all integers represent one instance out of the set of all possible integers, their informational entropy should be equivalent, as are sides of a die in a dice game.
So, do large integers carry more information than small integers, or does the interaction of integers with a base-ten system produce different amounts of information for different integers?
information-theory
information-theory
New contributor
New contributor
New contributor
asked 2 days ago
J--J--
1061
1061
New contributor
New contributor
8
$begingroup$
How do you define information?
$endgroup$
– John Douma
2 days ago
1
$begingroup$
The Kolmogorov-complexity is in fact a measure of how many "information" a number contains. Most large numbers cannot be compreessed (that is created by a much smaller program than the program "print<number>"), hence in this sense contain more information in general, when they are larger.
$endgroup$
– Peter
2 days ago
$begingroup$
Another definition of information is as an appropriate function of the probability a random integer would be the value of interest. All sufficiently large integers must be very improbable once a distribution is specified, so the information is greater.
$endgroup$
– J.G.
2 days ago
1
$begingroup$
Depends. In a language where there are only two words, but each word is 999 letters long, each one is still only one bit's worth of information. There needs to be more context.
$endgroup$
– Anadactothe
2 days ago
1
$begingroup$
It seems useless to talk about a game of dice if nobody can ever know how many sides each die has. But if you tell me you rolled $1$ on a $45249081$-sided die, then you have given me a lot more information than just the number $1.$
$endgroup$
– David K
2 days ago
|
show 4 more comments
8
$begingroup$
How do you define information?
$endgroup$
– John Douma
2 days ago
1
$begingroup$
The Kolmogorov-complexity is in fact a measure of how many "information" a number contains. Most large numbers cannot be compreessed (that is created by a much smaller program than the program "print<number>"), hence in this sense contain more information in general, when they are larger.
$endgroup$
– Peter
2 days ago
$begingroup$
Another definition of information is as an appropriate function of the probability a random integer would be the value of interest. All sufficiently large integers must be very improbable once a distribution is specified, so the information is greater.
$endgroup$
– J.G.
2 days ago
1
$begingroup$
Depends. In a language where there are only two words, but each word is 999 letters long, each one is still only one bit's worth of information. There needs to be more context.
$endgroup$
– Anadactothe
2 days ago
1
$begingroup$
It seems useless to talk about a game of dice if nobody can ever know how many sides each die has. But if you tell me you rolled $1$ on a $45249081$-sided die, then you have given me a lot more information than just the number $1.$
$endgroup$
– David K
2 days ago
8
8
$begingroup$
How do you define information?
$endgroup$
– John Douma
2 days ago
$begingroup$
How do you define information?
$endgroup$
– John Douma
2 days ago
1
1
$begingroup$
The Kolmogorov-complexity is in fact a measure of how many "information" a number contains. Most large numbers cannot be compreessed (that is created by a much smaller program than the program "print<number>"), hence in this sense contain more information in general, when they are larger.
$endgroup$
– Peter
2 days ago
$begingroup$
The Kolmogorov-complexity is in fact a measure of how many "information" a number contains. Most large numbers cannot be compreessed (that is created by a much smaller program than the program "print<number>"), hence in this sense contain more information in general, when they are larger.
$endgroup$
– Peter
2 days ago
$begingroup$
Another definition of information is as an appropriate function of the probability a random integer would be the value of interest. All sufficiently large integers must be very improbable once a distribution is specified, so the information is greater.
$endgroup$
– J.G.
2 days ago
$begingroup$
Another definition of information is as an appropriate function of the probability a random integer would be the value of interest. All sufficiently large integers must be very improbable once a distribution is specified, so the information is greater.
$endgroup$
– J.G.
2 days ago
1
1
$begingroup$
Depends. In a language where there are only two words, but each word is 999 letters long, each one is still only one bit's worth of information. There needs to be more context.
$endgroup$
– Anadactothe
2 days ago
$begingroup$
Depends. In a language where there are only two words, but each word is 999 letters long, each one is still only one bit's worth of information. There needs to be more context.
$endgroup$
– Anadactothe
2 days ago
1
1
$begingroup$
It seems useless to talk about a game of dice if nobody can ever know how many sides each die has. But if you tell me you rolled $1$ on a $45249081$-sided die, then you have given me a lot more information than just the number $1.$
$endgroup$
– David K
2 days ago
$begingroup$
It seems useless to talk about a game of dice if nobody can ever know how many sides each die has. But if you tell me you rolled $1$ on a $45249081$-sided die, then you have given me a lot more information than just the number $1.$
$endgroup$
– David K
2 days ago
|
show 4 more comments
2 Answers
2
active
oldest
votes
$begingroup$
It will help to think of information as a measure of difficulty: roughly, "more information" means "harder to describe" (the connection being that in order to describe it you have to say more) or "harder to compress" (as a string).
In light of this it should be clear that size doesn't control information: the number $$X={10}^{{10}^{10^{10^{10^{10}}}}}$$is rather large, but quite simple to describe (I've just done it); by contrast, consider the number $$Y=4361748963429187634192343214123412345654678492734536.$$ This is - relatively speaking - tiny. But it's (at a glance, at least) much harder to describe.
Now, per your comment "Large numbers also take up more space when we write them in binary" - a key point here is that we get to choose how to describe the numbers in question. If I tried to write out $X$ in decimal notation, I'd be in trouble, but the point is that there is some way to write $X$ compactly. The issue with $Y$ is that there isn't any obvious way to "repackage" it in a simpler form. Maybe we're amazingly lucky and it's (say) the smallest counterexample to Goldbach's conjecture - which would give a relatively simple way to define it ("the smallest counterexample to Goldbach's conjecture") - but barring such surprises, $Y$ is in fact harder to describe (= contains more information) than $X$.
$endgroup$
add a comment |
$begingroup$
Think about the reverse problem: If I want to encode the complete works of Shakespeare into an integer, I don't think I can do it with anything smaller than $100.$ I could take the very large base-16 number formed from the ASCII file of all of Shakespeare's works.
So it's the case that I am able to encode more information in a large integer than in a small one.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
J-- is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3140352%2fdo-large-integers-carry-more-information-than-small-integers%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
It will help to think of information as a measure of difficulty: roughly, "more information" means "harder to describe" (the connection being that in order to describe it you have to say more) or "harder to compress" (as a string).
In light of this it should be clear that size doesn't control information: the number $$X={10}^{{10}^{10^{10^{10^{10}}}}}$$is rather large, but quite simple to describe (I've just done it); by contrast, consider the number $$Y=4361748963429187634192343214123412345654678492734536.$$ This is - relatively speaking - tiny. But it's (at a glance, at least) much harder to describe.
Now, per your comment "Large numbers also take up more space when we write them in binary" - a key point here is that we get to choose how to describe the numbers in question. If I tried to write out $X$ in decimal notation, I'd be in trouble, but the point is that there is some way to write $X$ compactly. The issue with $Y$ is that there isn't any obvious way to "repackage" it in a simpler form. Maybe we're amazingly lucky and it's (say) the smallest counterexample to Goldbach's conjecture - which would give a relatively simple way to define it ("the smallest counterexample to Goldbach's conjecture") - but barring such surprises, $Y$ is in fact harder to describe (= contains more information) than $X$.
$endgroup$
add a comment |
$begingroup$
It will help to think of information as a measure of difficulty: roughly, "more information" means "harder to describe" (the connection being that in order to describe it you have to say more) or "harder to compress" (as a string).
In light of this it should be clear that size doesn't control information: the number $$X={10}^{{10}^{10^{10^{10^{10}}}}}$$is rather large, but quite simple to describe (I've just done it); by contrast, consider the number $$Y=4361748963429187634192343214123412345654678492734536.$$ This is - relatively speaking - tiny. But it's (at a glance, at least) much harder to describe.
Now, per your comment "Large numbers also take up more space when we write them in binary" - a key point here is that we get to choose how to describe the numbers in question. If I tried to write out $X$ in decimal notation, I'd be in trouble, but the point is that there is some way to write $X$ compactly. The issue with $Y$ is that there isn't any obvious way to "repackage" it in a simpler form. Maybe we're amazingly lucky and it's (say) the smallest counterexample to Goldbach's conjecture - which would give a relatively simple way to define it ("the smallest counterexample to Goldbach's conjecture") - but barring such surprises, $Y$ is in fact harder to describe (= contains more information) than $X$.
$endgroup$
add a comment |
$begingroup$
It will help to think of information as a measure of difficulty: roughly, "more information" means "harder to describe" (the connection being that in order to describe it you have to say more) or "harder to compress" (as a string).
In light of this it should be clear that size doesn't control information: the number $$X={10}^{{10}^{10^{10^{10^{10}}}}}$$is rather large, but quite simple to describe (I've just done it); by contrast, consider the number $$Y=4361748963429187634192343214123412345654678492734536.$$ This is - relatively speaking - tiny. But it's (at a glance, at least) much harder to describe.
Now, per your comment "Large numbers also take up more space when we write them in binary" - a key point here is that we get to choose how to describe the numbers in question. If I tried to write out $X$ in decimal notation, I'd be in trouble, but the point is that there is some way to write $X$ compactly. The issue with $Y$ is that there isn't any obvious way to "repackage" it in a simpler form. Maybe we're amazingly lucky and it's (say) the smallest counterexample to Goldbach's conjecture - which would give a relatively simple way to define it ("the smallest counterexample to Goldbach's conjecture") - but barring such surprises, $Y$ is in fact harder to describe (= contains more information) than $X$.
$endgroup$
It will help to think of information as a measure of difficulty: roughly, "more information" means "harder to describe" (the connection being that in order to describe it you have to say more) or "harder to compress" (as a string).
In light of this it should be clear that size doesn't control information: the number $$X={10}^{{10}^{10^{10^{10^{10}}}}}$$is rather large, but quite simple to describe (I've just done it); by contrast, consider the number $$Y=4361748963429187634192343214123412345654678492734536.$$ This is - relatively speaking - tiny. But it's (at a glance, at least) much harder to describe.
Now, per your comment "Large numbers also take up more space when we write them in binary" - a key point here is that we get to choose how to describe the numbers in question. If I tried to write out $X$ in decimal notation, I'd be in trouble, but the point is that there is some way to write $X$ compactly. The issue with $Y$ is that there isn't any obvious way to "repackage" it in a simpler form. Maybe we're amazingly lucky and it's (say) the smallest counterexample to Goldbach's conjecture - which would give a relatively simple way to define it ("the smallest counterexample to Goldbach's conjecture") - but barring such surprises, $Y$ is in fact harder to describe (= contains more information) than $X$.
edited 2 days ago
answered 2 days ago
Noah SchweberNoah Schweber
126k10151290
126k10151290
add a comment |
add a comment |
$begingroup$
Think about the reverse problem: If I want to encode the complete works of Shakespeare into an integer, I don't think I can do it with anything smaller than $100.$ I could take the very large base-16 number formed from the ASCII file of all of Shakespeare's works.
So it's the case that I am able to encode more information in a large integer than in a small one.
$endgroup$
add a comment |
$begingroup$
Think about the reverse problem: If I want to encode the complete works of Shakespeare into an integer, I don't think I can do it with anything smaller than $100.$ I could take the very large base-16 number formed from the ASCII file of all of Shakespeare's works.
So it's the case that I am able to encode more information in a large integer than in a small one.
$endgroup$
add a comment |
$begingroup$
Think about the reverse problem: If I want to encode the complete works of Shakespeare into an integer, I don't think I can do it with anything smaller than $100.$ I could take the very large base-16 number formed from the ASCII file of all of Shakespeare's works.
So it's the case that I am able to encode more information in a large integer than in a small one.
$endgroup$
Think about the reverse problem: If I want to encode the complete works of Shakespeare into an integer, I don't think I can do it with anything smaller than $100.$ I could take the very large base-16 number formed from the ASCII file of all of Shakespeare's works.
So it's the case that I am able to encode more information in a large integer than in a small one.
answered 2 days ago
B. GoddardB. Goddard
19.5k21442
19.5k21442
add a comment |
add a comment |
J-- is a new contributor. Be nice, and check out our Code of Conduct.
J-- is a new contributor. Be nice, and check out our Code of Conduct.
J-- is a new contributor. Be nice, and check out our Code of Conduct.
J-- is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3140352%2fdo-large-integers-carry-more-information-than-small-integers%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
8
$begingroup$
How do you define information?
$endgroup$
– John Douma
2 days ago
1
$begingroup$
The Kolmogorov-complexity is in fact a measure of how many "information" a number contains. Most large numbers cannot be compreessed (that is created by a much smaller program than the program "print<number>"), hence in this sense contain more information in general, when they are larger.
$endgroup$
– Peter
2 days ago
$begingroup$
Another definition of information is as an appropriate function of the probability a random integer would be the value of interest. All sufficiently large integers must be very improbable once a distribution is specified, so the information is greater.
$endgroup$
– J.G.
2 days ago
1
$begingroup$
Depends. In a language where there are only two words, but each word is 999 letters long, each one is still only one bit's worth of information. There needs to be more context.
$endgroup$
– Anadactothe
2 days ago
1
$begingroup$
It seems useless to talk about a game of dice if nobody can ever know how many sides each die has. But if you tell me you rolled $1$ on a $45249081$-sided die, then you have given me a lot more information than just the number $1.$
$endgroup$
– David K
2 days ago