Entropy vs predictability in PRNGs












8












$begingroup$


If entropy is the measure of surprise, and a given PRNG has a uniform distribution, then the entropy would be high. So the Mersenne Twister(MT) has high entropy.



But the MT is also predictable. You can retrieve its past bits and predict its future bits.



What's the relationship between entropy and predictability?










share|improve this question











$endgroup$












  • $begingroup$
    If it's predictable, how can it be surprising?
    $endgroup$
    – Ella Rose
    9 hours ago










  • $begingroup$
    @fgrieu You're right, thanks. Made it more general.
    $endgroup$
    – Bastien
    8 hours ago






  • 2




    $begingroup$
    @EllaRose I'm trying to draw a clear line between entropy and predictability as they apply to random number generators.
    $endgroup$
    – Bastien
    7 hours ago


















8












$begingroup$


If entropy is the measure of surprise, and a given PRNG has a uniform distribution, then the entropy would be high. So the Mersenne Twister(MT) has high entropy.



But the MT is also predictable. You can retrieve its past bits and predict its future bits.



What's the relationship between entropy and predictability?










share|improve this question











$endgroup$












  • $begingroup$
    If it's predictable, how can it be surprising?
    $endgroup$
    – Ella Rose
    9 hours ago










  • $begingroup$
    @fgrieu You're right, thanks. Made it more general.
    $endgroup$
    – Bastien
    8 hours ago






  • 2




    $begingroup$
    @EllaRose I'm trying to draw a clear line between entropy and predictability as they apply to random number generators.
    $endgroup$
    – Bastien
    7 hours ago
















8












8








8


2



$begingroup$


If entropy is the measure of surprise, and a given PRNG has a uniform distribution, then the entropy would be high. So the Mersenne Twister(MT) has high entropy.



But the MT is also predictable. You can retrieve its past bits and predict its future bits.



What's the relationship between entropy and predictability?










share|improve this question











$endgroup$




If entropy is the measure of surprise, and a given PRNG has a uniform distribution, then the entropy would be high. So the Mersenne Twister(MT) has high entropy.



But the MT is also predictable. You can retrieve its past bits and predict its future bits.



What's the relationship between entropy and predictability?







random-number-generator entropy






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited 8 hours ago







Bastien

















asked 12 hours ago









BastienBastien

784




784












  • $begingroup$
    If it's predictable, how can it be surprising?
    $endgroup$
    – Ella Rose
    9 hours ago










  • $begingroup$
    @fgrieu You're right, thanks. Made it more general.
    $endgroup$
    – Bastien
    8 hours ago






  • 2




    $begingroup$
    @EllaRose I'm trying to draw a clear line between entropy and predictability as they apply to random number generators.
    $endgroup$
    – Bastien
    7 hours ago




















  • $begingroup$
    If it's predictable, how can it be surprising?
    $endgroup$
    – Ella Rose
    9 hours ago










  • $begingroup$
    @fgrieu You're right, thanks. Made it more general.
    $endgroup$
    – Bastien
    8 hours ago






  • 2




    $begingroup$
    @EllaRose I'm trying to draw a clear line between entropy and predictability as they apply to random number generators.
    $endgroup$
    – Bastien
    7 hours ago


















$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose
9 hours ago




$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose
9 hours ago












$begingroup$
@fgrieu You're right, thanks. Made it more general.
$endgroup$
– Bastien
8 hours ago




$begingroup$
@fgrieu You're right, thanks. Made it more general.
$endgroup$
– Bastien
8 hours ago




2




2




$begingroup$
@EllaRose I'm trying to draw a clear line between entropy and predictability as they apply to random number generators.
$endgroup$
– Bastien
7 hours ago






$begingroup$
@EllaRose I'm trying to draw a clear line between entropy and predictability as they apply to random number generators.
$endgroup$
– Bastien
7 hours ago












5 Answers
5






active

oldest

votes


















4












$begingroup$


entropy is the measure of surprise




That's informal and short, but correct within that. In the case of a Random Number Generator, we must make that: entropy is the measure of surprise in the output of the RNG, for one skilled person (with arbitrarily large computing power) knowing the RNG design including any parameter (for MT: the Mersenne prime used, and a few others). That's for unknown seed (if any) assumed uniformly random but arbitrarily large computing power of the skilled person (unless otherwise stated).



Notice that the entropy is a property of the generator, not that of one particular bitstring that it outputs.



Entropy further can be defined for the total output of a generator, or per output bit. In cryptography we measure the entropy in bit, so that it is 1 bit per output bit for an ideal uniform True RNG.



For any Pseudo RNG, the whole output is predictable from design, parameters and seed, hence the entropy in the whole output is limited to the entropy in what generates its seed, which is finite. And the entropy per output bit decreases to zero as the output size increase towards infinity.




the Mersenne Twister(MT) has high entropy.




No, because it is a PRNG (see above).




the MT is also predictable.




Yes, with enough output, and little computing power.






What's the relationship between entropy and predictability?




If a bitstring generator has the property that it's full output is predictable from a finite length prefix (as is the case for MT), then this generator has finite total entropy (bounded by said finite length in bits for Shannon entropy in bits), and vanishingly small entropy per output bit.



The converse is false for practical definition of (un)predictable. In particular, there exist practical Cryptographically Secure PRNGs (thus of finite total entropy) that are practically unpredictable.






share|improve this answer











$endgroup$













  • $begingroup$
    If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
    $endgroup$
    – Paul Uszak
    3 hours ago










  • $begingroup$
    Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
    $endgroup$
    – Bastien
    2 hours ago



















1












$begingroup$

The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.



In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.






share|improve this answer








New contributor




Ievgeni is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$









  • 2




    $begingroup$
    The entropy of a random string is the number of bits you need to describe the full string this appears to be the definition of kolmogorov complexity, rather than entropy.
    $endgroup$
    – Ella Rose
    9 hours ago



















1












$begingroup$

For the entropy of a bit string to be meaningful, it must have been chosen in some particular random or partially-random process which had a certain probability of producing that exact string, and a certain probability of producing something else. The entropy of a string is then, roughly(*), the negative log of the probability that the process that produced the string, would have done so. Thus, if some process generates a string which has 8 bits of entropy, that means that the process would have had a 1 in 256 chance of generating that particular string.



(*) There are a variety of ways of measuring entropy, but for most purposes they're close to each other that a simple approximation can be reasonably close to all of them.



If a process does not generate strings with equal probability, it's often appropriate to regard the entropy produced by the process as being that of the highest-probability string. So if a process has a 50% chance of generating the 16-bit string of zeroes, and a one in 131,070 chance of generating any other 16-bit string, it would generally be appropriate to regard the process as yielding one bit of entropy even one could filter the output to yield more (e.g. generating bit strings until one gets one that isn't all zeroes would yield 15.999978 bits of entropy while requiring, on average, only twice as long as generating one bit).






share|improve this answer









$endgroup$













  • $begingroup$
    OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
    $endgroup$
    – Bastien
    7 hours ago












  • $begingroup$
    @EllaRose Yeah, so if a random process has a uniform distribution, we can say its entropy is high is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.
    $endgroup$
    – PyRulez
    7 hours ago










  • $begingroup$
    Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
    $endgroup$
    – Ella Rose
    7 hours ago












  • $begingroup$
    @fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
    $endgroup$
    – supercat
    6 hours ago










  • $begingroup$
    @supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
    $endgroup$
    – fgrieu
    5 hours ago





















1












$begingroup$

PRNG would have a pseudo-uniform distribution, so to speak. There is actually a correlation between its outputs. So its entropy is limited to that of the seed.



Having (really) low entropy makes something predictable, since you can just bruteforce the seed. The converse is not true, however. A poorly designed PRNG will leak entropy in a way an adversary can take advantage of (unless the PRNG is not meant to be resistant to adversaries, like a video game). When a good PRNG leaks entropy, on the other hand, its computationally infeasible to take advantage of, so it effectively never decreases in entropy for practical purposes.






share|improve this answer









$endgroup$





















    0












    $begingroup$


    ...a given PRNG has a uniform distribution, then the entropy would be high.




    In cryptography, it is common to consider the entropy of a generator as the unknown input to that generator. It would be equivalent to the seed or key for a RNG. So for a common AES-128 counter based CSPRNG, the entropy would be 128 bits. The output distribution is irrelevant where cryptographic entropy is concerned, although most RNGs in their original forms do produce uniform output. They would be difficult to use for encryption otherwise.




    So the Mersenne Twister(MT) has high entropy.




    Ish. It starts out high with an entropy of 19,937 bits for the common 32 bit implementation. However MT is invertable and the state can be discovered by looking at sufficient output. Observing 624 output words from MT allows the internal state to be discovered. Consequently the entropy reduces by 32 bits per output word, reaching zero after 624 outputs. Any subsequent output is entirely predictable. The leftover hash lemma applies to MT outputs less than 624 words, when it would be acting as a very inefficient randomness extractor.




    What's the relationship between entropy and predictability?




    In cryptography generally, entropy = unpredictability = surprisal. That is necessary for seeding RNGs and creating keys. A cryptographer strives for unpredictable RNG sequences and unknown keys. A CSPRNG cannot be inverted, the seed/key cannot be recovered and thus the entropy, unpredictability and surprisal of output is preserved no matter the length of said output.






    share|improve this answer









    $endgroup$













      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "281"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: false,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f66525%2fentropy-vs-predictability-in-prngs%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      5 Answers
      5






      active

      oldest

      votes








      5 Answers
      5






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      4












      $begingroup$


      entropy is the measure of surprise




      That's informal and short, but correct within that. In the case of a Random Number Generator, we must make that: entropy is the measure of surprise in the output of the RNG, for one skilled person (with arbitrarily large computing power) knowing the RNG design including any parameter (for MT: the Mersenne prime used, and a few others). That's for unknown seed (if any) assumed uniformly random but arbitrarily large computing power of the skilled person (unless otherwise stated).



      Notice that the entropy is a property of the generator, not that of one particular bitstring that it outputs.



      Entropy further can be defined for the total output of a generator, or per output bit. In cryptography we measure the entropy in bit, so that it is 1 bit per output bit for an ideal uniform True RNG.



      For any Pseudo RNG, the whole output is predictable from design, parameters and seed, hence the entropy in the whole output is limited to the entropy in what generates its seed, which is finite. And the entropy per output bit decreases to zero as the output size increase towards infinity.




      the Mersenne Twister(MT) has high entropy.




      No, because it is a PRNG (see above).




      the MT is also predictable.




      Yes, with enough output, and little computing power.






      What's the relationship between entropy and predictability?




      If a bitstring generator has the property that it's full output is predictable from a finite length prefix (as is the case for MT), then this generator has finite total entropy (bounded by said finite length in bits for Shannon entropy in bits), and vanishingly small entropy per output bit.



      The converse is false for practical definition of (un)predictable. In particular, there exist practical Cryptographically Secure PRNGs (thus of finite total entropy) that are practically unpredictable.






      share|improve this answer











      $endgroup$













      • $begingroup$
        If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
        $endgroup$
        – Paul Uszak
        3 hours ago










      • $begingroup$
        Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
        $endgroup$
        – Bastien
        2 hours ago
















      4












      $begingroup$


      entropy is the measure of surprise




      That's informal and short, but correct within that. In the case of a Random Number Generator, we must make that: entropy is the measure of surprise in the output of the RNG, for one skilled person (with arbitrarily large computing power) knowing the RNG design including any parameter (for MT: the Mersenne prime used, and a few others). That's for unknown seed (if any) assumed uniformly random but arbitrarily large computing power of the skilled person (unless otherwise stated).



      Notice that the entropy is a property of the generator, not that of one particular bitstring that it outputs.



      Entropy further can be defined for the total output of a generator, or per output bit. In cryptography we measure the entropy in bit, so that it is 1 bit per output bit for an ideal uniform True RNG.



      For any Pseudo RNG, the whole output is predictable from design, parameters and seed, hence the entropy in the whole output is limited to the entropy in what generates its seed, which is finite. And the entropy per output bit decreases to zero as the output size increase towards infinity.




      the Mersenne Twister(MT) has high entropy.




      No, because it is a PRNG (see above).




      the MT is also predictable.




      Yes, with enough output, and little computing power.






      What's the relationship between entropy and predictability?




      If a bitstring generator has the property that it's full output is predictable from a finite length prefix (as is the case for MT), then this generator has finite total entropy (bounded by said finite length in bits for Shannon entropy in bits), and vanishingly small entropy per output bit.



      The converse is false for practical definition of (un)predictable. In particular, there exist practical Cryptographically Secure PRNGs (thus of finite total entropy) that are practically unpredictable.






      share|improve this answer











      $endgroup$













      • $begingroup$
        If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
        $endgroup$
        – Paul Uszak
        3 hours ago










      • $begingroup$
        Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
        $endgroup$
        – Bastien
        2 hours ago














      4












      4








      4





      $begingroup$


      entropy is the measure of surprise




      That's informal and short, but correct within that. In the case of a Random Number Generator, we must make that: entropy is the measure of surprise in the output of the RNG, for one skilled person (with arbitrarily large computing power) knowing the RNG design including any parameter (for MT: the Mersenne prime used, and a few others). That's for unknown seed (if any) assumed uniformly random but arbitrarily large computing power of the skilled person (unless otherwise stated).



      Notice that the entropy is a property of the generator, not that of one particular bitstring that it outputs.



      Entropy further can be defined for the total output of a generator, or per output bit. In cryptography we measure the entropy in bit, so that it is 1 bit per output bit for an ideal uniform True RNG.



      For any Pseudo RNG, the whole output is predictable from design, parameters and seed, hence the entropy in the whole output is limited to the entropy in what generates its seed, which is finite. And the entropy per output bit decreases to zero as the output size increase towards infinity.




      the Mersenne Twister(MT) has high entropy.




      No, because it is a PRNG (see above).




      the MT is also predictable.




      Yes, with enough output, and little computing power.






      What's the relationship between entropy and predictability?




      If a bitstring generator has the property that it's full output is predictable from a finite length prefix (as is the case for MT), then this generator has finite total entropy (bounded by said finite length in bits for Shannon entropy in bits), and vanishingly small entropy per output bit.



      The converse is false for practical definition of (un)predictable. In particular, there exist practical Cryptographically Secure PRNGs (thus of finite total entropy) that are practically unpredictable.






      share|improve this answer











      $endgroup$




      entropy is the measure of surprise




      That's informal and short, but correct within that. In the case of a Random Number Generator, we must make that: entropy is the measure of surprise in the output of the RNG, for one skilled person (with arbitrarily large computing power) knowing the RNG design including any parameter (for MT: the Mersenne prime used, and a few others). That's for unknown seed (if any) assumed uniformly random but arbitrarily large computing power of the skilled person (unless otherwise stated).



      Notice that the entropy is a property of the generator, not that of one particular bitstring that it outputs.



      Entropy further can be defined for the total output of a generator, or per output bit. In cryptography we measure the entropy in bit, so that it is 1 bit per output bit for an ideal uniform True RNG.



      For any Pseudo RNG, the whole output is predictable from design, parameters and seed, hence the entropy in the whole output is limited to the entropy in what generates its seed, which is finite. And the entropy per output bit decreases to zero as the output size increase towards infinity.




      the Mersenne Twister(MT) has high entropy.




      No, because it is a PRNG (see above).




      the MT is also predictable.




      Yes, with enough output, and little computing power.






      What's the relationship between entropy and predictability?




      If a bitstring generator has the property that it's full output is predictable from a finite length prefix (as is the case for MT), then this generator has finite total entropy (bounded by said finite length in bits for Shannon entropy in bits), and vanishingly small entropy per output bit.



      The converse is false for practical definition of (un)predictable. In particular, there exist practical Cryptographically Secure PRNGs (thus of finite total entropy) that are practically unpredictable.







      share|improve this answer














      share|improve this answer



      share|improve this answer








      edited 6 hours ago

























      answered 7 hours ago









      fgrieufgrieu

      78.1k7166331




      78.1k7166331












      • $begingroup$
        If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
        $endgroup$
        – Paul Uszak
        3 hours ago










      • $begingroup$
        Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
        $endgroup$
        – Bastien
        2 hours ago


















      • $begingroup$
        If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
        $endgroup$
        – Paul Uszak
        3 hours ago










      • $begingroup$
        Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
        $endgroup$
        – Bastien
        2 hours ago
















      $begingroup$
      If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
      $endgroup$
      – Paul Uszak
      3 hours ago




      $begingroup$
      If as you say H(any bit string) = 0, and you wanted to transmit that string to the Martians without loss in one attempt, what determines the minimum transmission length?
      $endgroup$
      – Paul Uszak
      3 hours ago












      $begingroup$
      Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
      $endgroup$
      – Bastien
      2 hours ago




      $begingroup$
      Ok, I'll repeat this back to you to see if I understand it. PRNGs have relatively low entropy because, while they may start with some high entropy bits from some source as the seed, the way the PRNG then generates subsequent bits is linear and predictable such that all the effective entropy is in the seed only. A CSPRNG in contrast generates bits in a non-linear fashion such that, even though it's still deterministic, the effective (practical) entropy remains high because it's infeasible to predict the next bit. Is that right?
      $endgroup$
      – Bastien
      2 hours ago











      1












      $begingroup$

      The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
      But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.



      In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.






      share|improve this answer








      New contributor




      Ievgeni is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      $endgroup$









      • 2




        $begingroup$
        The entropy of a random string is the number of bits you need to describe the full string this appears to be the definition of kolmogorov complexity, rather than entropy.
        $endgroup$
        – Ella Rose
        9 hours ago
















      1












      $begingroup$

      The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
      But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.



      In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.






      share|improve this answer








      New contributor




      Ievgeni is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      $endgroup$









      • 2




        $begingroup$
        The entropy of a random string is the number of bits you need to describe the full string this appears to be the definition of kolmogorov complexity, rather than entropy.
        $endgroup$
        – Ella Rose
        9 hours ago














      1












      1








      1





      $begingroup$

      The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
      But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.



      In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.






      share|improve this answer








      New contributor




      Ievgeni is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      $endgroup$



      The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
      But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.



      In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.







      share|improve this answer








      New contributor




      Ievgeni is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      share|improve this answer



      share|improve this answer






      New contributor




      Ievgeni is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      answered 12 hours ago









      IevgeniIevgeni

      292




      292




      New contributor




      Ievgeni is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.





      New contributor





      Ievgeni is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      Ievgeni is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.








      • 2




        $begingroup$
        The entropy of a random string is the number of bits you need to describe the full string this appears to be the definition of kolmogorov complexity, rather than entropy.
        $endgroup$
        – Ella Rose
        9 hours ago














      • 2




        $begingroup$
        The entropy of a random string is the number of bits you need to describe the full string this appears to be the definition of kolmogorov complexity, rather than entropy.
        $endgroup$
        – Ella Rose
        9 hours ago








      2




      2




      $begingroup$
      The entropy of a random string is the number of bits you need to describe the full string this appears to be the definition of kolmogorov complexity, rather than entropy.
      $endgroup$
      – Ella Rose
      9 hours ago




      $begingroup$
      The entropy of a random string is the number of bits you need to describe the full string this appears to be the definition of kolmogorov complexity, rather than entropy.
      $endgroup$
      – Ella Rose
      9 hours ago











      1












      $begingroup$

      For the entropy of a bit string to be meaningful, it must have been chosen in some particular random or partially-random process which had a certain probability of producing that exact string, and a certain probability of producing something else. The entropy of a string is then, roughly(*), the negative log of the probability that the process that produced the string, would have done so. Thus, if some process generates a string which has 8 bits of entropy, that means that the process would have had a 1 in 256 chance of generating that particular string.



      (*) There are a variety of ways of measuring entropy, but for most purposes they're close to each other that a simple approximation can be reasonably close to all of them.



      If a process does not generate strings with equal probability, it's often appropriate to regard the entropy produced by the process as being that of the highest-probability string. So if a process has a 50% chance of generating the 16-bit string of zeroes, and a one in 131,070 chance of generating any other 16-bit string, it would generally be appropriate to regard the process as yielding one bit of entropy even one could filter the output to yield more (e.g. generating bit strings until one gets one that isn't all zeroes would yield 15.999978 bits of entropy while requiring, on average, only twice as long as generating one bit).






      share|improve this answer









      $endgroup$













      • $begingroup$
        OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
        $endgroup$
        – Bastien
        7 hours ago












      • $begingroup$
        @EllaRose Yeah, so if a random process has a uniform distribution, we can say its entropy is high is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.
        $endgroup$
        – PyRulez
        7 hours ago










      • $begingroup$
        Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
        $endgroup$
        – Ella Rose
        7 hours ago












      • $begingroup$
        @fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
        $endgroup$
        – supercat
        6 hours ago










      • $begingroup$
        @supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
        $endgroup$
        – fgrieu
        5 hours ago


















      1












      $begingroup$

      For the entropy of a bit string to be meaningful, it must have been chosen in some particular random or partially-random process which had a certain probability of producing that exact string, and a certain probability of producing something else. The entropy of a string is then, roughly(*), the negative log of the probability that the process that produced the string, would have done so. Thus, if some process generates a string which has 8 bits of entropy, that means that the process would have had a 1 in 256 chance of generating that particular string.



      (*) There are a variety of ways of measuring entropy, but for most purposes they're close to each other that a simple approximation can be reasonably close to all of them.



      If a process does not generate strings with equal probability, it's often appropriate to regard the entropy produced by the process as being that of the highest-probability string. So if a process has a 50% chance of generating the 16-bit string of zeroes, and a one in 131,070 chance of generating any other 16-bit string, it would generally be appropriate to regard the process as yielding one bit of entropy even one could filter the output to yield more (e.g. generating bit strings until one gets one that isn't all zeroes would yield 15.999978 bits of entropy while requiring, on average, only twice as long as generating one bit).






      share|improve this answer









      $endgroup$













      • $begingroup$
        OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
        $endgroup$
        – Bastien
        7 hours ago












      • $begingroup$
        @EllaRose Yeah, so if a random process has a uniform distribution, we can say its entropy is high is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.
        $endgroup$
        – PyRulez
        7 hours ago










      • $begingroup$
        Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
        $endgroup$
        – Ella Rose
        7 hours ago












      • $begingroup$
        @fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
        $endgroup$
        – supercat
        6 hours ago










      • $begingroup$
        @supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
        $endgroup$
        – fgrieu
        5 hours ago
















      1












      1








      1





      $begingroup$

      For the entropy of a bit string to be meaningful, it must have been chosen in some particular random or partially-random process which had a certain probability of producing that exact string, and a certain probability of producing something else. The entropy of a string is then, roughly(*), the negative log of the probability that the process that produced the string, would have done so. Thus, if some process generates a string which has 8 bits of entropy, that means that the process would have had a 1 in 256 chance of generating that particular string.



      (*) There are a variety of ways of measuring entropy, but for most purposes they're close to each other that a simple approximation can be reasonably close to all of them.



      If a process does not generate strings with equal probability, it's often appropriate to regard the entropy produced by the process as being that of the highest-probability string. So if a process has a 50% chance of generating the 16-bit string of zeroes, and a one in 131,070 chance of generating any other 16-bit string, it would generally be appropriate to regard the process as yielding one bit of entropy even one could filter the output to yield more (e.g. generating bit strings until one gets one that isn't all zeroes would yield 15.999978 bits of entropy while requiring, on average, only twice as long as generating one bit).






      share|improve this answer









      $endgroup$



      For the entropy of a bit string to be meaningful, it must have been chosen in some particular random or partially-random process which had a certain probability of producing that exact string, and a certain probability of producing something else. The entropy of a string is then, roughly(*), the negative log of the probability that the process that produced the string, would have done so. Thus, if some process generates a string which has 8 bits of entropy, that means that the process would have had a 1 in 256 chance of generating that particular string.



      (*) There are a variety of ways of measuring entropy, but for most purposes they're close to each other that a simple approximation can be reasonably close to all of them.



      If a process does not generate strings with equal probability, it's often appropriate to regard the entropy produced by the process as being that of the highest-probability string. So if a process has a 50% chance of generating the 16-bit string of zeroes, and a one in 131,070 chance of generating any other 16-bit string, it would generally be appropriate to regard the process as yielding one bit of entropy even one could filter the output to yield more (e.g. generating bit strings until one gets one that isn't all zeroes would yield 15.999978 bits of entropy while requiring, on average, only twice as long as generating one bit).







      share|improve this answer












      share|improve this answer



      share|improve this answer










      answered 8 hours ago









      supercatsupercat

      22114




      22114












      • $begingroup$
        OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
        $endgroup$
        – Bastien
        7 hours ago












      • $begingroup$
        @EllaRose Yeah, so if a random process has a uniform distribution, we can say its entropy is high is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.
        $endgroup$
        – PyRulez
        7 hours ago










      • $begingroup$
        Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
        $endgroup$
        – Ella Rose
        7 hours ago












      • $begingroup$
        @fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
        $endgroup$
        – supercat
        6 hours ago










      • $begingroup$
        @supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
        $endgroup$
        – fgrieu
        5 hours ago




















      • $begingroup$
        OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
        $endgroup$
        – Bastien
        7 hours ago












      • $begingroup$
        @EllaRose Yeah, so if a random process has a uniform distribution, we can say its entropy is high is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.
        $endgroup$
        – PyRulez
        7 hours ago










      • $begingroup$
        Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
        $endgroup$
        – Ella Rose
        7 hours ago












      • $begingroup$
        @fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
        $endgroup$
        – supercat
        6 hours ago










      • $begingroup$
        @supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
        $endgroup$
        – fgrieu
        5 hours ago


















      $begingroup$
      OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
      $endgroup$
      – Bastien
      7 hours ago






      $begingroup$
      OK, I've seen the negative log calculation before. So if a random process has a uniform distribution, we can say its entropy is high, correct? So what I'm trying to understand is the relationship between entropy and predictability. Because from what I'm reading, something with high entropy doesn't automatically make it unpredictable. So you can have a PRNG with high entropy that still wouldn't be suited for cryptography purposes.
      $endgroup$
      – Bastien
      7 hours ago














      $begingroup$
      @EllaRose Yeah, so if a random process has a uniform distribution, we can say its entropy is high is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.
      $endgroup$
      – PyRulez
      7 hours ago




      $begingroup$
      @EllaRose Yeah, so if a random process has a uniform distribution, we can say its entropy is high is a vacuously true in that case. You would have to show a random process with low entropy. The lowest would be a random bit, but in general if you randomly sample from a uniform distribution, the entropy will be maximized for that domain. Hashing an incrementing counter would not count as a random sample since it gives the same results each run.
      $endgroup$
      – PyRulez
      7 hours ago












      $begingroup$
      Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
      $endgroup$
      – Ella Rose
      7 hours ago






      $begingroup$
      Ah, I see where I went wrong, using a deterministic process to try and prove a point about a random one. (Apologies to supercat for blowing up their notifications)
      $endgroup$
      – Ella Rose
      7 hours ago














      $begingroup$
      @fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
      $endgroup$
      – supercat
      6 hours ago




      $begingroup$
      @fgrieu: The entropy of a bit string in a particular context is measured relative to the probability that the process that produced it could have alternative bitstrings instead, and is meaningful only in contexts where that can be measured or estimated.
      $endgroup$
      – supercat
      6 hours ago












      $begingroup$
      @supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
      $endgroup$
      – fgrieu
      5 hours ago






      $begingroup$
      @supercat: you are right about the possible definition of entropy of a particular bitstring. I should have read more carefully. However I think that the question uses entropy for that of the generator, not of a bitstring.
      $endgroup$
      – fgrieu
      5 hours ago













      1












      $begingroup$

      PRNG would have a pseudo-uniform distribution, so to speak. There is actually a correlation between its outputs. So its entropy is limited to that of the seed.



      Having (really) low entropy makes something predictable, since you can just bruteforce the seed. The converse is not true, however. A poorly designed PRNG will leak entropy in a way an adversary can take advantage of (unless the PRNG is not meant to be resistant to adversaries, like a video game). When a good PRNG leaks entropy, on the other hand, its computationally infeasible to take advantage of, so it effectively never decreases in entropy for practical purposes.






      share|improve this answer









      $endgroup$


















        1












        $begingroup$

        PRNG would have a pseudo-uniform distribution, so to speak. There is actually a correlation between its outputs. So its entropy is limited to that of the seed.



        Having (really) low entropy makes something predictable, since you can just bruteforce the seed. The converse is not true, however. A poorly designed PRNG will leak entropy in a way an adversary can take advantage of (unless the PRNG is not meant to be resistant to adversaries, like a video game). When a good PRNG leaks entropy, on the other hand, its computationally infeasible to take advantage of, so it effectively never decreases in entropy for practical purposes.






        share|improve this answer









        $endgroup$
















          1












          1








          1





          $begingroup$

          PRNG would have a pseudo-uniform distribution, so to speak. There is actually a correlation between its outputs. So its entropy is limited to that of the seed.



          Having (really) low entropy makes something predictable, since you can just bruteforce the seed. The converse is not true, however. A poorly designed PRNG will leak entropy in a way an adversary can take advantage of (unless the PRNG is not meant to be resistant to adversaries, like a video game). When a good PRNG leaks entropy, on the other hand, its computationally infeasible to take advantage of, so it effectively never decreases in entropy for practical purposes.






          share|improve this answer









          $endgroup$



          PRNG would have a pseudo-uniform distribution, so to speak. There is actually a correlation between its outputs. So its entropy is limited to that of the seed.



          Having (really) low entropy makes something predictable, since you can just bruteforce the seed. The converse is not true, however. A poorly designed PRNG will leak entropy in a way an adversary can take advantage of (unless the PRNG is not meant to be resistant to adversaries, like a video game). When a good PRNG leaks entropy, on the other hand, its computationally infeasible to take advantage of, so it effectively never decreases in entropy for practical purposes.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered 7 hours ago









          PyRulezPyRulez

          409217




          409217























              0












              $begingroup$


              ...a given PRNG has a uniform distribution, then the entropy would be high.




              In cryptography, it is common to consider the entropy of a generator as the unknown input to that generator. It would be equivalent to the seed or key for a RNG. So for a common AES-128 counter based CSPRNG, the entropy would be 128 bits. The output distribution is irrelevant where cryptographic entropy is concerned, although most RNGs in their original forms do produce uniform output. They would be difficult to use for encryption otherwise.




              So the Mersenne Twister(MT) has high entropy.




              Ish. It starts out high with an entropy of 19,937 bits for the common 32 bit implementation. However MT is invertable and the state can be discovered by looking at sufficient output. Observing 624 output words from MT allows the internal state to be discovered. Consequently the entropy reduces by 32 bits per output word, reaching zero after 624 outputs. Any subsequent output is entirely predictable. The leftover hash lemma applies to MT outputs less than 624 words, when it would be acting as a very inefficient randomness extractor.




              What's the relationship between entropy and predictability?




              In cryptography generally, entropy = unpredictability = surprisal. That is necessary for seeding RNGs and creating keys. A cryptographer strives for unpredictable RNG sequences and unknown keys. A CSPRNG cannot be inverted, the seed/key cannot be recovered and thus the entropy, unpredictability and surprisal of output is preserved no matter the length of said output.






              share|improve this answer









              $endgroup$


















                0












                $begingroup$


                ...a given PRNG has a uniform distribution, then the entropy would be high.




                In cryptography, it is common to consider the entropy of a generator as the unknown input to that generator. It would be equivalent to the seed or key for a RNG. So for a common AES-128 counter based CSPRNG, the entropy would be 128 bits. The output distribution is irrelevant where cryptographic entropy is concerned, although most RNGs in their original forms do produce uniform output. They would be difficult to use for encryption otherwise.




                So the Mersenne Twister(MT) has high entropy.




                Ish. It starts out high with an entropy of 19,937 bits for the common 32 bit implementation. However MT is invertable and the state can be discovered by looking at sufficient output. Observing 624 output words from MT allows the internal state to be discovered. Consequently the entropy reduces by 32 bits per output word, reaching zero after 624 outputs. Any subsequent output is entirely predictable. The leftover hash lemma applies to MT outputs less than 624 words, when it would be acting as a very inefficient randomness extractor.




                What's the relationship between entropy and predictability?




                In cryptography generally, entropy = unpredictability = surprisal. That is necessary for seeding RNGs and creating keys. A cryptographer strives for unpredictable RNG sequences and unknown keys. A CSPRNG cannot be inverted, the seed/key cannot be recovered and thus the entropy, unpredictability and surprisal of output is preserved no matter the length of said output.






                share|improve this answer









                $endgroup$
















                  0












                  0








                  0





                  $begingroup$


                  ...a given PRNG has a uniform distribution, then the entropy would be high.




                  In cryptography, it is common to consider the entropy of a generator as the unknown input to that generator. It would be equivalent to the seed or key for a RNG. So for a common AES-128 counter based CSPRNG, the entropy would be 128 bits. The output distribution is irrelevant where cryptographic entropy is concerned, although most RNGs in their original forms do produce uniform output. They would be difficult to use for encryption otherwise.




                  So the Mersenne Twister(MT) has high entropy.




                  Ish. It starts out high with an entropy of 19,937 bits for the common 32 bit implementation. However MT is invertable and the state can be discovered by looking at sufficient output. Observing 624 output words from MT allows the internal state to be discovered. Consequently the entropy reduces by 32 bits per output word, reaching zero after 624 outputs. Any subsequent output is entirely predictable. The leftover hash lemma applies to MT outputs less than 624 words, when it would be acting as a very inefficient randomness extractor.




                  What's the relationship between entropy and predictability?




                  In cryptography generally, entropy = unpredictability = surprisal. That is necessary for seeding RNGs and creating keys. A cryptographer strives for unpredictable RNG sequences and unknown keys. A CSPRNG cannot be inverted, the seed/key cannot be recovered and thus the entropy, unpredictability and surprisal of output is preserved no matter the length of said output.






                  share|improve this answer









                  $endgroup$




                  ...a given PRNG has a uniform distribution, then the entropy would be high.




                  In cryptography, it is common to consider the entropy of a generator as the unknown input to that generator. It would be equivalent to the seed or key for a RNG. So for a common AES-128 counter based CSPRNG, the entropy would be 128 bits. The output distribution is irrelevant where cryptographic entropy is concerned, although most RNGs in their original forms do produce uniform output. They would be difficult to use for encryption otherwise.




                  So the Mersenne Twister(MT) has high entropy.




                  Ish. It starts out high with an entropy of 19,937 bits for the common 32 bit implementation. However MT is invertable and the state can be discovered by looking at sufficient output. Observing 624 output words from MT allows the internal state to be discovered. Consequently the entropy reduces by 32 bits per output word, reaching zero after 624 outputs. Any subsequent output is entirely predictable. The leftover hash lemma applies to MT outputs less than 624 words, when it would be acting as a very inefficient randomness extractor.




                  What's the relationship between entropy and predictability?




                  In cryptography generally, entropy = unpredictability = surprisal. That is necessary for seeding RNGs and creating keys. A cryptographer strives for unpredictable RNG sequences and unknown keys. A CSPRNG cannot be inverted, the seed/key cannot be recovered and thus the entropy, unpredictability and surprisal of output is preserved no matter the length of said output.







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered 46 mins ago









                  Paul UszakPaul Uszak

                  7,22011535




                  7,22011535






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Cryptography Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f66525%2fentropy-vs-predictability-in-prngs%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      GameSpot

                      日野市

                      Tu-95轟炸機