[Cryptech Tech] Hardware entropy

Stephan Mueller smueller at chronox.de
Tue May 27 14:32:30 UTC 2014


Am Dienstag, 27. Mai 2014, 16:21:00 schrieb Bernd Paysan:

Hi Bernd,

>Am Dienstag, 27. Mai 2014, 15:11:58 schrieb Stephan Mueller:
>> >Example: To remove the bias of the individual roscs, you can xor two
>> >sources (that's why I had the p ^ n readout address in my original
>> >entropy block). The result is a pretty flat distribution of all
>> >possible byte values (roughly gauss-shaped, see attached
>> >second-order
>> >histogram - that's a histogram of the histogram), but the entropy is
>> >nearly half of the original entropy (the individual bits are now
>> >better, but the overall number of bits are reduced by a factor two).
>> 
>> Can you please elaborate on the last part. I do not understand what
>> you mean with the entropy is half.
>> 
>> It is mathematically proven that XOR does not reduce entropy of two
>> *independent* data sets. So, when you say that XOR reduces entropy,
>> the data sets per definition are dependent. Thus, you do not have as
>> much entropy even in your initial data sets.
>
>No, you take *two* bits (e.g. p[1] ^ n[1]), and produce *one* bit as
>result. This means you have half the number of bits.  If the two

Ah, I see where we speak past each other :-)

Let us pretend string a contains 5 bits of entropy and string b 10 bits.

a XOR b implies 10 bits of entropy.

Thus, you do not destroy the entropy in the sense that when you "add" a 
new value to an existing "entropy pool" by XORing, you cannot destropy 
that entropy pool's entropy.

But you are right, with XOR, you cannot increase entropy past the 
entropy of the two existing strings. For doing that, you have to use 
concatenation.

>inputs have high entropy, the xor operation will produce one bit with
>at least the same amount of entropy (it can only get better, but there
>is a limit for entropy per bit: you can't have more than 1 shannon per
>bit), but it is one bit for where there were originally two bits.
>
>Two bits with good entropy have together more entropy than one bit with
>somewhat better entropy.  This doesn't even depend on the amount of
>entropy: Even if the entropy originally was poor, the xor will provide
>less than twice the entropy per bit, and the number of bits is reduced
>by two.

I think we both say the same, using different terminology. :-)
>
>AFAIK, what you get with a probability of surprise of p1/p2 for the two
>inputs (1 if completely random, 0 if completely predictable), is
>
>p_xor = (p1+p2)/(1+p1*p2)

Ciao
Stephan


More information about the Tech mailing list