[Cryptech Tech] fyi: papers from COST-IACR School on Randomness in Cryptography

Bernd Paysan bernd at net2o.de
Sat Dec 3 20:47:20 UTC 2016


Am Freitag, 2. Dezember 2016, 16:05:01 CET schrieb =JeffH:
> of possible interest...
> 
>  >> No, the above #2 is not accurate. It does matter how good your
>  >> entropy source is. The leftover hash lemma gives you the
>  >> expression for the amount of entropy you can extract from entropy
>  >> sources - but doesn't tell you how and for the real constructions
>  >> the answer is worse. Subsequent papers given bounds for certain
>  >> specific extractors. This can be be summarized as the 0.5 limit. If
>  >> your input data has less that 0.5 bits of entropy per bit of data,
>  >> your extractor is swimming upstream slower than the stream is
>  >> moving downstream.
>  > 
>  > Reference please? Because this would be news to me (and, I think, a
>  > lot of other people as well).

If you follow the NIST recommendations to PRNGs, yes, that is a problem. To 
generalize it: NIST recommends to read in a block of entropy, compress it with 
a compression function (hash-like), and use that as input seed for a DRNG. The 
compression in the NIST standard is 2:1, so 0.5 bits/entropy per bit is the 
lower limit.

What can be done about this?

a) use the previous seed for the DRNG as third input into the compression. 
This means the new entropy is added to the old one, instead of replacing it. 
No "swimming upstream slower than the stream is moving downstream".

b) use a sponge function like Keccak to absorb entropy and expand 
deterministic randomness. Absorbing always adds entropy to the pool, and you 
can use the same function (same hardware) to expand.

-- 
Bernd Paysan
"If you want it done right, you have to do it yourself"
net2o ID: kQusJzA;7*?t=uy at X}1GWr!+0qqp_Cn176t4(dQ*
http://bernd-paysan.de/


More information about the Tech mailing list