[Cryptech Tech] User auditable hardware entropy source/random number generator
Bernd Paysan
bernd at net2o.de
Sat Jul 12 21:58:56 UTC 2014
Am Freitag, 11. Juli 2014, 20:33:20 schrieb Benedikt Stockebrand:
> And yet another one...
>
> Joachim Strömbergson <joachim at secworks.se> writes:
> > It seems that at least quite a few hobbyist (and some professional
> > solutions) use transistors connected as diodes instead of a real diode.
> > Any thoughts on that?
>
> I've tried with some transistors I've had lying around. The noise
> amplitude wasn't as good as with the "magic" Zener diodes, I didn't find
> any definite information about their breakdown voltages, and most
> importantly: Zener/avalanche diodes are built for this kind of use
> (reverse breakdown, that is), while transistors aren't.
>
> So even if this worked, I'd be rather worried about the average life
> expectancy of such a design.
What kills diodes and transistors in reverse breakdown mode is heat, so with a
reasonable large resistor to limit the reverse current, transistors won't get
damaged, either.
> > Another issue we have been discussing is how fast one really can sample
> > a PN avalanche noise source. One suggestion has been that anything above
> > a few hunderd Hz is not good.
> >
> > Any ideas about this Bernd and Benedikt?
>
> I think I have to explain about how I do the processing:
>
> I don't sample invididual readings, or xor a cluster of them or so.
> That (kind of) works, but I found something way more robust: Measuring
> the time between rising edges on that pin, or rather, the least
> significant bit of that.
>
> The disadvantage of this approach is that it can't be easily
> parallelized, and I don't know if this is feasible in an FPGA.
That part is pretty easy to do in an FPGA: have a DFF which toggles a bit
every cycle, and sample that bit whenever the input changes polarity (into a
shift register). It might be interesting to measure the time each high and
low part takes with a fast clock (50MHz is easy to achieve), just to see what
kind of signal that is.
> But it means that if the noise source goes flat, the output will simply
> stop. If the source's frequency spectrum shifts, then the output speed
> will adjust accordingly. And since I implicitly only measure every
> second wave, the bits read may be biased but they are effectively
> uncorrelated unless I feed a fixed frequency signal insted of noise.
It shouldn't hurt if you measure high- and low-times.
> It also reduces the effects of outside interference, e.g. from 50/60Hz
> sources, and should even make intentional interference attacks from
> outside significantly more difficult.
Hm, if you feed a square wave into the diode, then you should be able to
contol the output pretty well. If you feed in a square wave with randomized
amplitude, even worse: that would defeat tests. External sources are easier
to tamper than internal sources.
> Afterwards I use a simple von Neumann extractor to remove any bias and
> at the same time compensate for any timing measurement from the previous
> stage. I've wondered if a changing bias (e.g. influenced by
> temperature) might have any noticeable effect on it, but even if there
> is, I found it impossible to measure. If that ever turns out a problem,
> more than two bits can be used in the von Neumann extractor to
> compensate for that (but no, you won't find *that* in Wikipedia).
If you measure LSB of a clock between edges, instead of sampling the input
with a constant frequency, bias shouldn't be a problem. If the von Neumann
extractor doesn't improve the quality of the random bit stream, leave it away.
IMHO, the von Neumann extractor is not very helpful to improve an entropy
source; it reduces the bit rate by a factor of 4 if applied to a perfect
random bitstream (50% likelyhood that two consecutive bits are identical: they
get removed totally. The other 50% likelyhood will output the first of the
two bits only).
> With the "magic" Zener diode, I took about a month's worth of output (40
> GByte) and ran it through FIPS140-2 (rngtest from the Linux rngtools)
> and dieharder with options "-Y 1 -s 1 -k 2 -a -m 3"[1] and some
> follow-up testing on those tests that use a single psample and therefore
> can't really cope with the "-m 3"---dieharder is a bit awkward there.
> So I guess that 20 kByte/s are actually possible, and with a faster MCU
> or an FPGA possibly even more.
>
> By the way, those 20 kByte/s on the output side of the setup seem to be
> rather constant; that's why I assume it's where the current MCU and
> firmware hit their performance limit.
20kB/s doesn't sound too bad. That's nearly 200kBit/s, and as you use a von
Neumann extractor, it's nearly 800kBits/s raw data going into that extractor.
--
Bernd Paysan
"If you want it done right, you have to do it yourself"
http://bernd-paysan.de/
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 198 bytes
Desc: This is a digitally signed message part.
URL: <https://lists.cryptech.is/archives/tech/attachments/20140712/192950a9/attachment.sig>
More information about the Tech
mailing list