Playing with Randomness

Unpredictable random numbers are mandatory for cryptographic operations in many cases (ref). There are cryptographically secure pseudorandom number generators (CSPRNG) but the usage of a hardware random number generator (TRNG) is something I am especially interested in since many years. While there are many proprietary TRNGs (list) with different prices, I had a look at two cheap solutions: the Raspberry Pi’s hardware random number generator as well as an application that uses a DVB-T/RTL/SDR stick for gathering some noise.

I have tested both of them with various options and ran them against the dieharder test suite. In this post I am listing the CLI commands to get the random data from those source and I am listing the results of the tests.

What’s my use case? I am searching for a true random password generator instead of the KeePass Password Generator I am using currently. Furthermore I would like to have an online pre-shared key generator for site-to-site VPNs. Just two ideas.

Sources

These are the random sources I used for my tests. From each source I saved 8 GB (!) full of random data. Just to have a big basis to test with.

Raspi TRNG

I came along this blogpost: “Well, that was unexpected…”: The Raspberry Pi’s Hardware Random Number Generator. Not much to say about that. You can simply read out random data from the /dev/hwrng source with root privileges such as:

(I have not installed the rng-tools here because I was not interested in routing the /dev/hwrng to /dev/random at this point. I’ll cover that in another blogpost.)

To copy the 8 GB into a file I used the following command. Some information about that here: How to read in N random characters from /dev/urandom?

rtl-entropy

I also came across this post: Hardware RNG Through an rtl-sdr Dongle which refers to the GitHub project rtl-entropy which in turn refers to the rtl-sdr project. Since I am using these DVB-T sticks for some other projects as well I already had some of them. My complete installation procedure was as follows:

A first try without any options did not succeed at all:

while setting the sample rate to 2.4M immediately displayed random data. (To be honest: I do not exactly know what the sample rate is for. No one ever replied to my question at GitHub.)

In order to use the FIFO file you must start rtl_entropy in daemon mode:

For my test files I copied 8 GB again with this command:

I also tried some other variants with this DVB-T dongle such as the -e option for “Kaminsky debiasing” (???) which worked without any failures, as well as the default sample rate of 3.2 Mhz again (without any options) which did not work. I got many errors such as [877551.761758] usb 1-1.3: usbfs: usb_submit_urb returned -121 .

Generator” by Peter Van den Bossche is licensed under CC BY-SA 2.0

Furthermore the author of rtl_entropy states: “If you’re serious about the cryptographic security of your entropy source, you should probably short, or put a 75 Ohm load on the antenna port, and put the whole assembly in a shielded box. Then you’re getting entropy from the thermal noise of the amplifiers which is much harder to interfere with than atmospheric radio”, GitHub. Hence I tried it with such a resistor terminator but nothing happened at all. No data was coming in so I threw that scenario away. Finally I put the antenna in aluminium foil into an empty energy drink can to generate some noise. That is, I had 3x 8 GB files: plain w/ antenna, -e w/ antenna, and energy drink. ;)

/dev/urandom

For the sake of completeness I have also used the urandom source. Please also read this very interesting article about urandom: Myths about /dev/urandom.

 

Throughput

As already noted: For each source I copied 8 GB with dd into a file. (Note that the size descriptions vary since 8 x 1024 x 1024 x 1024 = 8589934592 = 8 GiB = 8.6 GB.) These are the results of the random generation “speed” within my tests on a Raspberry Pi 3 B. Obviously /dev/urandom was the fastest one:

  • /dev/hwrng: 8.6 GB copied, 81585.4 s, 105 kB/s
  • /dev/urandom: 8.6 GB copied, 2039.33 s, 4.2 MB/s
  • rtl-entropy -s 2.4M: 8.6 GB copied, 54842 s, 157 kB/s
  • rtl-entropy -s 2.4M -e: 8.6 GB copied, 54015.5 s, 159 kB/s
  • rtl-entropy -s 2.4M energy drink: 8.6 GB copied, 31491 s, 273 kB/s

dieharder

“The diehard tests are a battery of statistical tests for measuring the quality of a random number generator”, Wikipedia. It is not easy to understand and interpret the results correctly. There are a few paragraphs on the dieharder man page about that.

For each generated random binary file as input source (-g 201 -f input-file) I ran dieharder with all test (-a), saved the output into a file (> filename) while running in background (&):

Following are the results. The main problem is the interpretation! All random sources have some WEAK assessments (p < 0.05% or p > 99.95%) while the majority has PASSED. Due to the man page “Scattered reports of weakness or marginal failure in a preliminary -a(ll) run should therefore not be immediate cause for alarm” I would say that all of them provide good random data. If you want to look at the mere results by yourself:

/dev/hwrng:

/dev/urandom:

rtl_entropy:

rtl_entropy -e:

rtl_entropy energy drink:

 

Conclusion

While the rtl_entropy solution has some unsolved problems (as of 2018) and does not run very well the hardware RNG on the Raspberry Pi itself is quite easy to use and produces real random data. I won’t build high grade crypto protocols on it but will pick up my idea for generating pre-shared keys (PSKs) or passwords out of it. Why not?

Just for Fun: Full HD Wallpaper

Out of the /dev/hwrng random data from the Raspberry Pi I generated this Full HD wallpaper. (Not my idea, look here again.) Yes, it has about 6 MB (!) because it cannot be compressed due to its randomness.

random20160720114153

Even More Links

Featured image: “Dice” by Clancey Livingston is licensed under CC BY-NC 2.0.

3 thoughts on “Playing with Randomness

  1. A few thoughts:

    1. You will get better performance out of /dev/urandom if you upgrade your Linux kernel to something beyond version 4.8, as from that point forward, it’s based on ChaCha20 as the cryptographic primitive, rather than SHA-1, which is much more efficient.

    2. In practice, you should never need anything more than /dev/urandom. It’s indistinguishable from true random whitened noise. The only catch, of course, is that it needs to be seeded with true random white noise (IE: “entropy) before use. Thankfully, the kernel mixes in interrupts, and other sources of hardware-based entropy on most GNU/Linux installers, and saves to disk for seeding on first boot.

    3. Randomness tests are interesting, but they don’t say anything about the security of the randomness function. They only state that the randomness function is passing statistical tests for randomness. Of course, cryptographically security random number generators should pass the tests with flying colors, but that doesn’t mean that a random number generator that does pass the test with flying colors is cryptographically secure. So, be careful there. But, it it worth a sanity check to make sure your CSPRNG is actually passing the statistical tests.

    4. True random number generators are interesting, but if you can’t audit the hardware and the firmware microcode, I would take them with a very small grain of salt. Best Practice if you don’t trust the TRNG, would be to collect 512-bits of data from the TRNG, run the collected result through SHA-256, then use that output to seed your OS CSPRNG (EG: /dev/urandom), and use the OS CSPRNG from there. Personally, I wouldn’t use TRNG data directly unless I have personally vetted the hardware and firmware. #TinFoilHat

Leave a Reply

Your email address will not be published. Required fields are marked *