Version 0 of Cryptographically secure random numbers using /dev/urandom

Updated 2011-12-23 02:56:05 by bodangly
    Cryptographically secure random numbers are difficult to achieve, but important for things such as password and session key generation. You probably want a strong random number generator for something such as casino games involving real money. There are some solutions available in Tcl already, particularly in the Random package which provides a C implementation of the ISAAC algorithm. ISAAC is considered a cryptographically secure algorithm. Tcllib provides the rc4 package, a stream cipher that is essentially a random number generator. 
    However, if you are in a recent Linux environment chances are your kernel has support for /dev/random and its counterpart /dev/urandom. These kernel modules poll drivers and the running system to collect entropy used in generating true random numbers. The only difference between /dev/random and /dev/urandom is that /dev/urandom will start generating numbers with less entropy once the buffer has been overrun, as random is blocked and urandom is not. So for something such as a long-term cryptographic key it is usually recommended to generate using /dev/random, however, urandom is still considered a cryptographically secure pseudo-random number generator. So, for times when you need to generate random numbers often, /dev/urandom is more convenient than /dev/random. If you want a "truly" random number though, use /dev/random. The only other "true" random numbers available to most Tcl applications would be, and relying on a third-party that transmits your numbers in the clear kind of defeats the purpose of a secure RNG. So, lets take advantage of these powerful, mature tools that Linux offers us. Below is an example of a function to generate a random unsigned integer up to 64 bits using /dev/urandom.
proc randInt { min max } {
    set rand [open {/dev/urandom}]
    set random [read $rand 8 ]
    binary scan $random H16 random
    set random [expr (0x$random % $max) + $min]
    set random [expr $random > $max ? ($random % $max) + $min  : $random]
    close $rand
    return $random

I tested this function by generating a random number from 1 to 8 100000 times; with several runs. The distribution looks pretty even. But, with generating that many numbers we end up reusing the entropy quite a bit, as the consistency in the results indicates, unless someone spots something wrong with my algorithm. That said tallying up the results doesn't necessarily speak to the predictability of the order of those results.

1: 12535
2: 12536
3: 12822
4: 12567
5: 12349
6: 12156
7: 12430
8: 12605
1: 12618
2: 12561
3: 12832
4: 12556
5: 12618
6: 12072
7: 12435
8: 12308
1: 12372
2: 12475
3: 12850
4: 12565
5: 12708
6: 11930
7: 12662
8: 12438
1: 12499
2: 12516
3: 13027
4: 12438
5: 12415
6: 12105
7: 12419
8: 12581