Version 0 of Gaussian Distribution

Updated 2004-03-31 18:09:42

started by Theo Verelst

Also called 'Normal Distribution'. For most physical independent measurements it holds that the statistical law of large numbers makes a measurment distribution graph due to measurement noise is a normal or Guassian distrubuted probability density.

So if we flip a coin so many times, and keep score of 10 throws for instance, we expect to get an average of 5 heads for ten throws. But of course, it will also happen we throw 6 heads or even 7 tails.

This little script computes 100,000 times the average of 5 random numbers (from the rand() tcl function), and stores the normalized results in a list called to:

 unset to;  for {set j 0} {$j < 100000} {incr j} {
    set o 0; set u 5
    for {set i 0} {$i < $u} {incr i} {
       set o [expr $o+rand()]
    };
    lappend to [expr $o/$u]
 }

Now we'll make 300 buckets in which we count how much of the result fits between to adjacent bucket limits:

 unset gd; for {set i 0} {$i<300} {incr i} {set gd($i) 0} ; foreach i $to {incr gd([expr int([expr $i*300])])}

And display the result as a reasonably accurate bar graph on the bwise canvas, or you should have a visible Tk canvas of which the path is in the variable mc :

 mc del gr3; foreach n [array names gd] {$mc create line [expr 100+$n] 301 [expr 100+$n] [expr 300-0.2*$gd($n)] -tag gr3}

http://82.168.209.239/wiki/gaussian1.jpg

At least clearly the gaussian curve can be recognized, where the statistical measure 'variance' can be derived from the bowing points (the change of sign of the second derivative, at both sides of the maximum, the expectation value). Also clearly, a hundred thousand samples still leave us with considerable dviation from the ideal 'big numbers' curve.

Normally, for natural random processes, the law is big numbers works pretty well.