An Arduino based true random number generator.
Takes it's peculiar name from the potassium-40 present in bananas, used as radiation source for the random number generatation.
Disclaimer: the radioactivity from the bananas doesn't quite make the difference, the background radiation is more than enough to make the generator work and afaik the banana's is not even detectable. But it's funny.
The whole project is built around the STS-5 geiger tube. The 400V needed for the tube are provided by a 555-based step-up converter very common on the internet.
The geiger pulse is then inverted and fed to a monostable 555 to stretch the pulse. The 555 drives through a transistor a buzzer and a led, and the signal is sent to the microcontroller.
The microcontroller is an atmega328p. The geiger signal goes to the ICP1 (input capture for timer 1) pin, which stores the current 16 bit timer1 value in a register and generates an interrupt. When the interrupt is raised, the microcontroller takes the value from the capture register and sends it as two raw bytes over the serial to the computer.
Ent test was taken on a two-day sampling and showed no significant problems in the randomness.
My banana (given the right tools) can compute a good estimate of PI using the Monte Carlo method.
I've written a processing code that let's you see the points beeing added to the plot every radioactive event. Consult the internet if you don't know this method yet. It's probalby one of the most inefficient methods to get a numerical estimation of pi, but it works and it's very intuitive.
The right shift by two is done because micros()'s two lower bits are always to zero. The ANDing with 0xFFFF is done to prevent the sequentialiy of the numbers. The micros() value overflows every 70 minutes, we have about 30 new values per minute. If this AND wasn't done, we would have monotonically increasing valuesfor the span of 70 minutes). Doing this, our remaining 16 bit value overflows about every 200 mS, which is acceptable.
Data is printed as a 16 bit number on the serial port and is collected by the computer. This data is then saved to a text file, the numbers are converted in two raw bytes and are analyzed by ent.
The first two-day sampling gave an alarming result:
$ ent randomdat.bin -c
Value Char Occurrences Fraction
0 739 0.004077
1 402 0.002218
2 1033 0.005699
3 674 0.003718
... ... ...
254 � 722 0.003983
255 � 691 0.003812
Total: 181256 1.000000
Entropy = 7.997995 bits per byte.
Optimum compression would reduce the size
of this 181256 byte file by 0 percent.
Chi square distribution for 181256 samples is 498.15, and randomly
would exceed this value less than 0.01 percent of the times.
Arithmetic mean value of data bytes is 127.4942 (127.5 = random).
Monte Carlo value for Pi is 3.138799695 (error 0.09 percent).
Serial correlation coefficient is 0.005408 (totally uncorrelated = 0.0).
As you can see, i was getting about half times less "0x01" than all the other bytes, and about 1.5 times more "0x02" than the others. There was clearly something wrong. (byte values from 0x04 to 0xFD are omitted)
At some point i decided to split the high order and low order bytes that i was getting from the 16 bit number into two separate files and analyize them separately. These were the results:
But i was obviously having a problem with the LSBs. All my 0x01 values were counted as 0x02.
At this point i was totally clueless about what was going on, but some considerations about the periodicity of the 0x##01 value (where # is a dontcare) saved my day.
This value is periodic by 2^10 microseconds, which is about 1024 milliseconds. This led me to think to how the millis() value is updated via TIMER0_OVF interrupt in Arduino. From Arduino's wiring.c for AVRs:
// copy these to local variables so they can be stored in registers// (volatile variables must be read from memory on every access)unsignedlong m = timer0_millis;
unsignedchar f = timer0_fract;
Basically the prescaler of the timer0 is set so that it overflows every 1024 microseconds: clock frequency is 16MHz, the prescaler is set to 64 and it overflows every 256 counts = overflow every 1024 uS. The function then takes into account for the extra 24 uS per overflow to calculate the milliseconds, and that's how the millis() work.
Just as another side note: 16MHz with 64 as a prescaler gives minimum resolution of 4 uS, and that's why the micros() function has only 4 uS resolution.
What i think was happening with my problem is this: interrupts have priorities. INT0 (the external interrupt)...