I'm building my dream machine! Purpose built supercluster in a box matched to BigSack software. Not often as a developer you get to build a machine to match your software. I am using 8 ODroid C1 1.5 gHz quad proc Cortex A5 SBC with 1 gig memory each and gigabit ethernet. Also got 8 16GB eMMC cards, about 3x performance from SD flash. Each of these 8 workers will service one tablespace of every data store. Each database is composed of 8 memory mapped tablespaces. The master will be an ODroid XU3 octocore Cortex A15 with 2 gig running at 2 gigahertz. The master handles the redo log files and multiplexes requests to the worker nodes and gets blocks back. The transport is UDP fixed block size @ 4096 bytes. Each worker node is on its own UDP port. A cluster lives and dies by its switching fabric so I got 2 6 port 24V 1000TX industrial switches, those were the most expensive part, spent about a grand on the computer boards but the switches were $1300 apiece. Had to get 2 Dc/Dc converters from 24V to 5V @ 8A to power the worker nodes, those were $300 a pop. Still I doubt there is any mainframe that can match it pound for kilo when its running. When it is, I can transplant it right into ROSCOE himself. I have accepted the fact that I need a separate 24V bus just to power computing.
The eMMC is supposed to be 300 MB/sec so thats 2.4 gigabytes per second IO to permanent storage total aggregate.
8 procs at 1.5 GHz is 12 GHz plus 2 is 14 gigahertz aggregate clock.
8 cores master + 4 cores each node times 8 nodes is 40 total cluster processor cores
10 gigabytes total cluster RAM. 8 x 16 GB nodes plus 64GB master is 180GB eMMC total cluster perm storage (without extra flash, eMMC leaves Micro SD flash space open)
8 gigabit ethernet channels running UDP is total 1 gigabyte per second burst cluster throughput with all worker nodes fully active
After assembly, I will run my new Canny/Hough/Inverse DCT transform and build the AMI Algebraic Machine Intelligence model based on the exemplars of the Caltech 101. I can then test the Caltech 101 full dataset on the new model. After refinement its on to Caltech 256, PASCAL VOC 2007 and whatever else. If I am cracking the Caltech 256 at 90% with my shoebox robot brain attention will follow.