I never knew a full HD screen has soooo many pixels. I knew the actual numbers but to fill it pixel by pixel it's a hard job !
Remember this next time you play your favourite game at 60fps, your video card is a true hero, a miracle of recent years technology.
My project sets up a snowfall simulation using the frame buffer in Linux console mode. The snowflakes collide with each other and other objects and pile up in time. There is a Youtube video linked on this page where you can actually see it in action for ~30 minutes.
I used AMD64 assembly and "rdrand" instruction so you need a fairly recent processor to replicate it. I know, I know, NSA and other agencies have supposedly back-doored this instruction but I am not using it for cryptography :).
Project is not yet optimised at algorithm level but it's already 1000 bytes in binary form (ELF headers don't count according to HAD 1KB contest rules since they don't add functionality)
This brings back the nostalgia of the old xsnow program...
Next version : add pines and let Santa slide on his sleg :-)
Furthermore, no need of a special instruction to generate (pseudo-)random numbers. I think there is an "entropy pool" that will limit the rate of generated numbers to preserve "randomness" but you don't need that. A simple LFSR works. But if you are after code size, check the documentation carefully about the rate of execution of this opcode.
:) I don't think there is enough space for that but I still have to optimize the implementation. Maybe I can squeeze a few hundred bytes out and put some sprites. I choose rdrand because it spits out a random number (pseudo) in 4 bytes and it simplifies things a lot
By next version, I mean for a larger one. Though the xsnow sprites are low-res and easily compressible and could add some bonus points in the contest :-)
Hah, cool! Got some ideas brewing from this, thanks :)