1728 leds is a lot more data than you think

A project log for FLED

An LED display showing visualizations and rendering data from a variety of TCP sources over the Open Pixel Control protocol

Ben DelarreBen Delarre 01/30/2015 at 00:300 Comments

In the original FLED implementation we had a Raspberry Pi inside the case. This Pi ran a NodeJS application that executed the animation code written in JavaScript and provided a web interface for users to create animations within. This all worked fairly well and gave us a smooth 30fps even with quite complex animations. However, we only had 96 LEDs at the time. Even then it would occasionally run slow with a badly written animation. Scaling this up to 1728 leds just isn't going to work on the Pi.

So I decided a good option might be to put a Teensy 3.1 inside the case, with an ESP8266 wifi module. I would then generate the LED data using code running on a machine elsewhere in the office, and then just stream the raw LED data over the wifi to serial bridge and have the Teensy send it out to the LEDs. This seemed like a solid idea till I started running the numbers...

(1728 * 3) * 30 = 155520 bytes per second., or about 155kb/s.

That's pretty hefty...the max baud we can get out of an ESP8266 is 921600 bits per second giving us just 115200 bytes per second. Not quite enough to run our animations smoothly!

Now we could faff around trying to get two ESP8266 modules to work on the two high speed UARTs on the Teensy. Our sending application would then send to two separate ports on the Teensy, splitting our data between two packets on two chips giving us enough bandwidth. Of course then we would have to faff around figuring out how to synchronize the two wifi modules so we had complete packets before sending them out. This was starting to feel like a hack.

In the end I decided to switch to a Beaglebone Black that we had sitting around the office. Using this we could use a proper USB wifi module which would provide a much higher data rate, we also have a lot more power so we can start doing some fun things. With all this extra power we can setup things like receiving multiple streams at once from a variety of sources, allowing us to have alerts that are blended over the main animations. We would still have the animation data generated externally, but the Beaglebone Black would act as a blender and manager of sources, and provide a UI to control what gets shown when.

I still suspected NodeJS would be a performance hog in this regard and since we no longer had to run JavaScript on the server this let me dive into writing this in Go to get a nice performance bump. More on that later...