Control My Lights

Control My Lights using a website, Twitch chat, YouTube chat.

Similar projects worth following
This is an interactive light installation that can be controlled via that website, Twitch Chat, or YouTube chat. Control My Lights is an independent fundraiser for Feeding America. Millions are out of work right now and relying heavily on food pantries that are seeing their need skyrocket while donations dwindle. Feeding America is an organization that helps provide resources to food banks, food pantries and meal programs across the United States. With the need for food growing every day I wanted to build something that might drive donations toward a fantastic cause. I am raising money for Feeding America through a service that charities use called Tiltify to safely and securely provide a platform for accepting donations. I never touch the money and it goes straight to Feeding America.

This is an interactive light installation that can be controlled via that website, Twitch Chat, or YouTube chat. I am also using It to raise money for Feeding America (I put In the first $25, goal is $1000). I would be honored if you check it out. Here’s the server side tech stack (on Heroku):  NodeJS(ExpressJS/SocketIO), ReactJS. I also made the mockups for it in Figma. Here is the desktop tech stack: NodeJS, Python (Selenium – Chrome Web Driver), Redis, MongoDB, OpenFrameworks (C++ framework), Arduino, Web Sockets.

  • 2 × 8' FlexFireLeds ColorBright™ RGB 300 Color Changing LED Strip Light
  • 1 × SparkFun Snappable Protoboard
  • 3 × FQP30N06L Mosfet N-Channel Mosfet
  • 1 × Stackable Header - 3 Pin (Female, 0.1")
  • 1 × MEAN WELL LRS-350-24 350.4W 24V 14.6 Amp Single Output Switchable Power

View all 14 components

  • The Final Architecture:

    Edward C. Deaver, IV10/16/2020 at 21:10 0 comments

  • Performance v2:

    Edward C. Deaver, IV10/16/2020 at 21:08 0 comments

    Redis messaging:

    To reduce the number of times the NodeJS listeners would have to manipulate the messages they were receiving, and to establish a new standard of data sent around the system I moved the text manipulation code from the MongoDB listener to Redis Queue. This manipulation includes invoking the createFinalJSON command that converts messages from the Youtube Listener (Python) to Javascript values, fixes the dates, and lowercases the color hex values. This also mitigates an issue I had in OpenFrameworks; when messages from Youtube came in, the program crashed.   

    Arduino v2:

      Initially, the code for the Arduino would listen to serial commands inside the loop function. Also, I was checking for colons and running nested loops to get the serial data. This was incredibly slow; it also relied on blocking functions. This caused the Arduino to have a lag time from input to LED coloration of ~700ms. It would also disregard new messages while it was computing the last message.    To fix this, I implemented Serial Event from here:   Using a similar technique to the link, I created a string from the characters received over serial sent in the if serial available while loop. If a newline character was sent (‘\n’), break from the loop. Once broken from, I used substring to split the numbers based on RGB, 3 index values each (Red values 0-3, Green 4-7, Blue 8-11). Because of this, though, I had to make sure each RGB value was actually 3 digits. (Example: red is 255:0:0) So, to fix this in my Arduino Listener function I converted the RGB values to strings, and if they were single digit numbers added two 0s to the front, and if they were 2 digits added one 0 to the front. Also, my Arduino code now after the first message it received produces a bug that cuts off the first character of subsequent messages. This means that a message of “250:100:100” will result in a red/green/blue value of 250/100/100 if sent as the first message. But, if that message was sent as the second or subsequent message the red/green/blue value would be “50:”/”00:”/”00”. To fix this, I added a 0 in the NodeJS script to fix it. The lag time for this new way of doing things is sub 50ms.   


  • ​ Analytics:

    Edward C. Deaver, IV10/16/2020 at 21:08 0 comments

    I wanted to give the viewer an easy way to find the donation link, no matter the platform. To do this, I used a link. This also gave me the benefit of built-in analytics on that link, so I could track how many people clicked vs. donated. This experience with would push me to make all the links on the website links, so I could easily track outbound clicks.   

  • Building the cool streaming overlays in OBS:

    Edward C. Deaver, IV10/16/2020 at 21:07 0 comments

    Adding a donation overlay: 

    To get Tiltify, (the charity platform for this project), updates when someone donated I used a part of Streamlabs. Using their Alert Box I was able to connect the Tiltify campaign and show a message / animation when someone donated. In OBS I set the browser to point to my Streamlabs alertbox URL.   

    Adding analytics overlay:

    I wanted to give a sense of gamification and community across the different platforms, so I decided to display the viewership numbers across all platforms. For the website, I screen-grabbed the Google analytics real-time visitor count. Twitch and YouTube are both just screen grabs of their streams.   

    Text design:

    I wanted the overlay text to give a futuristic, clean vibe coupled with being easy to read. This brought me to using Adam GG Pro. I purchased a commercial desktop license.    

    Color Command slideshow:

    To get the color command slideshow I took screen shots of the color commands on the website. Then I brought the screengrab into Affinity Designer, removed the purple background and created a new image per row of the commands. Those are fed to OBS Studio Slideshow. 

    Affinity Designer: 

    OBS Studio: 

  • ​ Creating visual dashboard:

    Edward C. Deaver, IV10/16/2020 at 21:05 0 comments

    All the cool Twitch streamers have fun animations that happen for their streams so I wanted some for this. To create it I made a mini dashboard in OpenFrameworks(C++ Creative Coding framework).Without getting too deep into the inner workings of OF, I had to learn a lot about the openGL Matrix and what it means to push styles to the OF stack, and push matrices. I ended up using three rectangles rated 45 degrees to compliment the empty space of the X shaped lights. I also wanted to tell the audience where the current color command came from, so I created a text string that resizes based on the size of rectangles and their position, which is set via a GUI slider. Once I had everything working via the mouse-pressed event function, I moved onto getting the program data. I’m relatively new at C++ so didn’t understand how to use external libraries in the program.Therefore,using a Redis library was out, but I was easily able to setup a websocket server add-on for OpenFrameworks. After moving my code into the “onmessage” event function, I parsed the string data to JSON and set my global variables. Then on the NodeJS side, I created a websocket server Redis listener that pushed the Redis data out onto a websocket server once received.  The biggest issue I had with OpenFrameworks was wrapping my head around the draw loop and how, when my socket received a message, it would interact with that loop. I solved this problem by using a few global variables and a boolean value that was flipped when a new message was detected. The string that was written to the screen in the draw loop would just change it’s text on a message, and the rectangle colors would change. Another challenge I had was creating a lookup map from the hex values, so when a new message came in I could display it as a color name or the hex value. I had to learn how to create a map in C++, and iterated over my colors JSON data flipping the key value relationship and setting that resulting data structure to be a color lookup.    Also, I noticed that ofVideoGrabber function, which you use in OF to get webcam data, uses a lot of my CPU compared to OBS Studio. OBS can be getting data from multiple sources and not break 5% usage but my app will use 5% just getting data from one camera. To combat this high CPU usage I implemented an idle state: Only when the program received data would it get new pixel data from the webcam, and it would just draw whatever it had to the screen in the draw loop. This dropped my idle CPU usage to ~2%, and when I made sure it was only requesting a 720p image from the screen this lowered the CPU usage from a high of 15% to a high of 5%.   

  • ​ Streaming setup:

    Edward C. Deaver, IV10/16/2020 at 21:04 0 comments

    So the streaming setup is like most Twitch setups. I am using a Canon t6i outputting to a Elgato Cam Link capture card. Because no currently operating streaming service uses WebRTC, they all cause a delay to the video (they all use RTMP, Mixer was the one platform that used WebRTC with their FTL protocol). Initially when streaming I was pushing upwards of 8 seconds delay, I was able to reduce that to ~<5 seconds using this guide and reducing my streaming output to 720p from 1080p. To stream on YouTube and Twitch at the same time I’m using It should be noted that Twitch, unlike YouTube, will cut your stream at the 48 hour mark so their system can make a Video On Demand or VOD of the stream. OBS and Restream will successfully reconnect to Twitch at that point. After completing testing  for OBS, I moved onto making the streaming a little more interesting.   

  • Mounting the lights:

    Edward C. Deaver, IV10/16/2020 at 21:01 0 comments

    Initially I thought maybe I should use mounting brackets but without a 3d printer and time to create fancy wood working mounts, I decided to go with a fishing line and hang the tubes from the ceiling. To find the points where the hooks should go I used Pythagorean’s theorem. I wanted a 45* X shape so that resulted in 4 triangles with 2 45* sides and 1 90* side. The opposite and adjacent sides were 4’ long, which made the hypotenuse of 5.65’ or 5’ 5 inches set that to X. So to mount the lights I needed a Y value for spacing from the ceiling and lining up with the camera. When I knew Y, I cut a length of  line to X+Y for the end of the tube closer to the ground and one to Y for the part closer to the ceiling. To hold the line to the tubes I made knots in the end of the line, stuffed them into the tube, and put the cap overtop of them to hold them in place via friction. Using the fishing line made it look like the tubes were floating on camera. 

    Mounting dimensions for LED Tubes:

    Mounting the LED Tubes with fishing line:

  • Building the LED tubes:

    Edward C. Deaver, IV10/16/2020 at 20:40 0 comments

    To create the LED tubes, I got  8-foot plastic tubes, normally used for premade lights, from Home Depot. I created an 8-foot strand of lights from 2 4’ lights for one tube, and ordered an 8-foot strand for the other. You may notice that once inside the tube you can’t pull the strip to expose the adhesive. I found a YouTube video that said to use a fishing hook and hook it into the adhesive. Instead of a fishing hook I found a rubber coated paper clip works well. When you are doing this make sure the paper clip is closed up and not fashioned like a hook. If it’s like a hook it will catch on your LEDS and you’ll have to make an incision in your tube to fix it. To diffuse the light, I used parchment paper. 

    Here is a fast-forward video of building one of the tubes: 

  • Performance:

    Edward C. Deaver, IV10/16/2020 at 20:38 0 comments


    By this point I had everything working well enough, but there were two glaring issues. The first, my Arduino crashed if it was sent too many messages in a short period of time. The second, I was experiencing data loss on my SocketIO website listener when data was being sent too quickly.   

    Debugging NodeJS and Arduino for increased speed:

      Initially, my Arduino code received a string via serial “#AABBCC”, split the string into 3 parts and converted it from Hex to Ints. Example: FF would be 255. This way of processing data posed a big issue. The rate at which I could send messages to the Arduino (10 messages in 5ms) was  faster than the 9600 baud rate could allow. When I tested spamming data from the website I caused the Arduino to crash. While not receiving an error message I believe I ran out of dynamic memory, an issue that can happen when dealing with Strings.    In order to fix this, I first implemented a queuing system on the Express routing server using Bull. This implements a Redis queue, to which I pushed data when a Post request was received. When I received a new job from the queue I emitted the job data on the colormessages SocketIO channel and paused the processing for two seconds. Eventually I would bring that delay to one second then to 500ms.    This queue system is something that I had hoped I wouldn’t have to implement when beginning this project, but it was very simple and increases the ability of the application to scale when there are 10,000 users trying to communicate with the app.    Next, I moved onto the Arduino. First I pushed the baud rate from 9,600 to 115,200. Next I followed this guide on manipulating data using char arrays with a potential fixed length. This makes it easy to know that I won’t ever hit the max memory limit of the Arduino. I implemented the new schema for this new char array idea: R:G:B implementing a  colon as a separator character. Unfortunately, I forgot I appended a /r/n line ending to my serial write abstract class. When I failed to account for that increase in length in long strings like 255:255:255, the program would read this as Red/Green/Blue = 255, BUT, then it would re-read the string as Red = 0, Green/Blue = 255. After increasing the max array length, this worked perfectly.     Note: the code I used for this was very slow, and would be rewritten in Performance v2.   

    Moving to Redis Pub/Sub and Queue:

    I decided to increase the speed of my app and make it easier to maintain by removing the ExpressJS/SocketIO routing server, and replacing it with Redis Publisher / Subscriber model, and bypassing Bull to create a Redis queue directly. So now the external listener components Website, Twitch, and Youtube push their messages to a Redis queue named “ExternalMessages”. The Redis queue has it’s first element removed and published to the “InternalMessages” channel. The Arduino and MongoDB internal listeners subscribe to that channel.   

    Async + Promises:

    NodeJS works placing all of your tasks into this thing called the “event loop” and, if you have some code that slows down that loop, it can cause it to block javascript execution, like blocking my SocketIO listener from reading in new data. How do you fix this? By placing functions into the microtasks queue with Async and Promises.    I broke down all of my CPU heavy tasks into async functions. From there I tried to reduce the number of operations they needed to complete to get what I wanted from them. Then I used Promise.all to execute a few related functions at the same time, by pushing them to the microtask queue. After doing this to Web Server and the SocketIO listener, I tested it by clicking two different buttons at the same time, and it successfully handled two messages...

    Read more »

  • Rate limiting:

    Edward C. Deaver, IV10/16/2020 at 20:36 0 comments

    In order to reduce spam or potential bot manipulation, I implemented a rate limit on how quickly a user could submit a new color. This is done by a simple map (NodeJS) or dictionary structure (Python), and associating the last message time with a user. If that time is under the set timespan, the user is not allowed to set a color and their time value is not updated. If that time is greater than the allotted timespan, the user’s command is allowed and their time value is updated.   

View all 15 project logs

Enjoy this project?



Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates