I've been in love with the idea of playing a movie back very slowly since I saw the project Bryan Boyer built back in 2018. I was reminded of it in August when Tom Whitwell build a version of it and have been on the lookout for a ~7.5" epaper display ever since.

The problem is I'm really cheap and don't want to spend $60+ for a panel. But, I do have two unused epaper displays on hand that I bought back in 2019 and promptly forgot about after the initial hello world experience.


I grabbed an ESP32 out of my parts bin and got the display hooked up and running quickly using the ExPD Arduino Library that supports both pieces of hardware. From there it's just a matter of changing out the image data which is 5808 bytes for a black and white image (264x176 resolution / 8bits = 5808 bytes).

Looking at a couple of test images it was clear this screen was a viable solution. The high-res of the camera makes these look grainy, but the screen is small and pixels are dense, it actually comes out rather nice! The dithering method has a big effect on this as you can see from these tests.

Server<>Client (or How I'm Abusing MQTT)

Now obviously I'm not going to store an entire movie on the ESP32. I could have grabbed an SD card (and actually did pull out a breakout when grabbing parts for the prototype) but this didn't make much sense. I wasn't going to be pulling frames out of a video file using the ESP32. This board has WiFi and I have a couple of Linux servers on this LAN, so I decided to make a server-side image processing script that would pass image data to the board.

But how do you get the image to the board in the first place? Well, I was already loading 5,808 bytes into an array for the test image, why not just pass those bytes to the board via MQTT? I looked up the message length limit for the standard and figured I would have no problem with it. After messing around for a while I discovered a limit in the Arduino MQTT library which I upped to 20 kB and was able to send messages without a problem.

Server Side Script (or How Amazing ImageMagick Really Is)

There's another dirty hack in here I'll get to in a minute. But first, check out this ImageMagick command:

convert inputframe.png -rotate -90 -resize "176x264^" -gravity center -crop 176x264+0+0 -dither FloydSteinberg outputfile.XBM 

This single command takes an image file, rotates it 90 degrees, resizes to whichever width or length is the shortest based on the aspect ratio I need (while centering in the other direction), crops to the correct size, dithers the image, and spits out an XBM file with the bytes I need to represent a 1-bit color depth image!

Except that the image will not display properly on the screen. I still needed to invert the colors and swap the endianness of the bytes. It's possible both of those options are available in the ExPD library but I didn't immediately find them. Instead use Python to make these changes, format them into an 11,616 character long text message, and post it as a message on my MQTT broker. The display is subscribed to the topic and updates whenever a new message comes in.

That's the hack... certainly there are better ways to transfer the image. The ESP32 should be able to pull down an image and process it. Or there should be a better way to send MQTT data. But this is what I knew how to do quickly, it'll get fixed in a future version if there is one ;-)

1 Frame Per Minute (1440x Slower Than Normal)

I'm having fun 5 days into the project watching David Lynch's Dune play out. Irulan's opening monologue was on for 2 days after a day of the Universal logo. The opening credits that roll after the monologue have now been going for 2 days and I suspect there will be a couple more to go.

So far the effect is unexpected. It's like a graphic novel. I know the movie well, so I'm excited to see what is shown each time I walk by the display. It will take 4.5 months or so to play through, we'll see if I get...

Read more »