“Babel in Reverse” is an art installation under the Manhattan Bridge in the Dumbo neighborhood of Brooklyn. It features 178 hanging lamp-like custom devices that illuminate the archway space and play audio consisting of interviews, songs, lessons, and other types of recordings of people speaking some of the more than 700 languages spoken around New York City. Each unit in the array is assigned a different language. The installation cycles through several dynamic behaviors highlighting different sets of units over time. For example, playing only units with recordings of music or playing a section of units that moves like a wave from one end of the arch to the other.
How much money we needed and what we would spend it on
Ideas
The starting point was a list of locations that The Downtown Brooklyn + Dumbo Art Fund had provided for where the funded artworks could be installed. The anchorage space under the Manhattan Bridge had jumped out immediately to us as an especially interesting place for a work.
In the past the archway was closed off to public access and used to store materials for repair work on the bridge, but was restored and opened to the public in 2008. Nowadays it’s used for occasional events but is mostly a quick way for people to get under the bridge, a landmark to meet people at, or a place to sit for a bit. It’s a relatively dark space with a lot of noise from passing trains and construction nearby.
We thought it might suit a sound installation. The varying noise in the space would let our audio “hide” and then reemerge in the moments when the space quiets down. Mostly that’s at night after the trains start spacing out and the construction work has stopped for the day. We wanted to introduce a light component as well because that would let us direct the attention of passers-by to where the audio is playing. And the light could make the space generally more welcoming and highlight the installation.
Joe suggested the idea of reaching out to the Endangered Language Alliance (ELA) to see if we could use their interview audio for the installation. The ELA is a non-profit that was founded in 2010 to document Indigenous, minority, and endangered languages. Over the years they have conducted hundreds of interviews and collected various forms of data, which they share via maps and other projects. Thinking about how we could share that content in the form of an installation led us to the idea for our project.
We would highlight varied backgrounds and lives of New Yorkers by playing audio of them speaking and singing in the many languages spoken around the city. Ideally the installation would provide a chance for city dwellers passing by to be reminded that they live immersed in a vast wealth of human stories and culture. Unfortunately, that purpose grew more urgent over time as the pandemic progressed while we built the project.
Initial Proposal
With a workable concept for the art in our heads our next thoughts were about implementation. How would we approach building an installation that could play back hundreds of voices and produce lighting effects across the fairly large space under the anchorage (~140’x45’)? How could we do that affordably? We would need hardware distributed across the space and a structure to support it. Some careful thinking would be needed to do this within a reasonable budget and in a way we could actually execute.
Structure
Initially we wanted to minimize the structural engineering work that would need to be done, because that was both something that we thought could incur a lot of cost and a kind of engineering work that neither of us had much experience with. We went looking for off-the-shelf structures that would meet the needs and be affordable. The “safe” option was a truss structure like the type that is used to support lights and other equipment on stage for music shows. Our first rough 3D models of what the installation could look like used those.
(By Joe in Rhinoceros)
We didn’t entirely like the look of the frame structure so we kept thinking. It occurred to us that a common structure you see all over NYC could be repurposed to provide the structural support for our art installation. That structure is the common scaffold walkway put in place to protect pedestrians from falling debris when work is being done on the facade of a building. Because of legal and logistical factors...
Back in 2019 I worked on the software for another project with Joe - one of Joe’s installations called Space within Spaces. In that piece Joe had 180 lights hanging in a grid in the Juliana Curran Terian Design Center Atrium at Pratt Institute. They were to be animated based on particle collisions recorded by a muon detector. Each light was controlled by an ESP8266 over WiFi. The goal intent was to run the animations on a central computer connected to the detector and then stream them out to the array. Due to the large number of devices and the crowded airwaves on campus there was a technical challenge to overcome to get sufficient performance to achieve the intended effect. In the end I did find a solution, and that inspired the general architecture that I ended up implementing for Babel in Reverse.
Fast WiFi Streaming to 180 Lights
I joined the project near the end, after the hardware had been installed in the Atrium and the behavior was being programmed. There was already a first pass on the control software. But unfortunately it updated the array of lights extremely slowly - about one update every 10-20 seconds. This version of the software had each of the ESP8266s join a WiFi network and then run an HTTP server exposing a REST API to control the light. A Processing sketch generated the effects based on the incoming muon detections and then called the API of each light in turn to write an updated brightness value. Because several back-and-forths over the network were happening for each light in the array there was a lot of radio traffic and consequently high congestion that slowed communication.
We needed to reduce the number and size of messages being sent per frame. Because we were using HTTP over TCP each bulb update was taking 3 packets for the handshake to open the connection, one for the HTTP request, and then one for the reply. Multiplied by every bulb that meant at least 900 packets per frame, or 27,000 packets per second at 30 frames per second. Even setting aside the WiFi congestion issues that wasn’t workable, so I went looking for a way to reduce the number of packets being sent.
Each ESP8266 could control the brightness of the bulb in 1024 increments between off and full brightness. Since 1024 is 210 that means a bulb's brightness can be expressed in exactly 10 bits, which we would need to use 2 bytes to transmit. Multiplied by the 180 bulbs in the array we get 360 bytes for brightness, which is well under the 1500 byte MTU for the network. So we could make a single packet update the brightness for every bulb in the array at the same time. But how do we get that packet to every bulb without repeating it? And how does a bulb know what part of the packet to use to update its brightness?
The first question can be answered with a network facility called a "broadcast address." Basically these addresses are treated specially by a network and, if the network is configured to allow it, a packet with a broadcast address listed as its destination will be sent to every other client (read more in RFC 919). So we just take our update and put it it in a UDP packet that we send off to the broadcast address. The router sees our packet and retransmits it to every other client on the network, including all of our ESP8266s. Each bulb looks at the packet and pulls out the brightness that it should have and uses that value to update the duty cycle of the PWM signal controlling the LED brightness.
How does the bulb know what value in the packet to use? If each bulb had a unique identity and we had a map of where each of them were in the array then we could pick an arbitrary order for the values in the packet and then just flash the map into the firmware so each bulb could use it to look up its value in the right place. For the identity we can use the 32 bit chip ID in the ESP8226. In general it’s not guaranteed to be unique, but it was for the ESP8266s that we were using. I figured out where...
“Babel in Reverse” is an art installation under the Manhattan Bridge in the Dumbo neighborhood of Brooklyn. It features 178 hanging lamp-like custom devices that illuminate the archway space and play audio consisting of interviews, songs, lessons, and other types of recordings of people speaking some of the more than 700 languages spoken around New York City. Each unit in the array is assigned a language. The installation cycles through several dynamic behaviors highlighting different sets of units over time. For example, playing only units with recordings of music or playing a section of units that moves like a wave from one end of the arch to the other.
My collaborator, Joseph Morris, and I partnered with the Endangered Language Alliance (ELA) for the project. They have been working since 2010 to document Indigenous, minority, and endangered languages around New York City and beyond. Without audio from their extensive collection of interviews (most are on their YouTube channel, but also take a look at their language map) it would not have been possible to create our installation.
It took us more than two years to realize the project. Pandemic supply chain issues and work disruptions significantly impacted us, greatly complicating fabrication of our custom hardware and generally messing up our timeline.
In this blog my aim is to retroactively document the work that went into making this art installation a reality, how we overcame the challenges along the way, and hopefully inspire others to take on creating their own public art. I’ll cover the technical bits, the logistics, and expand on some of the side quests that I went down trying to get this built within our budget and on time. There were many mistakes made along the way. I’ll dig into some of those and distill the lessons that I’m taking away for my next projects.
The installation is up right now and will be until about May 2023. If you happen to live nearby feel free to stop by for a visit! The address is 1 Anchorage Place, Brooklyn, NY.
would be wonderful to hear more about this, especially the technical details!