DESIGN & CONSTRUCTION
Building the Sonic Runway took countless hours of research to figure out all the details of construction and deployment. We have gotten many inquiries about how we built it, so I'm putting together this page to share what we learned, and give back to the community.
- 32 gates, 1000 ft.
- Rolled steel tubing, welded pairs for strength
- Trenched and buried A/C and ethernet cables
- APA-102c LED strips
- Lumigeek controller, ArtNet signals
- Microphone on the first gate
- Mac mini 'base station'
- Custom software: C++ / OpenFrameworks
- Many zip ties!
There were 32 gates, 32 feet apart. The gates were made of 1 3/8 OD steel tubing, which we rolled to a gentle arc. The tubing came in 10' lengths, and we combined three of these end-to-end to span 270 degrees of a circle, with roughly 12' diameter. For strength, we welded pairs of gates together with small spacers in-between. So each gate consisted of 3 welded pairs of 10' tubes which we slipped (and hammered) together on site and then secured with self-tapping screws to prevent twisting.
Each side had a perpendicular foot, mounted on a wood base, which was then staked to the ground with 14" lag screws. Set screws prevented the gate from rotating out of the plane, but we deliberately allowed the structure to flex from side to side. Our goal was to build something that would be strong enough to withstand people crashing bikes into it (which happened frequently!), but also flexible enough to make it obvious that any attempts to climb it would be futile.
Each gate had a single strip of 5V APA-102c LEDs, 30 LEDs/m, for a total of 277 LEDs per gate. We ordered these to custom length from Ray Wu's Ali Express shop. Ordering from China is stressful, but Ray Wu was responsive and delivered as promised, albeit somewhat slower than we hoped. We chose APA-102c at the recommendation of Christopher Shardt (creator of Firmament and LED Lab, and many other cool projects), because they have faster refresh rates and tend to photograph better. We used IP65 (glue sealing) because it came with the adhesive backing, which made it more convenient while we were assembling the diffusers (see below). I would have preferred to use the IP68 for durability reasons, but this apparently isn't possible with such long strips. I ordered the 'waterproof' barrel connectors, though these turned out to be problematic.
During construction, and for safety in case the generator failed, we set up solar lanterns on the side of each gate. These work well if you gently pound the metal tube into the playa (instead of using the plastic stake).
We spent a lot of time trying to find a good solution for diffusers. After many false starts, here's where we ended up. It worked and looked nice, but was among the most labor intensive parts of the project. If anyone out there knows of good off-the-shelf solutions for flexible LED diffusers, let me know!
We used this polyethylene tubing, split down the side. We got some H-channel from Home Depot, which comes in 10' white plastic lengths. We trimmed the top side of the H-channel with a table saw so it was the same width as the LED strip, and adhered the strip to the top. We drilled holes through the narrow part of the H-channel and fed a few thousand twist-ties through to hold the strip onto the top (we didn't trust the tape). Then, we slipped the split tubing over, such that it fit in the gap of the H-channel. The nice thing about this design is that it remains flexible, and it wants to stay together by default. The down side is that it was a fair amount of labor to put it all together!
Any time you're running power over a long distance, you need to think carefully about the power requirements, voltage drop and wire gauge. After many long discussions, here's what we settled on.
There was a 3000 Watt generator located off to the side of the installation, at the 1/2 way point. This may have been overkill -- when we finally were able to measure the power consumption it seemed to be closer to 1500 W.
We ran solid 14 AWG UF-B wire from the generator to each gate, with a breaker for each half. We pre-wired a handy-box and outlet at the correct intervals, so we could just un-spool the cable on site and end up with outlets at the right places.
Each gate had its own 100W 5V Meanwell power supply, to convert the AC power to what we needed to drive the controller and the LED strip. This plugged into the outlet using a standard power cord, trimmed and attached to the power supply terminals.
All the signals were sent over standard Cat-5e un-shielded ethernet cables, using the Art-net protocol. I spent a lot of time worrying about signal issues, particularly because these were difficult to test beforehand. Originally, I was planning on daisy-chaining all 32 gates together, with an ethernet switch at each one. Some people seemed to think this would work just fine, others thought this would definitely fail because it would be too many 'hops', introducing signal delays. We ended up making every 6th station a 'hub' with a switch, with separate cables out to the neighbors. The hubs were daisy-chained together, but this way the number of hops to any given gate was minimized. The downside was that it meant the wiring was a lot more complicated. In a future iteration, I'd definitely want to experiment with other network topologies that might be easier to install.
The other issue we worried about was interference with the AC line. Electrical codes require a separation between AC and ethernet (signal) cables mostly for safety, but in part because it is possible for there to be interference between them. We made a minimal effort to separate the cables by a few inches in the trench, and then crossed our fingers and hoped for the best! As it turns out, we didn't notice any signal issues, despite having ethernet cables running alongside the AC lines for 1000 feet.
The LEDs were driven with a Lumigeek controller. Lumigeek is a small company formed by JoeJoe Martin and John Parts Taylor, who have worked on many Burning Man projects over the years (including the Koi Pond and Plug'N'Play in 2016). We were very fortunate to be connected with JoeJoe and Parts early on, since they basically told us everything we needed to know about how to get the electronics for the project off the ground!
They have a proprietary controller that accepts artnet and can drive 2 strips (ws2811 or APA-102c), and they were generous enough to do a production run for our project (Thanks, guys!). The Lumigeek controller is similar to a PixelPusher or AlphaPix -- it accepts Artnet signals and can be programmed to drive either APA-102 or ws2811 strips, but it's smaller and was more cost effective for our particular needs.
We wanted the sensitive electronics to be sealed from the elements, so we enclosed them all in sealed boxes. The BUD Industries NBF series had all the right features and was more reasonably priced than many alternatives. Each type of cable exiting and entering the enclosure needed a different size cable gland. The trickiest was the ethernet cable. We used these glands, which are designed to allow the entire head of the cable to pass through the gland so you don't have to re-crimp the end. We used short ethernet patch cables to exit the box, and then connected them on site to the buried cables using cat-5e couplers (see 'issues' below!).
We mounted the electronics enclosures on plywood, and attached it to the playa with 8" lag screws. To hide all the messy wiring, we covered the enclosures and wiring with these cheap baskets.
A 30 ft length of 5V LED strip shows significant discoloration by the far end due to voltage drop, especially if you try to blast all white. The strips we ordered had separate 2-wire barrel connectors on each end for 'power injection'. We ran 14 AWG speaker wire along the back side of the diffusers, connecting the leads on the front and the back sides of the strip. There was still some discoloration near the top (the point farthest from the power source), but it wasn't noticeable in practice, and at least it was symmetric.
All the gates were centrally controlled with a mac mini located off to the side of the 3rd gate, housed in a plastic bin and enclosed in a deck box. We also had a second deck box to store some tools, gear, water (and beer!) since our installation was in the deep playa and VERY far from our camp.
We specifically ordered a mac with only a solid state drive – fewer moving parts means less chance of getting destroyed by the dust. The mac was powered off the same generator (from gate 16), protected by a UPS power strip so it would shut down gracefully if the generator failed. An ethernet cable connected it to the nearest 'hub' station. We also had a keyboard, mouse, and monitor set up so we could hack the software on-playa, which we continued to do through Tuesday!
Our original plan was to use a radio transmitter to get a good clean signal out of whatever art car was 'playing' the runway. We also brought a microphone so we could pick up sound when there was no art car present. It turned out that the audio from the microphone was sufficient to pick up the music from the art cars, so we ended up not bothering with the radio transmitter. That was good, because it meant that each car could just roll up and start playing, without having to fuss with wiring and debugging. We mounted the microphone to the top of the first gate with some zip ties, and ran the XLR cable back to the base station. We piped it through an old DJ mixer to convert the balanced XLR to RCA output and fed that into the microphone in jack on the computer. Our software would monitor the audio levels so it would gradually adjust a maximum 'threshold'. That way we could display reasonable visualizations with a fairly broad range of input signal.
We also had a wireless router in the base station. This set up a wifi network so we could control the patterns of the Runway with an iPad running TouchOSC.
There are a lot of different possible ways to write control software for this kind of installation. The software had to be able to:
- perform audio analysis (beat detection, FFT, etc)
- broadcast ArtNet signals
- be able to render a pre-vis of the installation, since we wouldn't be able to set the thing up until we got out there
- maintain consistent timing, to achieve the 'speed of sound' effect
- allow multiple collaborators to contribute patterns
We ended up with C++, built on top of OpenFrameworks. I went this route because I'm familiar with C++ from work, and was able to get the ArtNet output going using some open source extensions without too much fuss. There are also tons of audio processing libraries available in C/C++, so we were well covered there, and it wraps up OpenGL for the previs. In retrospect, the main downside of OpenFrameworks is the lack of great UI support – you basically have a big OpenGL canvas, but no real support for dialogs and other things that would be standard in any other toolkit. C++ is also not the most friendly language for folks who are not familiar with it. That said, because you have very low level access, you're unlikely to run into something that isn't possible, given enough googling and grunt work.
In general, the software analyzes the incoming audio stream (using ofxAubio and essentia) and provides the analysis to a variety of patterns. These patterns are responsible for 'rendering' to a buffer that is basically a 32 x 277 image. That image is updated 36 times per second, and broadcast to the gates (each column in the image corresponds to a strip in a gate). So, the Runway is basically a big low-resolution screen that happens to be stretched to 1000 feet in one direction and curved into a partial cylinder in the other.
We calibrated the frame (refresh) rate so that a pattern that traveled down the Runway at a rate of 1 gate per refresh would move at the speed of sound. That turned out to be 36 frames per second, given the physics of sound and the scale of the installation.
Many people asked us if we did anything special to survey the site and make sure everything was all lined up. We considered trying to use some kind of laser (mostly because that sounds really cool!), but in the end it worked to just use a measuring tape and line things up by sight. Stakes we're first installed at 32 ft intervals and lined up by eye. Next, string was pulled progressively between sets of six stakes at a time, then the stake locations were fine tuned by sighting down the string, The result was very accurate. The string was left in place so that the feet could be installed parallel.
Generally, the build and install went pretty smoothly. Here are some issues we dealt with along the way.
The achilles heel turned out to be the ethernet couplers. These filled with dust and did not provide a robust connection. We spent a few evenings walking the length of the Runway, blowing out couplers and replacing them until we got all the gates running. Once they were working, they tended to stay functional, despite being buried in dust! If we build it again, we'll probably use these connectors instead, even though it means re-crimping the ethernet cables.
The barrel connectors on the LED strips we ordered were not robust. A few of them suffered broken pins and had to be replaced. It was also difficult to see the orientation of the plug, and if you plugged it in the wrong way you risked frying the strip. We destroyed many strips this way, and had to order a whole bunch of backups. In a future iteration, we'd probably want to replace these with something more robust.
The gates sustained many impacts from bicyclists throughout the week. We witnessed a few of these crashes, but also saw evidence of many more. Thankfully, the gates held up fairly well (in part due to the pivoting/sliding design of the base tee). Hopefully the people were ok too! One of the gates near the front got fairly mangled -- we're not sure if people were hanging on it, or if a car crashed into it.
Our deck boxes for the base station were extremely useful, but not terribly durable. After many hours of people sitting on them, we ended up having to reinforce them with some additional screws and plywood.
Thankfully, we didn't suffer an intense rain storm this year! It seems likely that the main electronics would have survived, but the exposed couplers and outlets may not have. If we do it again, we'll probably aim to find a solution that encloses all of these in waterproof enclosures. However, it wasn't practical to do that on the first iteration.
IN THE END
This project was only possible because of the enormous contributions of time and expertise from many people. We are extremely grateful for all their support. If you have any questions about the project, please feel free to ping us at firstname.lastname@example.org.
Written by Rob Jensen