Here’s a quick edit of @wizard and I using wireless sync with 28 Pixelblazes for the Denver Broncos Christmas Eve halftime show tonight.
Many thanks to the Denver Cheerleaders and Gloey Zoey for inviting us to do this collaboration.
Here’s a quick edit of @wizard and I using wireless sync with 28 Pixelblazes for the Denver Broncos Christmas Eve halftime show tonight.
Many thanks to the Denver Cheerleaders and Gloey Zoey for inviting us to do this collaboration.
Amazing! Beautiful show! My mind is blown!
When time permits, I’d love to hear more detail about how you made this show happen - how you set it up on the Pixelblazes, how you controlled the show, what were the challenges of working in that environment and so on.
My gosh. That is truly impressive. Well done guys.
Awesome work ! Bravo !
Strong +1 to this. The video is amazing but I’d love to hear more behind the scenes!
Amazing! I’ve had trouble getting more than 10 or 11 to sync before overloading the networking on the master node. How did you manage so many?
I did discover some issues with 28 at first, so these are running a build with some adjustments that with more testing will make it into a release.
As he mentioned, Wizard made a custom firmware build with the leader/follower subscription packets and time sync packets optimized. I predict he’ll get that into the next firmware release.
He also made a pattern that was heavily optimized for a small compiled size. We felt this would maximize the chances of successful sync between 28 followers and a leader. A trick Wizard taught me is that in the web inspector console, you can check your compiled program size in bytes with:
> program.compiled.length*4
We used plenty of magic numbers for binary vectors, multiple embedded maps, and other optimizations.
The code uses a keyframe timeline design pattern that specifies durations between subsequent keyframes. I’m not sure we can share it as it was developed on spec for the Broncos, but the general form is like this:
keyframe(interval, follower or follower group, subpattern, colors)
// Start at 0 seconds with all off
keyframe(0, ALL, solidColor, black)
// Fade in all to Christmas colors driven off nodeId over 8 seconds
keyframe(8, ALL, node5Colors, christmas)
// After the 8 second fade in, fade all to black in half a second,
// then quickly fade in just dancer 1 to pink twinkles for 8 seconds
keyframe(0.5, ALL, solidColor, black)
keyframe(0.5, DANCER1, twinkles, pink)
keyframe(8, DANCER1, twinkles, pink)
// Establish a keyframe for all the other nodes so they don't crossfade into
// the next pattern during the last 8.5 seconds
keyframe(8, ALL, solidColor, black)
keyframe(0, ALL, suddenFlashOnBeat, white)
We were trying to code defensively to what we assumed would be a very hostile WiFi environment, which is why we decided to make the main show timeline a single pattern. That way, if followers lost reliable contact with the leader, we knew the rest of the 7-minute show would execute fine, instead of relying on the playlist to distribute new pattern code on a precise timeline.
The wireless sync and playlist features were still essential to pulling this off for two reasons. First, we used a pattern to visually indicate that all 28 followers’ time()
timebases were perfectly synced. It blinked a different color each second for 4 seconds, followed by a 4-second slow fade-out. You can see this in the leader video below.
// Clock Sync Pulse
export function beforeRender(delta) {
t1sec = time(1/65.535)
t4sec = time(4/65.535)
tMode = time(8/65.535)
}
export function render(index) {
// Blink a different color 4x, once a second
if (tMode > .5) {
hsv(t4sec, 1, (index % 10 == 0) * square(t1sec, 1/10))
}
// red fadeout for 4s
else {
var v = (index % 10 == 0) * (1 - t4sec)
hsv(0, 1, v*v)
}
}
This Clock Sync Pulse pattern was set to a very long duration (something like 5 days) so you’d have to manually advance past it using the leader’s button. The next pattern in the playlist was a solid green pattern with a 20-second duration. After triggering it once to advance the playlist to this green pattern, I would wait to see that all costumes turned full-green. This way I knew they would all co-launch the next pattern (the larger main timeline pattern) at exactly the same time. That’s the second essential sync feature – the playlist in sync mode. It uses the current pattern’s running time to distribute the code for the next pattern and pre-orchestrate an exact moment in time for all followers to start the next pattern. This method ensures a very tight synchronized launch that’s more accurate than the multi-controller trigger we used to send from Firestorm.
In addition, Wizard made a few group-management command-line scripts that were useful for some contingencies that I’ll spare here because we didn’t need them, but imagine things like pushing configuration or new firmware to 29 controllers.
Wizard has a special board he can produce for custom work that includes an external antenna, but we actually didn’t need that for this job.
The leader Pixelbaze was enclosed in a 3D printed enclosure that also held a USB power bank, a single output LED, and two buttons wired in series. One switch with a spring-loaded safety cover “armed” it, and the other momentary pushbutton completed the circuit between the GND and BTN pads on the underside, which advances the pattern in the playlist.
The followers were embedded in the back pocket of the Glowy Zoey suits next to the combined 12V / 5V USB power pack. The noise on the 5V output made us nervous, so the follower enclosures accommodated a large capacitor across the power rails. To minimize space we designed around a right-angle micro USB that could be zip-tied with the LED outputs to the strain relief shelf. We used plain PLA for maximum RF transparency.
Lots of prototyping, then lots of production.
We put together a WiFi backpack with the following gear. I’ve used Cisco Aironet before, but I appreciate the simplicity of Ubiquiti.
Outside the technical implementation, a lot of what we did well was in communication and client service, but one thing in the final performance left us eager to improve.
Synchronizing the timeline pattern launch to the stadium’s audio track was accomplished via a human countdown and cue call on comms between a coordinator at field level and the DJ in the audio control booth. This method is subject to human timing error and excitement. Especially with the stadium’s sound propagation delay, most audience members couldn’t tell there was a sub-second unwanted offset between the two, but I could. An area for improvement, we’d like to look into automating this link, perhaps with an external wireless OSC message to a GPIO trigger on Pixelblaze.
I love all these details – the gap between “it works in a safe environment when everything goes wrong” and “this is a hostile environment that allows for limited testing where everything MUST go right, perfectly, the first time” is huge, and I appreciate the insights into how you handled those constraints.
Wow!!! That’s amazing!
I’m really looking forward to those firmware updates now.
What an awesome display! I can only imagine the amount of work and planning that went into this performance. This was a top level demonstration of Pixelblaze capabilities.
Was just getting ready to post a question about the current 9 follower limit and was guided into re-reading this post. I was thinking about hardwiring matching displays to a controller because I was using nine already. Super psyched! Just need more controllers.
No pressure, (just happy to know that the current limits are temporary) but is there an approximate time frame for the new firmware?!
I thought the 9-device limit was just for WiFi. Is it a follower limit? They provided their own WiFi at the halftime event, so avoided that one.
There is a hard 10 client limit for pixelblaze AP mode WiFi. You wouldn’t want more than 9 clients/followers in that setup because your phone/laptop wouldn’t have a spot.
There is a roughly 10 follower recommended limit. 10 followers works well for all supported features.
You can have more than 10 followers when you use a dedicated WiFi router.
Some issues may happen if you connect a lot more than 10. Usually problems arise when you power up everything at the same time, or are sharing sensor board data with lots of followers. I found issues with 30 followers and no sensor board data, where sometimes it would take a long time for things to sync up but it wasn’t completely broken.
I’m looking to do 19 followers and 1 master with sound controller… I would be using a fast router with nothing else on it about 10-15 feet away… Each Pixelblaze would only be controlling 75 pixels. Do you think that would work relatively reliably? and would the number of pixels being controlled make a difference here?
I haven’t tested with that number, but I suspect it would be pushing limits with the sensor board and bandwidth. It might work, or it might be marginal and skip some of the sensor board update frames.