Question: Choreographed wearables for dance show?

Hi- just learning about PixelBlaze (and have some limited pixel experience prior). I’m trying to plan some wearable LEDs for a dance show and program a sequence to a song, and trying to decide if this controller is the way to go. It seems to check a few important boxes:

  • small and light weight
  • runs on 5V battery- same as the pixels
  • wifi connectivity
  • stores the sequence onboard, not relying on wifi to transmit all the data in real time.

I’m just not certain how well it lends itself to preprogramed sequencing of multiple costumes to a song. From what I read am I right in assuming that running a single unit as an AP can coordinate the timing of patterns on multiple client devices, triggered by something like a button push, or a signal from a connected computer?

I’ve been playing with x-lights, and I like the intuitive ability to storyboard complex choreography across either isolated body parts, or entire costumes/dancers. It looks like PB is not directly compatible with xlights. Is there a way to make them play together, or alternately another program where I can do something similar with PB?

I’m also open to a different approach, as I see you can program timed pattern sequences with PB. Is there something analogous to the graphical interface of xlights that would make it easy to time a pattern sequence to a song, and quickly visualize it during development? Can the PB software isolate specific portions of the LEDs attached to it (if I want to light up just the right leg for example) and apply a pattern only to that specific segment? I’m comfortable copying someone’s code and making minor tweaks to it, but do not want to take on any heavy coding in this.

I’ve been playing with a basic NodeMCU ESP board using wled which plays well with xlights, but I’m finding it hard to find a controller that ticks all the above listed benefits of the PixelBlaze.

Thanks for any suggestions and experiences you can suggest.

Hi @DocNaes!

I think you understand most tradeoffs correctly. A few clarifications:

The easiest way to do much of this is to have a computer (or RPi, if you can find one right now) run a a Firestorm server. This allows you to clone the pattern code from one to all the others, and trigger new patterns activating at the exact same time on all your costumes. It’s easiest with a human following a cue sheet to activate different patterns by click/tapping them once in Firestorm; you can automate the “next pattern” action though if you’re familiar with JS web development and comfortable using APIs.

If you need more fine grained control, for example, sending an identical pattern on all Pixelblazes a event signal like a button tap, or selecting a color, this can be done with the websockets API.

@wizard - Without Firestorm, if one is in AP mode and the rest are clients, do the timebases for time() sync?

I don’t think so, unfortunately - the paradigms and assumptions are fairly different.

I wrote a pattern that tries to do this, but many people find it a little cumbersome to understand, tune, and run. There’s no UI to choreograph events on a timeline.

Yes - see the Multisgement Demo pattern, currently near the bottom of page 4 of the pattern library.

I think you’re right on the edge with some of the requirements. If you can have a human tap a pattern on Firestorm as part of the performance, the code will be much lighter.

Best of luck to you!

Yes, a PB in AP mode will act as a time synchronization coordinator, but doesn’t change the pattern on the other PBs (yet).

Syncing animation timebases isn’t enough, since you need to be able to control when everything starts. The low level tools are there to build something though!

There are 2 challenges here:

  1. Synchronizing the start of the song/scene across all controllers. For this, I would use the ability to read/write variables over the network using getVars and setVars to create a command system. It could be fairly simple, like a shouldRun variable that defaults to false and can be changed to true. Once every controller received the relatively small/low bandwidth command, they could run in sync without further network traffic. This could be hacked on to Firestorm for coordination via a button on screen or some other event.
  2. Scripting the choreography. This could be done by hand-coding through some helper framework, possibly as a big array literal. e.g. [rainbow, 1, 12.5] to play a the “rainbow” pattern on segment 1 at 12.5 seconds in, and just a bunch of lines like that to set things up.
    Another option would be to convert some other timecode format that has an editor into something like the above. Perhaps the xlights file format could be parsed and used, even if the patterns and other concepts are a bit different. For example, here’s a snippet of the xlights XML file:
<Element type="model" name="Candy Canes">
  <EffectLayer/>
  <Strand index="1">
    <Effect ref="0" name="Fireworks" startTime="0" endTime="1000" palette="0"/>
    <Effect ref="1" name="Ripple" id="1" startTime="1000" endTime="2000" palette="0"/>
    <Effect ref="2" name="Fill" id="2" startTime="2000" endTime="4900" palette="0"/>
  </Strand>
</Element>
<Element type="model" name="Circle">
  <EffectLayer>
    <Effect ref="3" name="Kaleidoscope" startTime="0" endTime="4900" palette="0"/>
  </EffectLayer>
</Element>

Thanks for the feedback. I have a couple PBs on the way I can play with and see what I can do. II imagine I’ll have more questions!