Dancing LED Stickman

Hello:

I am working on developing a Swift app with a dancing AR stickman made of LEDs. How can I capture the websocket frames for the live pattern preview so that I can apply them to the LEDs on the stickman?

Thanks for any thoughts or suggestions,

If you can program in Python, you can use GitHub - NickWaterton/Pixelblaze-Async: An MQTT interface and async Client for Pixelblaze devices

Using this library, you can enable preview frames for your pattern.

If you then override _handle_binary_data() you can access the bytes in the preview frames.

These have the first two bytes as (5,4), followed by r,g,b values as bytes.

Read the documentation for how to use it (you don’t need to use MQTT), and the examples for how to use the library.

If you want to use the websocket in SWIFT, the Python code should give you a good starting point for how it works.

Let me know if you need help.

Nick:

I really appreciate the information. I don’t code in Python but will start looking at the documentation to get a better understanding of how to use the library.