I am working on developing a Swift app with a dancing AR stickman made of LEDs. How can I capture the websocket frames for the live pattern preview so that I can apply them to the LEDs on the stickman?
I really appreciate the information. I don’t code in Python but will start looking at the documentation to get a better understanding of how to use the library.