I just posted pixelblaze-client to my github repository. It’s A Python library that presents an easy-to-program, synchronous interface for communicating with and controlling one or more Pixelblazes. It requires Python 3 (between 3.4 and 3.8 – I used 3.7.7) and the websocket-client module, which can be installed with pip. Full instructions, API docs and example/test code are in the repository
This is a very early release of a very new thing, so I’d love to hear bug reports and suggestions.
So some interesting behavior. I created a pattern that only lights up a single pixel, with a position controlled by an exported variable. I wrote a script to rapidly iterate through all the LED positions with minimal dwell times. What I saw was that the python script rapidly completed in a few seconds, but that the led position continued to increment by one pixel at a time. It looks like all the commands are stacked up somewhere and execute at some rate. Do you have an estimate for how slow things should be updated to stay in sync? Is it one update per frame?
The pattern
export var point
export function beforeRender(delta) {t1 = time(.1) }
export function render(index) {
h = t1 + index/pixelCount
s = 1
v = 0
if(index==point){v=1}
hsv(h, s, v)
}
The python loop
for i in range (1440):
pb.setVariable("point", i)
time.sleep(0.01)
edit:
I note in your repo you say that python v3.8 is not supported, but I ran it on python 3.8.5 (on windows 10) with no issue.
On Python 3.8 and above – I’m being cautious, since the websocket-client module hasn’t yet been updated past 3.7. There’s nothing I know of in there that will explode even on 3.9, but Python 3 changes quite quickly. I generally stay a few versions back from the leading edge so the libraries can keep up.
Variable update speed is dependent on the Pixelblaze. I’m not throttling sends at the Python end. I’m guessing that it buffers incoming packets until it has spare cycles to process them. You can definitely overwhelm it and make patterns stutter by sending a lot of websocket traffic.
I don’t know yet how much is too much, so for now we probably need to hand balance the send rate for individual patterns. I was gonna try sending 16x16 video at a super low (like 10 fps) for starters.
(one last thought: I’ll also test with TCP_NODELAY on the socket. Normally, the network stack tries to reduce packet traffic by delaying and merging small packets. TCP_NODELAY tells it to send immediately. This increases traffic, but can reduce latency (if your network isn’t killed by the large number of tiny packets.) Gamers often set things up this way, network admins hate it.)
Sending multiple websocket commands might get queued. If you need to know when it catches up, you can send a ping (not a websocket ping, but a ping object frame), and wait for the ack.
BTW, right now 2 other commands implicitly require/send an ack: pause and setControls. On the UI side, I keep a counter to know how many unacknowledged commands are still in flight. Incrementing on send, decrementing as acks pour in.
Pause isn’t documented (aside from this text), the editor uses it as code is changed to keep the animation from running before setControls, a separate command, can give it the right settings before resuming.
@wizard I know it’s come up before… But this is a perfect case where being able to pass pixel data via websocket would be handy. Ideally it would be a standard like e1.31 or even udp, but worst case, someone could write a e1.31->PB server if there was a supported API way to just push pixels enmass to PB. With the growth/development of apps like ledfx, while PB shines as self contained, a way to use it with other apps would help long term. (I’ve discussed PB with WLED users and that’s usually where PB wins on the self pattern front but loses on the “I want to push to a few controllers at once DMX style”)
@Scruffynerf,
Fair enough, and I get asked about that often.
You can implement a poor version now, but it will be limited in capabilities.
The setVars websocket API supports arrays. You can push either an unpacked [r,g,b,...] or maybe a packed array with 16.8 bits: [ (r<<16 | g<<8 | b) / 256, ...] and a simple pattern can render the array to pixels.
p = pixels[index]
var r = p >>8, g = p & 0xff, b = (p * 256 + .5) & 0xff
rgb(r, g, b)
There will be scale and FPS limitations though, I think it would work for a few hundred pixels perhaps.
Now a non-json binary protocol for variables/controls, perhaps over UDP could be much more efficient. This bleeds in to some other ideas on the backburner. Obviously a standards compliant network pixel protocol would be ideal, but sharing variables/control data over the network could open up all kinds of fun stuff, like broadcasting an expansion board’s data, sharing data between PBs, and things like that.
I was planning to bang together a quick e1.31 proxy out of various Python components (using packed pixels on the Pixelblaze) for the holidays because it’s an easy way to get lightshowpi integration. I’ll post it when done, or if it turns out to be impractical.
(fully expecting that I’ll have to keep the frame rate to the PB low, and not use a huge number of pixels per controller. A UDP -> RGB frame buffer mode would be occasionally handy, but this seems like a less-than-optimal use of a Pixelblaze. )
The getXXX methods all return null objects on timeout and other non-fatal errors. They throw exceptions on disconnects and unrecoverable protocol problems.
I’ve just updated the library on github to support the following new things:
TCP_NODELAY socket option set to reduce command latency
waitForEmptyQueue(timeout_ms) method added (uses “ping” frame described above)
setActivePatternId(pid) method added (takes pattern id only, not name, but is faster than setActivePattern() if you know the id.)
Ok… the e1.31 thing worked! I’ve posted python code for an e1.31/sacn proxy in the repository.
I’ve got lightshowpi on a 3b+ driving 256 pixels at about 20fps - fps limited by the pi, which is playing music, running other lights, doing the fft dance, etc… It looks smooth and good – video later when I can figure out how to film it without making a blinding blur.
@zranger1 Great work on the library, and it works well for my needs as I’m looking to integrate this into my own Home Assistant and Node Red setup for some automation, but had a few suggestions (perhaps I can just fork the project).
To set UI Controls. setControl - This seems to work fine only for a slider control, as you can input a single value, but you can not set the values for hsvPicker or rgbPicker which requires an array of 3 values [ (h,s,v) or (r,g,b) ]. I would define another function setControlColor or setControlPicker which accepts an array of length 3. Also, since the control name has a naming convention, you can validate the user input, such as the ctl_name must start with slider, hsvPicker or rgbPicker
@zranger1 can I suggest pyxelblaze as a new name? Since it’s a python connector to Pixelblaze.
Pronounced like Py-thon, it’ll be py-xelblaze (long i, like in Pi, as opposed to short i like in pixel.)
I think pixelblaze-client is the simplest option. I was avoiding using “Pixelblaze” without permission , but if it’s cool with you, @wizard, that’s what it shall be! I’ll build the new repo this afternoon, but in the meantime, changes and fixes are already made to the old pyblaze repository. Here’s what I did:
New methods for dealing with color picker controls:
controlExists(ctl_name, pattern) - returns True if specified control exists in the specified pattern, False otherwise
getColorControlName() - returns name of rgb or hsv color picker if the pattern has one, None otherwise
setColorControl(name,color) - allows you to set a color picker control to a 3 element array of color values
variableExists(var_name) - returns True if specified variable is exported by the current pattern
Additionally, if you omit the pattern name argument from getControls or controlExists, control data is retrieved
for the current pattern if available.
@vutang50, thank you very much for finding and reporting this set of issues! I totally appreciate the feedback – it just makes better software!