I’m thinking of using the WS2816B in my next custom JLCPCB design, and thought I’d start a conversation. Assume PB v3 in these discussions of support and performance, unless specified otherwise
Anybody know of a datasheet in English?
Few things that caught my eye (note, I don’t read Chinese):
48-bits vs the now common 24-bits.
The norm seems to be the 2.1mmx2.1mm package.
Sounds like faster framerate? 10kHz, vs 2kHz of 2812 or 2813.
The addition of Bo seems like layout might be easier than the 2813.
Red and Green look 20% less bright, and Blue 50% less, than 5050 performance. 2816C looks half as bright as 2816B.
The price is in the middle of the 5-10cents of various alternatives.
Will PB V3 support it?
The chip is 48-bits, instead of 24-bits:
Will patterns be backwards/forwards compatible?
Will PB handle the full datarate? Will timing be a challenge (assuming good transmission line technique)?
What are practical effects of memory limitations?
Performance expectations of WS2816+PBv3 vs. WS2812+PBv2 vs. other combinations?
Thoughts on the WS2816? Will it take over a substantial portion of the market, or is it a bit of a niche? Out of JLCPCB’s part library, any big reasons why I should consider a different part? It’s probably a matter of time before clones appear (some better than others), but can it be assumed that all these parts are currently coming from a reputable and competent factory, without digging and hoping? Anything catch your eye about them? Thoughts?
My next projects are 400-1000 LEDs, no POV, reliability is a major concern (I can’t repair a dead pixel without ripping into the art piece, and I might sell them and don’t want the headaches of customer returns. With the smaller package, I’d probably fan out the dataline to 2 or 3 parallel LED lines, keeping the address space the same, so that a dead LED isn’t that noticable)… I’ll be satisfied if I get similar framerate performance out of a WS2816+PBv3 as I currently do with WS2812+PBv2.
Given the fact they are making mystery ws2812ish chips that we can’t even get part numbers for, do you really think a new chip will gain any ground? There would have to be a serious technical reason for a mass majority to switch, and thus stay in production and business, and I don’t see it above. There is a reason you can’t find legit APA102s or HD107s etc. Lack of enough demand to forestall the cloners from undercutting you, running you into the ground, and then moving on the next hot item.
Also, if they’re really using a ws2812-like protocol at 5 times the speed, it could be a nightmare for current controllers.
The code I’ve seen at has offloaded the work of generating timing and sending WS2812 bits to a cleverly configured UART. The ESP8266’s UART can theoretically maybe be clocked to 10 or 20 Mbits/sec, but from what I’ve read, around 3Mbits/sec is the practical limit. Pretty much everybody would have to seriously rethink their controller code.
I wish I understood what’s driving feature decisions for the Chinese LED makers. From here, it seems pretty random. I kind of suspect them of getting feature lists from something like this: (cats have gotten much better, airbnb listings are occasionally hilarious)
The reason one would use it is that they are 16 bits per channel, rather than 8. That’s 65536 levels of brightness instead of 256!
That blows even apa102/sk9822 out of the water… Theoretically.
I have a bunch of these and have been experimenting before the v3 launch.
They use the same bit encoding and data rate, just twice the data per pixel.
They do have an increased range… But have a few shortcomings. First, at low levels where the difference is more noticable, I don’t see much improvement and green is especially bad. Some, but not x256. Second, it has built in gamma correction which means things will look different.
Still, I think they have advantages over ws2812, but are a bit less exciting than when I first saw 16 bits. It’s a shame really, if they just exposed a linear 16 bit per channel, I think it would have been much better.
At the moment, HDR television is only 10-12 bits per color, and the TV industry is extremely careful about using the extra bits for more detail at low brightness levels, where it actually makes a visible difference. Simply increasing color depth is not necessarily an advantage.
Getting green right is kind of important. If I’m not mistaken, the human eye is capable of distinguishing more shades in the green range than in any other color.
And sending twice as much data at the same rate? That amounts to halving the effective data transmission speed. You’d really feel this on anything with more than a couple hundred LEDs.
Unless I’m missing some huge hidden advantage, I’d pass on these in favor of anything APA-102/SK9822, and wait for somebody to take the good parts of this feature set and make a more useable product.