HSVtoRGB & Framebuffer with attenuation

I have 2m of 144 LED/m SK9822s folded back on itself in an acrylic tube with a PB in a discarded plastic toy sword hilt (and a lot of Sugru!) at one end.

Here’s a pattern for it with “Streaks” driven by lack-of-motion (or … whatever. That’s just what is in the code right now!). Like “Sparkles”, it has a list of items, but they have variable length and color, and are written into an RGB framebuffer via incrementation so they show through each other when they overlap. The color comes from a non-returning HSVtoRGB function, and if the total value in the framebuffer is too high (as in, having all those LEDs on would overload the battery), the framebuffer is attenuated.

Comments and performance improvements much appreciated! Enough talk, here’s the code!

// This pattern is designed for a "sword" with an LED strip folded in half
// and demonstrates:
// - non-render-ending HSVtoRGB function
// - RGB framebuffer accumulation of HSVtoRGB results
// - Framebuffer attenuation to avoid pulling too many amps from the battery

export var accelerometer

export var x = 0 
export var y = 0 
export var z = 0 

export var total

// configuration
var max_total = 12 // maximum total value to prevent overload
var streak_count = 20 // streak count

// tuning
var scale_acceleration = 50.0 // acceleration scale to ~ [-1,1]
var scale_speed = 12.0 // speed scale to ~10x
var scale_brightness = 8.0 // scale rgb by 1/scale_brightness before summing

// streak state storage
var s_age = array(streak_count)
var s_hue = array(streak_count)
var s_spd = array(streak_count)
var s_len = array(streak_count)

var hPc = pixelCount/2 // half pixelCount

// output framebuffer
var fb_Red = array(hPc)
var fb_Green = array(hPc)
var fb_Blue = array(hPc)

// other stuff I was hacking on
// export var angle = 0
// export var max_spd = 0

var r, g, b // filled by HSVtoRGB

function HSVtoRGB(h, s, v) { 
    var i, f, p, q, t;
    i = floor(h * 6);
    f = h * 6 - i;
    p = v * (1 - s);
    q = v * (1 - f * s);
    t = v * (1 - (1 - f) * s);
    im6 = i % 6 
    if (im6 == 0) { 
      r = v; g = t; b = p;
    } else { if (im6 == 1) { 
      r = q; g = v; b = p;
    } else { if (im6 == 2) { 
      r = p; g = v; b = t;
    } else { if (im6 == 3) { 
      r = p; g = q; b = v;
    } else { if (im6 == 4) { 
      r = t; g = p; b = v;
    } else {if (im6 == 5) { 
      r = v; g = p; b = q;
    }}}}}}
} 

export function beforeRender(delta) { 
  // current acceleration, with a 1-sample smoothing window
  var nx = clamp((x + (accelerometer[2]*scale_acceleration))/2.0,-1,1)
  var ny = clamp((y + (accelerometer[0]*scale_acceleration))/2.0,-1,1)
  var nz = clamp((z + (accelerometer[1]*scale_acceleration))/2.0,-1,1)

  // angle = acos(((x*nx)+(y*ny)+(z*nz))/(sqrt(x*x+y*y+z*z)*sqrt(nx*nx+ny*ny+nz*nz)))

  // clear framebuffer
  for (n = 0; n < hPc; n++) { 
    fb_Red[n] = 0 
    fb_Green[n] = 0 
    fb_Blue[n] = 0 
  } 

  // create a new streak if change in acceleration is low
  if (abs(sqrt(x*x+y*y+z*z) - sqrt(nx*nx+ny*ny+nz*nz)) < 0.25) { 
    s_new = 1 
  } 

  // render streaks
  for (n = 0; n < streak_count; n++) { 
    // if we are creating a new streak, and this slot is empty, fill it
    if (s_new && s_age[n] == 0) { 
      s_age[n] = 1/1024
      s_hue[n] = random(1) // or angle/PI // or (abs(nx-ny))
      s_spd[n] = 1 + scale_speed * sqrt(abs(nz * nx * ny) * (1 + random(2)))
      s_len[n] = max(12 - s_spd[n] + random(3), 3)
      // max_spd = max(max_spd, s_spd[n])
      s_new = 0 
    } 

    if (s_age[n] > 0) { 
      len = s_len[n]
      pix = s_spd[n] / scale_speed * s_age[n] / 6 
      if ((pix-len) >= hPc) { 
        s_age[n] = 0 
        if (s_new) { n-- } // render this streak
      } else { 
        s_age[n] += delta
        hue = s_hue[n]
        remaining = len
        for (pn = ceil(pix); remaining > 0 && pn >= 0; remaining--) { 
          if (pn < hPc) { 
            fade = remaining/len
            HSVtoRGB(hue, sqrt(clamp(1.5 - fade,0,1)), pow(clamp((1-pn/hPc) * fade * fade,0,1),1.5) ) 
            fb_Red[pn] += r/scale_brightness
            fb_Green[pn] += g/scale_brightness
            fb_Blue[pn] += b/scale_brightness
          } 
          pn--
        } 
      } 
    } 
  } 

  // Calculate the sum of all pixels in the framebuffer ...
  total = 0 
  for (n = 0; n < hPc; n++) { 
    fb_Red[n] = min(1,fb_Red[n])
    fb_Green[n] = min(1,fb_Green[n])
    fb_Blue[n] = min(1,fb_Blue[n])
    total += fb_Red[n] + fb_Green[n] + fb_Blue[n]
  } 
  // ... and scale them all down if necessary!
  if (total > max_total) { 
    adjust = total / max_total
    for (n = 0; n < hPc; n++) { 
      fb_Red[n] *= adjust
      fb_Green[n] *= adjust
      fb_Blue[n] *= adjust
    } 
  } 

  // Store current values for smoothing on next frame
  x = nx
  y = ny
  z = nz
} 

export function render(index) { 
  // The PB is at one end of the strip, and the strip is folded in half
  // Use "^hPc - [...]" instead of "[...] - 1$" if PB is at the other end
  pindex = ( (index < hPc) ? (hPc - index) : (index - hPc + 1) ) - 1 

  rgb(fb_Red[pindex], fb_Green[pindex], fb_Blue[pindex])
} 
3 Likes

Between this and @zranger1 's lightsaber there is definitely a whole category of patterns that could be done with a “stick of pixels”. Magic wands, swords, and more. Especially with a button or two?
Sound effects would be nice too but that’s a whole different subject.

My “lightsaber” has pretty much the same LED layout as @sorceror’s. It’s a single strip of LEDs “folded” in half, mounted on a 10mm wide x 3mm thick piece of hardwood, kept centered in its tube with flexible clear plastic spacers. I mapped it in 2D as an n x 2 matrix so I can build 2D patterns to display the same thing on both sides of the blade.

That pattern sounds cool! I’ll try it out this afternoon, after today’s hardware reconfiguration.

I’m using the Pixelblaze Pico and designing for moderate contact use (You can actually whack the PEX tube with a sledgehammer with no real effect). There’s no room for a sensor board in the build, so I’m trying to figure out a cheap, durable way to have “flash on impact” effects – messing with GPIO and various weird switch setups now.

Sound effects? Well for now, we’ll just have to DIY – “zzzzzzzzzzzzzz… shhhhhhhhhhzt… zrrrrrrrrrkt… You cannot win, Vader. If you strike me down, I will just come back with a longer extension cord!”

@Scruffynerf I was imagining having 2 swords and the PBs talking to each other (over OSC or WebSocket; imagine being able to call setVarsOnPartnerPixelBlaze({'foo':'bar'}).

Sample game: Each sword has a 3-way toggle on A0/A1/A2. Then suppose we want to play Rock-Paper-Scissors. The players thumb their toggle to select their move, which is transmitted to the other sword. When they both detect a ‘hit’ around the same time, the move state determines who gets a point, and the LEDs on the swords sparkle appropriately. Maybe each sword starts half brightness or length, and gets brighter/dimmer or longer/shorter with each move.

My broken toy sword hilt included a piezo speaker but I never got it working so I don’t know what kind of sounds it made. Hasbro’s Star Wars Scream Saber is a lot of fun — it has various SW samples built-in and an acceleration sensor to detect swinging and impacts, and plays the samples at a speed corresponding with your swing! The best part is that it has a mic so can use your own sounds. I just noticed a “Star Wars Lightsaber Academy Interactive Battling System Lightsaber” product which sounds way more elaborate with a smartphone app and such.

@zranger1, an n x 2 matrix is indeed a good idea, but I seem to recall that it normalizes all of the dimensions to 0…1 which confused me when I had a pixelCount-sized array. I suspect that the multiplication to resolve that is about the same amount of per-pixel work as my calculation. :relaxed:

I got IP65 strips which have clear silicone on the top and double-sided tape on the back and just stuck them together as-is. Maybe a couple of mm of wood would have helped, but the strips are 12.5mm in a 14mm tube so stay reasonably straight.

I don’t believe we have any way for a PB to generate outgoing msgs, only receive.

And yes, the difference between a pixelcount/index pattern and a matrix (0…1) has stymied many. I’m hoping to go thru a pattern list, and document how to do the same pattern each way. (thus making 2d mapped setups work with 1d patterns correctly, rather than ignoring the 2d)

@Scruffynerf ,
At some point I want to add a bitmap/raster API that would give you a 2D surface to draw into, and then map/pull from going to LEDs (a simple 1:1 if they happen to match, but not necessary).

In the meantime, I’ve been playing around with a fractal that lends itself to drawing pixels rather than math in render, and came up with this framework which isn’t totally dissimilar to what everyone else has been doing for 2D, but packs pixels into a single dimensional array:

/* Rendering into a "canvas" example
 *
 * In this example, an array is created to represend a 2D canvas with
 * x and y values in world coordinates, which are values from 0 to 1 exclusive.
 * The canvas is then scaled and drawn to the LEDs. The LEDs could match 1:1
 * or could be some other size, or even a non uniform layout.
 * The canvas is set up with 2 arrays, one for values, one for hues, which
 * are then fed to hsv() during render2D.
 * 
 * This example draws a dot traveling around the circumference of a cirlce,
 * leaving a fading trail of color.
 */

var width = 8
var height = 8
var numPixels = width * height
var canvasValues = array(numPixels) //make a "canvas" of brightness values
var canvasHues = array(numPixels) //likewise for hues
var fade = .95


//find the pixel index within a canvas array
//pixels are packed in rows, then columns of rows
function getIndex(x, y) {
  return floor(x*width) + floor(y*height)*width
}

function isIndexValid(index) {
  return index >= 0 && index < numPixels
}

export function beforeRender(delta) { 
  //fade out any existing pixels
  canvasValues.mutate(p => p*fade) //TODO fade based on delta for consistent fade

  //draw into the canvas here
  //this draws a pixel moving in a circle
  //radius 1/3rd, centered at 0.5, 0.5
  var a = time(.01) * PI2
  var r = 1/3
  var x = sin(a) * r + 0.5 
  var y = cos(a) * r + 0.5
  
  //optionally, you can make the pixels "wrap" around to the other side if out of bounds
  // x = mod(x,.99999)
  // y = mod(y,.99999)
  
  //calc this pixel's index in the canvas based on position of our coordinate
  var index = getIndex(x, y)
  
  //check that the coordinate is within bounds of the canvas before using it
  if (isIndexValid(index)) {
    canvasValues[index] = 1
    canvasHues[index] = time(.015)
  }
}

export function render2D(index, x, y) {
  index = getIndex(x, y) //calc this pixel's index in the canvas based on position
  h = canvasHues[index]
  v = canvasValues[index]
  hsv(h, 1, v*v)
}

2D canvas example.epe (7.9 KB)

@sorceror, On the other topic, I do want to add some way to get data between multiple PB. Sharing sensor board data is definitely top of the list, but I’m thinking about the best way to support other data, and perhaps events as well. I think a pub/sub kind of setup with UI to configure the connections might be best, allowing the code to remain more agnostic of where the data is coming from. Controls are a good example of something that could receive events/data from not only UI sliders and things, but could be “wired” up to an analog or digital input, or a data source on another PB on the network. Perhaps with an OSC compatibility layer (or using OSC if it fits well enough).

1 Like

…just made another thread for build info, so as to keep this one more about patterns and interesting ways for swords to communicate and interact…

2 Likes

Ben, sounds great. A drawing layer would be interesting. I’d suggest using the syntax from Processing (Actually use the js equiv in p5.js, not Processing itself) as the language and then only support whatever bits you wish or make sense. That would maximize the usefulness and make the most sense, since it’s a robust and well tested set of graphic commands.

And adding some way for a PB to send data would also be awesome.

Thanks, I like the getIndex() way of packing into a 1D array. Could definitely come in handy.

Reading your example, I am wondering if Array.mutate() and similar will ever make it to the v2 firmware. I’m using my v2 in the sword right now because I soldered the sensor board backwards on my v3 and haven’t had the time to sort that out. :relaxed:

Eventually, but there’s some pressing things to handle on v3 first.

The mutate one liner isn’t too hard to port to a for loop, but source compatibility would be nice!

Sounds like you are talking about an MQTT client, Paho has several packaged MQTT clients ready to go.

One thing that many people don’t realize is that MQTT works perfectly well with binary data (does not have to be just strings). I use it to send jpg images, with no problems - your limitation is really RAM for buffers.

It looks like an MQTT client (including MQTT over WebSocket) is at:
https://docs.espressif.com/projects/esp-idf/en/latest/esp32/api-reference/protocols/mqtt.html
and there also seem to be some embedded MQTT brokers available. I’d like to be able to do without a RaspberryPi (or my LattePanda).

‘A Firestorm of Pixelblaze’ over here A Firestorm of Pixelblaze - spread patterns and control via MIDI/OSC might be a more relevant place to continue. I like the idea of PBs being able to discover each other and then communicate without a broker, preferably over UDP.