HI - before i spend time to figure out the programming and if it works or not wanted to get a quick confirmation if using Exports from the sensor board will work in the Mapping code.
Specifically what i am envisioning is a dynamic map that is aware of the orientation of the board.
is this possible, doable?
I notice the mapping code does not like anything above the function.
Can the exports be declared inline in side of the function?
or is this just not going to work at the mapping level?
The way I understand this, the mapping code runs in regular Javascript, completely in your browser. It evaluates the map, then sends it to the Pixelblaze as a JSON array. So exports from the Pixelblaze itself aren’t going to be visible.
You can do the same thing though, with the new rotate() API, which if called in beforeRender() lets you rotates the map to whatever angle you want. Just be sure to call resetTransform() when you change it, or the rotations will be cumulative (and you’ll only be able to stack 31 of them with the current setup.)
Bingo. Mapping isn’t code run on the PB, which is also why @pixie 's recent post about the URL for the map returns text, not the actual map. (Right @wizard ? Is there a way to get the normalized (or not) map data right now? )
I agree, at this point, combining the sensor(s) in real time with transform/rotate is your best approach to changing the map. Heck, now I have an idea for a sound reactive pattern… On the pile of todos.
Right, the pixel map is a snapshot created when you change the mapping editor code and is done 100% in your browser. The coordinates never change after that. The map is saved as both a text file of your source (either a json array of coordinates or a JS generator function), and as the calculated map in a binary form after it’s normalized to word units.
For a full 3D orientation based on accelerometer data from the sensor board, I would check out Roger’s GlowFlow project and youtube demo - and adapt the coordinate transformation stuff using the new faster API where Rodger was doing all the math in pattern code. His project was very much the inspiration for the coordinate transformation API.
While rotate is intended for 2D work and will rotate about the Z axis you probably want to use the rotateX, rotateY, and rotateZ calls which are intended for 3D work.
The bit that calculates polar/azimuth angles from accelerometer data is probably the key bit you’d need.