Disclaimer: I haven’t had an Apple machine for development since Objective-C was king, and have only a passing familiarity with Swift, and none at all with the playground. Still…
The Web UI uses both http and websockets protocols, a couple of different data encodings, plus decompression of the pattern binaries to access preview images if you want them. It’s a non-trivial undertaking. I’d do it in stages, probably starting with using the websocket interface to control the Pixelblaze.
You can get information on protocols from pixelblaze-client and the Firestorm source code. I’d also recommend looking at @nick_w’s async python library, because the async model is a better fit for Swift, and it also has some code dealing directly with binary pattern files.