- Serialization. Saving and distributing models just became much easier. The new ROIBundle class bundles the model and any preprocessing operation into a single binary blob. The bundle automatically runs its preprocessor on any incoming data so you don’t have to set preprocessing operations manually. No more naming your models sgd_canny2.myr just to keep track of what you need to properly format data!
- Better OpenCL support. AMD’s Aparapi project seems to be in a bit of a lull, so I’ve switched to the Syncleus fork. Among other things it’s dropped the requirement to bundle a native library with Myriad. The NASA snapshot bundles libraries for Linux and Windows, but now that everything is done through Maven we can add OS X to the list of supported platforms.
- Bug fixes. Speaking of OpenCL, I fixed a memory leak that should hopefully make convolution kernels a little friendlier to your GPU. There were a few other bugs to be squashed in and around other parts of the toolkit.
Both Desktop and Trainer have been updated to work with Myriad 2.0, and have themselves been tagged as 2.0 Snapshots just to keep everything straight. Be sure to use like with like: models trained with Myriad 1.0 will continue to work with Desktop 1.0 and Trainer 1.0, but will not work with 2.0 and vice versa. Myriad 2.0 does have legacy read and write methods so if you’re comfortable with Java you could write your own converter utility, or if there’s any interest I could whip something up.
The main user-facing changes in both are related to the ROIBundle and its bundled preprocessor. In Desktop you don’t have to bother configuring the Data Preprocessing stage if you’re using a Myriad-based machine learning model. It’s still available if e.g. you’re calling Python or MATLAB and you want to continue doing your data preparation in Desktop.
In Trainer loading a model now sets the preprocessor to “bundled,” meaning that it’ll use whatever’s set for that particular bundle. When you save a trained model, Trainer creates a new ROIBundle with the model and its preprocessor all wrapped up in a single package. It’s still pretty efficient storage-wise, typically a model and a single preprocessor fits into a few kB of space.