Compilation#
The platform compiles the design canvas (stored on disk as JSON) into functional deep learning models. The platform is capable of creating either TensorFlow or PyTorch models. Each visual component on the canvas, backed by a Python script, generates the appropriate code during this process. By automating the translation from design to implementation, the platform allows users to focus on the conceptual and architectural aspects of model building, while the platform handles the technical intricacies of code generation and optimization.
Below is a screenshot of a small sample design canvas with 4 nodes; a globals to define the global parameters like learning rate & batch size, an input node to define which feature to use as input, a dense node with 3 layers 30 neurons, and an output node specifying the label to use.
To compile a canvas in the app, you just need to click the 'Compile' button.
Below, though, is a screenshot of python code in a Jupyter notebook demonstrating how do the same thing manually, load a canvas from disk (saved as JSON files) and transpile it into a TensorFlow model.
And here, we manually recreate the same model in Tensorflow/Keras code.
Note that the model created via the canvas is functionally identitical to the model created by hand in code.