ML Programs
This page describes the ML program model type. It is an evolution of the neural network model type that has been available since the first version of Core ML. This page describes how to:
Foundation for future improvements
Core ML is investing in the ML program model type as a foundation for future improvements. ML programs are available for the iOS15, macOS12, watchOS8, and tvOS15 deployment targets. For details, see Availability of ML programs.
The ML program type is recommended for newer deployment targets. You can also use the neural network type, which is supported in iOS13/macOS10.15 and above. For a comparison, see Comparing ML programs to neural networks.
Convert Models to ML Programs
You can convert a TensorFlow or PyTorch model, or a model created directly in the Model Intermediate Language (MIL), to a Core ML model that is either an ML program or a neural network. The Unified Conversion API can produce either type of model with the convert()
method.
Convert to the Default Neural Network
As with previous versions of coremltools, if you don't specify the model type, or your minimum_deployment_target
is a version older than iOS15, macOS12, watchOS8, or tvOS15, the TensorFlow or PyTorch model is converted to a neural network, as shown in the following example:
import coremltools as ct
# conversion to neural network format
# by default a neural network is produced. This is same as coremltools 4
model = ct.convert(source_model)
Convert to an ML Program
To convert a TensorFlow or PyTorch model to an ML program, do one of the following:
Specify the Model Type Directly
To convert TensorFlow or PyTorch model to an ML program, you can specify the model type with the convert_to
parameter as shown in the following example:
# provide the "convert_to" argument to convert to ML Programs
model = ct.convert(source_model, convert_to="mlprogram")
Specify the Minimum Deployment Target
Since ML programs are available in iOS15 or macOS12, you can instead use the minimum_deployment_target
parameter as shown in the following example:
# provide the "minimum_deployment_target" argument to convert to ML Programs
model = ct.convert(source_model,
minimum_deployment_target=ct.target.iOS15)
# or
model = ct.convert(source_model,
minimum_deployment_target=ct.target.macOS12)
(Optional) Set the ML Program Precision
You can optionally set the precision type (float 16 or float 32) of the weights and the intermediate tensors in the ML program during conversion. The ML program type offers an additional compute_precision
parameter as shown in the following example:
# produce a Float 16 typed model
# this is also the default if compute_precision argument is skipped
model = ct.convert(source_model,
convert_to="mlprogram",
compute_precision=ct.precision.FLOAT16)
# produce a Float 32 typed model,
# useful if the model needs higher precision, and float 16 is not sufficient
model = ct.convert(source_model,
convert_to="mlprogram",
compute_precision=ct.precision.FLOAT32)
For details on ML program precision, see Typed Execution.
Float 16 Default
For ML programs, coremltools 5.0b3 and higher, produces a model with float 16 precision by default (previous beta versions produced float 32 by default). You can override the default precision by using the
compute_precision
parameter ofcoremltools.convert()
.
Save ML Programs as Model Packages
The ML program type uses the Core ML model package container format that separates the model into components and offers more flexible metadata editing. Since an ML program decouples the weights from the program architecture, it cannot be saved as an .mlmodel
file.
Use the save()
method to save a file with the .mlpackage
extension, as shown in the following example:
model.save("my_model.mlpackage")
Requires Xcode 13 and Newer
The model package format is supported on Xcode 13
Find the Model Type in a Model Package
If you need to determine whether an mlpackage
file contains a neural network or an ML program, you can open it in Xcode 13, and look at the model type.
On a Linux system you can use Core ML Tools 5 or newer to inspect this property:
# load MLModel object
model = ct.models.MLModel("model.mlpackage")
# get the spec object
spec = model.get_spec()
print("model type: {}".format(spec.WhichOneof('Type')))
Availability of ML Programs
The ML program model type is available as summarized in the following table:
Neural Networks | ML Program | |
---|---|---|
Minimum deployment target | macOS 10.13 iOS 11 watchOS 4 tvOS 11 | macOS 12 iOS 15 watchOS 8 tvOS 15 |
Supported file formats | .mlmodel or .mlpackage | .mlpackage |
Updated over 1 year ago