ML Programs

This page describes the ML program model type. It is an evolution of the neural network model type that has been available since the first version of Core ML. This page describes how to:

📘

Foundation for future improvements

Core ML is investing in the ML program model type as a foundation for future improvements. ML programs are available for the iOS15, macOS12, watchOS8, and tvOS15 deployment targets. For details, see Availability of ML programs.

The ML program type is recommended for newer deployment targets. You can also use the neural network type, which is supported in iOS13/macOS10.15 and above. For a comparison, see Comparing ML programs to neural networks.

Convert Models to ML Programs

You can convert a TensorFlow or PyTorch model, or a model created directly in the Model Intermediate Language (MIL), to a Core ML model that is either an ML program or a neural network. The Unified Conversion API can produce either type of model with the convert() method.

Convert to an ML Program

In Core ML Tools 7.0 and newer versions, the convert() method produces an mlprogram by default:

# Convert to an ML Program
import coremltools as ct  # Core ML Tools version 7.0
model = ct.convert(source_model)

The above example produces an mlprogram with an iOS15/macOS12 deployment target (or newer). You can override this behavior by providing a minimum_deployment_target value, such as minimum_deployment_target=target.iOS14 or older.

Convert to a Neural Network

With versions of Core ML Tools older than 7.0, if you didn't specify the model type, or your minimum_deployment_target was a version older than iOS15, macOS12, watchOS8, or tvOS15, the model was converted by default to a neural network.

To convert to a neural network using Core ML Tools version 7.0 or newer, specify the model type with the convert_to parameter, as shown in the following example:

import coremltools as ct  # Core ML Tools version 7.0
# provide the "convert_to" argument to convert to a neural network
model = ct.convert(source_model, convert_to="neuralnetwork")

Alternatively, you can use the minimum_deployment_target parameter to specify a target such as minimum_deployment_target=target.iOS14 or older, as shown in the following example:

import coremltools as ct  # Core ML Tools version 7.0
# provide the "minimum_deployment_target" argument to convert to a neural network
model = ct.convert(source_model, 
                   minimum_deployment_target=ct.target.iOS14)

(Optional) Set the ML Program Precision

You can optionally set the precision type (float 16 or float 32) of the weights and the intermediate tensors in the ML program during conversion. The ML program type offers an additional compute_precision parameter as shown in the following example:

# produce a Float 16 typed model
# this is also the default if compute_precision argument is skipped
model = ct.convert(source_model, 
                   convert_to="mlprogram", 
                   compute_precision=ct.precision.FLOAT16)
                    
# produce a Float 32 typed model,
# useful if the model needs higher precision, and float 16 is not sufficient 
model = ct.convert(source_model, 
                   convert_to="mlprogram", 
                   compute_precision=ct.precision.FLOAT32)

For details on ML program precision, see Typed Execution.

📘

Float 16 Default

For ML programs, Core ML Tools version 5.0b3 and newer produces a model with float 16 precision by default (previous beta versions produced float 32 by default). You can override the default precision by using the compute_precision parameter of coremltools.convert().

Save ML Programs as Model Packages

The ML program type uses the Core ML model package container format that separates the model into components and offers more flexible metadata editing. Since an ML program decouples the weights from the program architecture, it cannot be saved as an .mlmodel file.

Use the save() method to save a file with the .mlpackage extension, as shown in the following example:

model.save("my_model.mlpackage")

🚧

Requires Xcode 13 and Newer

The model package format is supported on Xcode 13

Find the Model Type in a Model Package

If you need to determine whether an mlpackage file contains a neural network or an ML program, you can open it in Xcode 13, and look at the model type.

On a Linux system you can use Core ML Tools 5 or newer to inspect this property:

# load MLModel object
model = ct.models.MLModel("model.mlpackage")

# get the spec object
spec = model.get_spec()
print("model type: {}".format(spec.WhichOneof('Type')))

Availability of ML Programs

The ML program model type is available as summarized in the following table:

Neural NetworkML Program
Minimum deployment targetmacOS 10.13
iOS 11
watchOS 4
tvOS 11
macOS 12
iOS 15
watchOS 8
tvOS 15
Supported file formats.mlmodel or .mlpackage.mlpackage