Deployment and MLOps#

This reference lists aeon integrations with deployment and MLOps solutions.

Currently, aeon natively supports integration with mlflow.

See examples/mlflow for a notebook tutorial.

MLflow#

aeon.utils.mlflow_aeon

The mlflow_aeon module provides an MLflow API for aeon forecasters.

This module exports aeon models in the following formats:

aeon (native) format

This is the main flavor that can be loaded back into aeon, which relies on pickle internally to serialize a model.

mlflow.pyfunc

Produced for use by generic pyfunc-based deployment tools and batch inference.

The pyfunc flavor of the model supports aeon predict methods predict, predict_interval, predict_proba, predict_quantiles, predict_var.

The interface for utilizing a aeon model loaded as a pyfunc type for generating forecasts requires passing an exogenous regressor as Pandas DataFrame to the pyfunc.predict() method (an empty DataFrame must be passed if no exogenous regressor is used). The configuration of predict methods and parameter values passed to the predict methods is defined by a dictionary to be saved as an attribute of the fitted aeon model instance. If no prediction configuration is defined pyfunc.predict() will return output from aeon predict method. Note that for pyfunc flavor the forecasting horizon fh must be passed to the fit method.

Predict methods and parameter values for pyfunc flavor can be defined in two ways: Dict[str, dict] if parameter values are passed to pyfunc.predict(), for example {“predict_method”: {“predict”: {}, “predict_interval”: {“coverage”: [0.1, 0.9]}}. Dict[str, list], with default parameters in predict method, for example {“predict_method”: [“predict”, “predict_interval”} (Note: when including predict_proba method the former appraoch must be followed as quantiles parameter has to be provided by the user). If no prediction config is defined pyfunc.predict() will return output from aeon predict() method.

get_default_pip_requirements([...])

Create list of default pip requirements for MLflow Models.

get_default_conda_env([include_cloudpickle])

Return default Conda environment for MLflow Models.

save_model(estimator, path[, conda_env, ...])

Save a aeon model to a path on the local file system.

log_model(estimator, artifact_path[, ...])

Log a aeon model as an MLflow artifact for the current run.

load_model(model_uri[, dst_path])

Load a aeon model from a local file or a run.