log_model(estimator, artifact_path, conda_env=None, code_paths=None, registered_model_name=None, signature=None, input_example=None, await_registration_for=None, pip_requirements=None, extra_pip_requirements=None, serialization_format='pickle', **kwargs)[source]#

Log a aeon model as an MLflow artifact for the current run.

estimatorfitted aeon model

Fitted aeon model object.


Run-relative artifact path to save the model to.

conda_envUnion[dict, str], optional (default=None)

Either a dictionary representation of a Conda environment or the path to a conda environment yaml file.

code_pathsarray-like, optional (default=None)

A list of local filesystem paths to Python file dependencies (or directories containing file dependencies). These files are prepended to the system path when the model is loaded.

registered_model_namestr, optional (default=None)

If given, create a model version under registered_model_name, also creating a registered model if one with the given name does not exist.

signaturemlflow.models.signature.ModelSignature, optional (default=None)

Model Signature mlflow.models.ModelSignature describes model input and output Schema. The model signature can be inferred from datasets with valid model input (e.g. the training dataset with target column omitted) and valid model output (e.g. model predictions generated on the training dataset), for example:

from mlflow.models.signature import infer_signature
train = df.drop_column("target_label")
predictions = ... # compute model predictions
signature = infer_signature(train, predictions)


if performing probabilistic forecasts (predict_interval, predict_quantiles) with a aeon model, the signature on the returned prediction object will not be correctly inferred due to the Pandas MultiIndex column type when using the these methods. infer_schema will function correctly if using the pyfunc flavor of the model, though. The pyfunc flavor of the model supports aeon predict methods predict, predict_interval, predict_quantiles and predict_var while predict_proba and predict_residuals are currently not supported.

input_exampleUnion[pandas.core.frame.DataFrame, numpy.ndarray, dict, list, csr_matrix, csc_matrix], optional (default=None)

Input example provides one or several instances of valid model input. The example can be used as a hint of what data to feed the model. The given example will be converted to a Pandas DataFrame and then serialized to json using the Pandas split-oriented format. Bytes are base64-encoded.

await_registration_forint, optional (default=None)

Number of seconds to wait for the model version to finish being created and is in READY status. By default, the function waits for five minutes. Specify 0 or None to skip waiting.

pip_requirementsUnion[Iterable, str], optional (default=None)

Either an iterable of pip requirement strings (e.g. [“aeon”, “-r requirements.txt”, “-c constraints.txt”]) or the string path to a pip requirements file on the local filesystem (e.g. “requirements.txt”)

extra_pip_requirementsUnion[Iterable, str], optional (default=None)

Either an iterable of pip requirement strings (e.g. [“pandas”, “-r requirements.txt”, “-c constraints.txt”]) or the string path to a pip requirements file on the local filesystem (e.g. “requirements.txt”)

serialization_formatstr, optional (default=”pickle”)

The format in which to serialize the model. This should be one of the formats “pickle” or “cloudpickle”


Additional arguments for mlflow.models.model.Model

A ModelInfo instance that contains the
metadata of the logged model.

See also



>>> import mlflow  
>>> from mlflow.utils.environment import _mlflow_conda_env  
>>> from aeon.datasets import load_airline  
>>> from aeon.forecasting.arima import ARIMA  
>>> from aeon.utils import mlflow_aeon  
>>> y = load_airline()  
>>> forecaster = ARIMA(  
...     order=(1, 1, 0),
...     seasonal_order=(0, 1, 0, 12),
...     suppress_warnings=True)
>>> forecaster.fit(y)  
>>> mlflow.start_run()  
>>> artifact_path = "model"  
>>> model_info = mlflow_aeon.log_model(
...     estimator=forecaster,
...     artifact_path=artifact_path)