InformationGainSegmenter¶
- class InformationGainSegmenter(k_max: int = 10, step: int = 5)[source]¶
Information Gain based Temporal Segmentation (GTS) Estimator.
GTS is a n unsupervised method for segmenting multivariate time series into non-overlapping segments by locating change points that for which the information gain is maximized.
Information gain (IG) is defined as the amount of entropy lost by the segmentation. The aim is to find the segmentation that have the maximum information gain for a specified number of segments.
GTS uses top-down search method to greedily find the next change point location that creates the maximum information gain. Once this is found, it repeats the process until it finds k_max splits of the time series.
Note
GTS does not work very well for univariate series but it can still be used if the original univariate series are augmented by an extra feature dimensions. A technique proposed in the paper [1] us to subtract the series from it’s largest element and append to the series.
- Parameters:
- k_max: int, default=10
Maximum number of change points to find. The number of segments is thus k+1.
- step:int, default=5
Step size, or stride for selecting candidate locations of change points. Fox example a step=5 would produce candidates [0, 5, 10, …]. Has the same meaning as step in range function.
- Attributes:
- change_points_: list of int
Locations of change points as integer indexes. By convention change points include the identity segmentation, i.e. first and last index + 1 values.
- intermediate_results_: list of `ChangePointResult`
Intermediate segmentation results for each k value, where k=1, 2, …, k_max
Notes
Based on the work from [1]. - alt. py implementation: https://github.com/cruiseresearchgroup/IGTS-python - MATLAB version: https://github.com/cruiseresearchgroup/IGTS-matlab - paper available at:
References
[1] (1,2)Sadri, Amin, Yongli Ren, and Flora D. Salim. “Information gain-based metric for recognizing transitions in human activities.”, Pervasive and Mobile Computing, 38, 92-109, (2017). https://www.sciencedirect.com/science/article/abs/pii/S1574119217300081
Examples
>>> from aeon.testing.data_generation import make_example_dataframe_series >>> from sklearn.preprocessing import MinMaxScaler >>> from aeon.segmentation import InformationGainSegmenter >>> X = make_example_dataframe_series(n_channels=2, random_state=10) >>> X_scaled = MinMaxScaler(feature_range=(0, 1)).fit_transform(X) >>> igts = InformationGainSegmenter(k_max=3, step=2) >>> y = igts.fit_predict(X_scaled, axis=0)
Methods
clone
([random_state])Obtain a clone of the object with the same hyperparameters.
fit
(X[, y, axis])Fit time series segmenter to X.
fit_predict
(X[, y, axis])Fit segmentation to data and return it.
get_class_tag
(tag_name[, raise_error, ...])Get tag value from estimator class (only class tags).
Get class tags from estimator class and all its parent classes.
get_fitted_params
([deep])Get fitted parameters.
Sklearn metadata routing.
get_params
([deep])Get parameters for this estimator.
get_tag
(tag_name[, raise_error, ...])Get tag value from estimator class.
get_tags
()Get tags from estimator.
predict
(X[, axis])Create amd return segmentation of X.
reset
([keep])Reset the object to a clean post-init state.
set_params
(**params)Set the parameters of this estimator.
set_tags
(**tag_dict)Set dynamic tags to given values.
to_classification
(change_points, length)Convert change point locations to a classification vector.
to_clusters
(change_points, length)Convert change point locations to a clustering vector.
- clone(random_state=None)[source]¶
Obtain a clone of the object with the same hyperparameters.
A clone is a different object without shared references, in post-init state. This function is equivalent to returning
sklearn.clone
of self. Equal in value totype(self)(**self.get_params(deep=False))
.- Parameters:
- random_stateint, RandomState instance, or None, default=None
Sets the random state of the clone. If None, the random state is not set. If int, random_state is the seed used by the random number generator. If RandomState instance, random_state is the random number generator.
- Returns:
- estimatorobject
Instance of
type(self)
, clone of self (see above)
- fit(X, y=None, axis=1)[source]¶
Fit time series segmenter to X.
If the tag
fit_is_empty
is true, this just sets theis_fitted
tag to true. Otherwise, it checksself
can handleX
, formatsX
into the structure required byself
then passesX
(and possiblyy
) to_fit
.- Parameters:
- XOne of
VALID_SERIES_INPUT_TYPES
Input time series to fit a segmenter.
- yOne of
VALID_SERIES_INPUT_TYPES
or None, default None Training time series, a labeled 1D series same length as X for supervised segmentation.
- axisint, default = None
Axis along which to segment if passed a multivariate X series (2D input). If axis is 0, it is assumed each column is a time series and each row is a time point. i.e. the shape of the data is
(n_timepoints, n_channels)
.axis == 1
indicates the time series are in rows, i.e. the shape of the data is(n_channels, n_timepoints)`.``axis is None
indicates that the axis of X is the same asself.axis
.
- XOne of
- Returns:
- self
Fitted estimator
- classmethod get_class_tag(tag_name, raise_error=True, tag_value_default=None)[source]¶
Get tag value from estimator class (only class tags).
- Parameters:
- tag_namestr
Name of tag value.
- raise_errorbool, default=True
Whether a ValueError is raised when the tag is not found.
- tag_value_defaultany type, default=None
Default/fallback value if tag is not found and error is not raised.
- Returns:
- tag_value
Value of the
tag_name
tag in cls. If not found, returns an error ifraise_error
is True, otherwise it returnstag_value_default
.
- Raises:
- ValueError
if
raise_error
is True andtag_name
is not inself.get_tags().keys()
Examples
>>> from aeon.classification import DummyClassifier >>> DummyClassifier.get_class_tag("capability:multivariate") True
- classmethod get_class_tags()[source]¶
Get class tags from estimator class and all its parent classes.
- Returns:
- collected_tagsdict
Dictionary of tag name and tag value pairs. Collected from
_tags
class attribute via nested inheritance. These are not overridden by dynamic tags set byset_tags
or class__init__
calls.
- get_fitted_params(deep=True)[source]¶
Get fitted parameters.
- State required:
Requires state to be “fitted”.
- Parameters:
- deepbool, default=True
If True, will return the fitted parameters for this estimator and contained subobjects that are estimators.
- Returns:
- fitted_paramsdict
Fitted parameter names mapped to their values.
- get_params(deep=True)[source]¶
Get parameters for this estimator.
- Parameters:
- deepbool, default=True
If True, will return the parameters for this estimator and contained subobjects that are estimators.
- Returns:
- paramsdict
Parameter names mapped to their values.
- get_tag(tag_name, raise_error=True, tag_value_default=None)[source]¶
Get tag value from estimator class.
Includes dynamic and overridden tags.
- Parameters:
- tag_namestr
Name of tag to be retrieved.
- raise_errorbool, default=True
Whether a ValueError is raised when the tag is not found.
- tag_value_defaultany type, default=None
Default/fallback value if tag is not found and error is not raised.
- Returns:
- tag_value
Value of the
tag_name
tag in self. If not found, returns an error ifraise_error
is True, otherwise it returnstag_value_default
.
- Raises:
- ValueError
if raise_error is
True
andtag_name
is not inself.get_tags().keys()
Examples
>>> from aeon.classification import DummyClassifier >>> d = DummyClassifier() >>> d.get_tag("capability:multivariate") True
- get_tags()[source]¶
Get tags from estimator.
Includes dynamic and overridden tags.
- Returns:
- collected_tagsdict
Dictionary of tag name and tag value pairs. Collected from
_tags
class attribute via nested inheritance and then any overridden and new tags from__init__
orset_tags
.
- predict(X, axis=1)[source]¶
Create amd return segmentation of X.
- Parameters:
- XOne of
VALID_SERIES_INPUT_TYPES
Input time series
- axisint, default = None
Axis along which to segment if passed a multivariate series (2D input) with
n_channels
time series. If axis is 0, it is assumed each row is a time series and each column is a time point. i.e. the shape of the data is(n_timepoints,n_channels)
.axis == 1
indicates the time series are in rows, i.e. the shape of the data is(n_channels, n_timepoints)`.``axis is None
indicates that the axis of X is the same asself.axis
.
- XOne of
- Returns:
- List
Either a list of indexes of X indicating where each segment begins or a list of integers of
len(X)
indicating which segment each time point belongs to.
- reset(keep=None)[source]¶
Reset the object to a clean post-init state.
After a
self.reset()
call, self is equal or similar in value totype(self)(**self.get_params(deep=False))
, assuming no other attributes were kept usingkeep
.- Detailed behaviour:
- removes any object attributes, except:
hyper-parameters (arguments of
__init__
) object attributes containing double-underscores, i.e., the string “__”
runs
__init__
with current values of hyperparameters (result ofget_params
)- Not affected by the reset are:
object attributes containing double-underscores class and object methods, class attributes any attributes specified in the
keep
argument
- Parameters:
- keepNone, str, or list of str, default=None
If None, all attributes are removed except hyperparameters. If str, only the attribute with this name is kept. If list of str, only the attributes with these names are kept.
- Returns:
- selfobject
Reference to self.
- set_params(**params)[source]¶
Set the parameters of this estimator.
The method works on simple estimators as well as on nested objects (such as
Pipeline
). The latter have parameters of the form<component>__<parameter>
so that it’s possible to update each component of a nested object.- Parameters:
- **paramsdict
Estimator parameters.
- Returns:
- selfestimator instance
Estimator instance.
- set_tags(**tag_dict)[source]¶
Set dynamic tags to given values.
- Parameters:
- **tag_dictdict
Dictionary of tag name and tag value pairs.
- Returns:
- selfobject
Reference to self.
- classmethod to_classification(change_points: list[int], length: int)[source]¶
Convert change point locations to a classification vector.
Change point detection results can be treated as classification with true change point locations marked with 1’s at position of the change point and remaining non-change point locations being 0’s.
For example change points [2, 8] for a time series of length 10 would result in: [0, 0, 1, 0, 0, 0, 0, 0, 1, 0].
- classmethod to_clusters(change_points: list[int], length: int)[source]¶
Convert change point locations to a clustering vector.
Change point detection results can be treated as clustering with each segment separated by change points assigned a distinct dummy label.
For example change points [2, 8] for a time series of length 10 would result in: [0, 0, 1, 1, 1, 1, 1, 1, 2, 2].