ZScoreNormalize¶
- class torch_ecg.preprocessors.ZScoreNormalize(mean: Real | Iterable[Real] = 0.0, std: Real | Iterable[Real] = 1.0, per_channel: bool = False, inplace: bool = True, **kwargs: Any)[source]¶
Bases:
Normalize
Z-score normalization.
Z-score normalization is defined as
\[\left(\frac{sig - \operatorname{mean}(sig)}{\operatorname{std}(sig)}\right) \cdot s + m\]- Parameters:
mean (numbers.Real or array_like, default 0.0) – Mean value of the normalized signal, or mean values for each lead of the normalized signal.
std (numbers.Real or array_like, default 1.0) – Standard deviation of the normalized signal, or standard deviations for each lead of the normalized signal.
per_channel (bool, default False) – Whether to perform the normalization per channel.
inplace (bool, default True) – Whether to perform the normalization in-place.