Cross Entropy Transform¶
The cross-entropy transform computes the cross entropy between two discrete probability distributions. The cross entropy is defined as:
where \(p\) and \(q\) are the two probability distributions. The cross entropy is a measure of the difference between two probability distributions. The cross entropy is zero if and only if the two distributions are identical. The cross entropy is always non-negative i.e. \(H(p, q) \geq 0\).
Bases: Transform
Compute the cross entropy of the values in pk
with respect to qk
.
__call__(pk, qk, base=None, where=lambda : not np.isnan(x))
¶
Compute the cross entropy of the values in pk
with respect to qk
where where
is True
.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
pk | ndarray | A discrete probability distribution. | required |
qk | ndarray | A second discrete probability distribution. | required |
base | Optional[Union[int, int_]] | The base of the logarithm used to compute the entropy. Default is | None |
where | Callable[[Union[int, float, int_, float_]], Union[bool, bool_]] | A function that takes a value and returns | lambda : not numpy.isnan(x) |
Returns:
Type | Description |
---|---|
Union[float, float_] | The cross-entropy of the values in |
Examples¶
import numpy as np
import autonfeat as aft
# Random data
n_samples = 100
x1 = np.random.rand(n_samples)
x2 = np.random.rand(n_samples)
# Sliding window
ws = 10
ss = 10
window = aft.SlidingWindow(window_size=ws, step_size=ss)
# Create transform
tf = aft.CrossEntropyTransform()
# Get featurizer
featurizer = window.use(tf)
# Get features
features = featurizer(x1, x2)
# Print features
print(features)
If you enjoy using AutonFeat
, please consider starring the repository ⭐️.