Skip to content

Cross Entropy Function

The cross-entropy function computes the cross entropy between two discrete probability distributions. The cross entropy is defined as:

\[ H(p, q) = -\sum_{x} p(x) \log q(x) \]

where \(p\) and \(q\) are the two probability distributions. The cross entropy is a measure of the difference between two probability distributions. The cross entropy is zero if and only if the two distributions are identical. The cross entropy is always non-negative i.e. \(H(p, q) \geq 0\).

Compute the cross entropy of the values in pk with respect to qk where where is True.

Parameters:

Name Type Description Default
pk ndarray

A discrete probability distribution.

required
qk ndarray

A second discrete probability distribution.

required
base Optional[Union[int, int_]]

The base of the logarithm used to compute the entropy. Default is None which means that the natural logarithm is used.

None
where Callable[[Union[int, float, int_, float_]], Union[bool, bool_]]

A function that takes a value and returns True or False. Default is lambda x: not np.isnan(x) i.e. a measurement is valid if it is not a NaN value.

lambda : not numpy.isnan(x)

Returns:

Type Description
Union[float, float_]

The cross-entropy of the values in pk with respect to qk where where is True.

Examples

import numpy as np
import autonfeat as aft
import autonfeat.functional as F

# Random data
n_samples = 100
x1 = np.random.rand(n_samples)
x2 = np.random.rand(n_samples)

# Sliding window
ws = 10
ss = 10
window = aft.SlidingWindow(window_size=ws, step_size=ss)

# Get featurizer
featurizer = window.use(F.cross_entropy_tf)

# Get features
features = featurizer(x1, x2)

# Print features
print(features)

If you enjoy using AutonFeat, please consider starring the repository ⭐️.