scipy.special.rel_entr#
- scipy.special.rel_entr(x, y, out=None) = <ufunc 'rel_entr'>#
- Elementwise function for computing relative entropy. \[\begin{split}\mathrm{rel\_entr}(x, y) = \begin{cases} x \log(x / y) & x > 0, y > 0 \\ 0 & x = 0, y \ge 0 \\ \infty & \text{otherwise} \end{cases}\end{split}\]- Parameters:
- x, yarray_like
- Input arrays 
- outndarray, optional
- Optional output array for the function results 
 
- Returns:
- scalar or ndarray
- Relative entropy of the inputs 
 
 - See also - Notes - Added in version 0.15.0. - This function is jointly convex in x and y. - The origin of this function is in convex programming; see [1]. Given two discrete probability distributions \(p_1, \ldots, p_n\) and \(q_1, \ldots, q_n\), the definition of relative entropy in the context of information theory is \[\sum_{i = 1}^n \mathrm{rel\_entr}(p_i, q_i).\]- To compute the latter quantity, use - scipy.stats.entropy.- See [2] for details. - rel_entrhas experimental support for Python Array API Standard compatible backends in addition to NumPy. Please consider testing these features by setting an environment variable- SCIPY_ARRAY_API=1and providing CuPy, PyTorch, JAX, or Dask arrays as array arguments. The following combinations of backend and device (or other capability) are supported.- Library - CPU - GPU - NumPy - ✅ - n/a - CuPy - n/a - ✅ - PyTorch - ✅ - ✅ - JAX - ✅ - ✅ - Dask - ✅ - n/a - See Support for the array API standard for more information. - References [1]- Boyd, Stephen and Lieven Vandenberghe. Convex optimization. Cambridge University Press, 2004. DOI:https://doi.org/10.1017/CBO9780511804441 [2]- Kullback-Leibler divergence, https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence