Compute entropy
Description
Uses the formula -sum(pk *
log(pk)
where pk
are discrete probabilities.
Usage
<Expr>$entropy(base = exp(1), ..., normalize = TRUE)
Arguments
base
|
Numeric value used as base, defaults to exp(1) .
|
…
|
These dots are for future extensions and must be empty. |
normalize
|
Normalize pk if it doesn’t sum to 1.
|
Value
A polars expression
Examples
#> shape: (1, 1)
#> ┌──────────┐
#> │ a │
#> │ --- │
#> │ f64 │
#> ╞══════════╡
#> │ 1.459148 │
#> └──────────┘
#> shape: (1, 1)
#> ┌───────────┐
#> │ a │
#> │ --- │
#> │ f64 │
#> ╞═══════════╡
#> │ -6.754888 │
#> └───────────┘