elwood_spatial.entropy
Information-theoretic metrics for analyzing the distribution of binned sensor readings across a network. All calculations use base-2 logarithms, so values are in bits.
bin_probabilities(bin_indices)
Compute the probability distribution over bin indices.
| Parameter | Type | Description |
|---|---|---|
bin_indices | Sequence[int] | Bin index per device |
Returns dict[int, float], mapping bin index to probability.
probs = es.bin_probabilities([0, 0, 1, 2])
# => {0: 0.5, 1: 0.25, 2: 0.25} shannon_entropy(bin_indices)
Shannon entropy H = −∑ p · log2(p). Measures the disorder or spread of readings across bins.
| Parameter | Type | Description |
|---|---|---|
bin_indices | Sequence[int] | Bin index per device |
Returns float. Entropy in bits, or 0.0 for empty input.
H = es.shannon_entropy([0, 0, 0, 1]) # ~0.81 bits information_content(bin_index, bin_indices)
Self-information Ik = −log2(pk). Higher values indicate rarer bins.
| Parameter | Type | Description |
|---|---|---|
bin_index | int | Target device's bin |
bin_indices | Sequence[int] | All bin indices in the neighborhood |
Returns float. Value in bits.
bin_deviation(bin_index, other_bin_indices)
Dk = |Bk − mean(Bj)|. Absolute distance of a device's bin from the neighborhood mean.
| Parameter | Type | Description |
|---|---|---|
bin_index | int | Target device's bin |
other_bin_indices | Sequence[int] | Neighbor bin indices (excluding target) |
Returns float.
compute_network_metrics(bin_indices)
Compute per-device entropy, information content, and bin deviation for an entire network.
| Parameter | Type | Description |
|---|---|---|
bin_indices | dict[str, int] | Mapping of device ID to bin index |
Returns dict[str, dict[str, float]] with keys entropy, information, bin_deviation.
entropy_at_agreement(n_devices, agreement_fraction=0.75, n_bins=12)
Theoretical entropy when a fraction of devices agree on one bin and the rest are spread evenly.
| Parameter | Type | Default | Description |
|---|---|---|---|
n_devices | int | Total device count | |
agreement_fraction | float | 0.75 | Fraction that agree |
n_bins | int | 12 | Total number of bins |
Returns float — entropy in bits.
Worked Example
A network of 5 sensors where one is anomalous:
import elwood_spatial as es
bins = es.BinSpec.from_tuples([(0, 50), (51, 100), (101, 150), (151, 200)])
values = {"s1": 45, "s2": 48, "s3": 120, "s4": 42, "s5": 50}
indices = {k: bins.bin_index(v) for k, v in values.items()}
# => {"s1": 0, "s2": 0, "s3": 2, "s4": 0, "s5": 0}
# Network entropy: low because 4 of 5 agree
all_idx = list(indices.values())
print(es.shannon_entropy(all_idx)) # ~0.722 bits
# Probability distribution
print(es.bin_probabilities(all_idx)) # {0: 0.8, 2: 0.2}
# Information content: s3 (rare bin) vs s1 (common bin)
print(es.information_content(2, all_idx)) # 2.322 bits (rare)
print(es.information_content(0, all_idx)) # 0.322 bits (common)
# Bin deviation for s3
others = [v for k, v in indices.items() if k != "s3"]
print(es.bin_deviation(2, others)) # 2.0 — far from mean
# All metrics at once
metrics = es.compute_network_metrics(indices)
for sid, m in metrics.items():
print(f"{sid}: H={m['entropy']:.3f} I={m['information']:.3f} D={m['bin_deviation']:.3f}")