H(X), H(X|Y). Shannon entropy. Mutual information.
graph TD
D1["Def: Shannon entropy\nH(X)=−Σ p log p"]
D2["Def: Conditional H\nH(X|Y)=E_Y H(X|Y=y)"]
D3["Def: Mutual info\nI(X;Y)=H(X)−H(X|Y)"]
D4["Def: Joint entropy\nH(X,Y) chain rule"]
T1["Thm: H(X)≥0\nequality iff deterministic"]
T2["Thm: Chain rule\nH(X,Y)=H(X)+H(Y|X)"]
T3["Thm: I(X;Y)≥0\nindep iff 0"]
T4["Thm: Data processing\nI(X;Z)≤I(X;Y)"]
T5["Thm: Fano\nH(X|Y)≤H(P_e)"]
D1 --> D2
D2 --> D3
D3 --> D4
D1 --> T1
T2 --> T3
T3 --> T4
classDef definition fill:#b197fc,color:#fff
classDef theorem fill:#51cf66,color:#fff
class D1,D2,D3,D4 definition
class T1,T2,T3,T4,T5 theorem
Process Statistics
- Nodes: 14
- Edges: 11
Frontier: math.IT