ThmDex – An index of mathematical definitions, results, and conjectures.
Joint entropy formula for simple mutual information
Formulation 0
Let $X : \Omega \to \mathcal{X}$ and $Y : \Omega \to \mathcal{Y}$ each be a D5723: Simple random variable.
Let $a \in (0, \infty) \setminus \{ 1 \}$ be a D5407: Positive real number.
Then \begin{equation} I_a(X ; Y) = H_a(X) + H_a(Y) - H_a(X, Y) \end{equation}
Proofs
Proof 0
Let $X : \Omega \to \mathcal{X}$ and $Y : \Omega \to \mathcal{Y}$ each be a D5723: Simple random variable.
Let $a \in (0, \infty) \setminus \{ 1 \}$ be a D5407: Positive real number.
Using results
(i) R4843: Conditional entropy formula for simple mutual information
(ii) R4837: Chain rule for simple entropy in the binary case

we have \begin{equation} I_a(X ; Y) = H_a(X) - H_a(X \mid Y) = H_a(X) + H_a(Y) - H_a(X, Y) \end{equation} $\square$