|
Definition of Conditional entropy
1. Noun. (information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable. ¹
¹ Source: wiktionary.com
|
1. Noun. (information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable. ¹
¹ Source: wiktionary.com