|
Definition of Joint entropy
1. Noun. (information theory) The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts. ¹
¹ Source: wiktionary.com
|
1. Noun. (information theory) The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts. ¹
¹ Source: wiktionary.com