bn:03803694n
Noun Concept
Categories: Entropy and information, Articles with short description
EN
joint entropy
EN
In information theory, joint entropy is a measure of the uncertainty associated with a set of variables. Wikipedia
Definitions
Relations
Sources
EN
In information theory, joint entropy is a measure of the uncertainty associated with a set of variables. Wikipedia
measure of information in probability and information theory Wikidata
Wikipedia
Wikidata