bn:26718689n
Noun Concept
Categories: Thermodynamics, F-divergences, Entropy and information, Information geometry
EN
Kullback–Leibler divergence  KL-divergence  Kullback-leibler divergence  Discrimination information  Kullback-Leibler distance
EN
In mathematical statistics, the Kullback–Leibler divergence is a measure of how one probability distribution is different from a second, reference probability distribution. Wikipedia
Definitions
Relations
Sources
EN
In mathematical statistics, the Kullback–Leibler divergence is a measure of how one probability distribution is different from a second, reference probability distribution. Wikipedia
In mathematical statistics, the Kullback–Leibler divergence, denoted D KL {\displaystyle D_{\text{KL}}}, is a type of statistical distance: a measure of how one probability distribution P is different from a second, reference probability distribution Q. A simple interpretation of the KL divergence of P from Q is the expected excess surprise from using Q as a model when the actual distribution is P. While it is a measure of how different two distributions are, and in some sense is thus a "distance", it is not actually a metric, which is the most familiar and formal type of distance. Wikipedia
A measure of information in probability theory Wikipedia Disambiguation