Basic Mathematics You Should Mastered

  • 时间:
  • 浏览:1
  • 来源:大发5分快3APP下载_大发5分快3APP官方

4. Total variation distance of probability measures

In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of theKullback–Leibler divergence. The inequality is tight up to constant factors.[1] 

In statistics, probability theory, and information theory, a statistical distance quantifies the distance between two statistical objects, which can be two random variables, or two probability distributions or samples, or the distance can be between an individual sample point and a population or a wider sample of points. 

2. Pinsker's inequality 

6. The definition of TV: 

1. Statistical distance 

3. Total variation distance of probability measures 

5. σ-代数 

2017-08-17  21:22:40 

Basic Mathematics You Should Mastered