| Titre : |
Common information, noise stability, and their extensions |
| Type de document : |
document électronique |
| Auteurs : |
Lei Yu, Auteur ; Vincent Y. F. Tan, Auteur |
| Editeur : |
Boston : Now Publishers |
| Année de publication : |
2022 |
| Importance : |
1 fichier PDF |
| Présentation : |
ill. |
| ISBN/ISSN/EAN : |
978-1-63828-015-6 |
| Note générale : |
Mode d'accès : accès au texte intégral par :
- authentification après inscription à la plateforme EBSCOhost
ou
- adresse IP de l'École
Bibliogr. p. 265 - 283 |
| Mots-clés : |
Information theory
Communication complexity--computer science |
| Index. décimale : |
519.72 Théorie de l'information: aspects mathématiques. |
| Résumé : |
Common information measures the amount of matching variables in two or more information sources. It is ubiquitous in information theory and related areas such as theoretical computer science and discrete probability. However, because there are multiple notions of common information, a unified understanding of the deep interconnections between them is lacking. In this monograph the authors fill this gap by leveraging a small set of mathematical techniques that are applicable across seemingly disparate problems. The reader is introduced in Part I to the operational tasks and properties associated with the two main measures of common information, namely Wyner's and Gács–Körner–Witsenhausen's (GKW). In the subsequent two Parts, the authors take a deeper look at each of these. In Part II they discuss extensions to Wyner's common information from the perspective of distributed source simulation, including the Rényi common information. In Part III, GKW common information comes under the spotlight. Having laid the groundwork, the authors seamlessly transition to discussing their connections to various conjectures in information theory and discrete probability. This monograph provides students and researchers in Information Theory with a comprehensive resource for understanding common information and points the way forward to creating a unified set of techniques applicable over a wide range of problems. |
| Note de contenu : |
Summary:
I. Classic common information quantities.
2. Wyner’s common information.
3. Gács–körner–witsenhausen’s common information.
II. Extensions of wyner’s common information.
4. Rényi and total variation common information.
5. Exact common information.
...
III. Extensions of Gács–Körner–Witsenhausen’s common information.
8. Non-interactive correlation distillation.
9. Q-stability. |
| En ligne : |
https://research.ebsco.com/linkprocessor/plink?id=c11cc8bd-4f0a-356b-abc9-0d462c [...] |
Common information, noise stability, and their extensions [document électronique] / Lei Yu, Auteur ; Vincent Y. F. Tan, Auteur . - Boston : Now Publishers, 2022 . - 1 fichier PDF : ill. ISBN : 978-1-63828-015-6 Mode d'accès : accès au texte intégral par :
- authentification après inscription à la plateforme EBSCOhost
ou
- adresse IP de l'École
Bibliogr. p. 265 - 283
| Mots-clés : |
Information theory
Communication complexity--computer science |
| Index. décimale : |
519.72 Théorie de l'information: aspects mathématiques. |
| Résumé : |
Common information measures the amount of matching variables in two or more information sources. It is ubiquitous in information theory and related areas such as theoretical computer science and discrete probability. However, because there are multiple notions of common information, a unified understanding of the deep interconnections between them is lacking. In this monograph the authors fill this gap by leveraging a small set of mathematical techniques that are applicable across seemingly disparate problems. The reader is introduced in Part I to the operational tasks and properties associated with the two main measures of common information, namely Wyner's and Gács–Körner–Witsenhausen's (GKW). In the subsequent two Parts, the authors take a deeper look at each of these. In Part II they discuss extensions to Wyner's common information from the perspective of distributed source simulation, including the Rényi common information. In Part III, GKW common information comes under the spotlight. Having laid the groundwork, the authors seamlessly transition to discussing their connections to various conjectures in information theory and discrete probability. This monograph provides students and researchers in Information Theory with a comprehensive resource for understanding common information and points the way forward to creating a unified set of techniques applicable over a wide range of problems. |
| Note de contenu : |
Summary:
I. Classic common information quantities.
2. Wyner’s common information.
3. Gács–körner–witsenhausen’s common information.
II. Extensions of wyner’s common information.
4. Rényi and total variation common information.
5. Exact common information.
...
III. Extensions of Gács–Körner–Witsenhausen’s common information.
8. Non-interactive correlation distillation.
9. Q-stability. |
| En ligne : |
https://research.ebsco.com/linkprocessor/plink?id=c11cc8bd-4f0a-356b-abc9-0d462c [...] |
|  |