Information Bottleneck
Theory and Applications in Deep Learning

Download Url(s)
https://mdpi.com/books/pdfview/book/3864Contributor(s)
Geiger, Bernhard (editor)
Kubin, Gernot (editor)
Language
EnglishAbstract
The celebrated information bottleneck (IB) principle of Tishby et al. has recently enjoyed renewed attention due to its application in the area of deep learning. This collection investigates the IB principle in this new context. The individual chapters in this collection: • provide novel insights into the functional properties of the IB; • discuss the IB principle (and its derivates) as an objective for training multi-layer machine learning structures such as neural networks and decision trees; and • offer a new perspective on neural network learning via the lens of the IB framework. Our collection thus contributes to a better understanding of the IB principle specifically for deep learning and, more generally, of information–theoretic cost functions in machine learning. This paves the way toward explainable artificial intelligence.
Keywords
information theory; variational inference; machine learning; learnability; information bottleneck; representation learning; conspicuous subset; stochastic neural networks; mutual information; neural networks; information; bottleneck; compression; classification; optimization; classifier; decision tree; ensemble; deep neural networks; regularization methods; information bottleneck principle; deep networks; semi-supervised classification; latent space representation; hand crafted priors; learnable priors; regularization; deep learningWebshop link
https://mdpi.com/books/pdfview ...ISBN
9783036508023, 9783036508030Publisher website
www.mdpi.com/booksPublication date and place
Basel, Switzerland, 2021Classification
Information technology industries

