Information-Theoretic Methods in Deep Learning
Theory and Applications

Download Url(s)
https://mdpi.com/books/pdfview/book/10425Contributor(s)
Yang, Shuangming (editor)
Yu, Shujian (editor)
Sánchez Giraldo, Luis Gonzalo (editor)
Chen, Badong (editor)
Language
EnglishAbstract
The rapid development of deep learning has led to groundbreaking advancements across various fields, from computer vision to natural language processing and beyond. Information theory, as a mathematical foundation for understanding data representation, learning, and communication, has emerged as a powerful tool in advancing deep learning methods.This Special Issue, "Information-Theoretic Methods in Deep Learning: Theory and Applications", presents cutting-edge research that bridges the gap between information theory and deep learning. It covers theoretical developments, innovative methodologies, and practical applications, offering new insights into the optimization, generalization, and interpretability of deep learning models. The collection includes contributions on: Theoretical frameworks combining information theory with deep learning architectures; Entropy-based and information bottleneck methods for model compression and generalization; Mutual information estimation for feature selection and representation learning; Applications of information-theoretic principles in natural language processing, computer vision, and neural network optimization.
Keywords
sentiment analysis; deep neural networks; convolutional neural network; ResNet; Res2Net; multiple time series; forecasting method; information bottleneck; entropy; KL-divergence; mutual information; deep models; RNN; U-Net; partial convolutions; information theory; deep learning; information plane; kernels methods; deep neural network; generalization ability; regularization method; continual learning; quadruped robot locomotion; reinforcement learning; plasticity; active learning; universal prediction; deep active learning; individual sequences; normalized maximum likelihood; out-of-distribution; few-shot learning; meta learning; graph semi-supervision; label propagation; Gaussian kernel function; D-S evidence theory; self-supervised learning; representation learning; generative adversarial networks; parameterized loss functions; f-divergence; Jensen-f-divergence; time series modelling; forecasting; nonlinear modelling; denoising; anomaly detection; degradation modelling; machine learningISBN
9783725829828, 9783725829811Publisher website
www.mdpi.com/booksPublication date and place
Basel, 2025Classification
Oncology
Mathematics and Science

