Personal tools

Skip to content. | Skip to navigation

You are here: Home REPORTS Progress reports On the equivalence of maximizing entropy and mutual information

On the equivalence of maximizing entropy and mutual information

This study is conducted under the context of unsupervised training of neural networks with two layers, using the concepts of information theory to perform the training. The two criteria here addressed are: i) maximizing the entropy of the outputs (MaxEnt) and ii) maximization the mutual information between the inputs and outputs (MaxMI). The research question pursued is “are these two approaches equivalent?”. With base on the existing literature, it is possible to conclude that the two approaches are theoretically equivalent provided the system is noiseless.

There are currently no items in this folder.

Document Actions
Contact

INESC Porto
Campus da FEUP
Rua Dr. Roberto Frias, 378
4200 - 465 Porto
Portugal

Tel. +351 22 209 4000
Fax +351 22 209 4050

Vladimiro Miranda
vmiranda@inescporto.pt

« Abril 2025 »
Abril
Do
123456
78910111213
14151617181920
21222324252627
282930