Properties of the inferred nonlinearity for neural networks of increasing size.

2017-09-19T17:24:52Z (GMT) by Jan Humplik Gašper Tkačik
<p><b>A)</b> Comparison between the inferred nonlinearity in the range of energies observed in the dataset and the log of the density of states at the same energies, showing the increasing match between the two quantities as the population size, <i>N</i>, increases. Both axes are normalized by the population size so that all curves have a similar scale. Nonlinearity can be shifted by an arbitrary constant without changing the model; to remove this redundancy, we set <i>V</i> (0) = 0 for all nonlinearities. <b>B)</b> The population size dependence of the average squared distance between the density of states and the inferred nonlinearity. Since the nonlinearity can be shifted by an arbitrary constant, we chose this offset so as to minimize the average squared distance. Error bars (1 SD) denote variation over different subnetworks. <b>C)</b> Inferred nonlinearities map to latent variables whose probability distributions can be computed and plotted for one sequence of subnetworks increasing in size (colors). As the network size increases, the dynamic range of the latent variable distribution does as well, which is quantified by the entropy of the distributions (inset).</p>