HYY Home


Hao-Yang Yen



Experience

Learning

Research

Explore

Renormalization Group in Machine Learning

  • Abstract
  • In the phase transition theory, the correlation length diverges to infinity at the critical point which leads to many physical phenomena. We can use the mean-field theory and renormalization group theory to analyze these phenomena. But actually, similar phenomena happen not only in the phase transition but in many natural complex systems. Moreover, in these complex systems, people often use statistics to build the mathematical model. The neural network framework is a kind of algorithm that can be applied to justify which categorical data an individual belongs to. This process is similar to using the renormalization group theory to compute which phases a system belongs to. Therefore, although the precise mathematical structure hasn’t been built yet, we believe that the neural network framework can be viewed as the renormalization group theory.

  • Neural Network
  • Renormalization Group
  • I believe that the pruning of the neural network can be interpreted by the physical interpretation of the renormalization group. I would use some numerical experiments to explain this statement. I use the R language to build the neural network model, for it’s more convenient to see more statistical data and compute some statistical quantity

  • Futher Ideas
  • If we believe that the neural network is indeed a kind of renormalization group and there is a precise corresponding relation between them, then we can build a relation between statistical mechanics and other research fields.

    References

    [1] Pankaj Mehta, David J. Schwab, An exact mapping between the Variational Renormalization Group and Deep Learning ,
    [2] Cédric Bény, Deep learning and the renormalization group,
    [3] Dietmar Plenz, The Critical Brain, Section on Critical Brain Dynamics, National Institute of Mental Health, NIH, Bethesda, MD 20892, USA April 22, 2013• Physics 6, 47