The Emergence of Knowledge And The Benefits Of Quorum

Date

2022-07-29

Authors

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

We propose a model of artificial intelligence (AI) that can reproduce, in principle, an arbitrary distribution of data. Every distinct correlation (or pattern) in the data corresponds, in this model, to a bound state for a set of auxiliary variables. A useful analogy can be made with theories of non-adiabatic transfer of a charged particle in a condensed medium: Distinct patterns correspond to distinct positions of the particle, whose on-site energies are distributed, while the auxiliary variables can be thought of as a polarizable medium. The number of the bound states scales exponentially with the system's size. Thus we connect formation of knowledge with an emergence of a complex free energy landscape, where distinct patterns/memories correspond to distinct free energy minima. Conversely, the Marcus-inverted regime for the environmental degrees of freedom causes such free energy to merge and results in loss of data. We show that there is, in principle, a perfect underlying energy function whose Boltzmann distribution can reproduce an arbitrary distribution of data. In practice, one must use a large number of such energy functions whose performance is comparable. Thus we argue that faithful reproduction of data generally requires sampling of a quorum of free energy minima, thus requiring that the machine operate at finite temperatures. Motion along the landscape requires activation; still, the activation barriers are substantially lowered when the auxiliary degrees of freedom are explicitly used. The present methodology is in contrast with traditional machine-learning platforms, which effectively operate at vanishing temperatures and will generally become kinetically trapped in undesirable configurations. Finally, we propose ways to use simplified machine architectures to mitigate the computational challenges made apparent by the present approach.

Description

Keywords

Artificial intelligence, Machine learning, Neural networks, Data encoding, Free energy landscape

Citation