News

Using MI and AI to Better Understand Spin Models

April 04, 2020 by Luke James

Scientists at Tokyo Metropolitan University (TMU) have used machine learning to explore spin models that are used in physics to explore phase transitions.

One of the main attractions of artificial intelligence (AI) is that algorithms can be taught using pre-classified data and then used to categorize a relatively broader range of data. 

Now, researchers in Tokyo who work in the area of condensed matter physics have discovered that neural networks, the same type of AI used to classify images and handwriting, could be used to differentiate between various phases of matter in basic physical models.  

In a paper published in Nature Physics, the project’s two research leads—Juan Carrasquilla and Roger G. Melko—described how their method is relevant to more complicated models and how they discovered that an AI algorithm trained on one model and then applied to another could reveal important similarities between diverse phases in different systems. 

 

Differentiating Between Spin Phases Using AI

The team used machine learning to look at spin models, a mathematical model used in physics to study phase transitions and explain magnetism.

Their work follows on from previous research which revealed that existing artificial intelligence (AI) used to classify images and handwriting could be used to distinguish between spin states in the simplest spin models—up and down.

 

Achieving a Paradigm Shift 

The researchers looked at the most basic model of magnetism in materials—the Ising model. This is a mathematical model of ferromagnetism in statistical mechanics, consisting of discreet variables that represent magnetic dipole moments of spins that can exhibit one of two states, +1 or -1 (up or down). An atomic lattice with spin has an energy that relies on the relative arrangement of neighboring spins. 

Depending on conditions, these spins can organize themselves into a ferromagnetic phase or take up arbitrary directions in a paramagnetic phase. Typically, analysis of this system would require the study of an averaged quantity (e.g. the sum of all spins) but using machine learning, the team was able to utilize a whole microscopic configuration to categorize a phase. This, the team says, is a paradigm shift.

 

A graphic simulating the low temperature and high temperature phases of a 2D ising model.

Simulated low temperature (left) and high temperature (right) phase of a 2D Ising model. Here, blue points are spins pointing up and red points are spins pointing down. On the left, we see a ferromagnetic phase and, on the right, the ratio of up:down is closer 50:50. This is called a paramagnetic phase. Image credited to Tokyo Metropolitan University

 

Taking the Approach to the Next Level

Together, a team led by Professors Hiroyuki Mori and Yutaka Okabe at TMU are collaborating with Singapore’s Bioinformatics Institute to take this approach to the next level

In its current form developed by Carrasquilla and Melko, the approach cannot be applied to models more complex than the Ising model because other models, such as the q-based Potts model, can exhibit states other than “up” or “down”. And even though the Potts model has a phase transition, telling the phases apart is not simple. In a five-state model, for example, there are 120 states that are physically equivalent. 

To solve this problem and help AI differentiate between multiple phases capable of exhibiting a large number of physically equivalent states, the approach developed at TMU is being supplemented by more microscopic information, such as how the state of a particular atom relates to the state of an atom some distance away, or how the spins correlate over separation.

 

Training AI Algorithms to Distinguish Spin Phases 

After training the AI with several of these correlation configurations for both three- and five-state Potts models, the team discovered that it could correctly classify phases and identify the temperature where transition happened.

Then, to test the waters further, they attempted the same approach on a q-state clock model. With this model, spins adopt one of q orientations on a circle. In short, the team was able to train an AI algorithm to tell three phases apart in a six- and four-state clock model, discovering a deep connection between one of the three phases—the Berezinskii-Kosterlitz-Thouless (BKT) phase—and the critical phase that arises at the ‘second-order’ phase transition in the four-state clock model. 

 

Why This is Important

In theory, the method is generally applicable to a wide range of scientific problems such as universality, the observation that there are properties for a large class of systems independent of the dynamical details of the systems. Using universality, traits in seemingly unrelated systems can be identified, giving rise to unified behavior. 

The characteristics of machine learning lend themselves well to identifying these features in the most complex models and systems, enabling scientists and engineers to explore deep connections that underpin technology and the wider natural world.