Understanding the dynamics of synchronized behavior in natural systems, from the collective oscillations of biological populations to the information processing in neural networks, has long been a pursuit of scientific inquiry. This has been approached in a variety of ways. For example, VR Nareddy et al. (2020) delve into the dynamics of ecological systems using Ising models because of their potential to capture universal dynamical properties in physics. Meanwhile, work pursued by JT Lizier and colleagues (e.g. Lizier (2014) ) has introduced dynamic information theory (DIT) to analyze the emergence of distributed computation across different systems.
For VR Nareddy et al. (2020), the focus lies on elucidating the synchronization dynamics of ecological systems, particularly those exhibiting two-cycle oscillations, through the lens of the Ising model. This model, borrowed from physics, offers a simplified yet powerful framework to understand emergent patterns of synchrony arising from local interactions. By extending the Ising model to incorporate memory effects, Nareddy et al. (2020) explore how these models can accurately capture the intricate dynamics of ecological populations linked across space. The following figure taken from their paper summarizes the approach, which essentially entails finding an Ising model with the same oscillation dynamics as the metapopulation model.
Figure 1. (from Nareddy et al. 2020). The steps in constructing and assessing a dynamical Ising model that best describes a two-cycle metapopulation.
On the other hand, Lizier and colleagues introduce dynamic information theory (DIT), a mathematical framework that extends classic information theory to quantify different forms of “information entropy” in dynamic processes. Unlike classical information theory which averages information transfer over time to understand global properties, dynamic information theory allows for the measurement of information processing dynamics at specific observations or configurations. This localized perspective unveils nuanced insights into how information is stored, transferred, and modified within distributed systems.
For example, the transfer entropy (TE) is an example of one metric taken from the DIT toolkit. Its application in a (1 dimensional) spatial context is illustrated below. In the left column is a description of the TE works. The right column shows an example application to a cellular automaton, another well-studied type of spatial model, and how it can reveal the movement of information across the grid as it is processed by neighboring sites.
Fig. 1 Local transfer entropy indicated by the blue arrow: information contained in the realization y_n of the source variable Y about the next value x_n+1 of the destination variable X at time n+1, in the context of the corresponding realization x(k)_n of the destination’s past state.
Fig. 2 (from Lizier 2014) Local information dynamics in ECA rule 54 for the raw values in (a) (black for “1”, white for “0”). 35 time steps are displayed for 35 cells, and time increases down the page for all CA plots. All units are in bits. (b) Local active information storage; Local apparent transfer entropy: (c) one cell to the right, and (d) one cell to the left per time step.
There is tremendous potential for insight into complex natural systems when combining the perspectives of both papers. By applying dynamic information theory to the Ising models proposed in Nareddy et al, we can not only characterize the macroscopic patterns of synchrony but also dissect the intricate dynamics underlying them. For example, consider the below results which illustrate my ongoing research in this area.
Information dynamics of a metapopulation
Here, I have used a metapopulation model (the Ricker model) to describe
population dynamics at a site on a regular grid, allow populations to interact via dispersal across a 1D spatial grid, and introduce random fluctuations in environmental conditions. This is an example realization. Time moves down, and space corresponds to the x-axis.
All code is available here: https://github.com/jusinowicz/ising_info
To develop some intuition about the DIT metrics, it is useful to look at some test cases. For example, consider the situation where there is no dispersal and
no noise, shown on the right. In this case, populations become fixed into an oscillating cycle between high and low numbers. It looks like a checkerboard when mapped out.
Information is being stored in the form of regular, unbroken
patterns of density. The Active Information Storage (AIS), shown by the yellow block on the right, confirms this. The figure is not very interesting, other than that it shows that a fixed amount of information is stored across the grid.
What happens if we add noise to the system? Not surprisingly, information
storage is disrupted. Hence we see values of AIS that vary depending on where you look in time or space.
Adding in dispersal creates the potential for information transfer between sites. After adding a low rate of
dispersal between neighboring sites it becomes interesting to measure the TE, as per the earlier figure from Lizier (2014). Here I have placed the AIS and the
TE side by side. If you look closely, you can notice that low rates of AIS tend to correspond with high rates of TE. This is because the reduction in density in a site triggers a shuffling of dynamics until the pattern recovers. One way to think about this is that information flow reflects the apparent flow of population across
neighboring sites that aid in the stabilization of oscillations in the disturbed site. If we removed noise but retained dispersal we would see some information transfer in early timesteps as populations recovered from their random initial densities and the whole system moved towards its final, stable configuration.
Are the information dynamics of the metapopulation and its Ising model the same?
At this time it remains unclear. My preliminary results suggest that for some cases (chosen randomly from parameter combinations) the information dynamics of metapopulations do not match with those of their Ising representations, while for others they do. However, this remains to be approached in a systematic, rigorous way. Consider this case where noise is high relative to dispersal in the metapopulation model.
The inferred Ising model seems to match visually.
We can see that the information storage is similar in many regions.
This is also true for the transfer entropy.
Looking forward
If it is consistently true that the DIT properties of these models match, then this supports the idea that the Ising model is a good representation of oscillator dynamics in biological systems. On the other hand, finding that the DIT properties of do not match would suggest that there are features of the underlying processes that the DIT uniquely captures. Much research remains to be done to more fully understand how these early results help us better understand oscillating population dynamics. It is clear, however, that there is tremendous potential here to explore exciting new dimensions of universal behavior in complex systems.
This integration of methodologies promises to shed light on fundamental questions about the mechanisms driving synchronization phenomena across diverse biological scales. In my ongoing research I have begun to do just this, merging the simplicity of Ising models with the precision of dynamic information theory, aiming to unravel the mysteries of synchronized behavior in ecological and biological systems. Through this interdisciplinary approach, I seek to advance our understanding of complex phenomena and pave the way for future explorations into the dynamics of synchronization in natural systems.
Code
https://github.com/jusinowicz/ising_info
References
Lizier, J.T., Prokopenko, M. & Zomaya, A.Y. (2014). A framework for the local information dynamics of distributed computation in complex systems. In: Guided self-organization: inception. Springer, pp. 115–158.
Nareddy, V.R., Machta, J., Abbott, K.C., Esmaeili, S. & Hastings, A. (2020). Dynamical Ising model of spatially coupled ecological oscillators. J. R. Soc. Interface., 17, 20200571.