Turing instability on multiplex networks


This talk has been held during the Workshop From interaction patterns to critical behaviour.

Patterns are widespread in nature: regular forms and geometries, like spirals, trees and stripes, recur in different contexts. In a seminal paper Alan Turing set forth a theory by which patterns formation might arise from the dynamical interplay between reaction and diffusion in a system. Under specific conditions, diffusion drives an instability by perturbing an homogeneous stable fixed point, via an activator-inhibitor mechanism. As the perturbation grows, nonlinear reactions balance the diffusion terms, yielding the asymptotic, spatially inhomogeneous, steady state. However, the conventional approach to network theory is not general enough to ascertain the complexity that hides behind real world applications. Self-organization may proceed across multiple, inter-linked networks. For this reason, multiplex, networks in layers whose mutual connections are between twin nodes, have been introduced as a necessary leap forward in the modelling effort. Here we aim at developing the theory of patterns formation for a reaction-diffusion system defined on this latter kind of complex networks by means of a perturbative approach. The interlayer diffusion constants act as a small parameter in the expansion and the unperturbed state coincides with the limiting setting where the multiplex layers are decoupled. The interaction between adjacent layers can seed the instability of a homogeneous fixed point, yielding self-organized patterns which are instead impeded in the limit of decoupled layers. Patterns on individual layers can also fade away due to cross-talking between layers. Analytical results are compared to direct simulations.

You can find the manuscript here.