摘要
Empirical and theoretical work suggests that the brain operates at the edge of a critical phase transition between order and disorder. The wider adoption and investigation of criticality theory as a unifying framework in neuroscience has been hindered in part by the potentially daunting complexity of its analytical and theoretical foundation. Among critical phase transitions, avalanche and edge of chaos criticality are particularly relevant to studying brain function and dysfunction. The computational features of criticality provide a conceptual link between neuronal dynamics and cognition. Mounting evidence suggests that near-criticality, more than strict criticality, may be a more plausible mode of operation for the brain. The distance to criticality presents a promising and underexploited biological parameter for characterizing cognitive differences and mental illness. Criticality is the singular state of complex systems poised at the brink of a phase transition between order and randomness. Such systems display remarkable information-processing capabilities, evoking the compelling hypothesis that the brain may itself be critical. This foundational idea is now drawing renewed interest thanks to high-density data and converging cross-disciplinary knowledge. Together, these lines of inquiry have shed light on the intimate link between criticality, computation, and cognition. Here, we review these emerging trends in criticality neuroscience, highlighting new data pertaining to the edge of chaos and near-criticality, and making a case for the distance to criticality as a useful metric for probing cognitive states and mental illness. This unfolding progress in the field contributes to establishing criticality theory as a powerful mechanistic framework for studying emergent function and its efficiency in both biological and artificial neural networks. Criticality is the singular state of complex systems poised at the brink of a phase transition between order and randomness. Such systems display remarkable information-processing capabilities, evoking the compelling hypothesis that the brain may itself be critical. This foundational idea is now drawing renewed interest thanks to high-density data and converging cross-disciplinary knowledge. Together, these lines of inquiry have shed light on the intimate link between criticality, computation, and cognition. Here, we review these emerging trends in criticality neuroscience, highlighting new data pertaining to the edge of chaos and near-criticality, and making a case for the distance to criticality as a useful metric for probing cognitive states and mental illness. This unfolding progress in the field contributes to establishing criticality theory as a powerful mechanistic framework for studying emergent function and its efficiency in both biological and artificial neural networks. a large class of continuous phase transitions that separate a phase where activity dissipates from a phase where activity is amplified; they are characterized by scale-free avalanches. the property of a process whose trajectory in phase space is sensitive to small differences in initial conditions. a variable that, when tuned past a critical value, brings about a phase transition in a system, (e.g., temperature in the water–steam transition). the range of input rates that are separately encodable within the system dynamics. a physical model or mathematical system that evolves in time according to fixed equations, but which often gives rise to complex behavior. a phase transition between a stable phase and a chaotic phase. said of a high-level property that cannot readily be explained in terms of its low-level constituents. This intractability can be variously interpreted. In weak emergence interpretations, the inexplicability is merely a practical one due to the sheer complexity of the computation required to reach an explanation and reductionism still holds in principle. In strong emergence interpretations, the emergent property possesses causal autonomy independently of its constituents, thus challenging the principle of reductionism. an information-theoretic measure of the amount of information shared between two sources. a cascade of neural events (e.g., action potentials) that starts from a single seed event and propagates through a population. a boundary (hyperplane) in phase space at which a macroscopic property of the system (the order parameter) qualitatively changes (e.g. the water–steam transition, the magnetization of iron, or the onset of chaos in artificial neural networks). a mathematical relationship f (x) ~ xβ where one quantity f (x) varies proportionally to another quantity x raised to a certain power β. Also known as a scaling law. said of a shape or process whose statistics remain the same under a change of scale (i.e., spatial, temporal, or energy scale). any type of criticality that is autonomously maintained through homeostatic-like feedback loops. a magnet-like model where the couplings between nodes can be positive or negative (instead of only positive), resulting in so-called frustrated interactions, chaos, and metastable states. the property of a system that returns to its initial state within a finite period after a perturbation. the branch of physics concerned with explaining the large-scale behavior of systems in terms of the collective action of their constituent elements. a property of large classes of dynamical systems whereby their macroscopic properties are independent of many of their microscopic parameters.