The free-energy principle: a rough guide to the brain?

显著性(神经科学) 亥姆霍兹自由能 自由能原理 熵(时间箭头) 功能(生物学) 认知科学 心理学 感知 自由意志 计算机科学 人工智能 统计物理学 认识论 物理 神经科学 机器学习 量子力学 进化生物学 生物 哲学
作者
Karl J. Friston
出处
期刊:Trends in Cognitive Sciences [Elsevier]
卷期号:13 (7): 293-301 被引量:1364
标识
DOI:10.1016/j.tics.2009.04.005
摘要

This article reviews a free-energy formulation that advances Helmholtz's agenda to find principles of brain function based on conservation laws and neuronal energy. It rests on advances in statistical physics, theoretical biology and machine learning to explain a remarkable range of facts about brain structure and function. We could have just scratched the surface of what this formulation offers; for example, it is becoming clear that the Bayesian brain is just one facet of the free-energy principle and that perception is an inevitable consequence of active exchange with the environment. Furthermore, one can see easily how constructs like memory, attention, value, reinforcement and salience might disclose their simple relationships within this framework. This article reviews a free-energy formulation that advances Helmholtz's agenda to find principles of brain function based on conservation laws and neuronal energy. It rests on advances in statistical physics, theoretical biology and machine learning to explain a remarkable range of facts about brain structure and function. We could have just scratched the surface of what this formulation offers; for example, it is becoming clear that the Bayesian brain is just one facet of the free-energy principle and that perception is an inevitable consequence of active exchange with the environment. Furthermore, one can see easily how constructs like memory, attention, value, reinforcement and salience might disclose their simple relationships within this framework. information divergence, information gain, cross or relative entropy is a non-commutative measure of the difference between two probability distributions. a measure of salience based on the divergence between the recognition and prior densities. It measures the information in the data that can be recognised. or posterior density is the probability distribution of causes or model parameters, given some data; i.e., a probabilistic mapping from observed data to causes. priors that are induced by hierarchical models; they provide constraints on the recognition density is the usual way but depend on the data. the average surprise of outcomes sampled from a probability distribution or density. A density with low entropy means, on average, the outcome is relatively predictable. a process is ergodic if its long term time-average converges to its ensemble average. Ergodic processes that evolve for a long time forget their initial states. an information theory measure that bounds (is greater than) the surprise on sampling some data, given a generative model. of motion cover the value of a variable, its motion, acceleration, jerk and higher orders of motion. A point in generalised coordinates corresponds to a path or trajectory over time. or forward model is a probabilistic mapping from causes to observed consequences (data). It is usually specified in terms of the likelihood of getting some data given their causes (parameters of a model) and priors on the parameters an optimisation scheme that finds a minimum of a function by changing its arguments in proportion to the negative of the gradient of the function at the current value. device or scheme that uses a generative model to furnish a recognition density. They learn hidden structure in data by optimising the parameters of generative models. the probability distribution or density on the causes of data that encode beliefs about those causes prior to observing the data. or approximating conditional density is an approximate probability distribution of the causes of data. It is the product of inference or inverting a generative model. the successive states of stochastic processes are governed by random effects. quantities which are sufficient to parameterise a probability density (e.g., mean and covariance of a Gaussian density). or self-information is the negative log-probability of an outcome. An improbable outcome is therefore surprising.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
直率三颜发布了新的文献求助20
1秒前
谢子璇发布了新的文献求助10
1秒前
Foch发布了新的文献求助10
3秒前
许多完成签到,获得积分10
3秒前
3秒前
4秒前
英俊的铭应助菘蓝采纳,获得10
4秒前
香蕉觅云应助wx采纳,获得10
5秒前
lax完成签到,获得积分10
5秒前
5秒前
5秒前
7秒前
7秒前
英姑应助cx采纳,获得10
7秒前
孤独的丸子完成签到,获得积分10
8秒前
8秒前
sad完成签到,获得积分10
8秒前
momo完成签到,获得积分10
9秒前
9秒前
9秒前
9秒前
日月同辉完成签到,获得积分10
9秒前
petrichor完成签到,获得积分10
10秒前
友好元蝶完成签到 ,获得积分10
11秒前
03完成签到,获得积分10
11秒前
12秒前
12秒前
123456发布了新的文献求助10
13秒前
vtfangfangfang完成签到,获得积分10
13秒前
真实的芹发布了新的文献求助30
13秒前
niuma发布了新的文献求助10
14秒前
杨无敌发布了新的文献求助10
14秒前
今后应助威武从霜采纳,获得10
15秒前
15秒前
可爱的函函应助瑞瑞采纳,获得10
15秒前
15秒前
wx发布了新的文献求助10
17秒前
18秒前
18秒前
小黄人应助不安哲瀚采纳,获得20
18秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Handbook of pharmaceutical excipients, Ninth edition 5000
Aerospace Standards Index - 2026 ASIN2026 3000
Digital Twins of Advanced Materials Processing 2000
Polymorphism and polytypism in crystals 1000
Signals, Systems, and Signal Processing 610
Discrete-Time Signals and Systems 610
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 纳米技术 有机化学 物理 生物化学 化学工程 计算机科学 复合材料 内科学 催化作用 光电子学 物理化学 电极 冶金 遗传学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 6040035
求助须知:如何正确求助?哪些是违规求助? 7774222
关于积分的说明 16229380
捐赠科研通 5186224
什么是DOI,文献DOI怎么找? 2775269
邀请新用户注册赠送积分活动 1758227
关于科研通互助平台的介绍 1642062