统计力学
统计推断
最大熵原理
统计理论
熵(时间箭头)
推论
信息论
数学
统计假设检验
物理定律
统计物理学
遍历性
计算机科学
统计
物理
人工智能
量子力学
出处
期刊:Physical Review
[American Physical Society]
日期:1957-05-15
卷期号:106 (4): 620-630
被引量:12122
标识
DOI:10.1103/physrev.106.620
摘要
Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.
科研通智能强力驱动
Strongly Powered by AbleSci AI