反射(计算机编程)
信息论
熵(时间箭头)
关系(数据库)
集合(抽象数据类型)
转化(遗传学)
计算机科学
度量(数据仓库)
数学
信息系统
理论计算机科学
算法
应用数学
数据挖掘
统计
法学
物理
基因
程序设计语言
化学
量子力学
生物化学
政治学
作者
Viktor Borisovich Vyatkin
出处
期刊:Information
[MDPI AG]
日期:2019-04-16
卷期号:10 (4): 142-142
被引量:2
摘要
A new approach is presented to defining the amount of information, in which information is understood as the data about a finite set as a whole, whereas the average length of an integrative code of elements serves as a measure of information. In the framework of this approach, the formula for the syntropy of a reflection was obtained for the first time, that is, the information which two intersecting finite sets reflect (reproduce) about each other. Features of a reflection of discrete systems through a set of their parts are considered and it is shown that reproducible information about the system (the additive syntropy of reflection) and non-reproducible information (the entropy of reflection) are, respectively, measures of the structural order and the chaos. At that, the general classification of discrete systems is given by the ratio of the order and the chaos. Three information laws have been established: The law of conservation of the sum of chaos and order; the information law of reflection; and the law of conservation and transformation of information. An assessment of the structural organization and the level of development of discrete systems is presented. It is shown that various measures of information are structural characteristics of integrative codes of elements of discrete systems. A conclusion is made that, from the information-genetic positions, the synergetic approach to the definition of the quantity of information is primary in relation to the approaches of Hartley and Shannon.
科研通智能强力驱动
Strongly Powered by AbleSci AI