核(代数)
计算机科学
核方法
人工智能
字符串内核
树核
人工神经网络
机器学习
集合(抽象数据类型)
透视图(图形)
支持向量机
分布的核嵌入
理论计算机科学
算法
数学
离散数学
程序设计语言
摘要
With near-term devices available and the race for fault-tolerant computers in full swing, researchers became interested in the question of what happens if we replace a machine learning model with a circuit. While such quantum are sometimes called quantum neural networks, it has been repeatedly noted that their mathematical structure is actually much more closely related to kernel methods: they analyse data in high-dimensional Hilbert spaces to which we only have access through inner products revealed by measurements. This technical manuscript summarises, formalises and extends the link by systematically rephrasing models as a kernel method. It shows that most near-term and fault-tolerant models can be replaced by a general support vector machine whose kernel computes distances between data-encoding states. In particular, kernel-based training is guaranteed to find better or equally good models than variational circuit training. Overall, the kernel perspective of machine learning tells us that the way that data is encoded into states is the main ingredient that can potentially set models apart from classical machine learning models.
科研通智能强力驱动
Strongly Powered by AbleSci AI