卷积神经网络
深度学习
计算
特征向量
人工智能
计算机科学
卷积(计算机科学)
人工神经网络
代表(政治)
算法
理论计算机科学
物理
量子力学
政治
政治学
法学
作者
David Finol,Yan Lu,Vijay Mahadevan,Ankit Srivastava
摘要
Summary We show that deep convolutional neural networks (CNNs) can massively outperform traditional densely connected neural networks (NNs) (both deep or shallow) in predicting eigenvalue problems in mechanics. In this sense, we strike out in a new direction in mechanics computations with strongly predictive NNs whose success depends not only on architectures being deep but also being fundamentally different from the widely used to date. We consider a model problem: predicting the eigenvalues of one‐dimensional (1D) and two‐dimensional (2D) phononic crystals. For the 1D case, the optimal CNN architecture reaches 98% accuracy level on unseen data when trained with just 20 000 samples, compared to 85% accuracy even with 100 000 samples for the typical network of choice in mechanics research. We show that, with relatively high data efficiency, CNNs have the capability to generalize well and automatically learn deep symmetry operations, easily extending to higher dimensions and our 2D case. Most importantly, we show how CNNs can naturally represent mechanical material tensors, with its convolution kernels serving as local receptive fields, which is a natural representation of mechanical response. Strategies proposed are applicable to other mechanics' problems and may, in the future, be used to sidestep cumbersome algorithms with purely data‐driven approaches based upon modern deep architectures.
科研通智能强力驱动
Strongly Powered by AbleSci AI