arrow
Volume 38, Issue 1
A Kolmogorov High Order Deep Neural Network for High Frequency Partial Differential Equations in High Dimensions

Yaqin Zhang, Ke Li, Zhipeng Chang, Xuejiao Liu, Yunqing Huang & Xueshuang Xiang

Commun. Comput. Phys., 38 (2025), pp. 181-222.

Published online: 2025-07

Export citation
  • Abstract

This paper proposes a Kolmogorov high order deep neural network (K-HOrderDNN) for solving high-dimensional partial differential equations (PDEs), which improves the high order deep neural networks (HOrderDNNs). HOrderDNNs have been demonstrated to outperform conventional DNNs for high frequency problems by introducing a nonlinear transformation layer consisting of $(p+1)^d$ basis functions. However, the number of basis functions grows exponentially with the dimension $d,$ which results in the curse of dimensionality (CoD). Inspired by the Kolmogorov Superposition Theorem (KST), which expresses a multivariate function as superpositions of univariate functions and addition, K-HOrderDNN utilizes a HOrderDNN to efficiently approximate univariate inner functions instead of directly approximating the multivariate function, reducing the number of introduced basis functions to $d(p+1).$ We theoretically demonstrate that CoD is mitigated when target functions belong to a dense subset of continuous multivariate functions. Extensive numerical experiments show that: for high-dimensional problems $(d=10, 20, 50)$ where HOrderDNNs$(p>1)$ are intractable, K-HOrderDNNs$(p>1)$ exhibit remarkable performance. Specifically, when $d=10,$ K-HOrderDNN$(p=7)$ achieves an error of $4.40E−03,$ two orders of magnitude lower than that of HOrderDNN$(p=1)$ (see Table 10); for high frequency problems, K-HOrderDNNs$(p>1)$ can achieve higher accuracy with fewer parameters and faster convergence rates compared to HOrderDNNs (see Table 8).

  • AMS Subject Headings

68T99, 35Q68, 65N99

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{CiCP-38-181, author = {Zhang , YaqinLi , KeChang , ZhipengLiu , XuejiaoHuang , Yunqing and Xiang , Xueshuang}, title = {A Kolmogorov High Order Deep Neural Network for High Frequency Partial Differential Equations in High Dimensions}, journal = {Communications in Computational Physics}, year = {2025}, volume = {38}, number = {1}, pages = {181--222}, abstract = {

This paper proposes a Kolmogorov high order deep neural network (K-HOrderDNN) for solving high-dimensional partial differential equations (PDEs), which improves the high order deep neural networks (HOrderDNNs). HOrderDNNs have been demonstrated to outperform conventional DNNs for high frequency problems by introducing a nonlinear transformation layer consisting of $(p+1)^d$ basis functions. However, the number of basis functions grows exponentially with the dimension $d,$ which results in the curse of dimensionality (CoD). Inspired by the Kolmogorov Superposition Theorem (KST), which expresses a multivariate function as superpositions of univariate functions and addition, K-HOrderDNN utilizes a HOrderDNN to efficiently approximate univariate inner functions instead of directly approximating the multivariate function, reducing the number of introduced basis functions to $d(p+1).$ We theoretically demonstrate that CoD is mitigated when target functions belong to a dense subset of continuous multivariate functions. Extensive numerical experiments show that: for high-dimensional problems $(d=10, 20, 50)$ where HOrderDNNs$(p>1)$ are intractable, K-HOrderDNNs$(p>1)$ exhibit remarkable performance. Specifically, when $d=10,$ K-HOrderDNN$(p=7)$ achieves an error of $4.40E−03,$ two orders of magnitude lower than that of HOrderDNN$(p=1)$ (see Table 10); for high frequency problems, K-HOrderDNNs$(p>1)$ can achieve higher accuracy with fewer parameters and faster convergence rates compared to HOrderDNNs (see Table 8).

}, issn = {1991-7120}, doi = {https://doi.org/10.4208/cicp.OA-2024-0095}, url = {http://global-sci.org/intro/article_detail/cicp/24256.html} }
TY - JOUR T1 - A Kolmogorov High Order Deep Neural Network for High Frequency Partial Differential Equations in High Dimensions AU - Zhang , Yaqin AU - Li , Ke AU - Chang , Zhipeng AU - Liu , Xuejiao AU - Huang , Yunqing AU - Xiang , Xueshuang JO - Communications in Computational Physics VL - 1 SP - 181 EP - 222 PY - 2025 DA - 2025/07 SN - 38 DO - http://doi.org/10.4208/cicp.OA-2024-0095 UR - https://global-sci.org/intro/article_detail/cicp/24256.html KW - Deep neural network, Kolmogorov Superposition Theorem, high-dimensional and high-frequency PDEs. AB -

This paper proposes a Kolmogorov high order deep neural network (K-HOrderDNN) for solving high-dimensional partial differential equations (PDEs), which improves the high order deep neural networks (HOrderDNNs). HOrderDNNs have been demonstrated to outperform conventional DNNs for high frequency problems by introducing a nonlinear transformation layer consisting of $(p+1)^d$ basis functions. However, the number of basis functions grows exponentially with the dimension $d,$ which results in the curse of dimensionality (CoD). Inspired by the Kolmogorov Superposition Theorem (KST), which expresses a multivariate function as superpositions of univariate functions and addition, K-HOrderDNN utilizes a HOrderDNN to efficiently approximate univariate inner functions instead of directly approximating the multivariate function, reducing the number of introduced basis functions to $d(p+1).$ We theoretically demonstrate that CoD is mitigated when target functions belong to a dense subset of continuous multivariate functions. Extensive numerical experiments show that: for high-dimensional problems $(d=10, 20, 50)$ where HOrderDNNs$(p>1)$ are intractable, K-HOrderDNNs$(p>1)$ exhibit remarkable performance. Specifically, when $d=10,$ K-HOrderDNN$(p=7)$ achieves an error of $4.40E−03,$ two orders of magnitude lower than that of HOrderDNN$(p=1)$ (see Table 10); for high frequency problems, K-HOrderDNNs$(p>1)$ can achieve higher accuracy with fewer parameters and faster convergence rates compared to HOrderDNNs (see Table 8).

Zhang , YaqinLi , KeChang , ZhipengLiu , XuejiaoHuang , Yunqing and Xiang , Xueshuang. (2025). A Kolmogorov High Order Deep Neural Network for High Frequency Partial Differential Equations in High Dimensions. Communications in Computational Physics. 38 (1). 181-222. doi:10.4208/cicp.OA-2024-0095
Copy to clipboard
The citation has been copied to your clipboard