2025年04月04日 星期五 登录 EN

学术活动
Overcoming the curse of dimensionality: from nonlinear Monte Carlo to the training of neural networks
首页 - 学术活动
报告人:
Arnulf Jentzen, Professor, School of Data Science and Shenzhen Research Institut of Big Data, The Chinese University of Hong Kong
邀请人:
Jialin Hong, Professor
题目:
Overcoming the curse of dimensionality: from nonlinear Monte Carlo to the training of neural networks
时间地点:
15:30-16:30 April 25 (Tuesday), N109
摘要:

Partial differential equations (PDEs) are among the most universal tools used in modelling problems in nature and man-made complex systems. Nearly all traditional approximation algorithms for PDEs in the literature suffer from the so-called ”curse of dimensionality” in the sense that the number of required computational operations of the approximation algorithm to achieve a given approximation accuracy grows exponentially in the dimension of the considered PDE. With such algorithms it is impossible to approximately compute solutions of high-dimensional PDEs even when the fastest currently available computers are used. In the case of linear parabolic PDEs and approximations at a fixed space-time point, the curse of dimensionality can be overcome by means of Monte Carlo approximation algorithms and the Feynman-Kac formula. In this talk we demonstrate that deep neural network approximations overcome the curse of dimensionality in the case of a general class of semilinear parabolic PDEs. Moreover, we also specify concrete examples of smooth functions which can not be approximated by shallow neural networks without the curse of dimensionality but which can be approximated by deep neural networks without the curse of dimensionality.