主講人:許志欽 上海交通大學副教授
時間:2020年12月22日15:00
地點:3號樓301
舉辦單位:數理學院
內容介紹:We demonstrate a very universal Frequency Principle (F-Principle) --- DNNs often fit target functions from low to high frequencies --- on high-dimensional benchmark datasets and deep neural networks. We use F-Principle to understand the difference of DNN with traditional methods. We then propose novel multi-scale DNNs (MscaleDNN) using the idea of radial scaling in frequency domain and activation functions with compact support. The radial scaling converts the problem of approximation of high frequency content of the PDEs solution to one of lower frequency, and the compact support activation functions facilitate the separation of scales to be approximated by corresponding DNNs. As a result, the MscaleDNNs achieve fast uniform convergence over multiple scales. The proposed MscaleDNNs are shown to be superior to traditional fully connected DNNs and can be used as an effective mesh-less numerical method for elliptic PDEs.