Approximation theory of structured neural networks . Mathematical theory for deep learning has been desired due to the power applications of deep neural networks to deal with big data in various pract
Description
Approximation theory of structured neural networks . Mathematical theory for deep learning has been desired due to the power applications of deep neural networks to deal with big data in various practical domains. The main difficulty lies in the structures and architectures imposed to networks designed for specific learning tasks. Neither the classical approximation theory nor the recent one for depths of ReLU neural networks can be applied due to the structures imposed for processing large dimensional data such as natural images of tens of thousands of dimensions. This project aims at an approximation theory for structured neural networks. We plan to establish mathematical theories for deconvolution with deep convolutional neural networks, operator learning, and spectral graph networks. . Scheme: Discovery Projects. Field: 4903 - Numerical and Computational Mathematics. Lead: Prof Dingxuan Zhou