← Back to Grants

Approximation theory of structured neural networks . Mathematical theory for deep learning has been desired due to the power applications of deep neural networks to deal with big data in various pract

The University of Sydney — Discovery Projects
Amount
Up to $462,942
Closes
Saturday 31 July 2027
Status
unknown
Type
open opportunity
Apply Now →

Description

Approximation theory of structured neural networks . Mathematical theory for deep learning has been desired due to the power applications of deep neural networks to deal with big data in various practical domains. The main difficulty lies in the structures and architectures imposed to networks designed for specific learning tasks. Neither the classical approximation theory nor the recent one for depths of ReLU neural networks can be applied due to the structures imposed for processing large dimensional data such as natural images of tens of thousands of dimensions. This project aims at an approximation theory for structured neural networks. We plan to establish mathematical theories for deconvolution with deep convolutional neural networks, operator learning, and spectral graph networks. . Scheme: Discovery Projects. Field: 4903 - Numerical and Computational Mathematics. Lead: Prof Dingxuan Zhou

Categories
educationtechnology
Target Recipients
researchersuniversities

Foundations Supporting This Area

Discovery method: arc-grants
Last verified: Monday 2 March 2026
Added: Saturday 28 February 2026