Riesz basis of ReLU neural networks and its use in function recovery

  • Date:
  • Time: 14:00 - 15:30
  • Address:
    Sokolovská 83, Praha
  • Room: K1
  • Speaker: Jan Vybíral

We present a survey of our recent study of the trigonometric-like system of piecewise linear functions introduced by Daubechies, DeVore, Foucart, Hanin, and Petrova. In our previous work we gave an alternative proof that this system forms a Riesz basis of $L_2([0,1])$. More importantly, we generalized this system to higher dimensions $d>1$ by a construction, which avoids using (tensor) products. As a consequence, the functions from the new Riesz basis of $L_2([0,1]^d)$ can be easily represented by neural networks. As a byproduct, we also prove that the Riesz constants of this system are independent of $d$, making it an attractive building block regarding multivariate analysis of neural networks. In our recent work we used this Riesz basis and investigated how well can a multivariate function of a limited smoothness be approximated by deep neural networks of given length and width. Such questions were recently intensively studied from many points of view. Our approach differs from most of these works by employing a basis, which lies on the interface between Fourier analysis and artificial neural networks. Finally, we report on a recent work, which studies the behavior of the same Riesz basis in Lebesgue spaces $L_p([0,1]^d)$ for $1\le p\le\infty$.