I will present recent results on the expressivity of neural networks and its applications. First, we will recall the connections between linear finite elements and ReLU deep neural networks (DNNs), as well as between spectral methods and ReLUk$^k$ DNNs. Next, we will share our latest findings on whether DNNs can precisely recover continuous piecewise polynomials of arbitrary order on any simplicial mesh in any dimension. Furthermore, we will discuss a specific result on the optimal expressivity of ReLU DNNs and its applications, incorporating the Kolmogorov-Arnold representation theorem. Finally, I will conclude with a remark on studying convolutional neural networks from an expressivity perspective.