In this talk, we will discuss the universal approximation properties of deep neural networks, especially ResNets. We will start from the continuous-time ResNet, and leverage tools from control theory. This approach allows us to explore the expressive power of neural networks by its depth. The connection of this continuous resnets to the deep residual networks will be given. Additionally, we will discuss the generalization on neural networks with symmetry, e.g., the permutation-invariant case and the CNN case.