In the numerical analysis of differential equations, understanding the discretization error is of significant importance. Error bounds, often dependent on parameters such as time step size, are typically derived through theoretical analysis. However, there is a growing demand for more practical methods to quantify the discretization error, especially in the context of data assimilation and computational uncertainty quantification. We have recently proposed methods to quantify discretization errors, grounded in the empirical observation that these errors often exhibit a near-monotonic increase over time. Our approach models the discretization error at each discrete time point as a random variable and employs isotonic regression to estimate key factors such as variance, subject to monotonicity constraints. In this presentation, we will outline the methodology, demonstrate its potential, and also discuss challenges in this research direction.