Automatic differentiation is an automatized implementation of differential calculus, it plays a key computational role in several fields as machine learning, design optimization, fluid dynamics, physical modeling, mechanics, finance. It is also efficient for nonsmooth problems despite the occurence of spurious behaviors. In that case, one indeed observes the apparition of calculus artifacts and artificial critical points that have no variational nature. This talk introduces a family of multivalued mappings generalizing gradient-like behaviors that is usually referred as conservative fields. We then review their theoretical properties and show how they allow us to study the behavior of gradient-type algorithms from the theoreies developed from differential inclusion.