Differential privacy is one of the most common techniques in privacy-preserving machine learning due to its rigorous math definition and theoretical results. Transfer learning and Federated learning are two main artificial intelligence schemas, which guarantee no data transmission but enable cooperative training cross-solo to enhance the limited computing power and limited data of a single solo. However, the transfer learning and federated learning schemas only deal with the data ownership issue, not the privacy-preserving issues. Differential privacy provides a promising privacy guarantee for transfer and federated learning, with some challenges. This talk discusses the challenges and benefits of applying differential privacy in transfer and federated learning. IJCAI’22 and WWW’22 accepted the corresponding papers.