Various robust estimation methods or algorithm shave been proposed to hedge against Byzantine failures in distributed learning. However, there is a lack of systematic approaches to provide theoretic alguarantees of significance in detecting those Byzantine machines. We develop a general detection procedure, ByMI, via error rate control to address this issue, which is applicable to many robust learning problems. The key idea is to apply the sample-splitting strategy on each worker machine to construct a score statistic integrated with a general robust estimation and then to utilize the symmetry property of those scores to derive a data-driven threshold. The proposed method is dimension insensitive and p-value free with the elp of the symmetry property can achieve false discovery rate control under mild conditions. Numerical experiments on both synthetic and real data validate the theoretical results and demonstrate the effectiveness of our proposed method.