The Generalized Minimal Residual methods (GMRES) for the solution of general square linear systems $Ax=b$ is often combined with a preconditioner to improve the convergence speed and the overall computing performance of the method. Successful mixed precision implementations for the application of the preconditioner inside GMRES have been previously proposed: certain strategy prescribes to apply the preconditioner in low precision to reduce overall time and memory consumption, other strategies propose to apply the preconditioner in higher precision to improve robustness and accuracy. These existing studies tend to focus on one kind of preconditioner combined with one kind of preconditioning technique (left-, right-, and flexible-preconditioning). In this talk, we propose to unify most of the state-of-the-art mixed precision implementations for preconditioned GMRES under the same comprehensive theory and give a clear and exhaustive list of the possible strategies to set the precisions and how they compare numerically. To achieve this, we will present a backward error analysis framework for GMRES that substantially simplifies the process of determining error bounds of most existing and future variants of GMRES. We subsequently use this framework to derive new descriptive bounds for the attainable forward errors of the computed solutions. From the study of these bounds, we uncover yet unknown mixed precision implementations that achieve new tradeoffs between computing performance and accuracy. As importantly, we explain that there are critical differences in robustness and accuracy between left-, right-, and flexible-preconditioning for a same given set of precisions and that, therefore, the choice between the three preconditioning techniques has higher stakes in mixed precision. We substantiate our theoretical findings with a comprehensive experimental study on random dense and SuiteSparse matrices with various preconditioners.