In this talk, we extend a recently established subgradient method for the computation of Riemannian metrics that optimizes certain singular value functions associated with dynamical systems. This extension is threefold. First, we introduce a projected subgradient method which results in Riemannian metrics whose parameters are confined to a compact convex set and we can thus prove that a minimizer exists; second, we allow inexact subgradients and study the effect of the errors on the computed metrics; and third, we analyze the subgradient algorithm for three different choices of step sizes: constant, exogenous and Polyak.