The source resolvability problem (or resolvability problem for short) is one of random number generation problems in information theory. In the literature, the optimum achievable rates in the resolvability problem have been characterized in different two ways. One is based on the information spectrum quantity and the other is based on the smooth Rényi entropy. Recently, Nomura has revealed the optimum achievable rate with respect to the f-divergence, which includes the variational distance, the Kullback-Leibler (KL) divergence and so on. On the other hand, the optimum achievable rates with respect to the variational distance has been characterized by using the smooth Rényi entropy. In this paper, we try to extend this result to the case of other distances. To do so, we consider the resolvability problem with respect to the subclass of f-divergences and determine the optimum achievable rate in terms of the smooth Rényi entropy. The subclass of f-divergences considered in this paper includes typical distance measures such as the total variational distance, the KL divergence, the Hellinger distance and so on.