I have $X_1, \ldots, X_n$ random variables independent and identically distributed with probability density function $p_X$. I want to compute the probability density function for the random variable $Z = \min\{f(X_1), \ldots, f(X_n)\}$. Let $Y_i = f(X_i)$, then if I assume that $f$ is monotone I can compute $p_Y$ as:
\begin{equation} p_Y(y) = p_X(f^{-1}(y))\left|\frac{d}{dy}(f^{-1}(y))\right|. \end{equation}
I can then compute the cumulative distribution function $F_Z$ as:
\begin{equation} F_Z(z) = P(Z\leq z) = P(\min\{Y_1,\ldots,Y_n\}\leq z) = 1-P((Y_1>z)\wedge\ldots\wedge (Y_n>z)) = \\ 1-P(Y_1>z)\ldots P(Y_n>z) = 1-(1-F_Y(z))^n. \end{equation}
Finally for the probability density function $p_Z$ I get:
\begin{equation} p_Z(z) = \frac{d}{dz}F_Z(z) = (n-1)(1-F_Y(z))^{n-1}p_Y(z) = (n-1)(1-F_Y(z))^{n-1}p_X(f^{-1}(z))\left|\frac{d}{dy}(f^{-1}(z))\right|. \end{equation}
First of all is the logic above correct? Can I extend the above to using a probability mass function? To me it looks like that in that case the identity would be as simples as: $p_Y(y) = p_X(f^{-1}(y))$. I believe the rest will remain the same? What if I allow $f$ to be non-monotone and potentially non-invertible?