Min-Max Normalization is usually done when the data has varying scales and the training model does not make any assumptions about the distribution of data. Like Artificial Neural Network, or K-nearest neighbours.
Standardization assumes that the data has a Gaussian distribution, and therefore is generally employed when the data has varying scales and the training algorithm assumes that the data follows a Gaussian distribution. Like linear regression, logistic regression, or linear discriminant analysis.
If your data does follow a Gaussian distribution, performing standardization on top of min-max normalization should give you the desired results despite being an extra step.