A private repetition algorithm takes as input a differentially private algorithm with constant success probability and augments it to one that succeeds with high probability. These algorithms are closely related to private metaselection algorithms that compete with the best of many private algorithms and to private hyperparameter tuning algorithms that compete with the best hyperparameter settings for a private learning algorithm. Existing algorithms for these tasks pay either a large overhead in privacy costs or a large overhead in computational costs. In this work, we show strong lower bounds for problems of this type, showing in particular that for any algorithm that preserves the privacy cost up to a constant factor, the probability of failure can only fall polynomially in the computational overhead. This is in stark contrast to the non-private environment, where the probability of failure drops exponentially in computational overhead. By carefully combining existing algorithms for metaselection, we show that the trade-offs between computation and privacy almost match our lower bounds.