We study differentially private convex stochastic optimization (DP-SCO) under user-level privacy, where each user can have multiple data items. Existing work for user-level DP-SCO requires a superpolynomial execution time or requires a number of users that grows polynomially with the dimensionality of the problem. We develop new algorithms for user-level DP-SCO that obtain optimal rates, run in polynomial time, and require a number of users that grows logarithmically in dimension. Furthermore, our algorithms are the first to obtain optimal rates for non-smooth functions in polynomial time. These algorithms are based on multi-pass DP-SGD, combined with a novel private mean estimation procedure for lumped data, which applies an outlier removal step before estimating the mean of the gradients.