Targeted dropout is a technique that applies dropout primarily to network units and weights that are believed to be less useful based on their magnitudes. This makes networks robust to post-hoc pruning while achieving high sparsity. Experiments on ResNet, Wide ResNet and Transformer models on image and text tasks achieved up to 99% sparsity with less than 4% accuracy drop. Scheduling the targeting proportion and dropout rates over time was found to improve results compared to random pruning before training. Targeted dropout is an effective regularization method for training networks that can be heavily pruned after training.