Dropout is regularization technique in deep learning that works little like like that. Randomly selected neurons are ignored during training, typically the probability is 0.5. This means that half of the neurons don't work.
In other words, the network is trained with 50% 'brain damage' to make it learn better and become more robust.
In other words, the network is trained with 50% 'brain damage' to make it learn better and become more robust.