Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Dropout is regularization technique in deep learning that works little like like that. Randomly selected neurons are ignored during training, typically the probability is 0.5. This means that half of the neurons don't work.

In other words, the network is trained with 50% 'brain damage' to make it learn better and become more robust.



Dropout is not deletion. The neurons turned off will be lit in the next run, and all would make into the model (brain).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: