Dropout
Category
•
Deep Learning
Definition
A regularization technique that randomly sets some neurons to zero during training to prevent overfitting. Forces the network to not rely too heavily on specific neurons, improving generalization.
tl;dr
A regularization technique that randomly sets neurons to zero during training to prevent overfitting.