Geoff Hinton invented dropout, which randomly removes certain parameters from a network in each feedforward pass during training, based on a similar analogy to sexual recombination. It worked reasonably well in its time but has since fallen out of favor. I’m not sure if anyone has looked into the relative interpretability of nets trained with dropout VS those without.
Fwiw, dropout hasn’t fallen out of favor very much.
I think dropout makes nets less interpretable (wrt. naive interp strats). This is based on my recollection, I forget what exact experiments we have and haven’t run.
Geoff Hinton invented dropout, which randomly removes certain parameters from a network in each feedforward pass during training, based on a similar analogy to sexual recombination. It worked reasonably well in its time but has since fallen out of favor. I’m not sure if anyone has looked into the relative interpretability of nets trained with dropout VS those without.
Fwiw, dropout hasn’t fallen out of favor very much.
I think dropout makes nets less interpretable (wrt. naive interp strats). This is based on my recollection, I forget what exact experiments we have and haven’t run.
OK, good to know. I had a cached belief that it had declined in popularity which probably exaggerated the extent.