Turn off batch-norm but leave dropout on

#1
by matiuste - opened

To whom it may concern,
I would like to make inferences by keeping dropout on and turning off batch-norm or any other process that may update any parameter of the network.
I tried making inferences on evaluation mode, but this also turned off dropout.

Any clue may help.
Thanks in advance,
Matias P.

SpeechBrain org

Hi,

If you wish to do that, you will have to modify the interfaces.py directly. You can really override the interface class and do whatever. I guess that in your case, you can just copy and past the code of this interface and add some .train() somewhere.

Sign up or log in to comment