Add torch-optimizer to allow to try with different optimizers?
I would like to try to change optim_g = torch.optim.AdamW( and optim_d = torch.optim.AdamW( in train_nsf_sim_cache_sid_load_pretrain.py to something like DiffGrad for experimenting, but I always get AttributeError: module 'torch.optim' has no attribute 'DiffGrad'.
Would it be possible to add everything needed into the RVC-beta.7z? Or can someone tell me how to make it work?
You just need to pip install torch_optimizer
and add import torch_optimizer as optim
to the script
Wasnt working for me. But after the pip install I just copied the diffgrad.py files into torch.optimizers and added it in that ini.py.
I tried to add import torch_optimizer as optim but it didn't work, it show this ModuleNotFoundError: No module named 'torch_optimizer' even when I have downloaded the torch optimizer
Wasnt working for me. But after the pip install I just copied the diffgrad.py files into torch.optimizers and added it in that ini.py.
how exactly did you do that
thanks, it is working now