possible reason it might work

#1
by alkeryn - opened

my guess is that since merging is doing an average of two model, floating point rounding error slightly change the values when merging a model with itself.

that slight change can result in minor improvement or worsening, it is deterministic however.

Nice guess. Thanks for pointing out your opinion.

though, if it's a 50% merge that should not happen normally, so maybe it has something to do with the test being not deterministic or something else, it is worth investigating imo.
if it isn't 50% that can be the reason then because weighted average can result in rounding errors.

Sign up or log in to comment