You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Mar 14, 2024. It is now read-only.
It seems that the same batch negatives are just using the positive embedding as negative directly. However, my understanding is that it should at least shuffle them to avoid the case where positive edge has exactly the same edge as its negative, which defeats the purpose of learning.
) it seems that if we set both num_batch_neg and num_uniform_neg to be > 0 it will perform both same batch negative and uniform negative and concatenate the two as the final negative embeddings ? Is this correct ?
From
PyTorch-BigGraph/torchbiggraph/model.py
Line 548 in a11ff0e
It seems that the same batch negatives are just using the positive embedding as negative directly. However, my understanding is that it should at least shuffle them to avoid the case where positive edge has exactly the same edge as its negative, which defeats the purpose of learning.
Am I missing something ? Thanks!
And from the code (
PyTorch-BigGraph/torchbiggraph/model.py
Line 557 in a11ff0e