Skip to content
This repository was archived by the owner on Mar 14, 2024. It is now read-only.
This repository was archived by the owner on Mar 14, 2024. It is now read-only.

Behavior of same batch negative sampling ?  #268

@Kublai-Jing

Description

@Kublai-Jing

From

neg_embs = pos_embs

It seems that the same batch negatives are just using the positive embedding as negative directly. However, my understanding is that it should at least shuffle them to avoid the case where positive edge has exactly the same edge as its negative, which defeats the purpose of learning.

Am I missing something ? Thanks!

And from the code (

neg_embs = torch.cat(
) it seems that if we set both num_batch_neg and num_uniform_neg to be > 0 it will perform both same batch negative and uniform negative and concatenate the two as the final negative embeddings ? Is this correct ?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions