Skip to content

Embeddings are getting clustered together in a small region after training #20

Open
@Nrohlable

Description

Hi @tamerthamoqa,

Thanks a lot for such a fantastic repo which we could use for our work.
I was recently working on building a Face verification system using Siamese network. I was using results of pretrained models of Casia webface dataset and VGG2 Face dataset and was able to achieve close to 90% accuracy on my dataset. Further I was using Hard triplet batching sample and training strategy to further fine tune the network but for some reason after training the Embeddings for all the images are being clustered together or in other words the distance of two embeddings corresponding to two persons are getting too close to each other for example earlier using the pre-trained models if for two embedding we were getting 0.45 as cosine distance after training using this triplet loss we were getting 0.006 and it doesn't change much for same person or different person.

If you could give me any insights on this, that would be helpful.
Thanks

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions