Skip to content

About all_text_embedding #27

@LeslieZJC

Description

@LeslieZJC

Dear Author, while reviewing the code, I noticed the following issue:all_text_embedding = self.get_input_embeddings()(input_ids.clamp(min=0)).detach()This all_text_embedding includes embeddings for the ​​answer​​ (e.g., GPT output), which are then fed into prefusion_layers for fusion with other features. Is this design correct?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions