Skip to content

function one_hot_embedding maybe lack .to(device) #18

Open
@LSTM-Kirigaya

Description

@LSTM-Kirigaya

hi guys, nice work! However, maybe you forget to make torch.eye to device in one_hot_embedding?

Modified one_hot_embedding in helpers.py as

def one_hot_embedding(labels, num_classes=10):
    # Convert to One Hot Encoding
    device = get_device()
    y = torch.eye(num_classes).to(device)
    return y[labels]

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions