Skip to content

Backend: Use pretrained BERT embedding (preprocessing) #32

@AngelinaZhai

Description

@AngelinaZhai

Trying out BERT embeddings and tokenization as a replacement for one-hot encoding. Implement and compare results with the baseline model.

Metadata

Metadata

Labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions