A PyTorch implementation of a multi-task learning model that performs binary sentence classification and sentiment analysis using a pre-trained transformer as the backbone.
- Introduction
- Model Architecture
- Dependencies
- Installation
- Training the Model
- Inference
- Results
- Future Work
This project demonstrates how to:
- Use transformers for sentence embeddings.
- Implement multi-task learning with task-specific heads:
- Task A: Binary classification (positive vs. negative).
- Task B: Sentiment analysis (positive, neutral, negative).
- Fine-tune task-specific heads while freezing the transformer layers.
- Base Model:
SentenceTransformer(all-MiniLM-L6-v2). - Task-Specific Heads:
classifier_head: Classifies sentences as positive or negative.sentiment_head: Classifies sentences into positive, neutral, or negative.
- Loss Functions:
CrossEntropyLossfor both tasks.
- Optimizer:
Adamwith a learning rate of0.001.
torch(PyTorch)sentence-transformers
pip install -r requirements.txt