Skip to content

The code will be released for the paper "Gaze-infused BERT: Do Human Gaze Signals Help Pre-trained Language Models?"

Notifications You must be signed in to change notification settings

Vincy2King/Gaze-infused-BERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gaze-infused BERT

This research delves into the intricate connection between self-attention mechanisms in large-scale pre-trained language models, like BERT, and human gaze patterns, with the aim of harnessing gaze information to enhance the performance of natural language processing (NLP) models.

Analysis of Self-Attention and Gaze

  1. Download dataset pwd:4x0l

  2. cd Analysis and run python bert_crf.py to obtain self-attention of BERT saved to corpus/

  3. Calculate the spearmanr between self-attention and gaze by running plot.py.

Gaze-infused BERT Method

  1. For those dataset without gaze signals, first cd Gaze_prediction_model_V1/scripts/ and run run_roberta.py

  2. For GLUE and SNLI datasets, cd source/, and run the corresponding Python file, such as

  3. For WSC, WiC, and COPA datasets, cd WWC/, and revised the corresponding dataset to run python run_main.py

  4. For LCSTS dataset, cd LCSTS/, and python train.py

About

The code will be released for the paper "Gaze-infused BERT: Do Human Gaze Signals Help Pre-trained Language Models?"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages