You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+3-7Lines changed: 3 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,20 +1,16 @@
1
1
# Neural complexity
2
2
A neural language model that computes various information-theoretic processing complexity measures (e.g., surprisal) for each word given the preceding context. Also, it can function as an adaptive language model ([van Schijndel and Linzen, 2018](http://aclweb.org/anthology/D18-1499)) which adapts to test domains.
3
3
4
+
**Note**: Recent updates remove dependencies but break compatibility with pre-2021 models. To use older models, use version 1.1.0: `git checkout tags/v1.1.0`
5
+
4
6
### Dependencies
5
7
Requires the following python packages (available through pip):
6
-
*[pytorch](https://pytorch.org/) v1.0.0
7
-
* nltk
8
+
*[pytorch](https://pytorch.org/)
8
9
9
10
The following python packages are optional:
10
11
* progress
11
12
* dill (to handle binarized vocabularies)
12
13
13
-
Requires the `punkt` nltk module. Install it from within python:
14
-
15
-
import nltk
16
-
nltk.download('punkt')
17
-
18
14
### Quick Usage
19
15
The below all use GPUs. To use CPUs instead, omit the `--cuda` flag.
0 commit comments