Skip to content

Commit 41daeea

Browse files
authored
Merge pull request #969 from PAIR-code/dev
v0.5 Release
2 parents 7bee5d3 + 8653090 commit 41daeea

File tree

416 files changed

+33846
-10633
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

416 files changed

+33846
-10633
lines changed

.github/workflows/ci.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ jobs:
3232
name: Build and test (${{ matrix.python-version }})
3333
strategy:
3434
matrix:
35-
python-version: ["3.7"]
35+
python-version: ["3.9"]
3636
defaults:
3737
run:
3838
shell: bash -l {0}

.gitignore

+5-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,10 @@
1-
lit_nlp/yarn-error.log
1+
**/npm-debug.log*
2+
**/yarn-debug.log*
3+
**/yarn-error.log*
24
website/www/**
35
**/build/**
46
**/node_modules/**
57
**/__pycache__/**
68
**/*.pyc
9+
10+
**/.DS_Store

Dockerfile

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Use the official lightweight Python image.
22
# https://hub.docker.com/_/python
3-
FROM python:3.7-slim
3+
FROM python:3.9-slim
44

55
# Update Ubuntu packages and install basic utils
66
RUN apt-get update

README.md

+91-54
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,12 @@
1-
# 🔥 Language Interpretability Tool (LIT)
1+
# 🔥 Learning Interpretability Tool (LIT)
22

3-
<!--* freshness: { owner: 'lit-dev' reviewed: '2021-07-13' } *-->
3+
<!--* freshness: { owner: 'lit-dev' reviewed: '2022-11-15' } *-->
44

5-
The Language Interpretability Tool (LIT) is a visual, interactive
6-
model-understanding tool for ML models, focusing on NLP use-cases. It can be run
7-
as a standalone server, or inside of notebook environments such as Colab,
8-
Jupyter, and Google Cloud Vertex AI notebooks.
5+
The Learning Interpretability Tool (🔥LIT, formerly known as the Language
6+
Interpretability Tool) is a visual, interactive ML model-understanding tool that
7+
supports text, image, and tabular data. It can be run as a standalone server, or
8+
inside of notebook environments such as Colab, Jupyter, and Google Cloud Vertex
9+
AI notebooks.
910

1011
LIT is built to answer questions such as:
1112

@@ -50,12 +51,12 @@ For a broader overview, check out [our paper](https://arxiv.org/abs/2008.05122)
5051

5152
## Download and Installation
5253

53-
LIT can be installed via pip, or can be built from source. Building from source
54-
is necessary if you wish to update any of the front-end or core back-end code.
54+
LIT can be installed via `pip` or built from source. Building from source is
55+
necessary if you update any of the front-end or core back-end code.
5556

5657
### Install from source
5758

58-
Download the repo and set up a Python environment:
59+
Clone the repo and set up a Python environment:
5960

6061
```sh
6162
git clone https://github.com/PAIR-code/lit.git ~/lit
@@ -68,11 +69,11 @@ conda install cudnn cupti # optional, for GPU support
6869
conda install -c pytorch pytorch # optional, for PyTorch
6970

7071
# Build the frontend
71-
pushd lit_nlp; yarn && yarn build; popd
72+
(cd lit_nlp; yarn && yarn build)
7273
```
7374

7475
Note: if you see [an error](https://github.com/yarnpkg/yarn/issues/2821)
75-
running yarn on Ubuntu/Debian, be sure you have the
76+
running `yarn` on Ubuntu/Debian, be sure you have the
7677
[correct version installed](https://yarnpkg.com/en/docs/install#linux-tab).
7778

7879
### pip installation
@@ -81,62 +82,79 @@ running yarn on Ubuntu/Debian, be sure you have the
8182
pip install lit-nlp
8283
```
8384

84-
The pip installation will install all necessary prerequisite packages for use
85-
of the core LIT package. It also installs the code to run our demo examples.
86-
It does not install the prerequisites for those demos, so you need to install
87-
those yourself if you wish to run the demos. See
88-
[environment.yml](./environment.yml) for the list of all packages needed for
89-
running the demos.
85+
The `pip` installation will install all necessary prerequisite packages for use
86+
of the core LIT package.
87+
88+
It **does not** install the prerequisites for the provided demos, so you need to
89+
install those yourself. See [environment.yml](./environment.yml) for the list of
90+
packages required to run the demos.
9091

9192
## Running LIT
9293

9394
Explore a collection of hosted demos on the
9495
[LIT website demos page](https://pair-code.github.io/lit/demos).
9596

96-
Colab notebooks showing the use of LIT inside of notebooks can be found at [lit_nlp/examples/notebooks](./lit_nlp/examples/notebooks).
97-
A simple example can be viewed
98-
[here](https://colab.research.google.com/github/pair-code/lit/blob/main/lit_nlp/examples/notebooks/LIT_sentiment_classifier.ipynb).
99-
10097
### Quick-start: classification and regression
10198

102-
To explore classification and regression models tasks from the popular [GLUE benchmark](https://gluebenchmark.com/):
99+
To explore classification and regression models tasks from the popular
100+
[GLUE benchmark](https://gluebenchmark.com/):
103101

104102
```sh
105103
python -m lit_nlp.examples.glue_demo --port=5432 --quickstart
106104
```
107105

108-
Navigate to http://localhost:5432 to access the LIT UI.
106+
Navigate to http://localhost:5432 to access the LIT UI.
109107

110-
Your default view will be a
108+
Your default view will be a
111109
[small BERT-based model](https://arxiv.org/abs/1908.08962) fine-tuned on the
112110
[Stanford Sentiment Treebank](https://nlp.stanford.edu/sentiment/treebank.html),
113-
but you can switch to
114-
[STS-B](http://ixa2.si.ehu.es/stswiki/index.php/STSbenchmark) or [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) using the toolbar or the gear icon in
115-
the upper right.
111+
but you can switch to
112+
[STS-B](http://ixa2.si.ehu.es/stswiki/index.php/STSbenchmark) or
113+
[MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) using the toolbar or the
114+
gear icon in the upper right.
116115

116+
### Quick-start: language modeling
117117

118-
### Quick start: language modeling
119-
120-
To explore predictions from a pretrained language model (BERT or GPT-2), run:
118+
To explore predictions from a pre-trained language model (BERT or GPT-2), run:
121119

122120
```sh
123-
python -m lit_nlp.examples.lm_demo --models=bert-base-uncased \
124-
--port=5432
121+
python -m lit_nlp.examples.lm_demo --models=bert-base-uncased --port=5432
125122
```
126123

127124
And navigate to http://localhost:5432 for the UI.
128125

129126
### Notebook usage
130127

131-
A simple colab demo can be found [here](https://colab.research.google.com/github/PAIR-code/lit/blob/main/lit_nlp/examples/notebooks/LIT_sentiment_classifier.ipynb).
132-
Just run all the cells to see LIT on an example classification model right in
133-
the notebook.
128+
Colab notebooks showing the use of LIT inside of notebooks can be found at
129+
google3/third_party/py/lit_nlp/examples/notebooks.
130+
131+
We provide a simple
132+
[Colab demo](https://colab.research.google.com/github/PAIR-code/lit/blob/main/lit_nlp/examples/notebooks/LIT_sentiment_classifier.ipynb).
133+
Run all the cells to see LIT on an example classification model in the notebook.
134134

135135
### Run LIT in a Docker container
136136

137-
See [docker.md](https://github.com/PAIR-code/lit/wiki/docker.md) for instructions on running LIT as
138-
a containerized web app. This is the approach we take for our
139-
[website demos](https://pair-code.github.io/lit/demos/).
137+
LIT can be run as a containerized app using [Docker](https://www.docker.com/) or
138+
your preferred engine. Use the following shell commands to build the default
139+
Docker image for LIT from the provided `Dockerfile`, and then run a container
140+
from that image. Comments are provided in-line to help explain each step.
141+
142+
```shell
143+
# Build the docker image using the -t argument to name the image. Remember to
144+
# include the trailing . so Docker knows where to look for the Dockerfile
145+
docker build -t lit_app .
146+
147+
# Now you can run LIT as a containerized app using the following command. Note
148+
# that the last parameter to the run command is the value you passed to the -t
149+
# argument in the build command above.
150+
docker run --rm -p 5432:5432 lit-app
151+
```
152+
153+
The image above defaults to launching the GLUE demo on port 5432, but you can
154+
override this using environment variables. See our
155+
[advanced guide](https://github.com/PAIR-code/lit/wiki/docker.md) for detailed instructions on using the default
156+
LIT Docker image, running LIT as a containerized web app in different scenarios,
157+
and how to creating your own LIT images.
140158

141159
### More Examples
142160

@@ -154,33 +172,52 @@ watch this [video](https://www.youtube.com/watch?v=CuRI_VK83dU).
154172
## Adding your own models or data
155173

156174
You can easily run LIT with your own model by creating a custom `demo.py`
157-
launcher, similar to those in [lit_nlp/examples](./lit_nlp/examples). The basic
158-
steps are:
175+
launcher, similar to those in [lit_nlp/examples](./lit_nlp/examples). The
176+
basic steps are:
159177

160-
* Write a data loader which follows the
161-
[`Dataset` API](https://github.com/PAIR-code/lit/wiki/api.md#datasets)
178+
* Write a data loader which follows the [`Dataset` API](https://github.com/PAIR-code/lit/wiki/api.md#datasets)
162179
* Write a model wrapper which follows the [`Model` API](https://github.com/PAIR-code/lit/wiki/api.md#models)
163180
* Pass models, datasets, and any additional
164-
[components](https://github.com/PAIR-code/lit/wiki/api.md#interpretation-components) to the LIT server
165-
class
181+
[components](https://github.com/PAIR-code/lit/wiki/api.md#interpretation-components) to the LIT server class
166182

167183
For a full walkthrough, see
168184
[adding models and data](https://github.com/PAIR-code/lit/wiki/api.md#adding-models-and-data).
169185

170186
## Extending LIT with new components
171187

172188
LIT is easy to extend with new interpretability components, generators, and
173-
more, both on the frontend or the backend. See our
174-
[documentation](https://github.com/PAIR-code/lit/wiki) to get started.
189+
more, both on the frontend or the backend. See our [documentation](https://github.com/PAIR-code/lit/wiki) to get
190+
started.
175191

176192
## Pull Request Process
177193

178-
To make code changes to LIT, please work off of the `dev` branch and create
179-
pull requests against that branch. The `main` branch is for stable releases, and it is expected that the `dev` branch will always be ahead of `main` in terms of commits.
194+
To make code changes to LIT, please work off of the `dev` branch and
195+
[create pull requests](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request)
196+
(PRs) against that branch. The `main` branch is for stable releases, and it is
197+
expected that the `dev` branch will always be ahead of `main`.
198+
199+
[Draft PRs](https://github.blog/2019-02-14-introducing-draft-pull-requests/) are
200+
encouraged, especially for first-time contributors or contributors working on
201+
complex tasks (e.g., Google Summer of Code contributors). Please use these to
202+
communicate ideas and implementations with the LIT team, in addition to issues.
203+
204+
Prior to sending your PR or marking a Draft PR as "Ready for Review", please run
205+
the Python and TypeScript linters on your code to ensure compliance with
206+
Google's [Python](https://google.github.io/styleguide/pyguide.html) and
207+
[TypeScript](https://google.github.io/styleguide/tsguide.html) Style Guides.
208+
209+
```sh
210+
# Run Pylint on your code using the following command from the root of this repo
211+
pushd lit_nlp & pylint & popd
212+
213+
# Run ESLint on your code using the following command from the root of this repo
214+
pushd lit_nlp & yarn lint & popd
215+
```
180216

181217
## Citing LIT
182218

183-
If you use LIT as part of your work, please cite [our EMNLP paper](https://arxiv.org/abs/2008.05122):
219+
If you use LIT as part of your work, please cite
220+
[our EMNLP paper](https://arxiv.org/abs/2008.05122):
184221

185222
```
186223
@misc{tenney2020language,
@@ -198,8 +235,8 @@ If you use LIT as part of your work, please cite [our EMNLP paper](https://arxiv
198235

199236
This is not an official Google product.
200237

201-
LIT is a research project, and under active development by a small team.
202-
There will be some bugs and rough edges, but we're releasing at an early stage
203-
because we think it's pretty useful already. We want LIT to be an open platform,
204-
not a walled garden, and we'd love your suggestions and feedback - drop us a
205-
line in the [issues](https://github.com/pair-code/lit/issues).
238+
LIT is a research project and under active development by a small team. There
239+
will be some bugs and rough edges, but we're releasing at an early stage because
240+
we think it's pretty useful already. We want LIT to be an open platform, not a
241+
walled garden, and we would love your suggestions and feedback - drop us a line
242+
in the [issues](https://github.com/pair-code/lit/issues).

0 commit comments

Comments
 (0)