Skip to content

Commit 137e3c7

Browse files
authored
chore: update more urls in docs given new repo url (#578)
1 parent 8d8fc97 commit 137e3c7

File tree

11 files changed

+465
-466
lines changed

11 files changed

+465
-466
lines changed

.github/ISSUE_TEMPLATE/bug.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,6 @@
22
name: Bug Report
33
about: Submit a bug report
44
title: "[Bug Report] Bug title"
5-
65
---
76

87
If you are submitting a bug report, please fill in the following details and use the tag [bug].
@@ -15,13 +14,14 @@ Please try to provide a minimal example to reproduce the bug. Error messages and
1514

1615
**System Info**
1716
Describe the characteristic of your environment:
18-
* Describe how `transformer_lens` was installed (pip, docker, source, ...)
19-
* What OS are you using? (Linux, MacOS, Windows)
20-
* Python version (We suppourt 3.10 -3.12 currently)
17+
18+
- Describe how `transformer_lens` was installed (pip, docker, source, ...)
19+
- What OS are you using? (Linux, MacOS, Windows)
20+
- Python version (We suppourt 3.10 -3.12 currently)
2121

2222
**Additional context**
2323
Add any other context about the problem here.
2424

2525
### Checklist
2626

27-
- [ ] I have checked that there is no similar [issue](https://github.com/jbloomAus/SAELens/issues) in the repo (**required**)
27+
- [ ] I have checked that there is no similar [issue](https://github.com/decoderesearch/SAELens/issues) in the repo (**required**)

.github/ISSUE_TEMPLATE/proposal.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ about: Propose changes that are not bug fixes
44
title: "[Proposal] Proposal title"
55
---
66

7-
### Proposal
7+
### Proposal
88

99
A clear and concise description of the proposal.
1010

@@ -28,4 +28,4 @@ Add any other context or screenshots about the feature request here.
2828

2929
### Checklist
3030

31-
- [ ] I have checked that there is no similar [issue](https://github.com/jbloomAus/SAELens/issues) in the repo (**required**)
31+
- [ ] I have checked that there is no similar [issue](https://github.com/decoderesearch/SAELens/issues) in the repo (**required**)

.github/workflows/build.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,7 @@ jobs:
108108
uses: codecov/[email protected]
109109
with:
110110
token: ${{ secrets.CODECOV_TOKEN }}
111-
slug: jbloomAus/SAELens
111+
slug: decoderesearch/SAELens
112112

113113
release:
114114
needs: build

docs/citation.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,6 @@
55
title = {SAELens},
66
author = {Bloom, Joseph and Tigges, Curt and Duong, Anthony and Chanin, David},
77
year = {2024},
8-
howpublished = {\url{https://github.com/jbloomAus/SAELens}},
8+
howpublished = {\url{https://github.com/decoderesearch/SAELens}},
99
}}
1010
```

docs/contributing.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,9 @@ Contributions are welcome! To get setup for development, follow the instructions
77
Make sure you have [poetry](https://python-poetry.org/) installed, clone the repository, and install dependencies with:
88

99
```bash
10-
git clone https://github.com/jbloomAus/SAELens.git # we recommend you make a fork for submitting PR's and clone that!
10+
git clone https://github.com/decoderesearch/SAELens.git # we recommend you make a fork for submitting PR's and clone that!
1111
poetry lock # can take a while.
12-
poetry install
12+
poetry install
1313
make check-ci # validate the install
1414
```
1515

@@ -44,6 +44,5 @@ This project uses [mkdocs](https://www.mkdocs.org/) for documentation. You can s
4444
```bash
4545
make docs-serve
4646
```
47-
If you make changes to code which requires updating documentation, it would be greatly appreciated if you could update the docs as well.
48-
4947

48+
If you make changes to code which requires updating documentation, it would be greatly appreciated if you could update the docs as well.

docs/index.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -3,15 +3,15 @@
33
SAELens 6.0.0 is live with changes to SAE training and loading. Check out the [migration guide →](migrating)
44
<!-- prettier-ignore-end -->
55

6-
<img width="1308" alt="Screenshot 2024-03-21 at 3 08 28 pm" src="https://github.com/jbloomAus/mats_sae_training/assets/69127271/209012ec-a779-4036-b4be-7b7739ea87f6">
6+
<img width="1308" height="532" alt="saes_pic" src="https://github.com/user-attachments/assets/2a5d752f-b261-4ee4-ad5d-ebf282321371" />
77

88
# SAELens
99

1010
[![PyPI](https://img.shields.io/pypi/v/sae-lens?color=blue)](https://pypi.org/project/sae-lens/)
1111
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
12-
[![build](https://github.com/jbloomAus/SAELens/actions/workflows/build.yml/badge.svg)](https://github.com/jbloomAus/SAELens/actions/workflows/build.yml)
13-
[![Deploy Docs](https://github.com/jbloomAus/SAELens/actions/workflows/deploy_docs.yml/badge.svg)](https://github.com/jbloomAus/SAELens/actions/workflows/deploy_docs.yml)
14-
[![codecov](https://codecov.io/gh/jbloomAus/SAELens/graph/badge.svg?token=N83NGH8CGE)](https://codecov.io/gh/jbloomAus/SAELens)
12+
[![build](https://github.com/decoderesearch/SAELens/actions/workflows/build.yml/badge.svg)](https://github.com/decoderesearch/SAELens/actions/workflows/build.yml)
13+
[![Deploy Docs](https://github.com/decoderesearch/SAELens/actions/workflows/deploy_docs.yml/badge.svg)](https://github.com/decoderesearch/SAELens/actions/workflows/deploy_docs.yml)
14+
[![codecov](https://codecov.io/gh/decoderesearch/SAELens/graph/badge.svg?token=N83NGH8CGE)](https://codecov.io/gh/decoderesearch/SAELens)
1515

1616
The SAELens training codebase exists to help researchers:
1717

@@ -59,7 +59,7 @@ sae = SAE.load_from_disk("/path/to/your/sae", device="cuda")
5959

6060
### Importing SAEs from other libraries
6161

62-
You can import an SAE created with another library by writing a custom `PretrainedSaeHuggingfaceLoader` or `PretrainedSaeDiskLoader` for use with `SAE.from_pretrained()` or `SAE.load_from_disk()`, respectively. See the [pretrained_sae_loaders.py](https://github.com/jbloomAus/SAELens/blob/main/sae_lens/loading/pretrained_sae_loaders.py) file for more details, or ask on the [Open Source Mechanistic Interpretability Slack](https://join.slack.com/t/opensourcemechanistic/shared_invite/zt-375zalm04-GFd5tdBU1yLKlu_T_JSqZQ). If you write a good custom loader for another library, please consider contributing it back to SAELens!
62+
You can import an SAE created with another library by writing a custom `PretrainedSaeHuggingfaceLoader` or `PretrainedSaeDiskLoader` for use with `SAE.from_pretrained()` or `SAE.load_from_disk()`, respectively. See the [pretrained_sae_loaders.py](https://github.com/decoderesearch/SAELens/blob/main/sae_lens/loading/pretrained_sae_loaders.py) file for more details, or ask on the [Open Source Mechanistic Interpretability Slack](https://join.slack.com/t/opensourcemechanistic/shared_invite/zt-375zalm04-GFd5tdBU1yLKlu_T_JSqZQ). If you write a good custom loader for another library, please consider contributing it back to SAELens!
6363

6464
### Background and further Readings
6565

@@ -71,9 +71,9 @@ For recent progress in SAEs, we recommend the LessWrong forum's [Sparse Autoenco
7171

7272
I wrote a tutorial to show users how to do some basic exploration of their SAE:
7373

74-
- Loading and Analysing Pre-Trained Sparse Autoencoders [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://githubtocolab.com/jbloomAus/SAELens/blob/main/tutorials/basic_loading_and_analysing.ipynb)
75-
- Understanding SAE Features with the Logit Lens [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://githubtocolab.com/jbloomAus/SAELens/blob/main/tutorials/logits_lens_with_features.ipynb)
76-
- Training a Sparse Autoencoder [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://githubtocolab.com/jbloomAus/SAELens/blob/main/tutorials/training_a_sparse_autoencoder.ipynb)
74+
- Loading and Analysing Pre-Trained Sparse Autoencoders [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://githubtocolab.com/decoderesearch/SAELens/blob/main/tutorials/basic_loading_and_analysing.ipynb)
75+
- Understanding SAE Features with the Logit Lens [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://githubtocolab.com/decoderesearch/SAELens/blob/main/tutorials/logits_lens_with_features.ipynb)
76+
- Training a Sparse Autoencoder [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://githubtocolab.com/decoderesearch/SAELens/blob/main/tutorials/training_a_sparse_autoencoder.ipynb)
7777

7878
## Example WandB Dashboard
7979

@@ -88,6 +88,6 @@ WandB Dashboards provide lots of useful insights while training SAEs. Here's a s
8888
title = {SAELens},
8989
author = {Bloom, Joseph and Tigges, Curt and Duong, Anthony and Chanin, David},
9090
year = {2024},
91-
howpublished = {\url{https://github.com/jbloomAus/SAELens}},
91+
howpublished = {\url{https://github.com/decoderesearch/SAELens}},
9292
}}
9393
```

docs/training_saes.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,8 @@
22

33
Methods development for training SAEs is rapidly evolving, so these docs may change frequently. For all available training options, see the [LanguageModelSAERunnerConfig][sae_lens.LanguageModelSAERunnerConfig] and the architecture-specific configuration classes it uses (e.g., [StandardTrainingSAEConfig][sae_lens.StandardTrainingSAEConfig], [GatedTrainingSAEConfig][sae_lens.GatedTrainingSAEConfig], [JumpReLUTrainingSAEConfig][sae_lens.JumpReLUTrainingSAEConfig], and [TopKTrainingSAEConfig][sae_lens.TopKTrainingSAEConfig]).
44

5-
However, we are attempting to maintain this [tutorial](https://github.com/jbloomAus/SAELens/blob/main/tutorials/training_a_sparse_autoencoder.ipynb)
6-
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://githubtocolab.com/jbloomAus/SAELens/blob/main/tutorials/training_a_sparse_autoencoder.ipynb).
5+
However, we are attempting to maintain this [tutorial](https://github.com/decoderesearch/SAELens/blob/main/tutorials/training_a_sparse_autoencoder.ipynb)
6+
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://githubtocolab.com/decoderesearch/SAELens/blob/main/tutorials/training_a_sparse_autoencoder.ipynb).
77

88
We encourage readers to join the [Open Source Mechanistic Interpretability Slack](https://join.slack.com/t/opensourcemechanistic/shared_invite/zt-375zalm04-GFd5tdBU1yLKlu_T_JSqZQ) for support!
99

@@ -35,7 +35,7 @@ Core options typically configured within the architecture-specific `sae` object
3535
- For TopK and BatchTopK SAEs: `k` (the number of features to keep active). Sparsity is enforced structurally.
3636
- `normalize_activations`: Strategy for normalizing activations before they enter the SAE (e.g., `"expected_average_only_in"`).
3737

38-
A sample training run from the [tutorial](https://github.com/jbloomAus/SAELens/blob/main/tutorials/training_a_sparse_autoencoder.ipynb) is shown below. Note how SAE-specific parameters are nested within the `sae` field:
38+
A sample training run from the [tutorial](https://github.com/decoderesearch/SAELens/blob/main/tutorials/training_a_sparse_autoencoder.ipynb) is shown below. Note how SAE-specific parameters are nested within the `sae` field:
3939

4040
```python
4141
import torch
@@ -361,7 +361,7 @@ It's also possible to use pre-tokenized datasets to speed up training, since tok
361361

362362
## Pretokenizing datasets
363363

364-
We also provider a runner, [PretokenizeRunner][sae_lens.PretokenizeRunner], which can be used to pre-tokenize a dataset and upload it to Huggingface. See [PretokenizeRunnerConfig][sae_lens.PretokenizeRunnerConfig] for all available options. We also provide a [pretokenizing datasets tutorial](https://github.com/jbloomAus/SAELens/blob/main/tutorials/pretokenizing_datasets.ipynb) with more details.
364+
We also provider a runner, [PretokenizeRunner][sae_lens.PretokenizeRunner], which can be used to pre-tokenize a dataset and upload it to Huggingface. See [PretokenizeRunnerConfig][sae_lens.PretokenizeRunnerConfig] for all available options. We also provide a [pretokenizing datasets tutorial](https://github.com/decoderesearch/SAELens/blob/main/tutorials/pretokenizing_datasets.ipynb) with more details.
365365

366366
A sample run from the tutorial for GPT2 and the NeelNanda/c4-10k dataset is shown below.
367367

@@ -429,7 +429,7 @@ To use the cached activations during training, set `use_cached_activations=True`
429429

430430
## Uploading SAEs to Huggingface
431431

432-
Once you have a set of SAEs that you're happy with, your next step is to share them with the world! SAELens has a `upload_saes_to_huggingface()` function which makes this easy to do. We also provide a [uploading saes to huggingface tutorial](https://github.com/jbloomAus/SAELens/blob/main/tutorials/uploading_saes_to_huggingface.ipynb) with more details.
432+
Once you have a set of SAEs that you're happy with, your next step is to share them with the world! SAELens has a `upload_saes_to_huggingface()` function which makes this easy to do. We also provide a [uploading saes to huggingface tutorial](https://github.com/decoderesearch/SAELens/blob/main/tutorials/uploading_saes_to_huggingface.ipynb) with more details.
433433

434434
You'll just need to pass a dictionary of SAEs to upload along with the huggingface repo id to upload to. The dictionary keys will become the folders in the repo where each SAE will be located. It's best practice to use the hook point that the SAE was trained on as the key to make it clear to users where in the model to apply the SAE. The values of this dictionary can be either an SAE object, or a path to a saved SAE object on disk from the `sae.save_model()` method.
435435

scripts/ansible/tasks/configure_ec2_instance.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@
6868

6969
- name: "Git checkout SAELens {{ saelens_version_or_branch }}"
7070
ansible.builtin.git:
71-
repo: 'https://github.com/jbloomAus/SAELens.git'
71+
repo: 'https://github.com/decoderesearch/SAELens.git'
7272
dest: /home/ubuntu/SAELens
7373
version: "{{ saelens_version_or_branch }}"
7474

tutorials/Hooked_SAE_Transformer_Demo.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@
4545
"\n",
4646
" IN_COLAB = True\n",
4747
" print(\"Running as a Colab notebook\")\n",
48-
" %pip install git+https://github.com/jbloomAus/SAELens\n",
48+
" %pip install git+https://github.com/decoderesearch/SAELens\n",
4949
"\n",
5050
"except:\n",
5151
" IN_COLAB = False\n",

0 commit comments

Comments
 (0)