SAELens
ArthurConmyGDM commited on
Commit
80643e6
1 Parent(s): 1ee5918

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +19 -2
README.md CHANGED
@@ -15,7 +15,20 @@ See our [landing page](https://huggingface.co/google/gemma-scope) for details on
15
  - `2b-pt-`: These SAEs were trained on Gemma v2 2B base model.
16
  - `att`: These SAEs were trained on the attention layer outputs, before the final linear projection.
17
 
18
- ## 3. Point of Contact
 
 
 
 
 
 
 
 
 
 
 
 
 
19
 
20
  Point of contact: Arthur Conmy
21
 
@@ -26,4 +39,8 @@ Contact by email:
26
  ```
27
 
28
  HuggingFace account:
29
- https://huggingface.co/ArthurConmyGDM
 
 
 
 
 
15
  - `2b-pt-`: These SAEs were trained on Gemma v2 2B base model.
16
  - `att`: These SAEs were trained on the attention layer outputs, before the final linear projection.
17
 
18
+ # 3. How can I use these SAEs straight away?
19
+
20
+ ```python
21
+ from sae_lens import SAE # pip install sae-lens
22
+
23
+ sae, cfg_dict, sparsity = SAE.from_pretrained(
24
+ release = "gemma-scope-2b-pt-att-canonical",
25
+ sae_id = "layer_0/width_16k/canonical",
26
+ )
27
+ ```
28
+
29
+ See https://github.com/jbloomAus/SAELens for details on this library.
30
+
31
+ # 4. Point of Contact
32
 
33
  Point of contact: Arthur Conmy
34
 
 
39
  ```
40
 
41
  HuggingFace account:
42
+ https://huggingface.co/ArthurConmyGDM
43
+
44
+ # 5. Citation
45
+
46
+ Paper: https://arxiv.org/abs/2408.05147