SAELens
File size: 945 Bytes
461c104
 
 
 
4d6e034
dea719f
c221462
dea719f
4d6e034
dea719f
7add934
 
 
 
4d6e034
461c104
 
349b43a
461c104
 
 
349b43a
461c104
 
c221462
 
4d6e034
461c104
c221462
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
---
license: apache-2.0
---

# 1. GemmaScope

Gemmascope is TODO

# 2. What Is `gemmascope-9b-pt-res`?

- `gemmascope-`: See 1.
- `9b-pt-`: These SAEs were trained on the Gemma v2 9B base model (TODO link).
- `res`: These SAEs were trained on the states on the residual stream.
- 
## 3. GTM FAQ (TODO(conmy): delete for main rollout)

Q1: Why does this model exist in `gg-hf`?

A1: See https://docs.google.com/document/d/1bKaOw2mJPJDYhgFQGGVOyBB3M4Bm_Q3PMrfQeqeYi0M (Google internal only).

Q2: What does "SAE" mean?

A2: Sparse Autoencoder. See https://docs.google.com/document/d/1roMgCPMPEQgaNbCu15CGo966xRLToulCBQUVKVGvcfM (should be available to trusted HuggingFace collaborators, and Google too).

TODO(conmy): remove this when making the main repo.

## 4. Point of Contact

Point of contact: Arthur Conmy

Contact by email:

```python
''.join(list('moc.elgoog@ymnoc')[::-1])
```

HuggingFace account:
https://huggingface.co/ArthurConmyGDM