soniajoseph commited on
Commit
d71b701
·
verified ·
1 Parent(s): 3ba5cb9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +49 -0
README.md ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # CLIP Sparse Autoencoder Checkpoint
2
+
3
+ This model is a sparse autoencoder trained on CLIP's internal representations.
4
+
5
+ ## Model Details
6
+
7
+ ### Architecture
8
+ - **Layer**: 3
9
+ - **Layer Type**: hook_resid_post
10
+ - **Model**: open-clip:laion/CLIP-ViT-B-32-DataComp.XL-s13B-b90K
11
+ - **Dictionary Size**: 49152
12
+ - **Input Dimension**: 768
13
+ - **Expansion Factor**: 64
14
+ - **CLS Token Only**: True
15
+
16
+ ### Training
17
+ - **Training Images**: 183087104
18
+ - **Learning Rate**: 0.0002
19
+ - **L1 Coefficient**: 0.3000
20
+ - **Batch Size**: 4096
21
+ - **Context Size**: 1
22
+
23
+ ## Performance Metrics
24
+
25
+ ### Sparsity
26
+ - **L0 (Active Features)**: 64
27
+ - **Dead Features**: 29335
28
+ - **Mean Log10 Feature Sparsity**: -9.2914
29
+ - **Features Below 1e-5**: 48859
30
+ - **Features Below 1e-6**: 41691
31
+ - **Mean Passes Since Fired**: 16714.2930
32
+
33
+ ### Reconstruction
34
+ - **Explained Variance**: 0.9595
35
+ - **Explained Variance Std**: 0.0150
36
+ - **MSE Loss**: 0.0001
37
+ - **L1 Loss**: 0
38
+ - **Overall Loss**: 0.0001
39
+
40
+ ## Training Details
41
+ - **Training Duration**: 17942.3141 seconds
42
+ - **Final Learning Rate**: 0.0002
43
+ - **Warm Up Steps**: 200
44
+ - **Gradient Clipping**: 1
45
+
46
+ ## Additional Information
47
+ - **Weights & Biases Run**: https://wandb.ai/perceptual-alignment/clip/runs/34j4mtdy
48
+ - **Original Checkpoint Path**: /network/scratch/s/sonia.joseph/checkpoints/clip-b
49
+ - **Random Seed**: 42