fix model path
Browse files
README.md
CHANGED
@@ -57,8 +57,8 @@ This model can be easily loaded using the `AutoModelForCausalLM` functionality:
|
|
57 |
```python
|
58 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
59 |
|
60 |
-
tokenizer = AutoTokenizer.from_pretrained("
|
61 |
-
model = AutoModelForCausalLM.from_pretrained("gpt-j-6B")
|
62 |
```
|
63 |
|
64 |
### Limitations and Biases
|
|
|
57 |
```python
|
58 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
59 |
|
60 |
+
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-j-6B")
|
61 |
+
model = AutoModelForCausalLM.from_pretrained("EleutherAI/gpt-j-6B")
|
62 |
```
|
63 |
|
64 |
### Limitations and Biases
|