library_name: transformers | |
tags: [] | |
# Model Card for Model ID | |
This is a GPT-2 model trained in llm.c, for 32K steps (of 1M batch size) on FineWeb-EDU. | |
A lot more detailed information is here: https://github.com/karpathy/llm.c/discussions/677 | |
## Bias, Risks, and Limitations | |
Eagerly generates disinformation about English-speaking unicorns in the Andes mountains. | |