codeparrot / README.md
lvwerra's picture
lvwerra HF staff
Create README.md
304967f
|
raw
history blame
533 Bytes

CodeParrot

CodeParrot (large) is a 1.5B parameter GPT-2 model trained on the CodeParrot Python code dataset. The model is trained in Chapter 10: Training Transformers from Scratch in the NLP with Transformers book. You can find the full code in the accompanying Github repository.