Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Andyrasika
/
finetuned-gpt2_dolly_lite
like
1
Text2Text Generation
Transformers
PyTorch
tatsu-lab/alpaca
English
gpt2
text-generation
Generated from Trainer
gpt
text-generation-inference
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
refs/pr/1
finetuned-gpt2_dolly_lite
1 contributor
History:
8 commits
SFconvertbot
Adding `safetensors` variant of this model
12ec85e
about 1 year ago
checkpoint-1300
Upload folder using huggingface_hub
about 1 year ago
checkpoint-2600
Upload folder using huggingface_hub
about 1 year ago
checkpoint-3900
Upload folder using huggingface_hub
about 1 year ago
.gitattributes
Safe
1.52 kB
initial commit
about 1 year ago
README.md
Safe
2.85 kB
Update README.md
about 1 year ago
config.json
Safe
1.01 kB
Upload folder using huggingface_hub
about 1 year ago
generation_config.json
Safe
119 Bytes
Upload folder using huggingface_hub
about 1 year ago
merges.txt
Safe
456 kB
Upload tokenizer
about 1 year ago
model.safetensors
Safe
328 MB
LFS
Adding `safetensors` variant of this model
about 1 year ago
pytorch_model.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
328 MB
LFS
Upload folder using huggingface_hub
about 1 year ago
special_tokens_map.json
Safe
131 Bytes
Upload tokenizer
about 1 year ago
tokenizer.json
Safe
2.11 MB
Upload tokenizer
about 1 year ago
tokenizer_config.json
Safe
234 Bytes
Upload tokenizer
about 1 year ago
training_args.bin
pickle
Detected Pickle imports (8)
"transformers.trainer_utils.SchedulerType"
,
"torch.device"
,
"transformers.training_args.OptimizerNames"
,
"transformers.trainer_utils.IntervalStrategy"
,
"accelerate.state.PartialState"
,
"accelerate.utils.dataclasses.DistributedType"
,
"transformers.training_args.TrainingArguments"
,
"transformers.trainer_utils.HubStrategy"
How to fix it?
4.03 kB
LFS
Upload folder using huggingface_hub
about 1 year ago
vocab.json
Safe
798 kB
Upload tokenizer
about 1 year ago