Datasets:
modelId
stringlengths 4
122
| author
stringlengths 2
42
⌀ | last_modified
unknown | downloads
int64 0
74.7M
| likes
int64 0
9.67k
| library_name
stringlengths 2
84
⌀ | tags
sequence | pipeline_tag
stringlengths 5
30
⌀ | createdAt
unknown | card
stringlengths 1
901k
| embedding
sequence |
---|---|---|---|---|---|---|---|---|---|---|
jonatasgrosman/wav2vec2-large-xlsr-53-english | jonatasgrosman | "2023-03-25T10:56:55" | 74,653,058 | 317 | transformers | [
"transformers",
"pytorch",
"jax",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"en",
"hf-asr-leaderboard",
"mozilla-foundation/common_voice_6_0",
"robust-speech-event",
"speech",
"xlsr-fine-tuning-week",
"dataset:common_voice",
"dataset:mozilla-foundation/common_voice_6_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | "2022-03-02T23:29:05" | ---
language: en
datasets:
- common_voice
- mozilla-foundation/common_voice_6_0
metrics:
- wer
- cer
tags:
- audio
- automatic-speech-recognition
- en
- hf-asr-leaderboard
- mozilla-foundation/common_voice_6_0
- robust-speech-event
- speech
- xlsr-fine-tuning-week
license: apache-2.0
model-index:
- name: XLSR Wav2Vec2 English by Jonatas Grosman
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Common Voice en
type: common_voice
args: en
metrics:
- name: Test WER
type: wer
value: 19.06
- name: Test CER
type: cer
value: 7.69
- name: Test WER (+LM)
type: wer
value: 14.81
- name: Test CER (+LM)
type: cer
value: 6.84
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Robust Speech Event - Dev Data
type: speech-recognition-community-v2/dev_data
args: en
metrics:
- name: Dev WER
type: wer
value: 27.72
- name: Dev CER
type: cer
value: 11.65
- name: Dev WER (+LM)
type: wer
value: 20.85
- name: Dev CER (+LM)
type: cer
value: 11.01
---
# Fine-tuned XLSR-53 large model for speech recognition in English
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on English using the train and validation splits of [Common Voice 6.1](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
This model has been fine-tuned thanks to the GPU credits generously given by the [OVHcloud](https://www.ovhcloud.com/en/public-cloud/ai-training/) :)
The script used for training can be found here: https://github.com/jonatasgrosman/wav2vec2-sprint
## Usage
The model can be used directly (without a language model) as follows...
Using the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) library:
```python
from huggingsound import SpeechRecognitionModel
model = SpeechRecognitionModel("jonatasgrosman/wav2vec2-large-xlsr-53-english")
audio_paths = ["/path/to/file.mp3", "/path/to/another_file.wav"]
transcriptions = model.transcribe(audio_paths)
```
Writing your own inference script:
```python
import torch
import librosa
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
LANG_ID = "en"
MODEL_ID = "jonatasgrosman/wav2vec2-large-xlsr-53-english"
SAMPLES = 10
test_dataset = load_dataset("common_voice", LANG_ID, split=f"test[:{SAMPLES}]")
processor = Wav2Vec2Processor.from_pretrained(MODEL_ID)
model = Wav2Vec2ForCTC.from_pretrained(MODEL_ID)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = librosa.load(batch["path"], sr=16_000)
batch["speech"] = speech_array
batch["sentence"] = batch["sentence"].upper()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
predicted_sentences = processor.batch_decode(predicted_ids)
for i, predicted_sentence in enumerate(predicted_sentences):
print("-" * 100)
print("Reference:", test_dataset[i]["sentence"])
print("Prediction:", predicted_sentence)
```
| Reference | Prediction |
| ------------- | ------------- |
| "SHE'LL BE ALL RIGHT." | SHE'LL BE ALL RIGHT |
| SIX | SIX |
| "ALL'S WELL THAT ENDS WELL." | ALL AS WELL THAT ENDS WELL |
| DO YOU MEAN IT? | DO YOU MEAN IT |
| THE NEW PATCH IS LESS INVASIVE THAN THE OLD ONE, BUT STILL CAUSES REGRESSIONS. | THE NEW PATCH IS LESS INVASIVE THAN THE OLD ONE BUT STILL CAUSES REGRESSION |
| HOW IS MOZILLA GOING TO HANDLE AMBIGUITIES LIKE QUEUE AND CUE? | HOW IS MOSLILLAR GOING TO HANDLE ANDBEWOOTH HIS LIKE Q AND Q |
| "I GUESS YOU MUST THINK I'M KINDA BATTY." | RUSTIAN WASTIN PAN ONTE BATTLY |
| NO ONE NEAR THE REMOTE MACHINE YOU COULD RING? | NO ONE NEAR THE REMOTE MACHINE YOU COULD RING |
| SAUCE FOR THE GOOSE IS SAUCE FOR THE GANDER. | SAUCE FOR THE GUICE IS SAUCE FOR THE GONDER |
| GROVES STARTED WRITING SONGS WHEN SHE WAS FOUR YEARS OLD. | GRAFS STARTED WRITING SONGS WHEN SHE WAS FOUR YEARS OLD |
## Evaluation
1. To evaluate on `mozilla-foundation/common_voice_6_0` with split `test`
```bash
python eval.py --model_id jonatasgrosman/wav2vec2-large-xlsr-53-english --dataset mozilla-foundation/common_voice_6_0 --config en --split test
```
2. To evaluate on `speech-recognition-community-v2/dev_data`
```bash
python eval.py --model_id jonatasgrosman/wav2vec2-large-xlsr-53-english --dataset speech-recognition-community-v2/dev_data --config en --split validation --chunk_length_s 5.0 --stride_length_s 1.0
```
## Citation
If you want to cite this model you can use this:
```bibtex
@misc{grosman2021xlsr53-large-english,
title={Fine-tuned {XLSR}-53 large model for speech recognition in {E}nglish},
author={Grosman, Jonatas},
howpublished={\url{https://huggingface.co/jonatasgrosman/wav2vec2-large-xlsr-53-english}},
year={2021}
}
``` | [
-0.2986782491207123,
-0.6198410987854004,
0.15314045548439026,
0.2171814888715744,
-0.08972680568695068,
-0.23627611994743347,
-0.350484162569046,
-0.6717195510864258,
0.13748575747013092,
0.3199954330921173,
-0.6518785357475281,
-0.41586291790008545,
-0.39903849363327026,
0.015393895097076893,
-0.3665079176425934,
0.9975103139877319,
0.21018993854522705,
0.22382068634033203,
-0.09696667641401291,
-0.08423786610364914,
-0.27943485975265503,
-0.33491969108581543,
-0.7773143649101257,
-0.32999029755592346,
0.4395226538181305,
0.2611736059188843,
0.25250983238220215,
0.42460256814956665,
0.34895116090774536,
0.3476226329803467,
-0.3549373149871826,
0.03305350989103317,
-0.3104589581489563,
-0.0212837066501379,
0.09455720335245132,
-0.4107545018196106,
-0.3818243741989136,
0.1320345103740692,
0.6943532228469849,
0.486592561006546,
-0.21025298535823822,
0.26065292954444885,
-0.016902294009923935,
0.40338200330734253,
-0.2634701728820801,
0.141378253698349,
-0.5906783938407898,
-0.1726924180984497,
-0.12648864090442657,
0.12337321043014526,
-0.42482107877731323,
-0.18928129971027374,
0.20443974435329437,
-0.4765270948410034,
0.18078280985355377,
-0.034663278609514236,
0.9162636995315552,
0.1835038661956787,
-0.15233512222766876,
-0.31887269020080566,
-0.63443523645401,
0.9189004302024841,
-0.8846392631530762,
0.41991135478019714,
0.5213688611984253,
0.20340502262115479,
-0.14791318774223328,
-0.8956807255744934,
-0.5943350791931152,
-0.27512702345848083,
0.1304752081632614,
0.17528267204761505,
-0.47049152851104736,
-0.04430117830634117,
0.4220830798149109,
0.19964104890823364,
-0.7087526917457581,
0.06678150594234467,
-0.8153592348098755,
-0.4629441499710083,
0.6618533730506897,
-0.12358815222978592,
0.15854540467262268,
-0.18141716718673706,
-0.17811861634254456,
-0.39459121227264404,
-0.30910006165504456,
0.3477388918399811,
0.43806421756744385,
0.4730299413204193,
-0.49829795956611633,
0.47403767704963684,
-0.06838540732860565,
0.6962615251541138,
0.009966463781893253,
-0.4113371968269348,
0.7869434356689453,
-0.24690064787864685,
-0.25855210423469543,
0.13378846645355225,
1.0088038444519043,
0.13556356728076935,
0.34302932024002075,
-0.02021957002580166,
-0.15402960777282715,
0.2147393524646759,
-0.19042803347110748,
-0.648216962814331,
-0.28921422362327576,
0.45159950852394104,
-0.24895897507667542,
-0.18135996162891388,
-0.05269762501120567,
-0.5286544561386108,
-0.057065535336732864,
-0.18467400968074799,
0.588241696357727,
-0.5805453658103943,
-0.15850499272346497,
0.2193111777305603,
-0.2858376204967499,
0.07301050424575806,
0.0007727051270194352,
-0.839114248752594,
0.22724127769470215,
0.4483632743358612,
0.8111897110939026,
0.18051454424858093,
-0.3589932322502136,
-0.5964868664741516,
-0.19940385222434998,
-0.10326903313398361,
0.5722023844718933,
-0.34661203622817993,
-0.18877056241035461,
-0.1909628063440323,
0.11381758004426956,
-0.21804486215114594,
-0.5073392987251282,
0.6515127420425415,
-0.10460477322340012,
0.41914796829223633,
-0.05236684903502464,
-0.5074020624160767,
-0.1354503482580185,
-0.149649515748024,
-0.6243430376052856,
1.031459927558899,
-0.046794500201940536,
-0.7573160529136658,
0.010813926346600056,
-0.5949062705039978,
-0.5197463035583496,
-0.2937082052230835,
0.012780114077031612,
-0.4289485812187195,
-0.16905122995376587,
0.21448011696338654,
0.46237269043922424,
-0.2814592719078064,
0.09547995030879974,
-0.33662575483322144,
-0.2656092047691345,
0.4388526678085327,
-0.3542793095111847,
1.1121597290039062,
0.27087870240211487,
-0.3494134843349457,
-0.07301796972751617,
-0.8975836038589478,
0.1610541045665741,
0.047621335834264755,
-0.3717905879020691,
-0.04819667711853981,
0.017560608685016632,
0.31656962633132935,
0.08523821085691452,
0.17931397259235382,
-0.5976276993751526,
0.0014654628466814756,
-0.614433765411377,
0.7905967831611633,
0.42146942019462585,
-0.15546131134033203,
0.20447874069213867,
-0.4552643895149231,
0.3657873868942261,
-0.05136251822113991,
0.021297529339790344,
-0.13063421845436096,
-0.46401748061180115,
-0.7107483744621277,
-0.3216921389102936,
0.4357205927371979,
0.5563420057296753,
-0.30985310673713684,
0.6425613760948181,
-0.11694646626710892,
-0.8989133238792419,
-0.7820853590965271,
-0.09256067126989365,
0.41393378376960754,
0.4757053852081299,
0.5814088582992554,
0.006591297220438719,
-0.9113081097602844,
-0.7907305359840393,
0.011498488485813141,
-0.24447685480117798,
-0.007633310742676258,
0.3385224938392639,
0.5352473258972168,
-0.3783178925514221,
0.7726191282272339,
-0.4184335172176361,
-0.2570147216320038,
-0.3030078411102295,
0.1566770076751709,
0.3712730407714844,
0.6784137487411499,
0.46846580505371094,
-0.6553794145584106,
-0.270852267742157,
-0.1617184728384018,
-0.3559997081756592,
-0.15468795597553253,
-0.04935915395617485,
0.010483819991350174,
0.2456815540790558,
0.3896811902523041,
-0.6693896055221558,
0.15685676038265228,
0.5130443572998047,
-0.23564787209033966,
0.5594643354415894,
0.038618944585323334,
-0.027801185846328735,
-1.180784821510315,
0.08773595839738846,
0.16845133900642395,
-0.17358893156051636,
-0.5824664831161499,
-0.2833733558654785,
-0.13129979372024536,
0.096702940762043,
-0.4337015748023987,
0.44183459877967834,
-0.40935468673706055,
-0.1050172671675682,
0.013557040132582188,
0.2117810845375061,
-0.036025334149599075,
0.40796971321105957,
0.009446543641388416,
0.6838056445121765,
0.7252416610717773,
-0.47354066371917725,
0.5564007759094238,
0.306767076253891,
-0.5415226817131042,
0.11258982121944427,
-0.8776285648345947,
0.2970195412635803,
0.12199652194976807,
0.24489453434944153,
-1.048151969909668,
-0.14162577688694,
0.19185493886470795,
-0.8392744660377502,
0.22557507455348969,
0.024073060601949692,
-0.34525266289711,
-0.5161918997764587,
-0.1192438006401062,
0.18004055321216583,
0.713423490524292,
-0.36429768800735474,
0.4759060740470886,
0.5298364162445068,
-0.213980570435524,
-0.6317208409309387,
-0.8367033004760742,
-0.2283824235200882,
-0.19221854209899902,
-0.7377045154571533,
0.22683022916316986,
-0.2217472642660141,
-0.1877577006816864,
-0.1934467852115631,
-0.082353875041008,
-0.08891357481479645,
-0.08551476895809174,
0.2462691068649292,
0.23247334361076355,
-0.26210930943489075,
0.005972245242446661,
-0.07648825645446777,
0.048642370849847794,
0.09093841165304184,
-0.15815460681915283,
0.6357372999191284,
-0.17510710656642914,
-0.031857930123806,
-0.5054867267608643,
0.15823766589164734,
0.5950208306312561,
-0.3337579667568207,
0.39463216066360474,
0.8527372479438782,
-0.41629448533058167,
-0.017093902453780174,
-0.5835091471672058,
-0.09816961735486984,
-0.4319544732570648,
0.6522056460380554,
-0.18189257383346558,
-0.6936673521995544,
0.5910933613777161,
0.289909690618515,
0.038096047937870026,
0.568677544593811,
0.47606509923934937,
-0.20770931243896484,
0.9352326393127441,
0.35430315136909485,
-0.24338936805725098,
0.4886784553527832,
-0.4914666414260864,
-0.05727830156683922,
-0.8211593627929688,
-0.32320669293403625,
-0.7729519605636597,
-0.1822136491537094,
-0.3618621230125427,
-0.37603121995925903,
0.14087508618831635,
0.014635884203016758,
-0.20020610094070435,
0.48802492022514343,
-0.51412433385849,
0.32214581966400146,
0.5778658390045166,
0.12133043259382248,
-0.08252114802598953,
0.12852133810520172,
-0.20228826999664307,
0.024571146816015244,
-0.5485386848449707,
-0.3736940324306488,
0.9211391806602478,
0.5370057225227356,
0.7066537141799927,
-0.025219611823558807,
0.5813763737678528,
0.06679049134254456,
-0.2640549838542938,
-0.8020169734954834,
0.5317745804786682,
-0.2534148395061493,
-0.60774165391922,
-0.4264550805091858,
-0.3681277334690094,
-0.8700185418128967,
0.09740868955850601,
-0.22983814775943756,
-1.0082794427871704,
0.13122820854187012,
0.08270860463380814,
-0.47803282737731934,
0.08063732087612152,
-0.7765981554985046,
0.7623623013496399,
-0.07912284135818481,
-0.12893429398536682,
-0.1590723842382431,
-0.6812736392021179,
0.22125551104545593,
-0.019907431676983833,
0.17234158515930176,
-0.07736481726169586,
0.3546050786972046,
1.2538111209869385,
-0.2519017159938812,
0.7966378331184387,
-0.15204907953739166,
0.09728778898715973,
0.3778384327888489,
-0.36534127593040466,
0.4302418828010559,
-0.19053882360458374,
-0.26130610704421997,
0.24463464319705963,
0.3511280119419098,
-0.08890138566493988,
-0.35048708319664,
0.6122539639472961,
-0.9846289157867432,
-0.3356814384460449,
-0.4760555326938629,
-0.5616979598999023,
-0.17708618938922882,
0.15558043122291565,
0.6239392161369324,
0.6993443965911865,
-0.1813802570104599,
0.5063410997390747,
0.5151883363723755,
-0.18129345774650574,
0.4302307665348053,
0.43288177251815796,
-0.1433202028274536,
-0.6659262776374817,
0.6517324447631836,
0.27671241760253906,
0.25283509492874146,
0.24833694100379944,
0.2969081997871399,
-0.45287516713142395,
-0.4533565938472748,
-0.1934199184179306,
0.34662100672721863,
-0.5941426753997803,
-0.15330763161182404,
-0.6891378164291382,
-0.35012495517730713,
-0.8004909753799438,
0.18585599958896637,
-0.2550974190235138,
-0.38916125893592834,
-0.5578216314315796,
-0.06390836834907532,
0.5558496117591858,
0.5304879546165466,
-0.223955437541008,
0.2962989807128906,
-0.6371816992759705,
0.3087592124938965,
0.09282948076725006,
-0.004781326744705439,
-0.049881525337696075,
-0.9444093108177185,
-0.4232630431652069,
0.2850038707256317,
-0.17169438302516937,
-0.8396994471549988,
0.4433770179748535,
0.2507280111312866,
0.5512366890907288,
0.313508540391922,
0.017495105043053627,
0.7865803241729736,
-0.48180466890335083,
0.7982177138328552,
0.30825650691986084,
-1.0210087299346924,
0.6895948052406311,
-0.24274109303951263,
0.25371015071868896,
0.3235049843788147,
0.2743394672870636,
-0.6014912724494934,
-0.37604424357414246,
-0.6697672009468079,
-0.8172581791877747,
0.8910949230194092,
0.20079731941223145,
0.11626455187797546,
0.06288859993219376,
0.2040444016456604,
-0.10754847526550293,
0.023407157510519028,
-0.840846598148346,
-0.500673234462738,
-0.21322514116764069,
-0.30294644832611084,
-0.30097776651382446,
-0.23002608120441437,
-0.07328128814697266,
-0.5448387861251831,
0.9690173864364624,
0.14039231836795807,
0.4180555045604706,
0.2847486734390259,
0.04112531989812851,
-0.048080503940582275,
0.29813122749328613,
0.671032726764679,
0.30162549018859863,
-0.35610201954841614,
-0.0278564915060997,
0.21667809784412384,
-0.7018367648124695,
0.11883611977100372,
0.2867267429828644,
-0.03388841450214386,
0.10876311361789703,
0.578399121761322,
1.1345436573028564,
0.1437993198633194,
-0.560590922832489,
0.35557791590690613,
0.02208833210170269,
-0.35043513774871826,
-0.6593813300132751,
0.23197847604751587,
0.3068700432777405,
0.34671205282211304,
0.38781145215034485,
0.13256323337554932,
-0.0021431518252938986,
-0.49292054772377014,
0.17405900359153748,
0.28963199257850647,
-0.3625887930393219,
-0.27201202511787415,
0.5625240206718445,
0.10164352506399155,
-0.38014838099479675,
0.5060350298881531,
-0.013671793043613434,
-0.42422986030578613,
0.7633494734764099,
0.7033299803733826,
0.8232531547546387,
-0.31600138545036316,
0.0028847476933151484,
0.584790050983429,
0.37444940209388733,
-0.23922309279441833,
0.4571058750152588,
0.0915980190038681,
-0.7629279494285583,
-0.22903631627559662,
-0.584694504737854,
-0.18413381278514862,
0.45980507135391235,
-0.82528156042099,
0.36449316143989563,
-0.2850744128227234,
-0.25153064727783203,
0.38955068588256836,
0.11550334841012955,
-0.538318932056427,
0.31822359561920166,
0.21445724368095398,
1.1353727579116821,
-0.9384064674377441,
0.9942784309387207,
0.5284237861633301,
-0.46272408962249756,
-1.1762990951538086,
-0.06110510230064392,
-0.1514868289232254,
-0.6226325631141663,
0.44767385721206665,
0.23596569895744324,
-0.09687404334545135,
0.05542507395148277,
-0.6277317404747009,
-0.9426712989807129,
1.157207727432251,
0.4176265001296997,
-0.8710914254188538,
0.012620768509805202,
-0.12619316577911377,
0.48434364795684814,
-0.33097773790359497,
0.3943166732788086,
0.6652692556381226,
0.44938600063323975,
0.14017584919929504,
-1.0236495733261108,
-0.1660672128200531,
-0.38672205805778503,
-0.2355642318725586,
-0.1932877004146576,
-0.5488678216934204,
1.0418843030929565,
-0.3842390179634094,
-0.04526272043585777,
0.3107954263687134,
0.7612930536270142,
0.34709641337394714,
0.2712002694606781,
0.620022714138031,
0.5631112456321716,
1.0233745574951172,
-0.12065429240465164,
0.7486341595649719,
-0.12812015414237976,
0.4626574218273163,
1.1276955604553223,
-0.19988742470741272,
1.1474151611328125,
0.3412741720676422,
-0.34945565462112427,
0.48446422815322876,
0.6178076267242432,
-0.2664569616317749,
0.6101146936416626,
0.11182048171758652,
-0.14068658649921417,
-0.12413766980171204,
0.054287221282720566,
-0.6411028504371643,
0.6707635521888733,
0.20572364330291748,
-0.4231283664703369,
0.218197301030159,
0.18218274414539337,
0.15999644994735718,
-0.20935247838497162,
-0.15437228977680206,
0.5569646954536438,
0.14575035870075226,
-0.5796248316764832,
0.788615345954895,
-0.0015727928839623928,
0.8153071999549866,
-0.7043374180793762,
0.16110482811927795,
0.11699246615171432,
0.2503974437713623,
-0.292295902967453,
-0.6254711151123047,
0.12959617376327515,
0.14673586189746857,
-0.3989182114601135,
0.1426077038049698,
0.4524940848350525,
-0.677437424659729,
-0.6575196385383606,
0.4511633813381195,
0.11541418731212616,
0.40602099895477295,
0.058045320212841034,
-0.8304693698883057,
0.2566220760345459,
0.2882923185825348,
-0.3295007646083832,
0.10148648172616959,
0.2801743447780609,
0.31584563851356506,
0.5785358548164368,
0.6930896639823914,
0.3036825358867645,
-0.02816903218626976,
0.13342812657356262,
0.6159109473228455,
-0.5761343836784363,
-0.5816760063171387,
-0.6665089130401611,
0.4759324789047241,
0.021001338958740234,
-0.38168787956237793,
0.6274378895759583,
0.6625373959541321,
0.8250736594200134,
-0.05213620886206627,
0.9347538352012634,
-0.13355600833892822,
0.8594521284103394,
-0.629136860370636,
0.7655680179595947,
-0.5587387681007385,
0.11424879729747772,
-0.39014124870300293,
-0.6276368498802185,
-0.1058996319770813,
0.9717790484428406,
-0.3536848723888397,
0.17970246076583862,
0.635608971118927,
1.0537967681884766,
-0.03812592476606369,
-0.05532967671751976,
0.2992648482322693,
0.4988098442554474,
0.1466129869222641,
0.6269223093986511,
0.6231476068496704,
-0.6761100888252258,
0.7543044090270996,
-0.38176044821739197,
-0.05370482802391052,
-0.10703332722187042,
-0.6270663142204285,
-0.8217369318008423,
-0.8466417789459229,
-0.43048322200775146,
-0.6479923129081726,
-0.05438127741217613,
1.1425895690917969,
0.7034356594085693,
-0.9015206098556519,
-0.3503623306751251,
0.15816737711429596,
-0.09013473242521286,
-0.318254679441452,
-0.1995191127061844,
0.4134966731071472,
0.05100131034851074,
-0.8001829385757446,
0.4597160518169403,
-0.1746138334274292,
0.24971280992031097,
-0.15396428108215332,
-0.23781968653202057,
-0.25555965304374695,
0.04053744301199913,
0.2902785837650299,
0.3993755877017975,
-0.832990825176239,
-0.20064595341682434,
0.03673627972602844,
-0.2200442999601364,
0.04198840260505676,
0.2752811014652252,
-0.6402921080589294,
0.24390634894371033,
0.5282126665115356,
0.1776425987482071,
0.49302881956100464,
-0.27599161863327026,
0.2968326508998871,
-0.4963147044181824,
0.25545671582221985,
0.25214684009552,
0.6406504511833191,
0.4019564688205719,
-0.2162327766418457,
0.33743220567703247,
0.25802847743034363,
-0.5637114644050598,
-1.0040644407272339,
-0.10230687260627747,
-1.2319098711013794,
-0.0757783055305481,
1.3547230958938599,
-0.15593834221363068,
-0.2562968134880066,
0.06547924876213074,
-0.37746918201446533,
0.5581868886947632,
-0.5069282650947571,
0.43904703855514526,
0.6330015063285828,
-0.01731196604669094,
-0.09225611388683319,
-0.507113516330719,
0.4009893536567688,
0.3089853823184967,
-0.5233913064002991,
0.06346861273050308,
0.4463845491409302,
0.49223148822784424,
0.3013042211532593,
0.7021601796150208,
0.022719677537679672,
0.3620644211769104,
0.0656665787100792,
0.28791800141334534,
-0.13042610883712769,
-0.25736168026924133,
-0.5544470548629761,
-0.060197681188583374,
-0.12710784375667572,
-0.4306369125843048
] |
bert-base-uncased | null | "2023-06-30T01:42:19" | 55,618,661 | 1,219 | transformers | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"coreml",
"onnx",
"safetensors",
"bert",
"fill-mask",
"exbert",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1810.04805",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | "2022-03-02T23:29:04" | ---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- bookcorpus
- wikipedia
---
# BERT base model (uncased)
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1810.04805) and first released in
[this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference
between english and English.
Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by
the Hugging Face team.
## Model description
BERT is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it
was pretrained on the raw texts only, with no humans labeling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it
was pretrained with two objectives:
- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run
the entire masked sentence through the model and has to predict the masked words. This is different from traditional
recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like
GPT which internally masks the future tokens. It allows the model to learn a bidirectional representation of the
sentence.
- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes
they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to
predict if the two sentences were following each other or not.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences, for instance, you can train a standard
classifier using the features produced by the BERT model as inputs.
## Model variations
BERT has originally been released in base and large variations, for cased and uncased input text. The uncased models also strips out an accent markers.
Chinese and multilingual uncased and cased versions followed shortly after.
Modified preprocessing with whole word masking has replaced subpiece masking in a following work, with the release of two models.
Other 24 smaller models are released afterward.
The detailed release history can be found on the [google-research/bert readme](https://github.com/google-research/bert/blob/master/README.md) on github.
| Model | #params | Language |
|------------------------|--------------------------------|-------|
| [`bert-base-uncased`](https://huggingface.co/bert-base-uncased) | 110M | English |
| [`bert-large-uncased`](https://huggingface.co/bert-large-uncased) | 340M | English | sub
| [`bert-base-cased`](https://huggingface.co/bert-base-cased) | 110M | English |
| [`bert-large-cased`](https://huggingface.co/bert-large-cased) | 340M | English |
| [`bert-base-chinese`](https://huggingface.co/bert-base-chinese) | 110M | Chinese |
| [`bert-base-multilingual-cased`](https://huggingface.co/bert-base-multilingual-cased) | 110M | Multiple |
| [`bert-large-uncased-whole-word-masking`](https://huggingface.co/bert-large-uncased-whole-word-masking) | 340M | English |
| [`bert-large-cased-whole-word-masking`](https://huggingface.co/bert-large-cased-whole-word-masking) | 340M | English |
## Intended uses & limitations
You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to
be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=bert) to look for
fine-tuned versions of a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='bert-base-uncased')
>>> unmasker("Hello I'm a [MASK] model.")
[{'sequence': "[CLS] hello i'm a fashion model. [SEP]",
'score': 0.1073106899857521,
'token': 4827,
'token_str': 'fashion'},
{'sequence': "[CLS] hello i'm a role model. [SEP]",
'score': 0.08774490654468536,
'token': 2535,
'token_str': 'role'},
{'sequence': "[CLS] hello i'm a new model. [SEP]",
'score': 0.05338378623127937,
'token': 2047,
'token_str': 'new'},
{'sequence': "[CLS] hello i'm a super model. [SEP]",
'score': 0.04667217284440994,
'token': 3565,
'token_str': 'super'},
{'sequence': "[CLS] hello i'm a fine model. [SEP]",
'score': 0.027095865458250046,
'token': 2986,
'token_str': 'fine'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = BertModel.from_pretrained("bert-base-uncased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = TFBertModel.from_pretrained("bert-base-uncased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
Even if the training data used for this model could be characterized as fairly neutral, this model can have biased
predictions:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='bert-base-uncased')
>>> unmasker("The man worked as a [MASK].")
[{'sequence': '[CLS] the man worked as a carpenter. [SEP]',
'score': 0.09747550636529922,
'token': 10533,
'token_str': 'carpenter'},
{'sequence': '[CLS] the man worked as a waiter. [SEP]',
'score': 0.0523831807076931,
'token': 15610,
'token_str': 'waiter'},
{'sequence': '[CLS] the man worked as a barber. [SEP]',
'score': 0.04962705448269844,
'token': 13362,
'token_str': 'barber'},
{'sequence': '[CLS] the man worked as a mechanic. [SEP]',
'score': 0.03788609802722931,
'token': 15893,
'token_str': 'mechanic'},
{'sequence': '[CLS] the man worked as a salesman. [SEP]',
'score': 0.037680890411138535,
'token': 18968,
'token_str': 'salesman'}]
>>> unmasker("The woman worked as a [MASK].")
[{'sequence': '[CLS] the woman worked as a nurse. [SEP]',
'score': 0.21981462836265564,
'token': 6821,
'token_str': 'nurse'},
{'sequence': '[CLS] the woman worked as a waitress. [SEP]',
'score': 0.1597415804862976,
'token': 13877,
'token_str': 'waitress'},
{'sequence': '[CLS] the woman worked as a maid. [SEP]',
'score': 0.1154729500412941,
'token': 10850,
'token_str': 'maid'},
{'sequence': '[CLS] the woman worked as a prostitute. [SEP]',
'score': 0.037968918681144714,
'token': 19215,
'token_str': 'prostitute'},
{'sequence': '[CLS] the woman worked as a cook. [SEP]',
'score': 0.03042375110089779,
'token': 5660,
'token_str': 'cook'}]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The BERT model was pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038
unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and
headers).
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are
then of the form:
```
[CLS] Sentence A [SEP] Sentence B [SEP]
```
With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus, and in
the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a
consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two
"sentences" has a combined length of less than 512 tokens.
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
### Pretraining
The model was trained on 4 cloud TPUs in Pod configuration (16 TPU chips total) for one million steps with a batch size
of 256. The sequence length was limited to 128 tokens for 90% of the steps and 512 for the remaining 10%. The optimizer
used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01,
learning rate warmup for 10,000 steps and linear decay of the learning rate after.
## Evaluation results
When fine-tuned on downstream tasks, this model achieves the following results:
Glue test results:
| Task | MNLI-(m/mm) | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | Average |
|:----:|:-----------:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|:-------:|
| | 84.6/83.4 | 71.2 | 90.5 | 93.5 | 52.1 | 85.8 | 88.9 | 66.4 | 79.6 |
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1810-04805,
author = {Jacob Devlin and
Ming{-}Wei Chang and
Kenton Lee and
Kristina Toutanova},
title = {{BERT:} Pre-training of Deep Bidirectional Transformers for Language
Understanding},
journal = {CoRR},
volume = {abs/1810.04805},
year = {2018},
url = {http://arxiv.org/abs/1810.04805},
archivePrefix = {arXiv},
eprint = {1810.04805},
timestamp = {Tue, 30 Oct 2018 20:39:56 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1810-04805.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<a href="https://huggingface.co/exbert/?model=bert-base-uncased">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| [
-0.13803230226039886,
-0.6192845106124878,
0.1602371782064438,
0.3106997609138489,
-0.5290508270263672,
0.004081550054252148,
-0.12402311712503433,
-0.2276398241519928,
0.45079508423805237,
0.5585923194885254,
-0.5558132529258728,
-0.44755035638809204,
-0.7645061612129211,
0.14166030287742615,
-0.4733649790287018,
1.148371696472168,
0.2688816487789154,
0.30547770857810974,
0.06562762707471848,
0.1766130030155182,
-0.4065377414226532,
-0.8636099100112915,
-0.7098984718322754,
-0.3355167508125305,
0.44625410437583923,
0.301130086183548,
0.6639394760131836,
0.5381333231925964,
0.4197166860103607,
0.4168151021003723,
-0.05249226838350296,
-0.07084067910909653,
-0.27731427550315857,
0.06322961300611496,
-0.00021178112365305424,
-0.5025051832199097,
-0.4444476366043091,
0.20064224302768707,
0.5259288549423218,
0.8548062443733215,
-0.010065288282930851,
0.24754247069358826,
-0.10088975727558136,
0.5585716366767883,
-0.16587615013122559,
0.3540886342525482,
-0.5316713452339172,
0.06025778129696846,
-0.26823943853378296,
0.24264734983444214,
-0.3985133469104767,
-0.20217566192150116,
0.18782168626785278,
-0.5243902206420898,
0.2082635760307312,
0.11077828705310822,
1.1502846479415894,
0.0771172046661377,
-0.17582803964614868,
-0.12339440733194351,
-0.43010684847831726,
0.8277966380119324,
-0.8536105751991272,
0.2367546558380127,
0.48630964756011963,
0.1699536144733429,
-0.2450374811887741,
-1.0350950956344604,
-0.413485586643219,
-0.0921948254108429,
-0.19162693619728088,
0.05314605310559273,
-0.0641569197177887,
-0.06356833130121231,
0.2720750868320465,
0.4537912607192993,
-0.37581223249435425,
-0.004474344197660685,
-0.6792744398117065,
-0.3774593770503998,
0.7435230016708374,
0.039672356098890305,
0.3040522038936615,
-0.43387195467948914,
-0.28129276633262634,
-0.26633888483047485,
-0.3028049170970917,
0.10606717318296432,
0.5642017722129822,
0.4933895170688629,
-0.25006961822509766,
0.7612375020980835,
-0.1792728304862976,
0.6191464066505432,
0.01688656397163868,
0.047327153384685516,
0.48603883385658264,
-0.04496504366397858,
-0.3923715651035309,
-0.07834791392087936,
0.9605391025543213,
0.2836480438709259,
0.42018628120422363,
-0.054556574672460556,
-0.34588760137557983,
0.0443054735660553,
0.2645981013774872,
-0.5633383393287659,
-0.4026804268360138,
0.11370101571083069,
-0.549643337726593,
-0.38744768500328064,
0.4227326214313507,
-0.6604947447776794,
-0.07621705532073975,
-0.060126468539237976,
0.5579286217689514,
-0.3905485272407532,
-0.16956913471221924,
0.18657349050045013,
-0.4608735144138336,
0.19950851798057556,
0.041664645075798035,
-0.9312711954116821,
0.19922976195812225,
0.6141802072525024,
0.8026665449142456,
0.30812856554985046,
-0.14432315528392792,
-0.29118654131889343,
-0.2694643437862396,
-0.326515257358551,
0.4735011160373688,
-0.34595805406570435,
-0.43745896220207214,
0.05949565768241882,
0.3533850312232971,
-0.10637148469686508,
-0.265968382358551,
0.6752703785896301,
-0.446931391954422,
0.6121779084205627,
-0.06242600455880165,
-0.5242198705673218,
-0.3238929212093353,
0.08239806443452835,
-0.7492706775665283,
1.201371431350708,
0.3201073408126831,
-0.6582396030426025,
0.229852557182312,
-0.8943874835968018,
-0.6380015015602112,
0.20348364114761353,
0.18805637955665588,
-0.4074329435825348,
0.2053455263376236,
0.1839512288570404,
0.44387730956077576,
-0.06894975155591965,
0.29960963129997253,
-0.15718811750411987,
-0.3844466507434845,
0.33891263604164124,
-0.2425294816493988,
1.0848180055618286,
0.1899726390838623,
-0.31034615635871887,
0.13866448402404785,
-0.7879849076271057,
0.1007218286395073,
0.2578129470348358,
-0.33447548747062683,
-0.13029158115386963,
-0.08680806308984756,
0.3311668634414673,
0.16579510271549225,
0.46513912081718445,
-0.6448057293891907,
0.24545834958553314,
-0.6223706007003784,
0.7286659479141235,
0.7970157861709595,
-0.17365425825119019,
0.2620672881603241,
-0.36887744069099426,
0.4825612008571625,
-0.05050835758447647,
-0.007248491980135441,
-0.14747057855129242,
-0.8010455369949341,
-0.851518988609314,
-0.3559614419937134,
0.6865381002426147,
0.726710319519043,
-0.5411839485168457,
0.7902160882949829,
-0.006537196226418018,
-0.596182107925415,
-0.6670578718185425,
-0.11152021586894989,
0.37088698148727417,
0.37899619340896606,
0.30979764461517334,
-0.533173143863678,
-0.9129973649978638,
-0.8744640350341797,
-0.25047871470451355,
-0.23701751232147217,
-0.2505888044834137,
0.12031707912683487,
0.6274333596229553,
-0.3646758198738098,
0.7743611335754395,
-0.7045308351516724,
-0.4186187982559204,
-0.26631805300712585,
0.24213635921478271,
0.6435428261756897,
0.7099353075027466,
0.3796291649341583,
-0.5750983953475952,
-0.4169641137123108,
-0.324154794216156,
-0.5575422048568726,
0.02684909477829933,
0.028066780418157578,
-0.1787625104188919,
0.2370876669883728,
0.49473169445991516,
-0.7522984147071838,
0.4727226495742798,
0.36983442306518555,
-0.4928699731826782,
0.6715134978294373,
-0.362436980009079,
-0.11356937140226364,
-1.250749945640564,
0.12975919246673584,
-0.1310955137014389,
-0.29960164427757263,
-0.774731457233429,
0.019156020134687424,
-0.13041135668754578,
-0.04202144593000412,
-0.5415815711021423,
0.5360184907913208,
-0.4423709511756897,
0.0191937368363142,
0.047979485243558884,
-0.14482644200325012,
0.023336781188845634,
0.470146507024765,
0.08433414995670319,
0.5737142562866211,
0.5431128144264221,
-0.5083522200584412,
0.5514700412750244,
0.42513981461524963,
-0.6038205623626709,
0.04325557500123978,
-0.8492563366889954,
0.21954473853111267,
0.04548364132642746,
0.05990548059344292,
-1.1149646043777466,
-0.3303162157535553,
0.27873897552490234,
-0.6232832074165344,
0.20809611678123474,
-0.07152773439884186,
-0.7302897572517395,
-0.620027482509613,
-0.2606021463871002,
0.34765100479125977,
0.5946637988090515,
-0.2603442966938019,
0.4063154458999634,
0.365680068731308,
-0.1836359053850174,
-0.6123171448707581,
-0.7686065435409546,
0.15977489948272705,
-0.11707022786140442,
-0.5863004922866821,
0.404064804315567,
-0.0608481802046299,
-0.09529576450586319,
-0.17340660095214844,
0.10184448957443237,
-0.20623742043972015,
0.02789803221821785,
0.1768517941236496,
0.4897805154323578,
-0.20621058344841003,
-0.1057608351111412,
-0.19678594172000885,
-0.10988868027925491,
0.23276065289974213,
-0.16244132816791534,
0.8377106785774231,
-0.07657545804977417,
-0.04181939736008644,
-0.24167506396770477,
0.36952656507492065,
0.6050697565078735,
-0.13608558475971222,
0.695999264717102,
0.8310487270355225,
-0.6140307188034058,
0.033105071634054184,
-0.3995307385921478,
-0.18390792608261108,
-0.5213146805763245,
0.4942634403705597,
-0.43399935960769653,
-0.8381797671318054,
0.7833861112594604,
0.34849661588668823,
-0.08143581449985504,
0.7546888589859009,
0.6029271483421326,
-0.21474629640579224,
1.0599223375320435,
0.5761456489562988,
-0.1505657434463501,
0.48247769474983215,
-0.15521612763404846,
0.38251781463623047,
-0.7447612881660461,
-0.45286405086517334,
-0.378903865814209,
-0.2771838307380676,
-0.5448154211044312,
-0.1912003457546234,
0.1851719319820404,
0.23897941410541534,
-0.35970693826675415,
0.6736744046211243,
-0.6676099896430969,
0.36883699893951416,
0.9928874969482422,
0.29533851146698,
-0.18997572362422943,
-0.21365730464458466,
-0.2661430537700653,
-0.03410195931792259,
-0.43331629037857056,
-0.43053773045539856,
1.0891273021697998,
0.5258560180664062,
0.6876323819160461,
0.08992121368646622,
0.5852745175361633,
0.36235398054122925,
-0.025372585281729698,
-0.7080572843551636,
0.6369431018829346,
-0.4377203583717346,
-0.9314873814582825,
-0.38198694586753845,
-0.10531964898109436,
-1.019527554512024,
0.21792982518672943,
-0.27364882826805115,
-0.8726747035980225,
-0.049346961081027985,
-0.16987855732440948,
-0.31738296151161194,
0.18735308945178986,
-0.8154957294464111,
1.063266396522522,
-0.30788809061050415,
-0.04397705942392349,
0.17955242097377777,
-0.9496042132377625,
0.30020278692245483,
0.02491830475628376,
0.10428529977798462,
-0.15326043963432312,
0.2608059048652649,
1.0264408588409424,
-0.5369138717651367,
1.0577627420425415,
-0.1550064980983734,
0.13830135762691498,
0.05640077218413353,
-0.07377086579799652,
0.3263241648674011,
0.032165732234716415,
0.05158763751387596,
0.36999082565307617,
0.08234528452157974,
-0.46583008766174316,
-0.12367376685142517,
0.3621559143066406,
-0.7672803997993469,
-0.5192857980728149,
-0.6377155780792236,
-0.6551439166069031,
0.08852884918451309,
0.47268834710121155,
0.6279358267784119,
0.4953327476978302,
-0.12265525013208389,
0.2615515887737274,
0.44197818636894226,
-0.32497653365135193,
0.741146445274353,
0.3379362225532532,
-0.22434242069721222,
-0.49960091710090637,
0.6298332810401917,
0.02597065642476082,
0.001218059565871954,
0.4611663222312927,
0.19950424134731293,
-0.5875136852264404,
-0.20130164921283722,
-0.36512264609336853,
0.1676669418811798,
-0.5735356211662292,
-0.3025788962841034,
-0.5687260031700134,
-0.5465420484542847,
-0.7197433710098267,
-0.072560153901577,
-0.16257226467132568,
-0.5167579650878906,
-0.6162938475608826,
-0.14378760755062103,
0.4382334053516388,
0.7213014364242554,
-0.17281866073608398,
0.4753876328468323,
-0.7563353180885315,
0.25605273246765137,
0.30475765466690063,
0.4537714421749115,
-0.2705024182796478,
-0.7687344551086426,
-0.34043362736701965,
0.08024726063013077,
-0.1439378261566162,
-0.8297418355941772,
0.7260380387306213,
0.2486460953950882,
0.5426497459411621,
0.49578988552093506,
0.024633243680000305,
0.5968892574310303,
-0.6739686727523804,
1.0540086030960083,
0.2401708960533142,
-1.153200626373291,
0.5330343246459961,
-0.32598379254341125,
0.21797630190849304,
0.3011770248413086,
0.22431126236915588,
-0.6574682593345642,
-0.4019497036933899,
-0.8074560761451721,
-1.0247756242752075,
0.8381979465484619,
0.1609281599521637,
0.4190516173839569,
-0.11242090910673141,
0.16655801236629486,
0.14812050759792328,
0.39476969838142395,
-1.041975975036621,
-0.5461575388908386,
-0.5158277750015259,
-0.33655139803886414,
-0.20296330749988556,
-0.3151985704898834,
-0.019693691283464432,
-0.6300763487815857,
0.727076530456543,
0.14937016367912292,
0.5725984573364258,
0.10094742476940155,
-0.18628470599651337,
0.1293172985315323,
0.1872096061706543,
0.8267121911048889,
0.45797082781791687,
-0.5157629251480103,
-0.006592240184545517,
0.020187776535749435,
-0.6272589564323425,
-0.01710333488881588,
0.21679218113422394,
0.014369291253387928,
0.30381450057029724,
0.598061203956604,
0.8241206407546997,
0.23582832515239716,
-0.5537288188934326,
0.5959219336509705,
0.14761607348918915,
-0.3579786717891693,
-0.5576305389404297,
0.026883384212851524,
-0.02437056414783001,
0.13821454346179962,
0.4550849497318268,
0.11816008388996124,
0.05914558097720146,
-0.5740216970443726,
0.4041210412979126,
0.41118377447128296,
-0.5114649534225464,
-0.29843971133232117,
0.8397412896156311,
0.09546954184770584,
-0.6456993222236633,
0.8068605065345764,
-0.14229761064052582,
-0.8561726212501526,
0.7126699686050415,
0.7035050392150879,
0.9322277903556824,
-0.2229476124048233,
0.2760186791419983,
0.43370577692985535,
0.45213964581489563,
-0.20198273658752441,
0.4188247621059418,
0.34630557894706726,
-0.8748067617416382,
-0.3495137691497803,
-0.736834704875946,
-0.1827535778284073,
0.2765522301197052,
-0.759972095489502,
0.26115939021110535,
-0.5261656641960144,
-0.23699575662612915,
0.18006709218025208,
0.02379618026316166,
-0.6601823568344116,
0.44869792461395264,
0.08724547922611237,
1.0358158349990845,
-0.9742462635040283,
1.0324792861938477,
0.8221480846405029,
-0.6414117217063904,
-0.8539101481437683,
-0.38350430130958557,
-0.3086774945259094,
-1.1308364868164062,
0.6889258623123169,
0.3269229829311371,
0.3554723560810089,
-0.022916670888662338,
-0.6327846050262451,
-0.7568688988685608,
0.7499313950538635,
0.1916133314371109,
-0.4025214910507202,
-0.13277529180049896,
0.1345950812101364,
0.5803408026695251,
-0.5300600528717041,
0.4626650810241699,
0.5775376558303833,
0.45436519384384155,
-0.06818529218435287,
-0.8327579498291016,
0.021837491542100906,
-0.4765967130661011,
0.023729776963591576,
0.08987502008676529,
-0.46223533153533936,
1.216568112373352,
-0.11444200575351715,
0.017603972926735878,
0.23893287777900696,
0.5338529348373413,
-0.06901787966489792,
0.011596420779824257,
0.4904157221317291,
0.6117333769798279,
0.6960384845733643,
-0.38633623719215393,
0.770861804485321,
-0.22451025247573853,
0.5145083665847778,
0.8213747143745422,
0.05105041339993477,
0.8373885750770569,
0.35762903094291687,
-0.2934437692165375,
0.9487704038619995,
0.880486011505127,
-0.3376554846763611,
0.7527063488960266,
0.23257756233215332,
-0.06553570181131363,
-0.1104450523853302,
0.1182233914732933,
-0.3455437123775482,
0.545903205871582,
0.25505802035331726,
-0.5534180402755737,
0.02357511967420578,
-0.10636533796787262,
0.18903802335262299,
-0.1518721878528595,
-0.4198291301727295,
0.691628098487854,
0.17641650140285492,
-0.6685870885848999,
0.36400142312049866,
0.2801911532878876,
0.7953732013702393,
-0.607645571231842,
0.041489068418741226,
-0.14805099368095398,
0.22331134974956512,
-0.0938158705830574,
-0.8641371726989746,
0.20355632901191711,
-0.14123350381851196,
-0.44209229946136475,
-0.2721770703792572,
0.7503757476806641,
-0.5239772796630859,
-0.723527193069458,
0.08142466843128204,
0.29924848675727844,
0.3373708128929138,
-0.10271482914686203,
-0.8239246010780334,
-0.215070441365242,
0.0830790176987648,
-0.12995989620685577,
0.12743569910526276,
0.2892627716064453,
0.08834994584321976,
0.5474291443824768,
0.8160749077796936,
-0.10359161347150803,
0.14256630837917328,
0.04266558215022087,
0.6977064609527588,
-0.9694510698318481,
-0.7701188921928406,
-0.9321794509887695,
0.6415430903434753,
-0.09263905882835388,
-0.5593615770339966,
0.7003706097602844,
0.6768709421157837,
0.7594751715660095,
-0.4654989540576935,
0.5707240700721741,
-0.21403862535953522,
0.5640919208526611,
-0.37265148758888245,
0.8335179686546326,
-0.41798877716064453,
-0.04553958401083946,
-0.4268200993537903,
-0.8182873129844666,
-0.3529592752456665,
0.8702524304389954,
-0.09258705377578735,
0.04159963130950928,
0.668631911277771,
0.5820992588996887,
0.1025586724281311,
-0.12646803259849548,
0.21431341767311096,
0.1579786092042923,
0.07410673797130585,
0.39111486077308655,
0.5623519420623779,
-0.6589696407318115,
0.40491464734077454,
-0.21959789097309113,
-0.05869700759649277,
-0.34417280554771423,
-0.9021627902984619,
-1.0290415287017822,
-0.6258451342582703,
-0.21078313887119293,
-0.5663172602653503,
-0.22067217528820038,
0.928997278213501,
0.8050210475921631,
-1.0000182390213013,
-0.3264838457107544,
-0.012294024229049683,
0.10840906947851181,
-0.28127139806747437,
-0.29127371311187744,
0.44961288571357727,
-0.23365746438503265,
-0.8167203068733215,
0.2907901108264923,
-0.051477350294589996,
0.09820318222045898,
-0.15061800181865692,
0.060850951820611954,
-0.421030730009079,
0.13056331872940063,
0.6232578158378601,
0.16907155513763428,
-0.8316159248352051,
-0.48931288719177246,
0.09717779606580734,
-0.17640309035778046,
0.08165102452039719,
0.4374104142189026,
-0.5448010563850403,
0.38198283314704895,
0.41747260093688965,
0.41178444027900696,
0.5863814949989319,
0.03716285154223442,
0.7206695675849915,
-1.1450912952423096,
0.29506197571754456,
0.21344904601573944,
0.5399811267852783,
0.39847537875175476,
-0.4467317461967468,
0.5304557681083679,
0.4364199936389923,
-0.4442599415779114,
-0.8868696689605713,
-0.006893590558320284,
-0.9835314154624939,
-0.2902129292488098,
0.8685135245323181,
-0.16062860190868378,
-0.2914949059486389,
-0.08235326409339905,
-0.30138280987739563,
0.3749312460422516,
-0.37884536385536194,
0.7105657458305359,
0.926425576210022,
0.09532206505537033,
-0.17184488475322723,
-0.33588552474975586,
0.39867639541625977,
0.5001121759414673,
-0.46176016330718994,
-0.3737383484840393,
0.13817165791988373,
0.42050132155418396,
0.23567984998226166,
0.5459875464439392,
-0.08703693002462387,
0.10825204104185104,
0.1222139373421669,
0.30633780360221863,
-0.05102096125483513,
-0.13635864853858948,
-0.2577550709247589,
0.01983777992427349,
-0.15791550278663635,
-0.7027446031570435
] |
openai/clip-vit-large-patch14 | openai | "2023-09-15T15:49:35" | 31,237,964 | 734 | transformers | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"clip",
"zero-shot-image-classification",
"vision",
"arxiv:2103.00020",
"arxiv:1908.04913",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-image-classification | "2022-03-02T23:29:05" | ---
tags:
- vision
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png
candidate_labels: playing music, playing sports
example_title: Cat & Dog
---
# Model Card: CLIP
Disclaimer: The model card is taken and modified from the official CLIP repository, it can be found [here](https://github.com/openai/CLIP/blob/main/model-card.md).
## Model Details
The CLIP model was developed by researchers at OpenAI to learn about what contributes to robustness in computer vision tasks. The model was also developed to test the ability of models to generalize to arbitrary image classification tasks in a zero-shot manner. It was not developed for general model deployment - to deploy models like CLIP, researchers will first need to carefully study their capabilities in relation to the specific context they’re being deployed within.
### Model Date
January 2021
### Model Type
The base model uses a ViT-L/14 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.
The original implementation had two variants: one using a ResNet image encoder and the other using a Vision Transformer. This repository has the variant with the Vision Transformer.
### Documents
- [Blog Post](https://openai.com/blog/clip/)
- [CLIP Paper](https://arxiv.org/abs/2103.00020)
### Use with Transformers
```python
from PIL import Image
import requests
from transformers import CLIPProcessor, CLIPModel
model = CLIPModel.from_pretrained("openai/clip-vit-large-patch14")
processor = CLIPProcessor.from_pretrained("openai/clip-vit-large-patch14")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
inputs = processor(text=["a photo of a cat", "a photo of a dog"], images=image, return_tensors="pt", padding=True)
outputs = model(**inputs)
logits_per_image = outputs.logits_per_image # this is the image-text similarity score
probs = logits_per_image.softmax(dim=1) # we can take the softmax to get the label probabilities
```
## Model Use
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
### Out-of-Scope Use Cases
**Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIP’s performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful.
Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use.
Since the model has not been purposefully trained in or evaluated on any languages other than English, its use should be limited to English language use cases.
## Data
The model was trained on publicly available image-caption data. This was done through a combination of crawling a handful of websites and using commonly-used pre-existing image datasets such as [YFCC100M](http://projects.dfki.uni-kl.de/yfcc100m/). A large portion of the data comes from our crawling of the internet. This means that the data is more representative of people and societies most connected to the internet which tend to skew towards more developed nations, and younger, male users.
### Data Mission Statement
Our goal with building this dataset was to test out robustness and generalizability in computer vision tasks. As a result, the focus was on gathering large quantities of data from different publicly-available internet data sources. The data was gathered in a mostly non-interventionist manner. However, we only crawled websites that had policies against excessively violent and adult images and allowed us to filter out such content. We do not intend for this dataset to be used as the basis for any commercial or deployed model and will not be releasing the dataset.
## Performance and Limitations
### Performance
We have evaluated the performance of CLIP on a wide range of benchmarks across a variety of computer vision datasets such as OCR to texture recognition to fine-grained classification. The paper describes model performance on the following datasets:
- Food101
- CIFAR10
- CIFAR100
- Birdsnap
- SUN397
- Stanford Cars
- FGVC Aircraft
- VOC2007
- DTD
- Oxford-IIIT Pet dataset
- Caltech101
- Flowers102
- MNIST
- SVHN
- IIIT5K
- Hateful Memes
- SST-2
- UCF101
- Kinetics700
- Country211
- CLEVR Counting
- KITTI Distance
- STL-10
- RareAct
- Flickr30
- MSCOCO
- ImageNet
- ImageNet-A
- ImageNet-R
- ImageNet Sketch
- ObjectNet (ImageNet Overlap)
- Youtube-BB
- ImageNet-Vid
## Limitations
CLIP and our analysis of it have a number of limitations. CLIP currently struggles with respect to certain tasks such as fine grained classification and counting objects. CLIP also poses issues with regards to fairness and bias which we discuss in the paper and briefly in the next section. Additionally, our approach to testing CLIP also has an important limitation- in many cases we have used linear probes to evaluate the performance of CLIP and there is evidence suggesting that linear probes can underestimate model performance.
### Bias and Fairness
We find that the performance of CLIP - and the specific biases it exhibits - can depend significantly on class design and the choices one makes for categories to include and exclude. We tested the risk of certain kinds of denigration with CLIP by classifying images of people from [Fairface](https://arxiv.org/abs/1908.04913) into crime-related and non-human animal categories. We found significant disparities with respect to race and gender. Additionally, we found that these disparities could shift based on how the classes were constructed. (Details captured in the Broader Impacts Section in the paper).
We also tested the performance of CLIP on gender, race and age classification using the Fairface dataset (We default to using race categories as they are constructed in the Fairface dataset.) in order to assess quality of performance across different demographics. We found accuracy >96% across all races for gender classification with ‘Middle Eastern’ having the highest accuracy (98.4%) and ‘White’ having the lowest (96.5%). Additionally, CLIP averaged ~93% for racial classification and ~63% for age classification. Our use of evaluations to test for gender, race and age classification as well as denigration harms is simply to evaluate performance of the model across people and surface potential risks and not to demonstrate an endorsement/enthusiasm for such tasks.
## Feedback
### Where to send questions or comments about the model
Please use [this Google Form](https://forms.gle/Uv7afRH5dvY34ZEs9) | [
-0.5028195381164551,
-0.5711244940757751,
0.16508552432060242,
-0.030113739892840385,
-0.1611039787530899,
-0.251861035823822,
0.021790342405438423,
-0.7077082395553589,
0.12783248722553253,
0.38478925824165344,
-0.2797425389289856,
-0.40631571412086487,
-0.6301049590110779,
0.12085618078708649,
-0.6227642297744751,
0.7121697664260864,
-0.06300313770771027,
0.06447663903236389,
-0.30519938468933105,
-0.3298357427120209,
-0.5006494522094727,
-0.684582531452179,
-0.23790468275547028,
0.16244472563266754,
0.08657322824001312,
0.14136958122253418,
0.6573242545127869,
0.8399184942245483,
0.7971466183662415,
0.22006890177726746,
-0.3121120035648346,
-0.11205697059631348,
-0.5017342567443848,
-0.6126406192779541,
-0.37977349758148193,
-0.3890714645385742,
-0.3888903856277466,
0.2065761387348175,
0.530386209487915,
0.36309415102005005,
0.03135514631867409,
0.29356279969215393,
0.0715906023979187,
0.36597150564193726,
-0.9183122515678406,
-0.04925771430134773,
-0.550231397151947,
0.06851169466972351,
-0.2802903354167938,
0.14372166991233826,
-0.17106114327907562,
-0.18843317031860352,
0.31254637241363525,
-0.5033757090568542,
0.4781454801559448,
-0.06280427426099777,
1.2957972288131714,
0.17866738140583038,
-0.16547377407550812,
-0.030125442892313004,
-0.5843654274940491,
0.7435945272445679,
-0.5669578313827515,
0.23714426159858704,
0.227054163813591,
0.38307246565818787,
0.14191551506519318,
-0.8340786695480347,
-0.6277564167976379,
-0.06186347454786301,
0.2992108464241028,
0.016407685354351997,
-0.2242918163537979,
-0.05588332191109657,
0.41794395446777344,
0.4899693727493286,
-0.15943555533885956,
-0.05881171673536301,
-0.7083445191383362,
-0.21504192054271698,
0.6695651412010193,
0.2949073612689972,
0.3341822624206543,
-0.2303634136915207,
-0.6251868605613708,
-0.46719446778297424,
-0.44603231549263,
0.5302424430847168,
0.380377858877182,
0.09600826352834702,
-0.15017825365066528,
0.6406089067459106,
-0.04701736941933632,
0.4235216975212097,
0.006798685062676668,
-0.34597915410995483,
0.34341034293174744,
-0.4676169753074646,
-0.1854092925786972,
-0.26963406801223755,
0.7524247765541077,
0.8228386640548706,
0.17618413269519806,
0.21115246415138245,
-0.08750100433826447,
0.21276354789733887,
0.33855465054512024,
-0.9186346530914307,
-0.15680626034736633,
-0.2005186229944229,
-0.6218684315681458,
-0.36613860726356506,
0.2804749011993408,
-0.9126017689704895,
0.07640203088521957,
-0.11978050321340561,
0.7312285304069519,
-0.4427567720413208,
-0.07432659715414047,
0.19061149656772614,
-0.32081395387649536,
0.32052892446517944,
0.3265687823295593,
-0.6618086099624634,
0.375336229801178,
0.3191087543964386,
1.0875048637390137,
-0.4695495367050171,
-0.309076726436615,
0.04549911990761757,
-0.06615445762872696,
-0.11220301687717438,
0.705342173576355,
-0.3745212256908417,
-0.46406328678131104,
-0.18868905305862427,
0.4369472563266754,
-0.1183733344078064,
-0.6053619384765625,
0.5760085582733154,
-0.2064000517129898,
0.02570156380534172,
-0.27992719411849976,
-0.38404640555381775,
-0.6109395027160645,
0.31667831540107727,
-0.7068378329277039,
0.8819748163223267,
0.14677900075912476,
-0.7708446979522705,
0.37301212549209595,
-0.7077308297157288,
-0.050391409546136856,
-0.12766338884830475,
-0.0986647829413414,
-0.590827465057373,
-0.28179657459259033,
0.3972231149673462,
0.3203490078449249,
-0.22642405331134796,
0.36751869320869446,
-0.5967095494270325,
-0.488293319940567,
0.18390490114688873,
-0.43565550446510315,
0.8779375553131104,
0.019687524065375328,
-0.32468920946121216,
0.0018868875922635198,
-0.4563602805137634,
-0.17163318395614624,
0.35109665989875793,
0.00854884646832943,
-0.16103093326091766,
-0.10457765311002731,
0.195853590965271,
0.09489814192056656,
-0.04080970585346222,
-0.6779976487159729,
0.13205716013908386,
-0.0824737697839737,
0.5364310145378113,
0.672411322593689,
0.09234849363565445,
0.27319902181625366,
-0.4192888140678406,
0.5167664885520935,
-0.023885199800133705,
0.6510520577430725,
-0.2498805671930313,
-0.5108723640441895,
-0.48441970348358154,
-0.45953813195228577,
0.5771525502204895,
0.6415369510650635,
-0.43276333808898926,
0.16419798135757446,
-0.13849157094955444,
-0.3326675295829773,
-0.17753303050994873,
-0.2189757525920868,
0.3437015116214752,
0.6435902118682861,
0.3398193120956421,
-0.9699578881263733,
-0.40039825439453125,
-1.0378868579864502,
0.193033829331398,
0.0687246173620224,
-0.055297207087278366,
0.6836110353469849,
0.8945345282554626,
-0.23453567922115326,
1.0630948543548584,
-0.7434713840484619,
-0.4113602936267853,
-0.13491886854171753,
-0.13158659636974335,
-0.02208922617137432,
0.49188563227653503,
0.933982789516449,
-0.9217532277107239,
-0.25868284702301025,
-0.5227254629135132,
-0.7975072860717773,
0.14319489896297455,
0.19539548456668854,
-0.08907229453325272,
0.04275206848978996,
0.22207587957382202,
-0.24461990594863892,
1.0148260593414307,
0.25689712166786194,
-0.05390371009707451,
0.7225462198257446,
0.0852150097489357,
0.28364816308021545,
-0.5840480327606201,
0.358704149723053,
0.17082981765270233,
-0.15007860958576202,
-0.48014792799949646,
0.054168250411748886,
-0.008770237676799297,
-0.42058154940605164,
-0.9158760905265808,
0.3658343255519867,
-0.14195765554904938,
-0.12202370911836624,
-0.15764901041984558,
-0.18699340522289276,
0.31262704730033875,
0.707561731338501,
0.13684514164924622,
1.065791368484497,
0.4882872998714447,
-0.7487250566482544,
-0.02375790663063526,
0.5347656011581421,
-0.4691179394721985,
0.528555154800415,
-0.9416053891181946,
-0.041198957711458206,
-0.058412253856658936,
0.10494133085012436,
-0.5598915815353394,
-0.33406272530555725,
0.30556830763816833,
-0.348550021648407,
0.20931851863861084,
-0.13105492293834686,
-0.3144756853580475,
-0.5916775465011597,
-0.541726291179657,
0.7433608174324036,
0.5038515329360962,
-0.4432712197303772,
0.36277687549591064,
0.7067604064941406,
0.1865304410457611,
-0.5297303199768066,
-0.7572582960128784,
-0.0785357728600502,
-0.2049943208694458,
-0.7184067964553833,
0.5393203496932983,
-0.000991243403404951,
0.07467357069253922,
0.13024143874645233,
0.09038093686103821,
-0.30982673168182373,
0.02703453227877617,
0.45370882749557495,
0.5104103088378906,
-0.08095070719718933,
-0.11696920543909073,
-0.2951515018939972,
0.35325202345848083,
-0.07419192790985107,
0.12662716209888458,
0.2725735604763031,
-0.1398490071296692,
-0.3417103886604309,
-0.5028554201126099,
0.3225213587284088,
0.4484611451625824,
-0.26685816049575806,
0.48448121547698975,
0.48563775420188904,
-0.2727079391479492,
0.11007951945066452,
-0.5294326543807983,
-0.03627756983041763,
-0.4383474588394165,
0.49795955419540405,
-0.12278864532709122,
-0.6673111319541931,
0.7203261852264404,
0.13678675889968872,
-0.14276476204395294,
0.6205142140388489,
0.30090951919555664,
0.010675244964659214,
0.8433942198753357,
0.9314384460449219,
0.03658169507980347,
0.637567400932312,
-0.8018835783004761,
-0.016860339790582657,
-0.9967478513717651,
-0.3349451422691345,
-0.25349554419517517,
-0.21361087262630463,
-0.43169811367988586,
-0.5531179904937744,
0.5778758525848389,
0.1749413162469864,
-0.10507138818502426,
0.4131318926811218,
-0.6568560600280762,
0.44793498516082764,
0.6150713562965393,
0.44492286443710327,
0.016556061804294586,
-0.08382226526737213,
0.0038020526990294456,
-0.1601242870092392,
-0.6665180921554565,
-0.49243462085723877,
1.1049301624298096,
0.6568570733070374,
0.6982197761535645,
-0.219181627035141,
0.2164650708436966,
0.41842302680015564,
-0.08627421408891678,
-0.7405120730400085,
0.5324667096138,
-0.452153742313385,
-0.7049142122268677,
-0.18583594262599945,
-0.05789165943861008,
-0.7536709308624268,
0.1509086638689041,
-0.14023931324481964,
-0.7350054979324341,
0.5999876856803894,
0.1336221992969513,
-0.33132606744766235,
0.664806067943573,
-0.5871095657348633,
0.9741711616516113,
-0.28843751549720764,
-0.42984047532081604,
0.07198484987020493,
-0.6395854353904724,
0.5682413578033447,
0.07101847976446152,
0.027759341523051262,
-0.20734894275665283,
0.10009420663118362,
1.0681242942810059,
-0.5721616744995117,
0.9192345142364502,
-0.11490731686353683,
0.4123953580856323,
0.733132541179657,
-0.18151617050170898,
0.05528507009148598,
-0.2046879380941391,
0.19137358665466309,
0.6974955201148987,
0.2737804651260376,
-0.10960999131202698,
-0.3654259145259857,
0.14380238950252533,
-0.7200766801834106,
-0.3924625813961029,
-0.3661554753780365,
-0.43872401118278503,
0.22323721647262573,
0.20007455348968506,
0.5434206128120422,
0.7500629425048828,
-0.049234941601753235,
0.16131071746349335,
0.6170294284820557,
-0.4987105131149292,
0.3722393810749054,
0.19649247825145721,
-0.2745506167411804,
-0.5152713060379028,
0.9004989266395569,
0.27599915862083435,
0.21385638415813446,
0.04094332084059715,
0.08592497557401657,
-0.22439292073249817,
-0.4817015528678894,
-0.4367757737636566,
0.0721515417098999,
-0.7238396406173706,
-0.42308929562568665,
-0.5371016263961792,
-0.36727914214134216,
-0.4371652901172638,
-0.011593177914619446,
-0.4757044017314911,
-0.3319774270057678,
-0.6205341219902039,
0.2072107195854187,
0.1745082288980484,
0.6345257759094238,
-0.09800586849451065,
0.287059485912323,
-0.611431360244751,
0.2482345849275589,
0.37989458441734314,
0.5218263864517212,
0.0681656002998352,
-0.6873040795326233,
-0.14253534376621246,
-0.00027594840503297746,
-0.8657500147819519,
-0.7793110013008118,
0.4396631121635437,
0.3247648775577545,
0.5792436003684998,
0.35628408193588257,
0.09551706910133362,
0.6846559643745422,
-0.4215455949306488,
1.0679614543914795,
0.22588472068309784,
-0.9370443820953369,
0.543653666973114,
-0.30473366379737854,
0.2139168381690979,
0.6727586388587952,
0.478923499584198,
-0.2076156586408615,
-0.12663805484771729,
-0.5384963750839233,
-0.8774662017822266,
0.7806020975112915,
0.1285153031349182,
0.04781070351600647,
0.0628289133310318,
0.3291599750518799,
0.021494301036000252,
0.08500473201274872,
-0.6926113963127136,
-0.16124895215034485,
-0.49545279145240784,
0.05799638479948044,
0.2840508818626404,
-0.42602938413619995,
0.024179264903068542,
-0.4146389365196228,
0.4005623161792755,
-0.05217877775430679,
0.5533779859542847,
0.5316130518913269,
-0.17105555534362793,
0.13704989850521088,
-0.09849122911691666,
0.650509238243103,
0.5970566868782043,
-0.38429340720176697,
-0.22680790722370148,
0.25359997153282166,
-0.8240457773208618,
0.011829765513539314,
-0.17977085709571838,
-0.498174786567688,
-0.04562739282846451,
0.3111919164657593,
0.9190090894699097,
0.19904309511184692,
-0.7378608584403992,
0.9900705218315125,
-0.09407984465360641,
-0.5474270582199097,
-0.2564293444156647,
0.076691634953022,
-0.5406688451766968,
0.1324404925107956,
0.3144390881061554,
0.21618981659412384,
0.45035627484321594,
-0.5084373950958252,
0.3871294856071472,
0.4221872091293335,
-0.3433851897716522,
-0.3733462691307068,
0.7476547956466675,
0.14398550987243652,
-0.20186810195446014,
0.49282145500183105,
-0.17284570634365082,
-0.944316029548645,
0.8064664006233215,
0.3963475823402405,
0.6486207842826843,
-0.013743950985372066,
0.17066317796707153,
0.6588932275772095,
0.14799140393733978,
-0.33707156777381897,
-0.047056812793016434,
0.013937613926827908,
-0.5570764541625977,
-0.21830935776233673,
-0.40931886434555054,
-0.5736475586891174,
0.153774693608284,
-0.9105253219604492,
0.4138091504573822,
-0.5008723139762878,
-0.49943023920059204,
-0.10634961724281311,
-0.26733821630477905,
-0.7181615829467773,
0.13234014809131622,
0.1548682004213333,
1.207140564918518,
-0.8277750015258789,
0.48266351222991943,
0.425367146730423,
-0.5864840149879456,
-0.7950787544250488,
-0.14838802814483643,
-0.10037794709205627,
-0.6335516571998596,
0.6581992506980896,
0.5258060097694397,
-0.003733827034011483,
-0.46435004472732544,
-0.933516800403595,
-0.9723369479179382,
1.1136428117752075,
0.32057783007621765,
-0.40224823355674744,
-0.08899114280939102,
-0.01629267819225788,
0.3401809334754944,
-0.32425543665885925,
0.3660754859447479,
0.3233945369720459,
-0.02058394066989422,
0.3343381881713867,
-1.1536753177642822,
-0.18725186586380005,
-0.1702989637851715,
0.2560049295425415,
0.023045731708407402,
-0.832671582698822,
1.0384564399719238,
-0.2684120535850525,
-0.4344950318336487,
0.05866951122879982,
0.43093863129615784,
-0.06260136514902115,
0.37096357345581055,
0.5112165808677673,
0.6893353462219238,
0.41597115993499756,
0.06394066661596298,
1.0571832656860352,
-0.05589362233877182,
0.45218518376350403,
0.92383873462677,
-0.14264684915542603,
0.8764290809631348,
0.3011862635612488,
-0.3492541015148163,
0.36191388964653015,
0.42676565051078796,
-0.6763984560966492,
0.7545884251594543,
0.0021253053564578295,
0.1541580855846405,
-0.04230530932545662,
-0.44262027740478516,
-0.2933233976364136,
0.6992912888526917,
0.03211953490972519,
-0.4601115584373474,
-0.06292233616113663,
0.403904527425766,
-0.24591632187366486,
-0.05322091281414032,
-0.44633153080940247,
0.4443184733390808,
-0.1601274013519287,
-0.33829620480537415,
0.4319251477718353,
0.06534789502620697,
0.9394005537033081,
-0.3577660620212555,
-0.15434551239013672,
0.09061578661203384,
0.19528502225875854,
-0.0830652043223381,
-0.934805154800415,
0.5403908491134644,
0.05777708813548088,
-0.2225620150566101,
0.08808813244104385,
0.7282537817955017,
-0.03494931012392044,
-0.5656675696372986,
0.2110108584165573,
-0.1360975205898285,
0.3465515673160553,
-0.10197411477565765,
-0.6955138444900513,
0.33062538504600525,
0.06243180111050606,
0.02785993553698063,
0.29215699434280396,
-0.01683165319263935,
-0.11081463098526001,
0.6593631505966187,
0.3812195956707001,
-0.049237966537475586,
0.10833752155303955,
-0.3374156355857849,
1.0371835231781006,
-0.544754683971405,
-0.39778468012809753,
-0.6761711835861206,
0.3475096821784973,
-0.0982847511768341,
-0.34040769934654236,
0.6080735921859741,
0.6037173271179199,
1.1036657094955444,
-0.12138237059116364,
0.5523548722267151,
-0.21089158952236176,
0.4996054768562317,
-0.3683314323425293,
0.43947654962539673,
-0.5207660794258118,
-0.030631018802523613,
-0.42714211344718933,
-0.628960907459259,
-0.18076695501804352,
0.6050317883491516,
-0.3947354853153229,
-0.07280343770980835,
0.4845198392868042,
0.7169778347015381,
-0.24301983416080475,
-0.032148949801921844,
0.2591622769832611,
-0.330429345369339,
0.25641778111457825,
0.6036332845687866,
0.6017049551010132,
-0.7820426821708679,
0.6837888956069946,
-0.6788225769996643,
-0.22482754290103912,
-0.19486694037914276,
-0.8245818614959717,
-1.022369146347046,
-0.49897223711013794,
-0.42180925607681274,
-0.13501325249671936,
-0.0580267459154129,
0.5685029625892639,
0.951771080493927,
-0.6999800801277161,
-0.0905606672167778,
0.32868582010269165,
-0.07138391584157944,
-0.006982887163758278,
-0.23981110751628876,
0.3748198449611664,
0.20650964975357056,
-0.5582612156867981,
-0.19345134496688843,
0.12450530380010605,
0.351590633392334,
-0.17999659478664398,
0.1201450526714325,
-0.19401372969150543,
-0.06597745418548584,
0.443072110414505,
0.52667236328125,
-0.6432492136955261,
-0.31874141097068787,
0.15350167453289032,
0.04125197231769562,
0.33677375316619873,
0.6344946622848511,
-0.6282545328140259,
0.428748220205307,
0.2736228406429291,
0.5439906120300293,
0.6507944464683533,
0.25668036937713623,
0.20126095414161682,
-0.42423686385154724,
0.2090161293745041,
0.20818743109703064,
0.3321470320224762,
0.34050479531288147,
-0.3942870497703552,
0.5836917757987976,
0.4773426055908203,
-0.6410806775093079,
-0.9654466509819031,
-0.030823230743408203,
-1.0626986026763916,
-0.19625894725322723,
0.8730432987213135,
-0.402445524930954,
-0.6692677736282349,
0.1520223319530487,
-0.20839084684848785,
0.16564975678920746,
-0.3536892235279083,
0.6482003927230835,
0.3887910544872284,
-0.02859175018966198,
-0.3560948669910431,
-0.5896066427230835,
0.1961309015750885,
0.05748412758111954,
-0.514034628868103,
-0.3825087547302246,
0.3604009747505188,
0.5753779411315918,
0.33941179513931274,
0.4655390679836273,
-0.3387034833431244,
0.3815578520298004,
0.04168790206313133,
0.29366838932037354,
-0.32165947556495667,
-0.3817782402038574,
-0.47130653262138367,
0.29366156458854675,
-0.27844858169555664,
-0.6039014458656311
] |
distilbert-base-uncased-finetuned-sst-2-english | null | "2023-10-26T16:14:11" | 30,654,240 | 355 | transformers | [
"transformers",
"pytorch",
"tf",
"rust",
"onnx",
"safetensors",
"distilbert",
"text-classification",
"en",
"dataset:sst2",
"dataset:glue",
"arxiv:1910.01108",
"doi:10.57967/hf/0181",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | "2022-03-02T23:29:04" | ---
language: en
license: apache-2.0
datasets:
- sst2
- glue
model-index:
- name: distilbert-base-uncased-finetuned-sst-2-english
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: glue
type: glue
config: sst2
split: validation
metrics:
- type: accuracy
value: 0.9105504587155964
name: Accuracy
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiN2YyOGMxYjY2Y2JhMjkxNjIzN2FmMjNiNmM2ZWViNGY3MTNmNWI2YzhiYjYxZTY0ZGUyN2M1NGIxZjRiMjQwZiIsInZlcnNpb24iOjF9.uui0srxV5ZHRhxbYN6082EZdwpnBgubPJ5R2-Wk8HTWqmxYE3QHidevR9LLAhidqGw6Ih93fK0goAXncld_gBg
- type: precision
value: 0.8978260869565218
name: Precision
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMzgwYTYwYjA2MmM0ZTYwNDk0M2NmNTBkZmM2NGNhYzQ1OGEyN2NkNDQ3Mzc2NTQyMmZiNDJiNzBhNGVhZGUyOSIsInZlcnNpb24iOjF9.eHjLmw3K02OU69R2Au8eyuSqT3aBDHgZCn8jSzE3_urD6EUSSsLxUpiAYR4BGLD_U6-ZKcdxVo_A2rdXqvUJDA
- type: recall
value: 0.9301801801801802
name: Recall
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMGIzM2E3MTI2Mzc2MDYwNmU3ZTVjYmZmZDBkNjY4ZTc5MGY0Y2FkNDU3NjY1MmVkNmE3Y2QzMzAwZDZhOWY1NiIsInZlcnNpb24iOjF9.PUZlqmct13-rJWBXdHm5tdkXgETL9F82GNbbSR4hI8MB-v39KrK59cqzFC2Ac7kJe_DtOeUyosj34O_mFt_1DQ
- type: auc
value: 0.9716626673402374
name: AUC
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMDM0YWIwZmQ4YjUwOGZmMWU2MjI1YjIxZGQ2MzNjMzRmZmYxMzZkNGFjODhlMDcyZDM1Y2RkMWZlOWQ0MWYwNSIsInZlcnNpb24iOjF9.E7GRlAXmmpEkTHlXheVkuL1W4WNjv4JO3qY_WCVsTVKiO7bUu0UVjPIyQ6g-J1OxsfqZmW3Leli1wY8vPBNNCQ
- type: f1
value: 0.9137168141592922
name: F1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMGU4MjNmOGYwZjZjMDQ1ZTkyZTA4YTc1MWYwOTM0NDM4ZWY1ZGVkNDY5MzNhYTQyZGFlNzIyZmUwMDg3NDU0NyIsInZlcnNpb24iOjF9.mW5ftkq50Se58M-jm6a2Pu93QeKa3MfV7xcBwvG3PSB_KNJxZWTCpfMQp-Cmx_EMlmI2siKOyd8akYjJUrzJCA
- type: loss
value: 0.39013850688934326
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMTZiNzAyZDc0MzUzMmE1MGJiN2JlYzFiODE5ZTNlNGE4MmI4YzRiMTc2ODEzMTUwZmEzOTgxNzc4YjJjZTRmNiIsInZlcnNpb24iOjF9.VqIC7uYC-ZZ8ss9zQOlRV39YVOOLc5R36sIzCcVz8lolh61ux_5djm2XjpP6ARc6KqEnXC4ZtfNXsX2HZfrtCQ
- task:
type: text-classification
name: Text Classification
dataset:
name: sst2
type: sst2
config: default
split: train
metrics:
- type: accuracy
value: 0.9885521685548412
name: Accuracy
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiY2I3NzU3YzhmMDkxZTViY2M3OTY1NmI0ZTdmMDQxNjNjYzJiZmQxNzczM2E4YmExYTY5ODY0NDBkY2I4ZjNkOCIsInZlcnNpb24iOjF9.4Gtk3FeVc9sPWSqZIaeUXJ9oVlPzm-NmujnWpK2y5s1Vhp1l6Y1pK5_78wW0-NxSvQqV6qd5KQf_OAEpVAkQDA
- type: precision
value: 0.9881965062029833
name: Precision Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDdlZDMzY2I3MTAwYTljNmM4MGMyMzU2YjAzZDg1NDYwN2ZmM2Y5OWZhMjUyMGJiNjY1YmZiMzFhMDI2ODFhNyIsInZlcnNpb24iOjF9.cqmv6yBxu4St2mykRWrZ07tDsiSLdtLTz2hbqQ7Gm1rMzq9tdlkZ8MyJRxtME_Y8UaOG9rs68pV-gKVUs8wABw
- type: precision
value: 0.9885521685548412
name: Precision Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZjFlYzAzNmE1YjljNjUwNzBjZjEzZDY0ZDQyMmY5ZWM2OTBhNzNjYjYzYTk1YWE1NjU3YTMxZDQwOTE1Y2FkNyIsInZlcnNpb24iOjF9.jnCHOkUHuAOZZ_ZMVOnetx__OVJCS6LOno4caWECAmfrUaIPnPNV9iJ6izRO3sqkHRmxYpWBb-27GJ4N3LU-BQ
- type: precision
value: 0.9885639626373408
name: Precision Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZGUyODFjNjBlNTE2MTY3ZDAxOGU1N2U0YjUyY2NiZjhkOGVmYThjYjBkNGU3NTRkYzkzNDQ2MmMwMjkwMWNiMyIsInZlcnNpb24iOjF9.zTNabMwApiZyXdr76QUn7WgGB7D7lP-iqS3bn35piqVTNsv3wnKjZOaKFVLIUvtBXq4gKw7N2oWxvWc4OcSNDg
- type: recall
value: 0.9886145346602994
name: Recall Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTU1YjlhODU3YTkyNTdiZDcwZGFlZDBiYjY0N2NjMGM2NTRiNjQ3MDNjNGMxOWY2ZGQ4NWU1YmMzY2UwZTI3YSIsInZlcnNpb24iOjF9.xaLPY7U-wHsJ3DDui1yyyM-xWjL0Jz5puRThy7fczal9x05eKEQ9s0a_WD-iLmapvJs0caXpV70hDe2NLcs-DA
- type: recall
value: 0.9885521685548412
name: Recall Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiODE0YTU0MDBlOGY4YzU0MjY5MzA3OTk2OGNhOGVkMmU5OGRjZmFiZWI2ZjY5ODEzZTQzMTI0N2NiOTVkNDliYiIsInZlcnNpb24iOjF9.SOt1baTBbuZRrsvGcak2sUwoTrQzmNCbyV2m1_yjGsU48SBH0NcKXicidNBSnJ6ihM5jf_Lv_B5_eOBkLfNWDQ
- type: recall
value: 0.9885521685548412
name: Recall Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZWNkNmM0ZGRlNmYxYzIwNDk4OTI5MzIwZWU1NzZjZDVhMDcyNDFlMjBhNDQxODU5OWMwMWNhNGEzNjY3ZGUyOSIsInZlcnNpb24iOjF9.b15Fh70GwtlG3cSqPW-8VEZT2oy0CtgvgEOtWiYonOovjkIQ4RSLFVzVG-YfslaIyfg9RzMWzjhLnMY7Bpn2Aw
- type: f1
value: 0.9884019815052447
name: F1 Macro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYmM4NjQ5Yjk5ODRhYTU1MTY3MmRhZDBmODM1NTg3OTFiNWM4NDRmYjI0MzZkNmQ1MzE3MzcxODZlYzBkYTMyYSIsInZlcnNpb24iOjF9.74RaDK8nBVuGRl2Se_-hwQvP6c4lvVxGHpcCWB4uZUCf2_HoC9NT9u7P3pMJfH_tK2cpV7U3VWGgSDhQDi-UBQ
- type: f1
value: 0.9885521685548412
name: F1 Micro
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDRmYWRmMmQ0YjViZmQxMzhhYTUyOTE1MTc0ZDU1ZjQyZjFhMDYzYzMzZDE0NzZlYzQyOTBhMTBhNmM5NTlkMiIsInZlcnNpb24iOjF9.VMn_psdAHIZTlW6GbjERZDe8MHhwzJ0rbjV_VJyuMrsdOh5QDmko-wEvaBWNEdT0cEKsbggm-6jd3Gh81PfHAQ
- type: f1
value: 0.9885546181087554
name: F1 Weighted
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMjUyZWFhZDZhMGQ3MzBmYmRiNDVmN2FkZDBjMjk3ODk0OTAxNGZkMWE0NzU5ZjI0NzE0NGZiNzM0N2Y2NDYyOSIsInZlcnNpb24iOjF9.YsXBhnzEEFEW6jw3mQlFUuIrW7Gabad2Ils-iunYJr-myg0heF8NEnEWABKFE1SnvCWt-69jkLza6SupeyLVCA
- type: loss
value: 0.040652573108673096
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTc3YjU3MjdjMzkxODA5MjU5NGUyY2NkMGVhZDg3ZWEzMmU1YWVjMmI0NmU2OWEyZTkzMTVjNDZiYTc0YjIyNCIsInZlcnNpb24iOjF9.lA90qXZVYiILHMFlr6t6H81Oe8a-4KmeX-vyCC1BDia2ofudegv6Vb46-4RzmbtuKeV6yy6YNNXxXxqVak1pAg
---
# DistilBERT base uncased finetuned SST-2
## Table of Contents
- [Model Details](#model-details)
- [How to Get Started With the Model](#how-to-get-started-with-the-model)
- [Uses](#uses)
- [Risks, Limitations and Biases](#risks-limitations-and-biases)
- [Training](#training)
## Model Details
**Model Description:** This model is a fine-tune checkpoint of [DistilBERT-base-uncased](https://huggingface.co/distilbert-base-uncased), fine-tuned on SST-2.
This model reaches an accuracy of 91.3 on the dev set (for comparison, Bert bert-base-uncased version reaches an accuracy of 92.7).
- **Developed by:** Hugging Face
- **Model Type:** Text Classification
- **Language(s):** English
- **License:** Apache-2.0
- **Parent Model:** For more details about DistilBERT, we encourage users to check out [this model card](https://huggingface.co/distilbert-base-uncased).
- **Resources for more information:**
- [Model Documentation](https://huggingface.co/docs/transformers/main/en/model_doc/distilbert#transformers.DistilBertForSequenceClassification)
- [DistilBERT paper](https://arxiv.org/abs/1910.01108)
## How to Get Started With the Model
Example of single-label classification:
```python
import torch
from transformers import DistilBertTokenizer, DistilBertForSequenceClassification
tokenizer = DistilBertTokenizer.from_pretrained("distilbert-base-uncased")
model = DistilBertForSequenceClassification.from_pretrained("distilbert-base-uncased")
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
with torch.no_grad():
logits = model(**inputs).logits
predicted_class_id = logits.argmax().item()
model.config.id2label[predicted_class_id]
```
## Uses
#### Direct Use
This model can be used for topic classification. You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you.
#### Misuse and Out-of-scope Use
The model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
## Risks, Limitations and Biases
Based on a few experimentations, we observed that this model could produce biased predictions that target underrepresented populations.
For instance, for sentences like `This film was filmed in COUNTRY`, this binary classification model will give radically different probabilities for the positive label depending on the country (0.89 if the country is France, but 0.08 if the country is Afghanistan) when nothing in the input indicates such a strong semantic shift. In this [colab](https://colab.research.google.com/gist/ageron/fb2f64fb145b4bc7c49efc97e5f114d3/biasmap.ipynb), [Aurélien Géron](https://twitter.com/aureliengeron) made an interesting map plotting these probabilities for each country.
<img src="https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/resolve/main/map.jpeg" alt="Map of positive probabilities per country." width="500"/>
We strongly advise users to thoroughly probe these aspects on their use-cases in order to evaluate the risks of this model. We recommend looking at the following bias evaluation datasets as a place to start: [WinoBias](https://huggingface.co/datasets/wino_bias), [WinoGender](https://huggingface.co/datasets/super_glue), [Stereoset](https://huggingface.co/datasets/stereoset).
# Training
#### Training Data
The authors use the following Stanford Sentiment Treebank([sst2](https://huggingface.co/datasets/sst2)) corpora for the model.
#### Training Procedure
###### Fine-tuning hyper-parameters
- learning_rate = 1e-5
- batch_size = 32
- warmup = 600
- max_seq_length = 128
- num_train_epochs = 3.0
| [
-0.39455512166023254,
-0.7664994597434998,
0.17849937081336975,
0.16509194672107697,
-0.4213687777519226,
-0.0027747880667448044,
-0.1830003559589386,
-0.32761508226394653,
0.10122796148061752,
0.4247645139694214,
-0.6012458205223083,
-0.6134636402130127,
-0.9000230431556702,
-0.20188122987747192,
-0.11908093839883804,
1.4349716901779175,
0.017388345673680305,
0.19627057015895844,
0.00017649999062996358,
0.009468479081988335,
-0.42075493931770325,
-0.5870155096054077,
-0.3725033104419708,
-0.3150821924209595,
0.10815275460481644,
0.30029433965682983,
0.5856735110282898,
0.18163341283798218,
0.432076632976532,
0.3041521906852722,
-0.460679829120636,
-0.13291917741298676,
-0.579167902469635,
-0.1468837559223175,
-0.1528633087873459,
-0.43070563673973083,
-0.5061169862747192,
0.3450506627559662,
0.27162477374076843,
0.6246225833892822,
-0.0177981685847044,
0.2992369532585144,
0.10641073435544968,
0.5606906414031982,
-0.28870704770088196,
0.2982610762119293,
-0.6910584568977356,
-0.05630846321582794,
-0.24140766263008118,
0.17037494480609894,
-0.47434985637664795,
-0.2934805452823639,
0.4018411338329315,
-0.3643893897533417,
0.3918888568878174,
-0.01735459454357624,
1.0571367740631104,
0.3059127926826477,
-0.46974417567253113,
-0.20859403908252716,
-0.4667978286743164,
0.7464268803596497,
-0.5381153225898743,
0.19937442243099213,
0.46745187044143677,
0.09841255098581314,
-0.13832546770572662,
-0.6955488920211792,
-0.5082835555076599,
-0.14492523670196533,
-0.056933462619781494,
0.3131434917449951,
-0.2624976634979248,
-0.04808567836880684,
0.41369062662124634,
0.3946012854576111,
-0.43355873227119446,
-0.10022295266389847,
-0.5700158476829529,
-0.3389454185962677,
0.5678173899650574,
-0.08704841881990433,
0.183237686753273,
-0.3329114615917206,
-0.6518622636795044,
-0.14835363626480103,
-0.39933285117149353,
0.004085691180080175,
0.44765928387641907,
0.339776873588562,
-0.2690736651420593,
0.6202360391616821,
-0.14262771606445312,
0.47465792298316956,
0.39691755175590515,
-0.08770941197872162,
0.5663097500801086,
-0.23002031445503235,
-0.22228668630123138,
0.14043080806732178,
0.6923131942749023,
0.47769150137901306,
0.22459469735622406,
0.039976298809051514,
0.05367925763130188,
0.255837619304657,
0.06202545762062073,
-1.0380254983901978,
-0.3720329701900482,
0.20975321531295776,
-0.3817058801651001,
-0.523870050907135,
0.1336606740951538,
-0.6942791938781738,
-0.1382533758878708,
-0.22323037683963776,
0.43168532848358154,
-0.47211989760398865,
-0.3962588310241699,
0.13454385101795197,
-0.32779574394226074,
0.024475591257214546,
0.15776750445365906,
-0.6616174578666687,
0.11765214055776596,
0.3058165907859802,
0.7591832876205444,
-0.15608423948287964,
-0.16358636319637299,
-0.1358160823583603,
-0.2138923555612564,
-0.14429807662963867,
0.4499902129173279,
-0.1508188545703888,
-0.2629219591617584,
-0.22023296356201172,
0.194838747382164,
0.11565342545509338,
-0.31736186146736145,
0.7365245223045349,
-0.38777825236320496,
0.4955316483974457,
-0.18287153542041779,
-0.4285941421985626,
-0.16679894924163818,
0.22100473940372467,
-0.6120895743370056,
1.153340220451355,
0.356952041387558,
-1.1279330253601074,
0.4589774012565613,
-0.4551559388637543,
-0.36490577459335327,
-0.14736540615558624,
0.11841396987438202,
-0.5878855586051941,
-0.04270247370004654,
0.07398687303066254,
0.42830365896224976,
-0.1100974753499031,
0.586842954158783,
-0.2632603347301483,
-0.3029768168926239,
0.19239678978919983,
-0.4257858991622925,
1.234033226966858,
0.20410676300525665,
-0.5036609172821045,
-0.10126122832298279,
-0.7253497838973999,
-0.12105635553598404,
0.07537828385829926,
-0.23028936982154846,
-0.41175389289855957,
-0.33167922496795654,
0.4078525900840759,
0.4434869587421417,
0.19995178282260895,
-0.6331462860107422,
0.18053480982780457,
-0.3517228662967682,
0.44965896010398865,
0.6642516851425171,
-0.06137717142701149,
0.41972389817237854,
-0.06992722302675247,
0.289168119430542,
0.36694929003715515,
0.19627878069877625,
0.18166372179985046,
-0.5688483119010925,
-0.7472202181816101,
-0.3337295651435852,
0.5400806069374084,
0.5999261140823364,
-0.6553006768226624,
0.6767974495887756,
-0.03429357334971428,
-0.6780361533164978,
-0.36029618978500366,
-0.0037899843882769346,
0.466594934463501,
0.6514629125595093,
0.4097729027271271,
-0.3472115099430084,
-0.555573046207428,
-0.7802368402481079,
0.08712699264287949,
-0.27035295963287354,
0.015200113877654076,
-0.07186167687177658,
0.6681923866271973,
-0.3764742612838745,
0.8431512117385864,
-0.671248733997345,
-0.3711678385734558,
-0.21799245476722717,
0.23998060822486877,
0.2906979024410248,
0.5128815174102783,
0.5992265343666077,
-0.8669005036354065,
-0.4679395258426666,
-0.4328412711620331,
-0.7112934589385986,
0.06274774670600891,
0.1343221515417099,
-0.32666099071502686,
0.341667115688324,
0.3147895336151123,
-0.6192516088485718,
0.4280968904495239,
0.41054147481918335,
-0.5385251641273499,
0.4346429109573364,
-0.030986594036221504,
-0.10124900937080383,
-1.3468687534332275,
-0.04214511066675186,
0.28363659977912903,
-0.06364244222640991,
-0.6373583674430847,
-0.14381256699562073,
0.024322854354977608,
0.06887457519769669,
-0.5955068469047546,
0.5213138461112976,
-0.37050530314445496,
0.29095882177352905,
-0.24847491085529327,
-0.18002840876579285,
0.07978535443544388,
0.5985899567604065,
0.29926779866218567,
0.5448622703552246,
0.6490444540977478,
-0.42339855432510376,
0.17415569722652435,
0.49139267206192017,
-0.4616641700267792,
0.5676307082176208,
-0.6781067252159119,
-0.026088742539286613,
-0.26158082485198975,
0.38259828090667725,
-0.8325991630554199,
-0.14316634833812714,
0.2849254906177521,
-0.5189048647880554,
0.5533652305603027,
-0.20863518118858337,
-0.4061069190502167,
-0.47487521171569824,
-0.31006884574890137,
0.20838484168052673,
0.6192159652709961,
-0.3911108672618866,
0.2889377772808075,
0.455562025308609,
-0.1183890849351883,
-0.6643891334533691,
-0.9520659446716309,
-0.22607967257499695,
-0.46978431940078735,
-0.4692792296409607,
0.46871674060821533,
-0.09687136113643646,
-0.12541522085666656,
-0.12776519358158112,
-0.08926212042570114,
-0.027568671852350235,
-0.0006596161983907223,
0.4760342240333557,
0.5116601586341858,
0.06807110458612442,
0.21556131541728973,
0.07853883504867554,
-0.20069298148155212,
-0.034882787615060806,
-0.21040064096450806,
0.4442758560180664,
-0.3436170220375061,
0.03953030705451965,
-0.3751736283302307,
0.06808639317750931,
0.33928439021110535,
0.007712363265454769,
0.6974692344665527,
0.7895979285240173,
-0.5196906328201294,
0.13154467940330505,
-0.5187159776687622,
-0.2643071115016937,
-0.4149612486362457,
0.5589550137519836,
-0.3132442831993103,
-0.740504264831543,
0.4184603989124298,
-0.0077957818284630775,
-0.19559237360954285,
0.7585350275039673,
0.742476224899292,
-0.1839454025030136,
0.8595841526985168,
0.7068747282028198,
-0.17007175087928772,
0.42257294058799744,
-0.4833884537220001,
0.014447271823883057,
-0.8271943926811218,
-0.35072043538093567,
-0.3312658965587616,
-0.19472599029541016,
-0.8998904824256897,
-0.4508382976055145,
0.22078420221805573,
0.3153958320617676,
-0.46418192982673645,
0.5980410575866699,
-0.6847987174987793,
0.31516018509864807,
0.7736015915870667,
0.2504311800003052,
0.13605201244354248,
0.1745445728302002,
-0.07265613973140717,
-0.19171665608882904,
-0.5804450511932373,
-0.5696525573730469,
1.156151533126831,
0.6976159811019897,
0.8835609555244446,
-0.11951801925897598,
0.627177357673645,
0.35524776577949524,
0.2951600253582001,
-0.5389713048934937,
0.27447161078453064,
-0.2492537796497345,
-1.0053186416625977,
-0.3220728039741516,
-0.284976065158844,
-0.6869947910308838,
0.13774341344833374,
-0.17346921563148499,
-0.7034978866577148,
0.29756757616996765,
0.05482909828424454,
-0.22495217621326447,
0.20147234201431274,
-0.7646337151527405,
0.9558120369911194,
-0.3845798671245575,
-0.3947889804840088,
0.21290764212608337,
-0.8314300179481506,
0.34835296869277954,
0.008652866818010807,
0.03712060675024986,
-0.2491636425256729,
0.30205512046813965,
0.8259046673774719,
-0.3167484998703003,
1.0511865615844727,
-0.33720797300338745,
0.10125981271266937,
0.5311238765716553,
-0.16211140155792236,
0.38044974207878113,
0.1514843851327896,
-0.182913139462471,
0.5634748935699463,
0.03464331477880478,
-0.3477303385734558,
-0.19886454939842224,
0.5762702226638794,
-1.026541829109192,
-0.28037387132644653,
-0.792603611946106,
-0.376328706741333,
-0.09318941831588745,
0.20931139588356018,
0.6873046159744263,
0.22798196971416473,
-0.3623468279838562,
0.08413100242614746,
0.7771492004394531,
-0.23854395747184753,
0.06471190601587296,
0.3295409083366394,
-0.02086302824318409,
-0.3811272084712982,
0.7277677655220032,
0.07794269919395447,
0.23282873630523682,
0.2320481538772583,
0.27249157428741455,
-0.5555741786956787,
-0.25380465388298035,
-0.50403892993927,
0.11110781133174896,
-0.7330042123794556,
-0.3301108777523041,
-0.6725967526435852,
-0.35014107823371887,
-0.538817286491394,
0.07839109003543854,
-0.3597587049007416,
-0.474581778049469,
-0.4385147988796234,
-0.3924425542354584,
0.5394052267074585,
0.4159330129623413,
-0.06583932042121887,
0.45719701051712036,
-0.2818510830402374,
0.19009330868721008,
0.11231238394975662,
0.3308224081993103,
-0.3863844573497772,
-0.8089033961296082,
0.01585964486002922,
0.27014926075935364,
-0.5139455795288086,
-0.9419294595718384,
0.2549116909503937,
0.07589007169008255,
0.4515518248081207,
0.3587566018104553,
0.2434529811143875,
0.4237591624259949,
-0.3333512842655182,
0.6499559283256531,
0.32229533791542053,
-0.8538691401481628,
0.7514451146125793,
-0.27120962738990784,
0.22763454914093018,
0.8347126841545105,
0.708276093006134,
-0.32926034927368164,
-0.3675524592399597,
-0.7862712740898132,
-0.9274308085441589,
0.8006370067596436,
0.42276862263679504,
0.18496744334697723,
0.051794663071632385,
0.22802972793579102,
0.1851537972688675,
0.31028348207473755,
-1.054742693901062,
-0.430080384016037,
-0.48457756638526917,
-0.24460282921791077,
-0.19724616408348083,
-0.43048757314682007,
-0.09888813644647598,
-0.48284563422203064,
0.965196430683136,
0.010355142876505852,
0.31297680735588074,
0.17831766605377197,
-0.12980212271213531,
0.016689103096723557,
0.11571813374757767,
0.418028861284256,
0.3970281779766083,
-0.6394710540771484,
0.0954812616109848,
0.20128309726715088,
-0.6141774654388428,
0.09389463067054749,
0.36967793107032776,
-0.3784502446651459,
0.1527315378189087,
0.1575489342212677,
0.9898125529289246,
-0.04617052525281906,
-0.4174819886684418,
0.5550938248634338,
-0.0022055909503251314,
-0.37150824069976807,
-0.4099050760269165,
-0.13426144421100616,
0.1533985733985901,
0.24979521334171295,
0.17313478887081146,
0.1090407520532608,
0.1716473549604416,
-0.7007641792297363,
0.23879031836986542,
0.3497660458087921,
-0.6393377780914307,
-0.09963985532522202,
0.724421501159668,
0.22263377904891968,
-0.038176558911800385,
0.7059614658355713,
-0.38744476437568665,
-0.7167786955833435,
0.7166312336921692,
0.48168691992759705,
0.7874501943588257,
-0.08428660035133362,
0.43145886063575745,
0.6438069343566895,
0.4749237596988678,
-0.15937620401382446,
0.1083986759185791,
0.1479026973247528,
-0.6971874833106995,
-0.08320046216249466,
-0.7961297631263733,
-0.1428368091583252,
0.23568958044052124,
-0.6774892210960388,
0.4504639804363251,
-0.25995105504989624,
-0.42738524079322815,
0.049422234296798706,
0.14435474574565887,
-0.7880160212516785,
0.37578752636909485,
0.21199558675289154,
0.8863317966461182,
-1.1896343231201172,
0.8667819499969482,
0.6460903882980347,
-0.6439225077629089,
-0.6075469255447388,
0.022814199328422546,
0.04924188554286957,
-0.5715823769569397,
0.693006157875061,
0.4285871684551239,
0.2427816092967987,
-0.21908684074878693,
-0.5228938460350037,
-0.8018374443054199,
1.1242660284042358,
0.17316266894340515,
-0.5742180943489075,
0.00021800631657242775,
0.17416661977767944,
0.7263214588165283,
-0.23792937397956848,
0.5445670485496521,
0.5020425915718079,
0.2676754891872406,
0.3124370574951172,
-0.8095596432685852,
0.10734791308641434,
-0.18750225007534027,
0.1290258914232254,
-0.0107869952917099,
-0.7772955298423767,
0.9041999578475952,
-0.20707401633262634,
-0.02271484024822712,
-0.13135401904582977,
0.6288811564445496,
0.31770122051239014,
0.4150295853614807,
0.4585939943790436,
0.6962486505508423,
0.6728048324584961,
-0.2809520661830902,
0.7057863473892212,
-0.09499859064817429,
0.5916179418563843,
1.2501112222671509,
-0.17249907553195953,
0.6568516492843628,
0.4329962134361267,
-0.3309740126132965,
0.5949482321739197,
0.9500848650932312,
-0.23044386506080627,
0.7461426258087158,
0.2807522714138031,
-0.027449175715446472,
-0.06331096589565277,
0.0486544594168663,
-0.5780562162399292,
0.48799142241477966,
0.2853248119354248,
-0.4778502583503723,
-0.1640867292881012,
0.16355033218860626,
0.16301605105400085,
-0.17076648771762848,
-0.0943361297249794,
0.5779903531074524,
0.03205307573080063,
-0.649949312210083,
0.44750723242759705,
0.12455794215202332,
0.969384491443634,
-0.49626362323760986,
0.11619332432746887,
-0.27353107929229736,
0.2573350965976715,
-0.11288169026374817,
-0.7391131520271301,
0.2765193581581116,
0.11003995686769485,
-0.24954254925251007,
-0.20268121361732483,
0.8342147469520569,
-0.5231289267539978,
-0.8830459713935852,
0.22545881569385529,
0.2999543845653534,
0.31030574440956116,
-0.24028263986110687,
-0.9547195434570312,
0.0019217487424612045,
0.1119103655219078,
-0.39386865496635437,
0.28009936213493347,
0.3974544405937195,
-0.1442578285932541,
0.42363104224205017,
0.43116748332977295,
-0.1231466755270958,
-0.06306044012308121,
-0.03213053569197655,
0.8441619873046875,
-0.4236343502998352,
-0.37915828824043274,
-0.7179392576217651,
0.6112212538719177,
-0.19926083087921143,
-0.45497825741767883,
0.7169384956359863,
0.7449997663497925,
1.1327126026153564,
-0.19620728492736816,
0.8667671084403992,
-0.34750545024871826,
0.3689331114292145,
-0.397773414850235,
0.7360743284225464,
-0.5064845085144043,
-0.07119663804769516,
-0.45821693539619446,
-0.8655080199241638,
0.015232192352414131,
0.7159936428070068,
-0.27841541171073914,
0.159575954079628,
0.5865151882171631,
0.7578409314155579,
-0.10622061043977737,
-0.05587517470121384,
0.02962624840438366,
0.2659987211227417,
-0.04438737779855728,
0.46568575501441956,
0.5900557041168213,
-0.7263216376304626,
0.4247487783432007,
-0.623539924621582,
-0.486054003238678,
-0.24097424745559692,
-0.855069100856781,
-1.0891088247299194,
-0.6968339085578918,
-0.6068265438079834,
-0.6957365870475769,
-0.022601351141929626,
0.8182079792022705,
0.7444773316383362,
-0.8511559367179871,
-0.0179264098405838,
-0.05812692269682884,
-0.02884083427488804,
-0.046011921018362045,
-0.23507431149482727,
0.4928792417049408,
0.010456881485879421,
-0.8979527950286865,
-0.17873305082321167,
-0.11087273806333542,
0.28452351689338684,
-0.24467717111110687,
-0.046762801706790924,
-0.2674524486064911,
-0.2437964379787445,
0.6554845571517944,
0.04008529707789421,
-0.7156467437744141,
-0.06401929259300232,
-0.044698525220155716,
-0.24579370021820068,
-0.13081791996955872,
0.344691663980484,
-0.5213624238967896,
0.4219704270362854,
0.40260621905326843,
0.2642192840576172,
0.7712266445159912,
0.02693791687488556,
0.1498466283082962,
-0.7748450040817261,
0.39462152123451233,
0.14672552049160004,
0.29643991589546204,
0.31147056818008423,
-0.5258164405822754,
0.6679111123085022,
0.3662897050380707,
-0.4399418830871582,
-0.7042784094810486,
0.07325100153684616,
-1.235612154006958,
-0.359758198261261,
1.313576579093933,
-0.03101460263133049,
-0.14459317922592163,
0.09457873553037643,
-0.39994198083877563,
0.5377811789512634,
-0.3393193781375885,
0.8355849385261536,
0.919766366481781,
0.009771095588803291,
0.14140470325946808,
-0.5477043986320496,
0.524770200252533,
0.2952578365802765,
-0.6089582443237305,
-0.15377487242221832,
0.4339662194252014,
0.6385073661804199,
0.2001967579126358,
0.6460119485855103,
-0.1832330822944641,
0.007327844854444265,
0.0644063800573349,
0.2941958010196686,
-0.028973015025258064,
-0.2902163565158844,
-0.1285143345594406,
-0.11102171987295151,
-0.0635223537683487,
-0.32445254921913147
] |
gpt2 | null | "2023-06-30T02:19:43" | 21,850,807 | 1,516 | transformers | [
"transformers",
"pytorch",
"tf",
"jax",
"tflite",
"rust",
"onnx",
"safetensors",
"gpt2",
"text-generation",
"exbert",
"en",
"doi:10.57967/hf/0039",
"license:mit",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | "2022-03-02T23:29:04" | ---
language: en
tags:
- exbert
license: mit
---
# GPT-2
Test the whole generation capabilities here: https://transformer.huggingface.co/doc/gpt2-large
Pretrained model on English language using a causal language modeling (CLM) objective. It was introduced in
[this paper](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf)
and first released at [this page](https://openai.com/blog/better-language-models/).
Disclaimer: The team releasing GPT-2 also wrote a
[model card](https://github.com/openai/gpt-2/blob/master/model_card.md) for their model. Content from this model card
has been written by the Hugging Face team to complete the information they provided and give specific examples of bias.
## Model description
GPT-2 is a transformers model pretrained on a very large corpus of English data in a self-supervised fashion. This
means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots
of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely,
it was trained to guess the next word in sentences.
More precisely, inputs are sequences of continuous text of a certain length and the targets are the same sequence,
shifted one token (word or piece of word) to the right. The model uses internally a mask-mechanism to make sure the
predictions for the token `i` only uses the inputs from `1` to `i` but not the future tokens.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks. The model is best at what it was pretrained for however, which is generating texts from a
prompt.
This is the **smallest** version of GPT-2, with 124M parameters.
**Related Models:** [GPT-Large](https://huggingface.co/gpt2-large), [GPT-Medium](https://huggingface.co/gpt2-medium) and [GPT-XL](https://huggingface.co/gpt2-xl)
## Intended uses & limitations
You can use the raw model for text generation or fine-tune it to a downstream task. See the
[model hub](https://huggingface.co/models?filter=gpt2) to look for fine-tuned versions on a task that interests you.
### How to use
You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we
set a seed for reproducibility:
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='gpt2')
>>> set_seed(42)
>>> generator("Hello, I'm a language model,", max_length=30, num_return_sequences=5)
[{'generated_text': "Hello, I'm a language model, a language for thinking, a language for expressing thoughts."},
{'generated_text': "Hello, I'm a language model, a compiler, a compiler library, I just want to know how I build this kind of stuff. I don"},
{'generated_text': "Hello, I'm a language model, and also have more than a few of your own, but I understand that they're going to need some help"},
{'generated_text': "Hello, I'm a language model, a system model. I want to know my language so that it might be more interesting, more user-friendly"},
{'generated_text': 'Hello, I\'m a language model, not a language model"\n\nThe concept of "no-tricks" comes in handy later with new'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import GPT2Tokenizer, GPT2Model
tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
model = GPT2Model.from_pretrained('gpt2')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import GPT2Tokenizer, TFGPT2Model
tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
model = TFGPT2Model.from_pretrained('gpt2')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
The training data used for this model has not been released as a dataset one can browse. We know it contains a lot of
unfiltered content from the internet, which is far from neutral. As the openAI team themselves point out in their
[model card](https://github.com/openai/gpt-2/blob/master/model_card.md#out-of-scope-use-cases):
> Because large-scale language models like GPT-2 do not distinguish fact from fiction, we don’t support use-cases
> that require the generated text to be true.
>
> Additionally, language models like GPT-2 reflect the biases inherent to the systems they were trained on, so we do
> not recommend that they be deployed into systems that interact with humans > unless the deployers first carry out a
> study of biases relevant to the intended use-case. We found no statistically significant difference in gender, race,
> and religious bias probes between 774M and 1.5B, implying all versions of GPT-2 should be approached with similar
> levels of caution around use cases that are sensitive to biases around human attributes.
Here's an example of how the model can have biased predictions:
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='gpt2')
>>> set_seed(42)
>>> generator("The White man worked as a", max_length=10, num_return_sequences=5)
[{'generated_text': 'The White man worked as a mannequin for'},
{'generated_text': 'The White man worked as a maniser of the'},
{'generated_text': 'The White man worked as a bus conductor by day'},
{'generated_text': 'The White man worked as a plumber at the'},
{'generated_text': 'The White man worked as a journalist. He had'}]
>>> set_seed(42)
>>> generator("The Black man worked as a", max_length=10, num_return_sequences=5)
[{'generated_text': 'The Black man worked as a man at a restaurant'},
{'generated_text': 'The Black man worked as a car salesman in a'},
{'generated_text': 'The Black man worked as a police sergeant at the'},
{'generated_text': 'The Black man worked as a man-eating monster'},
{'generated_text': 'The Black man worked as a slave, and was'}]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The OpenAI team wanted to train this model on a corpus as large as possible. To build it, they scraped all the web
pages from outbound links on Reddit which received at least 3 karma. Note that all Wikipedia pages were removed from
this dataset, so the model was not trained on any part of Wikipedia. The resulting dataset (called WebText) weights
40GB of texts but has not been publicly released. You can find a list of the top 1,000 domains present in WebText
[here](https://github.com/openai/gpt-2/blob/master/domains.txt).
## Training procedure
### Preprocessing
The texts are tokenized using a byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a
vocabulary size of 50,257. The inputs are sequences of 1024 consecutive tokens.
The larger model was trained on 256 cloud TPU v3 cores. The training duration was not disclosed, nor were the exact
details of training.
## Evaluation results
The model achieves the following results without any fine-tuning (zero-shot):
| Dataset | LAMBADA | LAMBADA | CBT-CN | CBT-NE | WikiText2 | PTB | enwiki8 | text8 | WikiText103 | 1BW |
|:--------:|:-------:|:-------:|:------:|:------:|:---------:|:------:|:-------:|:------:|:-----------:|:-----:|
| (metric) | (PPL) | (ACC) | (ACC) | (ACC) | (PPL) | (PPL) | (BPB) | (BPC) | (PPL) | (PPL) |
| | 35.13 | 45.99 | 87.65 | 83.4 | 29.41 | 65.85 | 1.16 | 1,17 | 37.50 | 75.20 |
### BibTeX entry and citation info
```bibtex
@article{radford2019language,
title={Language Models are Unsupervised Multitask Learners},
author={Radford, Alec and Wu, Jeff and Child, Rewon and Luan, David and Amodei, Dario and Sutskever, Ilya},
year={2019}
}
```
<a href="https://huggingface.co/exbert/?model=gpt2">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| [
-0.26800060272216797,
-0.7213711738586426,
0.30211585760116577,
-0.029406607151031494,
-0.25600627064704895,
-0.30617034435272217,
-0.3942148983478546,
-0.5192710161209106,
-0.10086898505687714,
0.30821824073791504,
-0.47000324726104736,
-0.2688843607902527,
-0.7264645099639893,
-0.032964274287223816,
-0.2630700170993805,
1.3902673721313477,
-0.1165480837225914,
0.014302617870271206,
0.07921174168586731,
0.07556425780057907,
-0.34821563959121704,
-0.4475557506084442,
-0.626145601272583,
-0.44711577892303467,
0.3113221824169159,
0.11196576803922653,
0.5982396602630615,
0.6187265515327454,
0.19330264627933502,
0.18577978014945984,
-0.1108967736363411,
0.038254812359809875,
-0.4871658384799957,
-0.1758861094713211,
-0.19592522084712982,
-0.32758116722106934,
-0.31916528940200806,
0.21157662570476532,
0.49687138199806213,
0.38406431674957275,
0.14179302752017975,
0.3340820372104645,
0.29046663641929626,
0.28503215312957764,
-0.3820275366306305,
0.29248374700546265,
-0.5449777841567993,
-0.1045522689819336,
-0.37312787771224976,
0.09404546022415161,
-0.40795403718948364,
-0.12326181679964066,
0.10076796263456345,
-0.4875422716140747,
0.40994688868522644,
-0.09439310431480408,
1.1653259992599487,
0.24578028917312622,
-0.4644908607006073,
-0.19272516667842865,
-0.699135959148407,
0.7981328368186951,
-0.6962667107582092,
0.31467753648757935,
0.46429169178009033,
0.061448149383068085,
-0.024337971583008766,
-0.843987762928009,
-0.730614185333252,
-0.16626375913619995,
-0.29180431365966797,
0.2588072717189789,
-0.06669306755065918,
-0.05579200014472008,
0.36462417244911194,
0.3002094030380249,
-0.851883590221405,
-0.012264401651918888,
-0.4588051736354828,
-0.2986520528793335,
0.48013177514076233,
-0.10887100547552109,
0.35492467880249023,
-0.3975319266319275,
-0.3588367998600006,
-0.2832241952419281,
-0.5684046149253845,
-0.07404550164937973,
0.435140997171402,
0.25759562849998474,
-0.29499393701553345,
0.6803699731826782,
-0.0020231055095791817,
0.5221033692359924,
-0.06638520210981369,
-0.14566990733146667,
0.3320988118648529,
-0.48691725730895996,
-0.12801867723464966,
-0.23632092773914337,
1.1070319414138794,
0.26391366124153137,
0.4432751536369324,
-0.11973480135202408,
-0.2031804472208023,
0.12757743895053864,
0.05614260211586952,
-0.8937644362449646,
-0.3307704031467438,
0.19552700221538544,
-0.4306071996688843,
-0.3270542323589325,
0.06010166183114052,
-0.8052879571914673,
-0.045688048005104065,
-0.15496569871902466,
0.3911561071872711,
-0.4507558047771454,
-0.4880216419696808,
-0.23811085522174835,
-0.23508352041244507,
0.23662103712558746,
-0.06764927506446838,
-1.0571582317352295,
0.14638210833072662,
0.5997335910797119,
0.8951951265335083,
0.01524875033646822,
-0.4092927873134613,
-0.14700202643871307,
-0.0276336669921875,
-0.13536247611045837,
0.49413859844207764,
-0.26020628213882446,
-0.13727353513240814,
-0.09805407375097275,
0.01550777442753315,
-0.0613178052008152,
-0.2888031005859375,
0.4719673693180084,
-0.39514440298080444,
0.6220029592514038,
-0.22662149369716644,
-0.45036840438842773,
-0.0927693173289299,
0.04494751617312431,
-0.46900320053100586,
1.2078648805618286,
0.39375951886177063,
-1.0375611782073975,
0.38671526312828064,
-0.752838134765625,
-0.39512380957603455,
-0.12920862436294556,
-0.09098172932863235,
-0.5057074427604675,
-0.05609443038702011,
0.17857997119426727,
0.297180712223053,
-0.46523356437683105,
0.40792664885520935,
-0.11918146908283234,
-0.21414433419704437,
0.13202640414237976,
-0.38899436593055725,
1.0884586572647095,
0.2816142439842224,
-0.6598138213157654,
-0.06511913239955902,
-0.5095223784446716,
0.08556293696165085,
0.4092918038368225,
-0.347924143075943,
-0.13130274415016174,
-0.06556221097707748,
0.3314511179924011,
0.35934340953826904,
0.19003872573375702,
-0.5077027082443237,
0.22835850715637207,
-0.486895352602005,
0.6538479328155518,
0.6057867407798767,
-0.12694789469242096,
0.32181406021118164,
-0.18201051652431488,
0.2849946618080139,
-0.01680440455675125,
0.0934966653585434,
-0.05431259050965309,
-0.7667784094810486,
-0.7437343597412109,
-0.11866719275712967,
0.47263744473457336,
0.7276293039321899,
-0.773720920085907,
0.44414398074150085,
-0.2846788465976715,
-0.4806286692619324,
-0.4828583896160126,
0.10368817299604416,
0.6313635110855103,
0.5151225924491882,
0.43877390027046204,
-0.22405023872852325,
-0.5790882110595703,
-0.7968162894248962,
-0.28335240483283997,
-0.3306891620159149,
-0.1796269565820694,
0.1642143279314041,
0.7243932485580444,
-0.3068493604660034,
0.9339622259140015,
-0.5619862079620361,
-0.21537292003631592,
-0.38102278113365173,
0.19944939017295837,
0.14170153439044952,
0.5326336622238159,
0.5184351205825806,
-0.6424381136894226,
-0.4989464282989502,
-0.120135597884655,
-0.8040398955345154,
-0.08032265305519104,
0.0239741038531065,
-0.1285729855298996,
0.32160380482673645,
0.2774953246116638,
-0.8789204955101013,
0.27267464995384216,
0.48421773314476013,
-0.49706339836120605,
0.6610094904899597,
-0.12826049327850342,
-0.2625684440135956,
-1.3356661796569824,
0.3465004861354828,
0.1355881243944168,
-0.12252910435199738,
-0.7671564817428589,
0.13945402204990387,
0.04276818782091141,
-0.07160677760839462,
-0.2979741394519806,
0.7875285744667053,
-0.5081168413162231,
-0.00751293171197176,
-0.19843637943267822,
0.07589121162891388,
-0.12013852596282959,
0.64971524477005,
0.014435765333473682,
0.9322414994239807,
0.411895215511322,
-0.463489830493927,
0.13715112209320068,
0.29597511887550354,
-0.4007657766342163,
0.24701716005802155,
-0.7747828364372253,
0.2775854468345642,
-0.11215464025735855,
0.21461786329746246,
-0.9456405639648438,
-0.3721058666706085,
0.3128392696380615,
-0.6385411024093628,
0.4225650131702423,
-0.17437367141246796,
-0.6571202874183655,
-0.5477219820022583,
-0.14025536179542542,
0.4077211618423462,
0.7857695817947388,
-0.43490952253341675,
0.3347192406654358,
0.4391508996486664,
-0.13149046897888184,
-0.5604321956634521,
-0.8165533542633057,
-0.05467802658677101,
-0.11488664895296097,
-0.5928171277046204,
0.3433428406715393,
0.08607473224401474,
-0.07673565298318863,
-0.03740604966878891,
0.24864710867404938,
-0.06723125278949738,
-0.0037849280051887035,
0.19491885602474213,
0.3150444030761719,
-0.13385364413261414,
0.08810748904943466,
0.0042253644205629826,
-0.17629386484622955,
-0.00936805084347725,
-0.5585521459579468,
0.725403368473053,
0.05359577015042305,
-0.05226777121424675,
-0.40876853466033936,
0.14762000739574432,
0.3351588845252991,
-0.25821423530578613,
0.7117131352424622,
1.010573387145996,
-0.4582613706588745,
0.0018501841695979238,
-0.4366331100463867,
-0.2912878692150116,
-0.4026377499103546,
0.6927090287208557,
-0.3437272906303406,
-0.8904018998146057,
0.40723150968551636,
0.3026493191719055,
0.1341148465871811,
0.8081094622612,
0.7567421793937683,
0.16100551187992096,
1.0377908945083618,
0.5130167603492737,
-0.11988598108291626,
0.4114222824573517,
-0.4062674045562744,
0.17023585736751556,
-0.906610906124115,
-0.19863179326057434,
-0.4266345798969269,
-0.1311579942703247,
-0.7718591690063477,
-0.40216314792633057,
0.23521974682807922,
0.22589705884456635,
-0.4612140953540802,
0.4834059774875641,
-0.6199231743812561,
0.33947259187698364,
0.7517991662025452,
0.07817743718624115,
-0.022676769644021988,
0.11118581146001816,
-0.1992121934890747,
-0.00350290653295815,
-0.5338294506072998,
-0.5187957882881165,
1.2871243953704834,
0.43834319710731506,
0.4167836606502533,
0.028195952996611595,
0.42767831683158875,
0.03170802444219589,
0.3277318775653839,
-0.5097234845161438,
0.34809380769729614,
-0.277678519487381,
-0.8296948075294495,
-0.317909300327301,
-0.501082181930542,
-0.8991039991378784,
0.23071198165416718,
-0.02957106940448284,
-0.8976508378982544,
-0.0639822781085968,
0.18651573359966278,
-0.18512602150440216,
0.3927589952945709,
-0.8144452571868896,
1.0133906602859497,
-0.18981565535068512,
-0.35401830077171326,
0.044074300676584244,
-0.6978777050971985,
0.449972540140152,
-0.01484525017440319,
0.07528553158044815,
0.13486096262931824,
0.09085747599601746,
0.9053322076797485,
-0.6245647072792053,
0.916256844997406,
-0.29548200964927673,
-0.005859556142240763,
0.5056777596473694,
-0.08805472403764725,
0.534161388874054,
-0.044477030634880066,
0.034415747970342636,
0.3798922598361969,
-0.22214993834495544,
-0.4134658873081207,
-0.26896917819976807,
0.6071754693984985,
-1.184957504272461,
-0.36317434906959534,
-0.44240161776542664,
-0.4715040624141693,
0.04819166660308838,
0.3914695084095001,
0.7298160791397095,
0.3964495062828064,
-0.08616654574871063,
-0.09579665213823318,
0.4138718247413635,
-0.2435775250196457,
0.4494536221027374,
0.15433833003044128,
-0.12420237064361572,
-0.4439282715320587,
0.8603357672691345,
0.1618364453315735,
0.26865407824516296,
0.35537195205688477,
0.21158486604690552,
-0.49669885635375977,
-0.3594887852668762,
-0.6087033152580261,
0.4384844899177551,
-0.5510559678077698,
-0.10174434632062912,
-0.8119768500328064,
-0.3457333743572235,
-0.6339358687400818,
0.2323945164680481,
-0.2947476804256439,
-0.4493906795978546,
-0.39651229977607727,
-0.15983577072620392,
0.2917042672634125,
0.8486515283584595,
0.0418647825717926,
0.39577656984329224,
-0.4636988341808319,
0.22048842906951904,
0.4037657082080841,
0.37119394540786743,
0.009919606149196625,
-0.7324863076210022,
-0.1855265349149704,
0.24214671552181244,
-0.4823529124259949,
-0.8280507326126099,
0.34308579564094543,
0.08988941460847855,
0.3521391749382019,
0.28812000155448914,
-0.19161829352378845,
0.4323657751083374,
-0.4473225772380829,
1.0535187721252441,
0.18685311079025269,
-0.8218963146209717,
0.50159752368927,
-0.3928958475589752,
0.2882399559020996,
0.33473658561706543,
0.2575318515300751,
-0.5158659815788269,
-0.28199172019958496,
-0.6583702564239502,
-0.8613508939743042,
0.9132770895957947,
0.4461021423339844,
0.17654310166835785,
-0.06547531485557556,
0.4186922311782837,
0.062362831085920334,
0.11308256536722183,
-1.0410906076431274,
-0.3902961313724518,
-0.5319854617118835,
-0.31022611260414124,
-0.20339155197143555,
-0.44128620624542236,
0.0660208985209465,
-0.20821432769298553,
0.7967042326927185,
-0.043552473187446594,
0.6466165781021118,
0.12722748517990112,
-0.2520384192466736,
0.04633825272321701,
0.1593116819858551,
0.6545462012290955,
0.5236353278160095,
-0.08097896724939346,
0.11724525690078735,
0.09723123908042908,
-0.7058572173118591,
0.006657379679381847,
0.2274378389120102,
-0.46653106808662415,
0.014052494429051876,
0.3627628982067108,
1.1284927129745483,
-0.17111964523792267,
-0.3689991235733032,
0.6230653524398804,
0.11818922311067581,
-0.28338831663131714,
-0.377174973487854,
-0.008642543107271194,
0.07189302891492844,
0.13362576067447662,
0.23323166370391846,
-0.04890561103820801,
-0.08191262930631638,
-0.5237978100776672,
0.15718889236450195,
0.3241625130176544,
-0.38809147477149963,
-0.510742723941803,
0.9770525097846985,
0.0853576734662056,
-0.2860836684703827,
0.8123061060905457,
-0.4817317724227905,
-0.6476997137069702,
0.5594940185546875,
0.7122218608856201,
0.9522725343704224,
-0.12782280147075653,
0.28609007596969604,
0.644853949546814,
0.4218883812427521,
-0.22604435682296753,
0.19301442801952362,
0.20214596390724182,
-0.7106346487998962,
-0.5089858770370483,
-0.6269577741622925,
0.04147981479763985,
0.46577417850494385,
-0.37330713868141174,
0.29646095633506775,
-0.3641170859336853,
-0.3079545199871063,
-0.07672257721424103,
0.06536015123128891,
-0.7903711199760437,
0.24284283816814423,
0.08035686612129211,
0.7506875395774841,
-0.9109331369400024,
0.9211437702178955,
0.6786550283432007,
-0.6857967376708984,
-0.9349444508552551,
0.1256517916917801,
-0.023530030623078346,
-0.8510869145393372,
0.638496994972229,
0.3101532459259033,
0.4045220613479614,
0.05338480696082115,
-0.515187680721283,
-0.8637335300445557,
1.1447993516921997,
0.2939298450946808,
-0.3679497241973877,
-0.26547688245773315,
0.34589192271232605,
0.6052057147026062,
-0.1192711666226387,
0.7019214630126953,
0.5606275200843811,
0.48654189705848694,
-0.17398542165756226,
-1.0288004875183105,
0.2828434109687805,
-0.330595999956131,
0.18590962886810303,
0.22017857432365417,
-0.7651781439781189,
1.168226957321167,
-0.2407006323337555,
-0.15598566830158234,
0.08569645136594772,
0.554932177066803,
0.10836654901504517,
0.037449803203344345,
0.40727147459983826,
0.6568664312362671,
0.6801350116729736,
-0.30083832144737244,
1.2677667140960693,
-0.3242199122905731,
0.6897895932197571,
1.0587784051895142,
0.07107196003198624,
0.6327437162399292,
0.3020801842212677,
-0.2821864187717438,
0.47166940569877625,
0.6130788326263428,
-0.14451247453689575,
0.549699068069458,
0.06542303413152695,
0.03683043643832207,
0.02561301179230213,
-0.021907448768615723,
-0.4058336019515991,
0.38616159558296204,
0.09967286139726639,
-0.5244377255439758,
-0.1164919063448906,
-0.03231293708086014,
0.42983561754226685,
-0.28713634610176086,
-0.053241197019815445,
0.7209063768386841,
0.09167949855327606,
-0.8592329621315002,
0.665591299533844,
0.308433473110199,
0.7687047719955444,
-0.5763344168663025,
0.16395990550518036,
-0.13765716552734375,
0.2581499218940735,
-0.1440669745206833,
-0.7767511010169983,
0.17480646073818207,
0.11221832036972046,
-0.32738468050956726,
-0.1583128124475479,
0.7199274897575378,
-0.5567746758460999,
-0.4291622042655945,
0.26953238248825073,
0.43337658047676086,
0.3087013363838196,
-0.22416310012340546,
-0.736251711845398,
-0.18669874966144562,
0.18247993290424347,
-0.454316109418869,
0.3840043246746063,
0.27320051193237305,
-0.07310883700847626,
0.37324824929237366,
0.647759199142456,
0.08566077053546906,
0.025547156110405922,
0.10052385926246643,
0.7372643947601318,
-0.5779276490211487,
-0.4512730836868286,
-0.9202864766120911,
0.5858457684516907,
-0.09076029062271118,
-0.5608783960342407,
0.657627284526825,
0.6443025469779968,
1.0220173597335815,
-0.14791184663772583,
1.013791561126709,
-0.3654012084007263,
0.44123849272727966,
-0.409017950296402,
0.7577755451202393,
-0.5141586065292358,
-0.07105136662721634,
-0.24143409729003906,
-1.0052385330200195,
-0.05019117519259453,
0.6307552456855774,
-0.3299505114555359,
0.35164305567741394,
0.739716649055481,
0.867887556552887,
-0.11490798741579056,
-0.09091459959745407,
0.01570977456867695,
0.37901660799980164,
0.32746621966362,
0.6663816571235657,
0.4530796408653259,
-0.6471797823905945,
0.6932317018508911,
-0.3478170335292816,
-0.3195335865020752,
-0.06411959230899811,
-0.5901902914047241,
-1.0248439311981201,
-0.6521384119987488,
-0.1938524693250656,
-0.5778084397315979,
0.02656811662018299,
0.8428603410720825,
0.6702060103416443,
-0.8657920360565186,
-0.22634442150592804,
-0.23799413442611694,
-0.09036734700202942,
-0.15369386970996857,
-0.28477418422698975,
0.48875635862350464,
-0.23629847168922424,
-0.7635785937309265,
-0.015300487168133259,
-0.1274365931749344,
0.19849665462970734,
-0.25950995087623596,
-0.17400768399238586,
-0.21592330932617188,
-0.18638065457344055,
0.535668134689331,
0.20458093285560608,
-0.6781095266342163,
-0.3611910343170166,
-0.037340108305215836,
-0.20583447813987732,
-0.059516869485378265,
0.6663970351219177,
-0.5570838451385498,
0.2479744404554367,
0.5099780559539795,
0.4193369448184967,
0.5908821225166321,
-0.11142013221979141,
0.4345402121543884,
-0.727213442325592,
0.21286959946155548,
-0.00995674729347229,
0.3287694752216339,
0.3667744994163513,
-0.40244582295417786,
0.6035663485527039,
0.48052337765693665,
-0.6191853880882263,
-0.6935640573501587,
0.17578984797000885,
-0.746711015701294,
-0.27755314111709595,
1.4319003820419312,
-0.11748728156089783,
-0.166074737906456,
-0.029791751876473427,
-0.18566353619098663,
0.6330386400222778,
-0.3753071129322052,
0.6690351366996765,
0.6748413443565369,
0.16290877759456635,
-0.08876632153987885,
-0.6962429881095886,
0.5650122761726379,
0.30741024017333984,
-0.6768838763237,
0.049271680414676666,
0.21262142062187195,
0.5891735553741455,
0.22511906921863556,
0.7403094172477722,
-0.1352323591709137,
0.04189683496952057,
0.043528202921152115,
0.2142183482646942,
-0.10028295964002609,
-0.16453728079795837,
-0.15367716550827026,
0.048850346356630325,
-0.07545870542526245,
-0.13926011323928833
] |
timm/mobilenetv3_large_100.ra_in1k | timm | "2023-04-27T22:49:21" | 15,846,252 | 17 | timm | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:1905.02244",
"license:apache-2.0",
"has_space",
"region:us"
] | image-classification | "2022-12-16T05:38:07" | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for mobilenetv3_large_100.ra_in1k
A MobileNet-v3 image classification model. Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* RandAugment `RA` recipe. Inspired by and evolved from EfficientNet RandAugment recipes. Published as `B` recipe in [ResNet Strikes Back](https://arxiv.org/abs/2110.00476).
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 5.5
- GMACs: 0.2
- Activations (M): 4.4
- Image size: 224 x 224
- **Papers:**
- Searching for MobileNetV3: https://arxiv.org/abs/1905.02244
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('mobilenetv3_large_100.ra_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'mobilenetv3_large_100.ra_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 112, 112])
# torch.Size([1, 24, 56, 56])
# torch.Size([1, 40, 28, 28])
# torch.Size([1, 112, 14, 14])
# torch.Size([1, 960, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'mobilenetv3_large_100.ra_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 960, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{howard2019searching,
title={Searching for mobilenetv3},
author={Howard, Andrew and Sandler, Mark and Chu, Grace and Chen, Liang-Chieh and Chen, Bo and Tan, Mingxing and Wang, Weijun and Zhu, Yukun and Pang, Ruoming and Vasudevan, Vijay and others},
booktitle={Proceedings of the IEEE/CVF international conference on computer vision},
pages={1314--1324},
year={2019}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| [
-0.42348533868789673,
-0.28704380989074707,
-0.060248397290706635,
0.08704610168933868,
-0.3158630132675171,
-0.41070929169654846,
-0.07284087687730789,
-0.35634562373161316,
0.38355836272239685,
0.4785507321357727,
-0.37447744607925415,
-0.741416871547699,
-0.5964348316192627,
-0.09770839661359787,
-0.1315591037273407,
0.8204724788665771,
-0.11619140952825546,
-0.07914324104785919,
-0.11129607260227203,
-0.6696323156356812,
-0.23828454315662384,
-0.3398592472076416,
-0.9126936793327332,
-0.482953280210495,
0.4982103705406189,
0.3534066677093506,
0.5794948935508728,
0.686033308506012,
0.6429165601730347,
0.43396225571632385,
-0.037324871867895126,
0.1376400589942932,
-0.09297335147857666,
-0.19031496345996857,
0.41923603415489197,
-0.7002341151237488,
-0.44241973757743835,
0.3282971680164337,
0.6296827793121338,
0.23484256863594055,
0.007683929055929184,
0.4462060332298279,
0.044905394315719604,
0.735454261302948,
-0.36965706944465637,
-0.023695332929491997,
-0.48746877908706665,
0.20347414910793304,
-0.15914268791675568,
-0.037138164043426514,
-0.2747045159339905,
-0.457736074924469,
0.11535139381885529,
-0.4381166994571686,
0.39791151881217957,
0.010497053153812885,
1.3639661073684692,
0.16708098351955414,
-0.1713085174560547,
0.024953775107860565,
-0.26268458366394043,
0.7973536849021912,
-0.7441643476486206,
0.21717171370983124,
0.41337257623672485,
0.24148185551166534,
-0.08083496987819672,
-1.0586925745010376,
-0.6307123303413391,
-0.18606649339199066,
0.0002544317103456706,
-0.021648496389389038,
-0.2164537012577057,
-0.10148067027330399,
0.2597903609275818,
0.25181880593299866,
-0.44010865688323975,
0.10682906955480576,
-0.5297539830207825,
-0.21305087208747864,
0.6085715293884277,
0.05116874352097511,
0.40953323245048523,
-0.32476845383644104,
-0.4280506372451782,
-0.38584446907043457,
-0.48898711800575256,
0.3252717852592468,
0.24587737023830414,
0.2129523903131485,
-0.6616863012313843,
0.5074856281280518,
0.05605536326766014,
0.6982414722442627,
-0.02660249173641205,
-0.43251219391822815,
0.7313666939735413,
-0.06633323431015015,
-0.4136558473110199,
-0.08313822746276855,
1.175633430480957,
0.5658135414123535,
0.1634339839220047,
0.19523297250270844,
-0.0962085947394371,
-0.4745062589645386,
-0.020654525607824326,
-1.1938143968582153,
-0.2604677677154541,
0.3521154522895813,
-0.879441499710083,
-0.48203209042549133,
0.21844063699245453,
-0.5124501585960388,
-0.2268432229757309,
0.05903937667608261,
0.4552991986274719,
-0.42989566922187805,
-0.42538610100746155,
0.05430371314287186,
-0.13325314223766327,
0.37115004658699036,
0.14203405380249023,
-0.5906597971916199,
0.1351097971200943,
0.22229062020778656,
1.3122295141220093,
0.1401495337486267,
-0.47035470604896545,
-0.26221540570259094,
-0.38237306475639343,
-0.22011491656303406,
0.4125920236110687,
-0.10161234438419342,
-0.16200295090675354,
-0.3378206193447113,
0.3388643264770508,
-0.2706914246082306,
-0.8196733593940735,
0.3650502860546112,
-0.3122868835926056,
0.19094251096248627,
-0.004904091823846102,
-0.029084281995892525,
-0.6309621930122375,
0.3008995056152344,
-0.5346730947494507,
1.402312159538269,
0.27842092514038086,
-0.9229663610458374,
0.2863938510417938,
-0.578763484954834,
-0.19145788252353668,
-0.3849440813064575,
0.005219656508415937,
-1.0957401990890503,
-0.16754208505153656,
0.2589605450630188,
0.8700389862060547,
-0.3934794068336487,
-0.07281536608934402,
-0.6129143238067627,
-0.3251670002937317,
0.31497129797935486,
0.13740669190883636,
1.0734502077102661,
0.25100648403167725,
-0.5429611206054688,
0.21784843504428864,
-0.6422085762023926,
0.19757963716983795,
0.4923722445964813,
-0.24261505901813507,
-0.1239829808473587,
-0.459246963262558,
0.10754761844873428,
0.38549572229385376,
0.0005027271108701825,
-0.564276933670044,
0.21448586881160736,
-0.1703403890132904,
0.5713128447532654,
0.43105417490005493,
-0.13112425804138184,
0.39608290791511536,
-0.46400678157806396,
0.2913931608200073,
0.2858252227306366,
0.2854510247707367,
-0.11374001204967499,
-0.5876470804214478,
-0.8015453219413757,
-0.45345422625541687,
0.39280977845191956,
0.47878149151802063,
-0.5906392335891724,
0.34106630086898804,
-0.20854663848876953,
-0.8528625965118408,
-0.4573950469493866,
0.08097974210977554,
0.4567543864250183,
0.5435879826545715,
0.30953821539878845,
-0.5093849301338196,
-0.6313554644584656,
-0.9302613735198975,
-0.05380910262465477,
0.045439112931489944,
0.011821726337075233,
0.463560551404953,
0.7264660000801086,
-0.1754179298877716,
0.6988248229026794,
-0.29281044006347656,
-0.29222217202186584,
-0.25267601013183594,
0.08841254562139511,
0.4093695282936096,
0.832099437713623,
0.8242072463035583,
-0.8466474413871765,
-0.4737040400505066,
-0.023675940930843353,
-0.9794518351554871,
0.19910240173339844,
-0.11242476850748062,
-0.07590793818235397,
0.2725590467453003,
0.27029144763946533,
-0.6408718228340149,
0.6537405252456665,
0.2488146871328354,
-0.16347216069698334,
0.3784598410129547,
-0.0691852793097496,
0.2896188795566559,
-1.2992026805877686,
0.13201113045215607,
0.48081740736961365,
-0.14780211448669434,
-0.38070738315582275,
0.07961510866880417,
0.09835832566022873,
-0.1468922197818756,
-0.602266252040863,
0.6969393491744995,
-0.5478137135505676,
-0.242793008685112,
-0.20578700304031372,
-0.12740178406238556,
-0.017239371314644814,
0.6025288105010986,
-0.15862436592578888,
0.47201821208000183,
0.7386423945426941,
-0.5644050240516663,
0.5459485650062561,
0.2805376648902893,
-0.18191249668598175,
0.3116166591644287,
-0.72276371717453,
0.1866394430398941,
0.03355417028069496,
0.38326185941696167,
-0.8627443313598633,
-0.29552558064460754,
0.4104108214378357,
-0.6978734731674194,
0.43533894419670105,
-0.658306896686554,
-0.4408513009548187,
-0.652315616607666,
-0.5554404258728027,
0.4280366897583008,
0.589364767074585,
-0.7847724556922913,
0.6407856345176697,
0.29438483715057373,
0.3754956126213074,
-0.6312875747680664,
-0.8393887877464294,
-0.19125379621982574,
-0.49849921464920044,
-0.7980324029922485,
0.456439346075058,
0.3226906955242157,
0.08638279885053635,
0.03431084752082825,
-0.1635177582502365,
-0.17195627093315125,
-0.11082285642623901,
0.7221019864082336,
0.342162162065506,
-0.27190929651260376,
-0.21018242835998535,
-0.4845534563064575,
-0.03554254025220871,
-0.015787435695528984,
-0.4046647250652313,
0.6128990054130554,
-0.34907224774360657,
-0.05122484639286995,
-0.946409285068512,
-0.21927528083324432,
0.5775348544120789,
-0.15181437134742737,
0.8220127820968628,
1.186039686203003,
-0.520042359828949,
0.1437680423259735,
-0.5133559107780457,
-0.187224879860878,
-0.5027512311935425,
0.3794330060482025,
-0.4527505338191986,
-0.4843794107437134,
0.9595658779144287,
0.033577002584934235,
-0.0014107167953625321,
0.6493582725524902,
0.3387535810470581,
-0.07680614292621613,
0.8017215132713318,
0.5736362338066101,
0.1662721186876297,
0.7273172736167908,
-0.9061659574508667,
-0.22195936739444733,
-0.952970027923584,
-0.6513695120811462,
-0.4635567367076874,
-0.5293684005737305,
-0.7712084650993347,
-0.4209119975566864,
0.40926337242126465,
0.1751173436641693,
-0.45303651690483093,
0.5451992750167847,
-0.7724483609199524,
0.10077983886003494,
0.7326833605766296,
0.6784394979476929,
-0.41388118267059326,
0.3678516745567322,
-0.35816165804862976,
-0.0011080054100602865,
-0.8235939145088196,
-0.26887571811676025,
1.1865462064743042,
0.4701661467552185,
0.5517743825912476,
-0.08954568952322006,
0.8287662267684937,
-0.31057116389274597,
0.34672003984451294,
-0.6627294421195984,
0.6230694651603699,
-0.0766533613204956,
-0.45444387197494507,
-0.03587116673588753,
-0.5132749080657959,
-1.1091132164001465,
0.1544867604970932,
-0.30484429001808167,
-0.8158209323883057,
0.18959400057792664,
0.1986398845911026,
-0.2837434411048889,
0.7722648978233337,
-0.8674749135971069,
0.916687548160553,
-0.05335213243961334,
-0.5436697602272034,
0.1319233924150467,
-0.7414104342460632,
0.39805829524993896,
0.20876777172088623,
-0.1775331199169159,
-0.11060954630374908,
0.13390310108661652,
1.0940808057785034,
-0.6632060408592224,
0.819428563117981,
-0.49431827664375305,
0.41609805822372437,
0.5961350798606873,
-0.1273093968629837,
0.40298959612846375,
-0.05176255851984024,
-0.24658703804016113,
0.27626219391822815,
-0.0171051025390625,
-0.4780831038951874,
-0.5578088760375977,
0.6930992603302002,
-0.9334447979927063,
-0.26092827320098877,
-0.3649969696998596,
-0.3462616205215454,
0.205439493060112,
0.2003602534532547,
0.5632120966911316,
0.7205398082733154,
0.3669765591621399,
0.3323966860771179,
0.5685994625091553,
-0.5302707552909851,
0.5088195204734802,
-0.09600286185741425,
-0.2660495638847351,
-0.5624807476997375,
0.9515230655670166,
0.12932747602462769,
0.02000402845442295,
0.13930918276309967,
0.17294904589653015,
-0.3384872376918793,
-0.6400954723358154,
-0.316924512386322,
0.27723509073257446,
-0.6187702417373657,
-0.5126295685768127,
-0.6424781084060669,
-0.41436922550201416,
-0.31728702783584595,
-0.04697370529174805,
-0.5686346888542175,
-0.33187776803970337,
-0.42672842741012573,
0.3242977261543274,
0.7336624264717102,
0.495635449886322,
-0.16023774445056915,
0.6234549880027771,
-0.675430417060852,
0.19333548843860626,
0.07215636968612671,
0.4662696123123169,
-0.0707777664065361,
-0.8228791356086731,
-0.27240538597106934,
-0.017414478585124016,
-0.45394137501716614,
-0.6513543128967285,
0.5143735408782959,
0.0783066675066948,
0.39860618114471436,
0.24455475807189941,
-0.2718374729156494,
0.7549176216125488,
-0.015025493688881397,
0.6250133514404297,
0.5717119574546814,
-0.5451094508171082,
0.6303005814552307,
-0.1981404423713684,
0.25684574246406555,
0.14384111762046814,
0.40357568860054016,
-0.24515387415885925,
0.13147424161434174,
-0.9019796848297119,
-0.8111575245857239,
0.8364982008934021,
0.13343656063079834,
-0.01672898605465889,
0.34152787923812866,
0.7742083072662354,
-0.13387146592140198,
-0.07237204164266586,
-0.859839677810669,
-0.45691484212875366,
-0.4137187898159027,
-0.2503059208393097,
0.1909787356853485,
-0.11073070764541626,
-0.031447045505046844,
-0.7211777567863464,
0.6884593963623047,
0.003837296972051263,
0.7919813394546509,
0.3950577676296234,
0.052759792655706406,
0.05595900118350983,
-0.47693273425102234,
0.5995327234268188,
0.26095205545425415,
-0.39490386843681335,
0.07907191663980484,
0.14345687627792358,
-0.732158899307251,
0.16230709850788116,
0.11864076554775238,
0.03070748969912529,
0.03198980912566185,
0.3613387644290924,
0.895924985408783,
-0.05238579958677292,
0.09379538893699646,
0.45496776700019836,
-0.13138389587402344,
-0.5558854341506958,
-0.3073793351650238,
0.14006726443767548,
-0.039888545870780945,
0.4232349097728729,
0.4395013749599457,
0.4674142003059387,
-0.1227404996752739,
-0.23640455305576324,
0.2831578850746155,
0.4135206937789917,
-0.2954753339290619,
-0.25651001930236816,
0.7236454486846924,
-0.09993383288383484,
-0.2277735322713852,
0.7834814786911011,
-0.13311268389225006,
-0.4863165318965912,
1.0759793519973755,
0.4622451066970825,
0.9269233345985413,
-0.08244089782238007,
0.06493154168128967,
0.8917442560195923,
0.31376805901527405,
-0.10660708695650101,
0.2619699239730835,
0.2143675684928894,
-0.7820839881896973,
0.01692471094429493,
-0.4519488215446472,
0.10996030271053314,
0.4556168019771576,
-0.5754011869430542,
0.3593224585056305,
-0.6879422664642334,
-0.5173597931861877,
0.2475656121969223,
0.2960878610610962,
-0.8942639827728271,
0.2995672821998596,
-0.15976344048976898,
0.9494009017944336,
-0.5839245319366455,
0.8114690780639648,
0.9143181443214417,
-0.5009404420852661,
-1.1185073852539062,
0.013654801063239574,
0.09442786127328873,
-0.9374832510948181,
0.7423568964004517,
0.5140979290008545,
0.011262595653533936,
0.10740146040916443,
-0.8437169790267944,
-0.7055767774581909,
1.4188138246536255,
0.42389801144599915,
-0.08796501904726028,
0.3425700068473816,
-0.16670162975788116,
0.0653371810913086,
-0.4828898310661316,
0.5531744360923767,
0.1472480446100235,
0.28993067145347595,
0.31202030181884766,
-0.7744052410125732,
0.2530193328857422,
-0.39293015003204346,
0.18893401324748993,
0.21682193875312805,
-0.8722915053367615,
0.7795101404190063,
-0.5831138491630554,
-0.1359471082687378,
0.012401042506098747,
0.6125088334083557,
0.25680088996887207,
0.3005574345588684,
0.4628185033798218,
0.7548122406005859,
0.5080176591873169,
-0.24219128489494324,
0.9473742246627808,
0.016934433951973915,
0.4970037639141083,
0.6450777053833008,
0.24751773476600647,
0.6296773552894592,
0.36367669701576233,
-0.1845729947090149,
0.41382548213005066,
1.2270251512527466,
-0.2813389301300049,
0.30661535263061523,
0.1955552101135254,
-0.07356643676757812,
-0.005383160896599293,
0.09786003082990646,
-0.47442013025283813,
0.6504635810852051,
0.1167997494339943,
-0.6259492635726929,
-0.14229576289653778,
0.09837325662374496,
0.05372624471783638,
-0.3483488857746124,
-0.2681216895580292,
0.36482152342796326,
0.0637262761592865,
-0.36496496200561523,
1.0872145891189575,
0.26650404930114746,
0.8891480565071106,
-0.2689782381057739,
0.06064436212182045,
-0.31325051188468933,
0.103676937520504,
-0.4843413531780243,
-0.7068573236465454,
0.28762343525886536,
-0.2964957356452942,
-0.034857332706451416,
0.1028909906744957,
0.741881787776947,
-0.11524251848459244,
-0.3132686913013458,
0.08478605002164841,
0.20882079005241394,
0.5144577622413635,
0.038547907024621964,
-1.236317753791809,
0.29479530453681946,
0.14158570766448975,
-0.5930477380752563,
0.32462242245674133,
0.3153114914894104,
0.08030076324939728,
0.9015547037124634,
0.6318965554237366,
-0.2159574329853058,
0.13691046833992004,
-0.24228200316429138,
0.84771728515625,
-0.6403581500053406,
-0.21756501495838165,
-0.8629745244979858,
0.6076021790504456,
-0.2022780478000641,
-0.616715133190155,
0.58740234375,
0.695358395576477,
0.8193150162696838,
0.04244224354624748,
0.5328148007392883,
-0.3225293457508087,
-0.030999181792140007,
-0.5170333385467529,
0.6721293926239014,
-0.8111162185668945,
0.09081375598907471,
-0.08174844831228256,
-0.676059901714325,
-0.44690439105033875,
0.840299129486084,
-0.2758261561393738,
0.4403124153614044,
0.544383704662323,
1.0911879539489746,
-0.450914204120636,
-0.2624810039997101,
0.06788709759712219,
-0.0032723688054829836,
-0.038936909288167953,
0.34037256240844727,
0.43732962012290955,
-0.9268322587013245,
0.4039561152458191,
-0.5546688437461853,
-0.18340513110160828,
-0.2649306058883667,
-0.7723538875579834,
-1.015716552734375,
-0.9115860462188721,
-0.553787350654602,
-0.8974683880805969,
-0.17468874156475067,
0.9823368787765503,
1.1638892889022827,
-0.5544816851615906,
-0.15976957976818085,
-0.0025378225836902857,
0.23647920787334442,
-0.18541373312473297,
-0.22402295470237732,
0.5842830538749695,
-0.03952034190297127,
-0.6286787986755371,
-0.21890637278556824,
-0.007982644252479076,
0.42943283915519714,
0.18137148022651672,
-0.238022118806839,
-0.17250889539718628,
-0.3679819703102112,
0.30880206823349,
0.4988957345485687,
-0.6253921389579773,
-0.08677980303764343,
-0.21018248796463013,
-0.22719134390354156,
0.4020480811595917,
0.563814103603363,
-0.5050827264785767,
0.2662062644958496,
0.2401396632194519,
0.36281049251556396,
0.9180933237075806,
-0.34163936972618103,
0.11988992244005203,
-0.8492425680160522,
0.6467593908309937,
-0.15102282166481018,
0.39679187536239624,
0.41327059268951416,
-0.2859653830528259,
0.6552876830101013,
0.4252198040485382,
-0.4167431890964508,
-0.9459969401359558,
-0.0806816890835762,
-1.0966153144836426,
-0.050301868468523026,
1.0355197191238403,
-0.28048214316368103,
-0.528342604637146,
0.3525335490703583,
-0.012822181917726994,
0.6232350468635559,
-0.11751161515712738,
0.4551444351673126,
0.17568954825401306,
-0.16459998488426208,
-0.7555505633354187,
-0.7522554397583008,
0.479911208152771,
0.1338600218296051,
-0.6148243546485901,
-0.5369886159896851,
-0.046340666711330414,
0.6901860237121582,
0.21824821829795837,
0.6391547322273254,
-0.2230127602815628,
0.16810233891010284,
0.07777722924947739,
0.5638473629951477,
-0.4413525462150574,
-0.03758186101913452,
-0.2584663927555084,
-0.023583868518471718,
-0.12658534944057465,
-0.7305856943130493
] |
distilgpt2 | null | "2023-04-29T12:24:21" | 14,735,848 | 275 | transformers | [
"transformers",
"pytorch",
"tf",
"jax",
"tflite",
"rust",
"coreml",
"safetensors",
"gpt2",
"text-generation",
"exbert",
"en",
"dataset:openwebtext",
"arxiv:1910.01108",
"arxiv:2201.08542",
"arxiv:2203.12574",
"arxiv:1910.09700",
"arxiv:1503.02531",
"license:apache-2.0",
"model-index",
"co2_eq_emissions",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | "2022-03-02T23:29:04" | ---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- openwebtext
model-index:
- name: distilgpt2
results:
- task:
type: text-generation
name: Text Generation
dataset:
type: wikitext
name: WikiText-103
metrics:
- type: perplexity
name: Perplexity
value: 21.1
co2_eq_emissions: 149200
---
# DistilGPT2
DistilGPT2 (short for Distilled-GPT2) is an English-language model pre-trained with the supervision of the smallest version of Generative Pre-trained Transformer 2 (GPT-2). Like GPT-2, DistilGPT2 can be used to generate text. Users of this model card should also consider information about the design, training, and limitations of [GPT-2](https://huggingface.co/gpt2).
## Model Details
- **Developed by:** Hugging Face
- **Model type:** Transformer-based Language Model
- **Language:** English
- **License:** Apache 2.0
- **Model Description:** DistilGPT2 is an English-language model pre-trained with the supervision of the 124 million parameter version of GPT-2. DistilGPT2, which has 82 million parameters, was developed using [knowledge distillation](#knowledge-distillation) and was designed to be a faster, lighter version of GPT-2.
- **Resources for more information:** See [this repository](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation) for more about Distil\* (a class of compressed models including Distilled-GPT2), [Sanh et al. (2019)](https://arxiv.org/abs/1910.01108) for more information about knowledge distillation and the training procedure, and this page for more about [GPT-2](https://openai.com/blog/better-language-models/).
## Uses, Limitations and Risks
#### Limitations and Risks
<details>
<summary>Click to expand</summary>
**CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.**
As the developers of GPT-2 (OpenAI) note in their [model card](https://github.com/openai/gpt-2/blob/master/model_card.md), “language models like GPT-2 reflect the biases inherent to the systems they were trained on.” Significant research has explored bias and fairness issues with models for language generation including GPT-2 (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
DistilGPT2 also suffers from persistent bias issues, as highlighted in the demonstrative examples below. Note that these examples are not a comprehensive stress-testing of the model. Readers considering using the model should consider more rigorous evaluations of the model depending on their use case and context.
The impact of model compression techniques – such as knowledge distillation – on bias and fairness issues associated with language models is an active area of research. For example:
- [Silva, Tambwekar and Gombolay (2021)](https://aclanthology.org/2021.naacl-main.189.pdf) find that distilled versions of BERT and RoBERTa consistently exhibit statistically significant bias (with regard to gender and race) with effect sizes larger than the teacher models.
- [Xu and Hu (2022)](https://arxiv.org/pdf/2201.08542.pdf) find that distilled versions of GPT-2 showed consistent reductions in toxicity and bias compared to the teacher model (see the paper for more detail on metrics used to define/measure toxicity and bias).
- [Gupta et al. (2022)](https://arxiv.org/pdf/2203.12574.pdf) find that DistilGPT2 exhibits greater gender disparities than GPT-2 and propose a technique for mitigating gender bias in distilled language models like DistilGPT2.
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='distilgpt2')
>>> set_seed(48)
>>> generator("The White man worked as a", max_length=20, num_return_sequences=3)
[{'generated_text': "The White man worked as a salesman at a McDonald's restaurant called Kia at the time of the"},
{'generated_text': 'The White man worked as a contractor in the Army in the late 1990s. He became a "'},
{'generated_text': 'The White man worked as a police spokesman to the US Navy in the 1930s.'}]
>>> set_seed(48)
>>> generator("The Black man worked as a", max_length=20, num_return_sequences=3)
[{'generated_text': 'The Black man worked as a shop assistant for an hour at Wal-Mart at Wal-Mart in'},
{'generated_text': 'The Black man worked as a waiter in the hotel when he was assaulted when he got out of a'},
{'generated_text': 'The Black man worked as a police spokesman four months ago...'}]
```
</details>
#### Potential Uses
Since DistilGPT2 is a distilled version of GPT-2, it is intended to be used for similar use cases with the increased functionality of being smaller and easier to run than the base model.
The developers of GPT-2 state in their [model card](https://github.com/openai/gpt-2/blob/master/model_card.md) that they envisioned GPT-2 would be used by researchers to better understand large-scale generative language models, with possible secondary use cases including:
> - *Writing assistance: Grammar assistance, autocompletion (for normal prose or code)*
> - *Creative writing and art: exploring the generation of creative, fictional texts; aiding creation of poetry and other literary art.*
> - *Entertainment: Creation of games, chat bots, and amusing generations.*
Using DistilGPT2, the Hugging Face team built the [Write With Transformers](https://transformer.huggingface.co/doc/distil-gpt2) web app, which allows users to play with the model to generate text directly from their browser.
#### Out-of-scope Uses
OpenAI states in the GPT-2 [model card](https://github.com/openai/gpt-2/blob/master/model_card.md):
> Because large-scale language models like GPT-2 do not distinguish fact from fiction, we don’t support use-cases that require the generated text to be true.
>
> Additionally, language models like GPT-2 reflect the biases inherent to the systems they were trained on, so we do not recommend that they be deployed into systems that interact with humans unless the deployers first carry out a study of biases relevant to the intended use-case.
### How to Get Started with the Model
<details>
<summary>Click to expand</summary>
*Be sure to read the sections on in-scope and out-of-scope uses and limitations of the model for further information on how to use the model.*
Using DistilGPT2 is similar to using GPT-2. DistilGPT2 can be used directly with a pipeline for text generation. Since the generation relies on some randomness, we set a seed for reproducibility:
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='distilgpt2')
>>> set_seed(42)
>>> generator("Hello, I’m a language model", max_length=20, num_return_sequences=5)
Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
[{'generated_text': "Hello, I'm a language model, I'm a language model. In my previous post I've"},
{'generated_text': "Hello, I'm a language model, and I'd love to hear what you think about it."},
{'generated_text': "Hello, I'm a language model, but I don't get much of a connection anymore, so"},
{'generated_text': "Hello, I'm a language model, a functional language... It's not an example, and that"},
{'generated_text': "Hello, I'm a language model, not an object model.\n\nIn a nutshell, I"}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import GPT2Tokenizer, GPT2Model
tokenizer = GPT2Tokenizer.from_pretrained('distilgpt2')
model = GPT2Model.from_pretrained('distilgpt2')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
And in TensorFlow:
```python
from transformers import GPT2Tokenizer, TFGPT2Model
tokenizer = GPT2Tokenizer.from_pretrained('distilgpt2')
model = TFGPT2Model.from_pretrained('distilgpt2')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
</details>
## Training Data
DistilGPT2 was trained using [OpenWebTextCorpus](https://skylion007.github.io/OpenWebTextCorpus/), an open-source reproduction of OpenAI’s WebText dataset, which was used to train GPT-2. See the [OpenWebTextCorpus Dataset Card](https://huggingface.co/datasets/openwebtext) for additional information about OpenWebTextCorpus and [Radford et al. (2019)](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf) for additional information about WebText.
## Training Procedure
The texts were tokenized using the same tokenizer as GPT-2, a byte-level version of Byte Pair Encoding (BPE). DistilGPT2 was trained using knowledge distillation, following a procedure similar to the training procedure for DistilBERT, described in more detail in [Sanh et al. (2019)](https://arxiv.org/abs/1910.01108).
## Evaluation Results
The creators of DistilGPT2 [report](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation) that, on the [WikiText-103](https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/) benchmark, GPT-2 reaches a perplexity on the test set of 16.3 compared to 21.1 for DistilGPT2 (after fine-tuning on the train set).
## Environmental Impact
*Carbon emissions were estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact.*
- **Hardware Type:** 8 16GB V100
- **Hours used:** 168 (1 week)
- **Cloud Provider:** Azure
- **Compute Region:** unavailable, assumed East US for calculations
- **Carbon Emitted** *(Power consumption x Time x Carbon produced based on location of power grid)*: 149.2 kg eq. CO2
## Citation
```bibtex
@inproceedings{sanh2019distilbert,
title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
author={Sanh, Victor and Debut, Lysandre and Chaumond, Julien and Wolf, Thomas},
booktitle={NeurIPS EMC^2 Workshop},
year={2019}
}
```
## Glossary
- <a name="knowledge-distillation">**Knowledge Distillation**</a>: As described in [Sanh et al. (2019)](https://arxiv.org/pdf/1910.01108.pdf), “knowledge distillation is a compression technique in which a compact model – the student – is trained to reproduce the behavior of a larger model – the teacher – or an ensemble of models.” Also see [Bucila et al. (2006)](https://www.cs.cornell.edu/~caruana/compression.kdd06.pdf) and [Hinton et al. (2015)](https://arxiv.org/abs/1503.02531).
<a href="https://huggingface.co/exbert/?model=distilgpt2">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| [
-0.14997319877147675,
-0.7801599502563477,
0.3517022430896759,
0.20232826471328735,
-0.27720993757247925,
-0.2595542073249817,
-0.2875414192676544,
-0.4347073435783386,
-0.4004024565219879,
0.15011632442474365,
-0.325300931930542,
-0.021060721948742867,
-0.9394680857658386,
-0.013787766918540001,
-0.27776509523391724,
1.4022268056869507,
-0.1333015114068985,
0.05073832347989082,
0.17392899096012115,
0.2597130239009857,
-0.475102961063385,
-0.4987308084964752,
-0.6867436766624451,
-0.552096426486969,
0.20563437044620514,
-0.021347956731915474,
0.643539309501648,
0.41404926776885986,
0.25518548488616943,
0.25403714179992676,
-0.31431832909584045,
-0.07985579967498779,
-0.5399245023727417,
-0.021993810310959816,
-0.23764774203300476,
-0.3982561230659485,
-0.32265016436576843,
0.3191944360733032,
0.25785666704177856,
0.33511555194854736,
-0.08362041413784027,
0.24792176485061646,
0.45538488030433655,
0.5381214618682861,
-0.25141751766204834,
0.4756637215614319,
-0.5434263348579407,
-0.03335218504071236,
-0.2530502378940582,
0.2767728567123413,
-0.3942122161388397,
-0.28519466519355774,
0.2017160803079605,
-0.3945971429347992,
0.29676058888435364,
-0.1444423943758011,
0.9476772546768188,
0.32963135838508606,
-0.40798017382621765,
-0.3951435387134552,
-0.8853196501731873,
0.6894330978393555,
-0.7517397999763489,
0.08439178764820099,
0.3927401602268219,
0.2974124550819397,
-0.13229992985725403,
-1.0694612264633179,
-0.7972820997238159,
-0.13287539780139923,
-0.3512874245643616,
0.3011355698108673,
-0.2901066839694977,
-0.028507357463240623,
0.4688999652862549,
0.40967389941215515,
-0.7293018102645874,
-0.037739697843790054,
-0.530029833316803,
-0.06795495003461838,
0.3645039200782776,
-0.1986830085515976,
0.3634093999862671,
-0.16896085441112518,
-0.2901749014854431,
-0.16181528568267822,
-0.5422317385673523,
-0.21755942702293396,
0.568647563457489,
0.14989787340164185,
-0.1727900207042694,
0.5640863180160522,
0.08634120225906372,
0.3794780373573303,
-0.058830972760915756,
-0.1322142779827118,
0.21398453414440155,
-0.3432896137237549,
-0.1479506492614746,
-0.13528454303741455,
0.9538782835006714,
0.4245372712612152,
0.5075126886367798,
0.03858368098735809,
-0.21545429527759552,
0.07630909234285355,
-0.03507634252309799,
-0.9909726977348328,
-0.4909948408603668,
0.11676020175218582,
-0.43236345052719116,
-0.42257288098335266,
0.01769413612782955,
-0.8996926546096802,
-0.1593405306339264,
-0.1407867670059204,
0.2863869071006775,
-0.28539103269577026,
-0.7029407024383545,
-0.1970815807580948,
-0.5244009494781494,
0.07548017054796219,
0.0325440838932991,
-1.2639846801757812,
0.22609449923038483,
0.6543174386024475,
0.9431106448173523,
0.12918485701084137,
-0.1652517318725586,
-0.023661669343709946,
-0.08683646470308304,
-0.003590498585253954,
0.30162546038627625,
-0.1883842647075653,
-0.2577866017818451,
-0.07528146356344223,
-0.05992722138762474,
0.26135051250457764,
-0.4859956204891205,
0.4393008351325989,
-0.27610909938812256,
0.7135880589485168,
-0.28242355585098267,
-0.2662886083126068,
-0.04277747869491577,
-0.08055379241704941,
-0.5231156945228577,
1.3165557384490967,
0.47653961181640625,
-0.9761852025985718,
0.32213684916496277,
-0.7264615297317505,
-0.36023402214050293,
-0.13585127890110016,
-0.0563209243118763,
-0.4791082739830017,
0.0836762934923172,
0.033422816544771194,
0.039189159870147705,
-0.5506054162979126,
0.5194968581199646,
0.01144873071461916,
-0.25368615984916687,
0.12020478397607803,
-0.6429350972175598,
1.0244529247283936,
0.5094640851020813,
-0.7190407514572144,
-0.44916144013404846,
-0.3039834797382355,
-0.03883642703294754,
0.4651508331298828,
-0.3853660225868225,
-0.23045013844966888,
0.08843358606100082,
0.3947925269603729,
0.3873755633831024,
0.10645545274019241,
-0.3724016845226288,
0.20908674597740173,
-0.4030109643936157,
0.8357400894165039,
0.6633899211883545,
-0.3086619973182678,
0.3236278295516968,
-0.1162552759051323,
0.20233945548534393,
0.07934381812810898,
0.15241171419620514,
0.08676250278949738,
-0.7779807448387146,
-0.6793509125709534,
-0.055693112313747406,
0.37477269768714905,
0.770793080329895,
-0.8547250032424927,
0.39063599705696106,
-0.1414669305086136,
-0.44930046796798706,
-0.3667612671852112,
-0.03935248777270317,
0.6401139497756958,
0.5285472869873047,
0.3768947124481201,
-0.11183740943670273,
-0.6038926243782043,
-0.7961452007293701,
-0.278326153755188,
-0.5522162914276123,
-0.19137614965438843,
0.12226834893226624,
0.6410109996795654,
-0.16449834406375885,
1.0624302625656128,
-0.6934837102890015,
-0.2185184210538864,
-0.4302908778190613,
0.24254776537418365,
0.13325516879558563,
0.4740506410598755,
0.7157189846038818,
-0.8007349967956543,
-0.618667483329773,
-0.19172757863998413,
-0.7420084476470947,
-0.3052895665168762,
0.2382364422082901,
0.03225459158420563,
0.2526286840438843,
0.18679718673229218,
-0.6189733147621155,
0.21191060543060303,
0.5433599352836609,
-0.39850836992263794,
0.45170995593070984,
-0.2182074636220932,
-0.13060560822486877,
-1.3507941961288452,
0.19470727443695068,
0.08445140719413757,
-0.12147211283445358,
-0.9063640236854553,
-0.02200930565595627,
-0.14148397743701935,
-0.1352730542421341,
-0.44272851943969727,
0.6442310214042664,
-0.46741339564323425,
0.10310035198926926,
-0.346534788608551,
-0.023199766874313354,
-0.030462799593806267,
0.4290684163570404,
0.047529350966215134,
0.9650529026985168,
0.3986675441265106,
-0.33440640568733215,
0.2722375690937042,
0.28766360878944397,
-0.23926441371440887,
0.21911917626857758,
-0.8310061693191528,
0.37629395723342896,
-0.20203983783721924,
0.22959718108177185,
-0.8878973126411438,
-0.1616460680961609,
0.37783923745155334,
-0.5609449744224548,
0.4874541759490967,
-0.39350172877311707,
-0.6913551688194275,
-0.5773557424545288,
-0.13033851981163025,
0.37267884612083435,
1.0941001176834106,
-0.4613761901855469,
0.1880086064338684,
0.38796988129615784,
-0.3033198118209839,
-0.45711255073547363,
-1.0710840225219727,
0.001869382569566369,
-0.25190818309783936,
-0.43954071402549744,
0.2403729110956192,
0.19964554905891418,
-0.31304726004600525,
-0.03346151113510132,
0.1192031279206276,
-0.2220275104045868,
0.03264202922582626,
0.14292114973068237,
0.22378063201904297,
0.021408317610621452,
0.005548074375838041,
0.21296516060829163,
-0.271453857421875,
-0.039769671857357025,
-0.5181360840797424,
0.6497704386711121,
-0.027825644239783287,
-0.014063544571399689,
-0.35664302110671997,
0.2892766296863556,
0.28569495677948,
-0.10890933126211166,
0.5339377522468567,
0.8362244367599487,
-0.40237292647361755,
0.019858330488204956,
-0.40191900730133057,
-0.36457791924476624,
-0.4670296013355255,
0.7691959738731384,
-0.21259072422981262,
-0.9279966950416565,
0.24590148031711578,
-0.0030825326684862375,
-0.008993327617645264,
0.6719167828559875,
0.8663622140884399,
0.0211371798068285,
0.9264235496520996,
0.548044741153717,
-0.15005071461200714,
0.47111794352531433,
-0.2136121243238449,
0.38078969717025757,
-0.8045397996902466,
-0.20954887568950653,
-0.49495047330856323,
-0.22315257787704468,
-0.7720122337341309,
-0.41156330704689026,
0.22847922146320343,
0.39094680547714233,
-0.2729418873786926,
0.381623774766922,
-0.7667489051818848,
0.4491100311279297,
0.624347448348999,
-0.11586464196443558,
0.06729573011398315,
0.2163492739200592,
0.053797632455825806,
-0.1333385854959488,
-0.5026620030403137,
-0.7481796741485596,
1.1477546691894531,
0.5820018649101257,
0.5458890795707703,
0.2444564700126648,
0.4058910012245178,
0.1901218444108963,
0.26125261187553406,
-0.38663744926452637,
0.23428098857402802,
-0.40247493982315063,
-0.8927587866783142,
-0.2952089011669159,
-0.3860483169555664,
-0.6733523011207581,
0.1791820377111435,
0.2202165126800537,
-0.8062000274658203,
-0.019230734556913376,
0.28468358516693115,
-0.021234650164842606,
0.3667987585067749,
-0.9518305659294128,
0.8171706795692444,
-0.1928875595331192,
-0.25743013620376587,
0.12474622577428818,
-0.650271475315094,
0.46610963344573975,
0.02165895886719227,
0.049234434962272644,
0.007000301964581013,
0.21846294403076172,
0.6792141795158386,
-0.6675053238868713,
0.7955493330955505,
-0.4700928330421448,
-0.14919954538345337,
0.5609260201454163,
-0.0774434506893158,
0.5561556816101074,
0.10082148760557175,
-0.028308700770139694,
0.40885210037231445,
-0.1444021463394165,
-0.19300217926502228,
-0.4023778438568115,
0.5342956781387329,
-0.9189901947975159,
-0.4604566991329193,
-0.6151334643363953,
-0.37888115644454956,
0.28390151262283325,
0.1694415658712387,
0.5684319734573364,
0.258111834526062,
-0.0699101909995079,
-0.21969276666641235,
0.4766472280025482,
-0.32620754837989807,
0.5061010122299194,
0.25449666380882263,
-0.1417965441942215,
-0.29505258798599243,
0.7346757054328918,
0.11290766298770905,
0.42422541975975037,
0.1725950837135315,
0.20087723433971405,
-0.6464307308197021,
-0.3853971064090729,
-0.6379641890525818,
0.13106639683246613,
-0.5468816757202148,
-0.010849983431398869,
-0.6594603061676025,
-0.3330208957195282,
-0.5847020745277405,
0.3776516318321228,
-0.3150324821472168,
-0.501894474029541,
-0.3634456992149353,
-0.19034069776535034,
0.4611719250679016,
0.789322555065155,
-0.07828389108181,
0.34973040223121643,
-0.359024316072464,
0.2644980549812317,
0.44826358556747437,
0.37089216709136963,
-0.14211595058441162,
-0.7238242626190186,
0.11014486104249954,
0.337197482585907,
-0.5102708339691162,
-0.8676735758781433,
0.2029038369655609,
0.1414453238248825,
0.28909826278686523,
0.1719139963388443,
-0.13069525361061096,
0.3763827681541443,
-0.6116216778755188,
1.0269215106964111,
0.21332493424415588,
-0.8407469987869263,
0.5998806953430176,
-0.2154000997543335,
0.06256280094385147,
0.45755577087402344,
0.2481001764535904,
-0.6014494299888611,
-0.42143604159355164,
-0.3933826684951782,
-0.8880963921546936,
0.9779914021492004,
0.6977978348731995,
0.35553449392318726,
-0.10211772471666336,
0.43386006355285645,
0.07722976058721542,
0.13435113430023193,
-1.031292200088501,
-0.5494038462638855,
-0.5161272287368774,
-0.24672019481658936,
-0.025167908519506454,
-0.6669707894325256,
0.056286901235580444,
-0.23068518936634064,
0.6955719590187073,
0.1282198429107666,
0.5561532974243164,
0.17389380931854248,
-0.15844205021858215,
0.2117733657360077,
0.31488239765167236,
0.6522135138511658,
0.38301321864128113,
-0.08996926248073578,
0.2678183317184448,
0.10207415372133255,
-0.8013430833816528,
0.2872013449668884,
0.3473336100578308,
-0.4839618504047394,
0.13266418874263763,
0.18052205443382263,
1.0641231536865234,
-0.4316845238208771,
-0.28973010182380676,
0.5224101543426514,
0.0006533796549774706,
-0.3431776463985443,
-0.3544093668460846,
-0.004990886896848679,
0.12682844698429108,
0.24337927997112274,
0.27840596437454224,
-0.10856853425502777,
0.17004536092281342,
-0.7174732089042664,
0.10450664907693863,
0.138703852891922,
-0.4600818157196045,
-0.3717900216579437,
1.0260999202728271,
0.2628348171710968,
-0.2892743945121765,
0.80785071849823,
-0.5525161027908325,
-0.658165454864502,
0.5259265899658203,
0.7344656586647034,
0.9176239967346191,
-0.1547577679157257,
0.25435954332351685,
0.5052648782730103,
0.4772353172302246,
-0.43624138832092285,
0.1938323676586151,
0.2597699463367462,
-0.4882529079914093,
-0.27511832118034363,
-0.5890159010887146,
0.12815973162651062,
0.3435707986354828,
-0.3629325330257416,
0.27451568841934204,
-0.2989485263824463,
-0.39915141463279724,
-0.19923794269561768,
-0.11194407194852829,
-0.5237016677856445,
0.03836808726191521,
0.14243638515472412,
0.6396914720535278,
-1.0864371061325073,
0.9635149240493774,
0.520175576210022,
-0.6943855285644531,
-0.8406863808631897,
0.10404954850673676,
0.14530730247497559,
-0.5124715566635132,
0.7215794920921326,
0.17738355696201324,
0.3947673439979553,
0.04437534511089325,
-0.25011327862739563,
-0.5770787000656128,
1.242838978767395,
0.4082915484905243,
-0.6541252732276917,
-0.14843451976776123,
0.5912708044052124,
0.7015088200569153,
0.056120678782463074,
0.7250467538833618,
0.7755418419837952,
0.5597825646400452,
-0.052530769258737564,
-1.1433799266815186,
0.25824469327926636,
-0.3229130804538727,
0.4018814265727997,
0.12984447181224823,
-0.6774179339408875,
1.1452372074127197,
-0.12412158399820328,
-0.20628733932971954,
0.04827529191970825,
0.43177223205566406,
0.09465222805738449,
0.09444079548120499,
0.3435218036174774,
0.6422134041786194,
0.6340004205703735,
-0.586469829082489,
1.104017972946167,
-0.05960334837436676,
0.7416789531707764,
1.1188803911209106,
-0.188157856464386,
0.3989291191101074,
0.41967540979385376,
-0.5277469754219055,
0.31866347789764404,
0.5355705618858337,
-0.20408272743225098,
0.7661280035972595,
0.22113865613937378,
-0.13725298643112183,
0.3634769022464752,
0.043836560100317,
-0.6445651650428772,
0.2425004094839096,
-0.01855052448809147,
-0.5454530119895935,
-0.25298741459846497,
-0.12532827258110046,
0.4204944670200348,
-0.13984693586826324,
0.21826598048210144,
0.8081325888633728,
0.2217780351638794,
-0.6799306273460388,
0.524803876876831,
0.3625078797340393,
0.736051619052887,
-0.5104166269302368,
-0.07904054969549179,
-0.2734394669532776,
0.25944334268569946,
-0.07597152888774872,
-0.8402581214904785,
0.3393779397010803,
0.2911926805973053,
-0.35589149594306946,
-0.26534122228622437,
0.641334593296051,
-0.6999664306640625,
-0.48108428716659546,
0.2496575117111206,
0.3454931974411011,
0.30572474002838135,
-0.21682794392108917,
-0.7200984954833984,
-0.1303914487361908,
0.14773529767990112,
-0.3552591800689697,
0.3563537299633026,
0.39629894495010376,
-0.040259964764118195,
0.29617488384246826,
0.5899072289466858,
0.031227296218276024,
-0.04556633159518242,
0.17594049870967865,
0.8116534948348999,
-0.24042898416519165,
-0.25841832160949707,
-1.1533931493759155,
0.6338663101196289,
-0.15533366799354553,
-0.3729175329208374,
0.5352694392204285,
0.7678796052932739,
0.8613851070404053,
-0.20806367695331573,
1.1310893297195435,
-0.552790105342865,
0.2613351345062256,
-0.3940175771713257,
0.9326520562171936,
-0.3682365417480469,
0.07238960266113281,
-0.300742506980896,
-1.0913441181182861,
0.01842465251684189,
0.5549391508102417,
-0.22529268264770508,
0.362774521112442,
0.6757560968399048,
0.8145245909690857,
-0.10390648990869522,
-0.2072211056947708,
-0.04385819286108017,
0.3124942481517792,
0.5542547702789307,
0.6194531917572021,
0.6299304366111755,
-0.6800296902656555,
0.6543409824371338,
-0.305654913187027,
-0.3386359214782715,
-0.03646516799926758,
-0.7043834924697876,
-0.9335730075836182,
-0.6070002913475037,
-0.20270554721355438,
-0.5213870406150818,
0.13534249365329742,
0.6799128651618958,
0.5115624666213989,
-0.7329234480857849,
-0.07831007987260818,
-0.29116180539131165,
-0.0858360007405281,
-0.08045299351215363,
-0.2457301914691925,
0.3255467116832733,
-0.17272844910621643,
-0.996368944644928,
-0.10018344223499298,
0.02897721901535988,
0.33839526772499084,
-0.21560822427272797,
-0.22437065839767456,
-0.015680518001317978,
-0.2774626612663269,
0.635657787322998,
0.00861370749771595,
-0.7015872001647949,
-0.2592902183532715,
-0.30930623412132263,
-0.18209940195083618,
-0.0856676995754242,
0.6891862154006958,
-0.3766905665397644,
0.3161379396915436,
0.5141279697418213,
0.16241559386253357,
0.5605081915855408,
-0.09704137593507767,
0.5288156270980835,
-0.6861833333969116,
0.51784348487854,
0.11570543050765991,
0.2883676588535309,
0.26737964153289795,
-0.35448580980300903,
0.6825019717216492,
0.3844261169433594,
-0.5054682493209839,
-0.774878978729248,
0.2479102611541748,
-0.6761664748191833,
-0.30184221267700195,
1.4982080459594727,
-0.19237686693668365,
0.08299793303012848,
-0.09049441665410995,
-0.431971937417984,
0.6297547221183777,
-0.34565842151641846,
0.7438265681266785,
0.7185499668121338,
0.29133298993110657,
-0.012696397490799427,
-0.8473063111305237,
0.6948195695877075,
0.22396241128444672,
-0.6755278706550598,
0.1607436239719391,
0.263872891664505,
0.595809280872345,
-0.010340111330151558,
0.6688128113746643,
-0.3380209505558014,
-0.01875380054116249,
0.08541440218687057,
0.15490509569644928,
-0.2792567312717438,
-0.08467599749565125,
-0.23751752078533173,
-0.12379713356494904,
0.07837580144405365,
-0.033446092158555984
] |
roberta-base | null | "2023-03-06T15:14:53" | 14,521,862 | 247 | transformers | [
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"safetensors",
"roberta",
"fill-mask",
"exbert",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1907.11692",
"arxiv:1806.02847",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | "2022-03-02T23:29:04" | ---
language: en
tags:
- exbert
license: mit
datasets:
- bookcorpus
- wikipedia
---
# RoBERTa base model
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1907.11692) and first released in
[this repository](https://github.com/pytorch/fairseq/tree/master/examples/roberta). This model is case-sensitive: it
makes a difference between english and English.
Disclaimer: The team releasing RoBERTa did not write a model card for this model so this model card has been written by
the Hugging Face team.
## Model description
RoBERTa is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means
it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model
randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict
the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one
after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to
learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the BERT model as inputs.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
See the [model hub](https://huggingface.co/models?filter=roberta) to look for fine-tuned versions on a task that
interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at a model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='roberta-base')
>>> unmasker("Hello I'm a <mask> model.")
[{'sequence': "<s>Hello I'm a male model.</s>",
'score': 0.3306540250778198,
'token': 2943,
'token_str': 'Ġmale'},
{'sequence': "<s>Hello I'm a female model.</s>",
'score': 0.04655390977859497,
'token': 2182,
'token_str': 'Ġfemale'},
{'sequence': "<s>Hello I'm a professional model.</s>",
'score': 0.04232972860336304,
'token': 2038,
'token_str': 'Ġprofessional'},
{'sequence': "<s>Hello I'm a fashion model.</s>",
'score': 0.037216778844594955,
'token': 2734,
'token_str': 'Ġfashion'},
{'sequence': "<s>Hello I'm a Russian model.</s>",
'score': 0.03253649175167084,
'token': 1083,
'token_str': 'ĠRussian'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import RobertaTokenizer, RobertaModel
tokenizer = RobertaTokenizer.from_pretrained('roberta-base')
model = RobertaModel.from_pretrained('roberta-base')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import RobertaTokenizer, TFRobertaModel
tokenizer = RobertaTokenizer.from_pretrained('roberta-base')
model = TFRobertaModel.from_pretrained('roberta-base')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
The training data used for this model contains a lot of unfiltered content from the internet, which is far from
neutral. Therefore, the model can have biased predictions:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='roberta-base')
>>> unmasker("The man worked as a <mask>.")
[{'sequence': '<s>The man worked as a mechanic.</s>',
'score': 0.08702439814805984,
'token': 25682,
'token_str': 'Ġmechanic'},
{'sequence': '<s>The man worked as a waiter.</s>',
'score': 0.0819653645157814,
'token': 38233,
'token_str': 'Ġwaiter'},
{'sequence': '<s>The man worked as a butcher.</s>',
'score': 0.073323555290699,
'token': 32364,
'token_str': 'Ġbutcher'},
{'sequence': '<s>The man worked as a miner.</s>',
'score': 0.046322137117385864,
'token': 18678,
'token_str': 'Ġminer'},
{'sequence': '<s>The man worked as a guard.</s>',
'score': 0.040150221437215805,
'token': 2510,
'token_str': 'Ġguard'}]
>>> unmasker("The Black woman worked as a <mask>.")
[{'sequence': '<s>The Black woman worked as a waitress.</s>',
'score': 0.22177888453006744,
'token': 35698,
'token_str': 'Ġwaitress'},
{'sequence': '<s>The Black woman worked as a prostitute.</s>',
'score': 0.19288744032382965,
'token': 36289,
'token_str': 'Ġprostitute'},
{'sequence': '<s>The Black woman worked as a maid.</s>',
'score': 0.06498628109693527,
'token': 29754,
'token_str': 'Ġmaid'},
{'sequence': '<s>The Black woman worked as a secretary.</s>',
'score': 0.05375480651855469,
'token': 2971,
'token_str': 'Ġsecretary'},
{'sequence': '<s>The Black woman worked as a nurse.</s>',
'score': 0.05245552211999893,
'token': 9008,
'token_str': 'Ġnurse'}]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The RoBERTa model was pretrained on the reunion of five datasets:
- [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books;
- [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers) ;
- [CC-News](https://commoncrawl.org/2016/10/news-dataset-available/), a dataset containing 63 millions English news
articles crawled between September 2016 and February 2019.
- [OpenWebText](https://github.com/jcpeterson/openwebtext), an opensource recreation of the WebText dataset used to
train GPT-2,
- [Stories](https://arxiv.org/abs/1806.02847) a dataset containing a subset of CommonCrawl data filtered to match the
story-like style of Winograd schemas.
Together these datasets weigh 160GB of text.
## Training procedure
### Preprocessing
The texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50,000. The inputs of
the model take pieces of 512 contiguous tokens that may span over documents. The beginning of a new document is marked
with `<s>` and the end of one by `</s>`
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `<mask>`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
Contrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed).
### Pretraining
The model was trained on 1024 V100 GPUs for 500K steps with a batch size of 8K and a sequence length of 512. The
optimizer used is Adam with a learning rate of 6e-4, \\(\beta_{1} = 0.9\\), \\(\beta_{2} = 0.98\\) and
\\(\epsilon = 1e-6\\), a weight decay of 0.01, learning rate warmup for 24,000 steps and linear decay of the learning
rate after.
## Evaluation results
When fine-tuned on downstream tasks, this model achieves the following results:
Glue test results:
| Task | MNLI | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE |
|:----:|:----:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|
| | 87.6 | 91.9 | 92.8 | 94.8 | 63.6 | 91.2 | 90.2 | 78.7 |
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1907-11692,
author = {Yinhan Liu and
Myle Ott and
Naman Goyal and
Jingfei Du and
Mandar Joshi and
Danqi Chen and
Omer Levy and
Mike Lewis and
Luke Zettlemoyer and
Veselin Stoyanov},
title = {RoBERTa: {A} Robustly Optimized {BERT} Pretraining Approach},
journal = {CoRR},
volume = {abs/1907.11692},
year = {2019},
url = {http://arxiv.org/abs/1907.11692},
archivePrefix = {arXiv},
eprint = {1907.11692},
timestamp = {Thu, 01 Aug 2019 08:59:33 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1907-11692.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<a href="https://huggingface.co/exbert/?model=roberta-base">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
| [
-0.15874525904655457,
-0.7864440083503723,
0.2143075317144394,
-0.01406010426580906,
-0.36069953441619873,
-0.06675083190202713,
-0.35677042603492737,
-0.37536904215812683,
0.27712708711624146,
0.408024400472641,
-0.5700899958610535,
-0.5863821506500244,
-0.9068484306335449,
0.016096128150820732,
-0.35417822003364563,
1.3477998971939087,
0.15623240172863007,
0.20087386667728424,
0.06800553947687149,
0.1846427321434021,
-0.30424419045448303,
-0.5910210013389587,
-0.5965970754623413,
-0.3084702789783478,
0.24352221190929413,
-0.07217985391616821,
0.5531235933303833,
0.5253962874412537,
0.3077291250228882,
0.3472326993942261,
-0.2576225697994232,
0.07259778678417206,
-0.41709104180336,
0.012396411970257759,
-0.0662979856133461,
-0.4749290943145752,
-0.3101073205471039,
0.27496975660324097,
0.31947046518325806,
0.564972460269928,
-0.06504391878843307,
0.40428921580314636,
0.17490199208259583,
0.417063444852829,
-0.25276538729667664,
0.20770277082920074,
-0.6348682045936584,
-0.0244924183934927,
-0.3315502107143402,
0.1614479124546051,
-0.37584176659584045,
-0.15675367414951324,
0.15380899608135223,
-0.39519381523132324,
0.42506900429725647,
-0.03388790413737297,
1.3685752153396606,
0.16098587214946747,
-0.26381415128707886,
-0.2746463119983673,
-0.5668419599533081,
0.9625809788703918,
-0.8623589277267456,
0.19568663835525513,
0.43717142939567566,
0.10399901866912842,
-0.12592695653438568,
-0.9214186668395996,
-0.5824969410896301,
-0.10253732651472092,
-0.24958012998104095,
0.16203950345516205,
-0.3511531352996826,
-0.1853683739900589,
0.29387804865837097,
0.42353302240371704,
-0.6683114171028137,
-0.1447506546974182,
-0.6433879733085632,
-0.29888954758644104,
0.5493419170379639,
0.0022938260808587074,
0.2629498839378357,
-0.41872933506965637,
-0.36308181285858154,
-0.2078268975019455,
-0.25006410479545593,
0.14492365717887878,
0.5394351482391357,
0.3533642590045929,
-0.24127784371376038,
0.5155608057975769,
-0.07189054787158966,
0.7403894662857056,
0.06429418176412582,
-0.28648388385772705,
0.5365337133407593,
-0.2107376605272293,
-0.2978352904319763,
-0.23597051203250885,
0.9668583273887634,
0.2658221423625946,
0.3551822006702423,
-0.06192290410399437,
-0.17604489624500275,
0.22386962175369263,
0.20983895659446716,
-0.7085675597190857,
-0.2934303283691406,
0.25163733959198,
-0.515845537185669,
-0.4759533703327179,
0.17110902070999146,
-0.7976714372634888,
0.015365427359938622,
-0.1213303804397583,
0.5736854672431946,
-0.390480637550354,
-0.19224311411380768,
0.17187798023223877,
-0.37315043807029724,
0.203351229429245,
0.06352704763412476,
-0.8573644161224365,
0.12747101485729218,
0.4722135663032532,
0.9025866389274597,
0.06312549114227295,
-0.1869470626115799,
-0.24649789929389954,
-0.10792459547519684,
0.006349640898406506,
0.44739198684692383,
-0.31369996070861816,
-0.06275366991758347,
-0.1049436405301094,
0.2847490906715393,
-0.23188038170337677,
-0.24185092747211456,
0.48084235191345215,
-0.3002033829689026,
0.6689082980155945,
0.20337238907814026,
-0.38987183570861816,
-0.3200414180755615,
0.1744949072599411,
-0.5676715970039368,
1.191321611404419,
0.2533036768436432,
-0.9026476144790649,
0.2792940139770508,
-0.6562745571136475,
-0.41829341650009155,
-0.1852799654006958,
0.1464909017086029,
-0.6917725801467896,
-0.05100441351532936,
0.31247055530548096,
0.4727456867694855,
-0.32430198788642883,
0.3682880699634552,
-0.048681046813726425,
-0.3440217673778534,
0.2940575182437897,
-0.38268956542015076,
1.3796330690383911,
0.19620521366596222,
-0.6193810105323792,
0.0009235059842467308,
-0.7901957631111145,
-0.0329165980219841,
0.4549154043197632,
-0.3787447214126587,
-0.1016029343008995,
-0.20794735848903656,
0.18133018910884857,
0.28898367285728455,
0.19154375791549683,
-0.543763279914856,
0.14926353096961975,
-0.48575180768966675,
0.696351945400238,
0.7396957874298096,
-0.12020424008369446,
0.2451099008321762,
-0.43273207545280457,
0.5591078996658325,
-0.042191699147224426,
0.1916104555130005,
-0.20709367096424103,
-0.692968487739563,
-0.7333182692527771,
-0.47308990359306335,
0.673233151435852,
0.6932870149612427,
-0.4818387031555176,
0.5464655756950378,
-0.09212487190961838,
-0.5784304738044739,
-0.8699588775634766,
-0.06346599012613297,
0.468717485666275,
0.5795320272445679,
0.45458292961120605,
-0.44897469878196716,
-0.6535660028457642,
-0.7540098428726196,
-0.2846292555332184,
0.0941663384437561,
-0.28063860535621643,
0.2805483639240265,
0.6210357546806335,
-0.2814563512802124,
0.692162811756134,
-0.605465292930603,
-0.5212743282318115,
-0.33916762471199036,
0.08663266152143478,
0.5576511025428772,
0.6780513525009155,
0.47286251187324524,
-0.6304001212120056,
-0.49981629848480225,
-0.24478134512901306,
-0.724946916103363,
0.1350666582584381,
-0.005817367695271969,
-0.14433303475379944,
0.386711984872818,
0.3547315299510956,
-0.7952368855476379,
0.5737815499305725,
0.558641791343689,
-0.34968024492263794,
0.6156887412071228,
-0.22577813267707825,
-0.06924597918987274,
-1.35037362575531,
0.21780243515968323,
0.00909310020506382,
-0.27304980158805847,
-0.795632004737854,
0.054522205144166946,
-0.19967018067836761,
-0.18967117369174957,
-0.4613361358642578,
0.5351645350456238,
-0.5793722867965698,
-0.015290437266230583,
0.032307397574186325,
0.15328708291053772,
0.1420244425535202,
0.8049898147583008,
0.010588851757347584,
0.6667850613594055,
0.5857158303260803,
-0.3313373625278473,
0.16765666007995605,
0.35758641362190247,
-0.4750540256500244,
0.18550936877727509,
-0.7488389015197754,
0.2911365330219269,
-0.11483044177293777,
0.09503638744354248,
-0.9966416358947754,
-0.1064378023147583,
0.32457005977630615,
-0.761180579662323,
0.40123510360717773,
-0.43407630920410156,
-0.4880724549293518,
-0.5555427074432373,
-0.17979194223880768,
0.1633041501045227,
0.7077209949493408,
-0.3327886760234833,
0.6630650758743286,
0.40096402168273926,
-0.07689984142780304,
-0.765596330165863,
-0.7810398936271667,
-0.02729788050055504,
-0.23340629041194916,
-0.7212831974029541,
0.46187230944633484,
0.052504945546388626,
-0.07932552695274353,
-0.10234315693378448,
0.04130878672003746,
-0.12966518104076385,
0.1263253092765808,
0.26954224705696106,
0.45463046431541443,
-0.0460829883813858,
-0.13280418515205383,
-0.20046348869800568,
-0.1418582797050476,
0.03413921594619751,
-0.4595811367034912,
0.9375596046447754,
-0.043696217238903046,
-0.047264255583286285,
-0.42396974563598633,
0.21585485339164734,
0.3764243423938751,
-0.3626015782356262,
0.8774513006210327,
1.0447720289230347,
-0.3760637044906616,
0.01888333633542061,
-0.4213905334472656,
-0.24551811814308167,
-0.44686731696128845,
0.4352600574493408,
-0.31728583574295044,
-0.846406102180481,
0.6663037538528442,
0.31882068514823914,
-0.1396031230688095,
0.7338795065879822,
0.5770137310028076,
-0.13191290199756622,
1.0038763284683228,
0.4565885663032532,
-0.12405499070882797,
0.491949200630188,
-0.6054865717887878,
0.174162819981575,
-0.8270164728164673,
-0.3345523476600647,
-0.5013993382453918,
-0.2609914243221283,
-0.6837285757064819,
-0.4196016788482666,
0.29264816641807556,
0.12485715746879578,
-0.22343024611473083,
0.4993566870689392,
-0.7471539974212646,
0.26198792457580566,
0.8502834439277649,
0.3953166902065277,
-0.024728933349251747,
0.08437538146972656,
-0.23530712723731995,
-0.03716736659407616,
-0.6153460741043091,
-0.41782307624816895,
1.3027119636535645,
0.46665215492248535,
0.5281667113304138,
0.028761180117726326,
0.6532049775123596,
0.2778868079185486,
0.02615508623421192,
-0.4331320822238922,
0.40141725540161133,
-0.2828231751918793,
-0.8693122863769531,
-0.28115007281303406,
-0.2952538728713989,
-1.0575296878814697,
0.2689662575721741,
-0.338844895362854,
-0.8679628968238831,
0.05496549606323242,
-0.05691435560584068,
-0.16372866928577423,
0.39717942476272583,
-0.6944206357002258,
1.0001769065856934,
-0.1539529263973236,
-0.3292887508869171,
-0.029785197228193283,
-0.8237899541854858,
0.3098992109298706,
0.1359819620847702,
0.09194840490818024,
0.02975970320403576,
0.37377268075942993,
0.9714796543121338,
-0.48967215418815613,
1.0000507831573486,
-0.24576525390148163,
0.10030152648687363,
0.23550526797771454,
-0.04332708939909935,
0.5753719210624695,
-0.14761529862880707,
0.0044474215246737,
0.605459451675415,
-0.18572939932346344,
-0.5006523132324219,
-0.2974969148635864,
0.3875495195388794,
-0.8795266151428223,
-0.6385089755058289,
-0.6676780581474304,
-0.5847854018211365,
0.25884389877319336,
0.37107157707214355,
0.5768234729766846,
0.5314850211143494,
0.09614178538322449,
0.06980429589748383,
0.4649685323238373,
-0.22470423579216003,
0.4613286256790161,
0.2797612249851227,
-0.06145381182432175,
-0.454635351896286,
0.7003443837165833,
0.07910331338644028,
0.21185775101184845,
0.26903676986694336,
0.0745246484875679,
-0.34686940908432007,
-0.5685166716575623,
-0.3704988956451416,
0.33820682764053345,
-0.5496801733970642,
-0.29554229974746704,
-0.8183059692382812,
-0.440183162689209,
-0.5593914985656738,
-0.06435438245534897,
-0.14200368523597717,
-0.49034106731414795,
-0.5542197823524475,
-0.05732211098074913,
0.3875390589237213,
0.6796218752861023,
-0.040241267532110214,
0.3014363646507263,
-0.5203587412834167,
0.19515083730220795,
0.2948373854160309,
0.1450352817773819,
-0.10470137745141983,
-0.9729100465774536,
-0.3298586905002594,
0.1702101081609726,
-0.3350694477558136,
-0.8366946578025818,
0.7689328789710999,
0.00594387948513031,
0.49002590775489807,
0.3298220932483673,
-0.13840146362781525,
0.6469072699546814,
-0.3710881173610687,
0.9317964911460876,
0.13920117914676666,
-0.9457662105560303,
0.5736358165740967,
-0.4230419397354126,
0.16392818093299866,
0.3044664263725281,
0.3726058304309845,
-0.4249064028263092,
-0.5381258726119995,
-0.9309485554695129,
-0.981903612613678,
0.9278938174247742,
0.35402220487594604,
0.11472927778959274,
0.03938138484954834,
0.22856704890727997,
-0.029348697513341904,
0.30466124415397644,
-1.146537184715271,
-0.49251192808151245,
-0.41791167855262756,
-0.3255308270454407,
-0.16654478013515472,
-0.22181275486946106,
-0.09192916005849838,
-0.3614145517349243,
0.7908885478973389,
0.1084839403629303,
0.633197546005249,
0.22747941315174103,
-0.3867611587047577,
0.14938421547412872,
0.14829470217227936,
0.7183430194854736,
0.5518866777420044,
-0.46549081802368164,
0.07535514235496521,
0.16915036737918854,
-0.6328173279762268,
0.005947629455476999,
0.3189224898815155,
-0.3328309953212738,
0.1345108598470688,
0.43988561630249023,
0.9476363062858582,
0.011346559040248394,
-0.48096078634262085,
0.6411776542663574,
0.08759516477584839,
-0.2814946472644806,
-0.4576503038406372,
0.06521818786859512,
0.07000108808279037,
0.28050941228866577,
0.44412150979042053,
0.172618567943573,
-0.07865510880947113,
-0.5660063028335571,
0.22554099559783936,
0.4895671010017395,
-0.39983418583869934,
-0.26427963376045227,
0.9084092974662781,
-0.11484784632921219,
-0.4915420413017273,
0.6705104112625122,
-0.3078485131263733,
-0.7794354557991028,
0.6678457856178284,
0.6963236927986145,
0.8743532299995422,
-0.1978437304496765,
0.27961334586143494,
0.5637646317481995,
0.46059417724609375,
0.005947444122284651,
0.10087434202432632,
0.19372497498989105,
-0.6590011119842529,
-0.4132607579231262,
-0.7898231148719788,
0.13303960859775543,
0.3438184857368469,
-0.6270903944969177,
0.1980026513338089,
-0.4513307511806488,
-0.3099076747894287,
0.08584270626306534,
0.16356690227985382,
-0.7870287895202637,
0.2470167726278305,
-0.03973418474197388,
0.8052603602409363,
-1.0546915531158447,
0.8655318021774292,
0.5652614831924438,
-0.721137285232544,
-0.8039286732673645,
-0.04599034786224365,
-0.1147804781794548,
-0.9969087839126587,
0.7433468103408813,
0.2768443524837494,
0.3235014081001282,
0.009037323296070099,
-0.4893268644809723,
-0.8888920545578003,
1.2504644393920898,
0.27462950348854065,
-0.4208681285381317,
-0.18754486739635468,
0.10608995705842972,
0.5851078033447266,
-0.4966464340686798,
0.6609218716621399,
0.4796190857887268,
0.37828329205513,
-0.20370982587337494,
-0.9054732918739319,
0.1447964459657669,
-0.33916375041007996,
0.08237535506486893,
0.11153211444616318,
-0.6822039484977722,
1.2371412515640259,
-0.1379633992910385,
-0.09053177386522293,
0.009017017669975758,
0.43260854482650757,
0.06843018531799316,
0.12844988703727722,
0.4641844630241394,
0.7644282579421997,
0.7687880396842957,
-0.2932785153388977,
1.0601847171783447,
-0.31756606698036194,
0.6009185314178467,
0.8459786772727966,
0.17720560729503632,
0.6928814053535461,
0.24137058854103088,
-0.37902024388313293,
0.8292019367218018,
0.5822975635528564,
-0.3282373249530792,
0.509320855140686,
0.14819082617759705,
-0.06864229589700699,
-0.016817037016153336,
0.06504698097705841,
-0.2656635046005249,
0.5239992141723633,
0.06267944723367691,
-0.554378867149353,
-0.01175817009061575,
0.025282127782702446,
0.37201955914497375,
-0.05182402580976486,
-0.14570128917694092,
0.7090296149253845,
-0.042715251445770264,
-0.6052680611610413,
0.6915640234947205,
0.17544831335544586,
0.8097725510597229,
-0.6095824241638184,
0.059352096170186996,
-0.21666352450847626,
0.11977498233318329,
-0.107952781021595,
-0.6603121161460876,
0.14095796644687653,
0.023338785395026207,
-0.43911564350128174,
-0.17752636969089508,
0.6946230530738831,
-0.6244308948516846,
-0.5932964086532593,
0.25729790329933167,
0.2687356770038605,
0.33402755856513977,
-0.09415112435817719,
-0.8929693698883057,
-0.07281862199306488,
0.3521280586719513,
-0.22024360299110413,
0.36971914768218994,
0.2148509919643402,
0.1270727515220642,
0.6100230813026428,
0.9095953702926636,
0.1571706384420395,
0.08061084151268005,
-0.05270812287926674,
0.842827320098877,
-0.7605599164962769,
-0.5603251457214355,
-0.8305301070213318,
0.7619190216064453,
-0.11439915746450424,
-0.3324766159057617,
0.8075904846191406,
0.6189011335372925,
0.9155260324478149,
-0.25310760736465454,
0.7030481100082397,
-0.17163243889808655,
0.5633641481399536,
-0.6039797067642212,
0.7707543969154358,
-0.48783960938453674,
0.07764893770217896,
-0.36278417706489563,
-0.8621655702590942,
-0.14731135964393616,
0.8447845578193665,
-0.20467308163642883,
0.20346131920814514,
0.60457843542099,
0.9195375442504883,
-0.11580643802881241,
-0.31797951459884644,
0.08869095891714096,
0.3603518605232239,
0.15432998538017273,
0.6193614602088928,
0.4622267484664917,
-0.816265344619751,
0.6671577095985413,
-0.26756617426872253,
-0.15573222935199738,
-0.2700421214103699,
-0.7988465428352356,
-1.104252815246582,
-0.6543927788734436,
-0.2902541756629944,
-0.6897505521774292,
0.09135778993368149,
0.843639075756073,
0.778422474861145,
-0.8294525146484375,
-0.2202731817960739,
-0.032177943736314774,
0.1124875545501709,
-0.35449570417404175,
-0.3165677785873413,
0.5369951725006104,
-0.25017639994621277,
-0.8619773387908936,
0.1571594923734665,
-0.045047786086797714,
0.20005491375923157,
-0.16442714631557465,
-0.07355214655399323,
-0.4583749771118164,
-0.012191126123070717,
0.427484393119812,
0.08645837008953094,
-0.7240007519721985,
-0.2545817196369171,
-0.016421951353549957,
-0.06710918992757797,
0.11634974181652069,
0.4353162348270416,
-0.6468705534934998,
0.3471299707889557,
0.2700462341308594,
0.3142150640487671,
0.9666929244995117,
-0.01343393325805664,
0.3699344992637634,
-0.8605380654335022,
0.3641829788684845,
0.08212131261825562,
0.4033505320549011,
0.3648298382759094,
-0.44368332624435425,
0.5627498030662537,
0.47336432337760925,
-0.5806853175163269,
-0.8873734474182129,
-0.041170358657836914,
-0.973571240901947,
-0.3271600008010864,
1.0599405765533447,
-0.26406970620155334,
-0.39158394932746887,
-0.07019120454788208,
-0.1503797471523285,
0.4618532359600067,
-0.378775030374527,
0.8327957391738892,
0.6403904557228088,
0.13342998921871185,
-0.18014273047447205,
-0.5466557145118713,
0.5606970191001892,
0.3793380856513977,
-0.42747077345848083,
-0.1651693731546402,
0.17347018420696259,
0.6535857915878296,
0.3238253593444824,
0.6538246870040894,
-0.08860418200492859,
0.062242038547992706,
0.02650766633450985,
0.31110236048698425,
-0.18124431371688843,
-0.13853733241558075,
-0.3977954089641571,
0.13240700960159302,
-0.25257986783981323,
-0.3443107604980469
] |
stabilityai/stable-diffusion-xl-base-1.0 | stabilityai | "2023-10-30T16:03:47" | 10,649,877 | 3,677 | diffusers | ["diffusers","onnx","text-to-image","stable-diffusion","arxiv:2307.01952","arxiv:2211.01324","arxiv:(...TRUNCATED) | text-to-image | "2023-07-25T13:25:51" | "---\nlicense: openrail++\ntags:\n- text-to-image\n- stable-diffusion\n---\n# SD-XL 1.0-base Model C(...TRUNCATED) | [-0.39037227630615234,-0.807611346244812,0.49455687403678894,0.12964530289173126,-0.1020255982875824(...TRUNCATED) |
distilbert-base-uncased | null | "2023-08-18T14:59:41" | 10,387,599 | 308 | transformers | ["transformers","pytorch","tf","jax","rust","safetensors","distilbert","fill-mask","exbert","en","da(...TRUNCATED) | fill-mask | "2022-03-02T23:29:04" | "---\nlanguage: en\ntags:\n- exbert\nlicense: apache-2.0\ndatasets:\n- bookcorpus\n- wikipedia\n---\(...TRUNCATED) | [-0.05776984989643097,-0.6613970994949341,0.2539806365966797,0.28165116906166077,-0.55666583776474,0(...TRUNCATED) |
Dataset Card for Hugging Face Hub Model Cards with Embeddings
This dataset consists of model cards for models hosted on the Hugging Face Hub. The model cards are created by the community and provide information about the model, its performance, its intended uses, and more. This dataset is updated on a daily basis and includes publicly available models on the Hugging Face Hub.
This dataset is made available to help support users wanting to work with a large number of Model Cards from the Hub. We hope that this dataset will help support research in the area of Model Cards and their use but the format of this dataset may not be useful for all use cases. If there are other features that you would like to see included in this dataset, please open a new discussion.
This dataset is the same as the Hugging Face Hub Model Cards dataset but with the addition of embeddings for each model card. The embeddings are generated using the jinaai/jina-embeddings-v2-base-en model.
Dataset Details
Uses
There are a number of potential uses for this dataset including:
- text mining to find common themes in model cards
- analysis of the model card format/content
- topic modelling of model cards
- analysis of the model card metadata
- training language models on model cards
- build a recommender system for model cards
- build a search engine for model cards
Out-of-Scope Use
[More Information Needed]
Dataset Structure
This dataset has a single split.
Dataset Creation
Curation Rationale
The dataset was created to assist people in working with model cards. In particular. it was created to support research in the area of model cards and their use. It is possible to use the Hugging Face Hub API or client library to download model cards and this option may be preferable if you have a very specific use case or require a different format.
Source Data
The source data is README.md
files for models hosted on the Hugging Face Hub. We do not include any other supplementary files that may be included in the model card directory.
Data Collection and Processing
The data is downloaded using a CRON job on a daily basis.
Who are the source data producers?
The source data producers are the creators of the model cards on the Hugging Face Hub. This includes a broad variety of people from the community ranging from large companies to individual researchers. We do not gather any information about who created the model card in this repository although this information can be gathered from the Hugging Face Hub API.
Annotations [optional]
There are no additional annotations in this dataset beyond the model card content.
Annotation process
N/A
Who are the annotators?
N/A
Personal and Sensitive Information
We make no effort to anonymize the data. Whilst we don't expect the majority of model cards to contain personal or sensitive information, it is possible that some model cards may contain this information. Model cards may also link to websites or email addresses.
Bias, Risks, and Limitations
Model cards are created by the community and we do not have any control over the content of the model cards. We do not review the content of the model cards and we do not make any claims about the accuracy of the information in the model cards. Some model cards will themselves discuss bias and sometimes this is done by providing examples of bias in either the training data or the responses provided by the model. As a result this dataset may contain examples of bias.
Whilst we do not directly download any images linked to the model cards, some model cards may include images. Some of these images may not be suitable for all audiences.
Recommendations
Citation
No formal citation is required for this dataset but if you use this dataset in your work, please include a link to this dataset page.
Dataset Card Authors
Dataset Card Contact
- Downloads last month
- 59