BEEP! ๋ฐ์ดํฐ์ ์ผ๋ก Epoch 10์ผ๋ก ํ์ธํ๋ํ์ฌ ๊ฒฐ๊ณผ๋ฅผ ์ดํด๋ณด๊ฒ ์ต๋๋ค.
Loss | Acc | Prec | Rec | F1 | |
---|---|---|---|---|---|
TRAIN | 0.11 | 0.965 | 0.966 | 0.972 | 0.969 |
VAL | 0.73 | 0.807 | 0.947 | 0.749 | 0.837 |
threshold 0.5 ๊ธฐ์ค์ผ๋ก ๊ตฌ๋ถํ์์ ๋, dev ๋ฐ์ดํฐ์ ์ ๋ํ ์ ํ๋๋ 0.85 ์ ๋๋ค.
๊ทธ๋ฆฌ๊ณ ์๋ฒ ๋ฉ ๊ฒฐ๊ณผ๋ฌผ์ t-SNE๋ก ์๊ฐํํ์ฌ๋ณด์์ต๋๋ค.
https://v5.core.today/notebook/34XX0RYM4#KcELECTRA_base_beep.ipynb
model = Model.load_from_checkpoint(latest_ckpt);
def infer(x):
return torch.softmax(
model(**model.tokenizer(x, return_tensors='pt')
).logits, dim=-1)
infer('์ก์ค๊ธฐ ์๋๊ทน์ ๋ฏฟ๊ณ ๋ณธ๋ค. ์ฒซํ ์ ์ ํ๊ณ ์ข์๋ค.')
tensor([[0.7414, 0.2586]], grad_fn=<SoftmaxBackward>)
infer('์ ์ด ์์ฐ์ค๋ฌ์์ง ์ฐ๊ธฐ')
tensor([[0.7627, 0.2373]], grad_fn=<SoftmaxBackward>)
- Downloads last month
- 2
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.