JordanTallon
commited on
Commit
•
2a48c96
1
Parent(s):
5b3ddba
Push model using huggingface_hub.
Browse files- README.md +223 -91
- model.safetensors +1 -1
- model_head.pkl +1 -1
README.md
CHANGED
@@ -6,23 +6,27 @@ tags:
|
|
6 |
- text-classification
|
7 |
- generated_from_setfit_trainer
|
8 |
metrics:
|
|
|
|
|
|
|
9 |
- accuracy
|
10 |
widget:
|
11 |
-
- text:
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
-
- text:
|
21 |
-
|
22 |
-
|
23 |
-
|
24 |
-
|
25 |
-
|
|
|
26 |
pipeline_tag: text-classification
|
27 |
inference: true
|
28 |
base_model: BAAI/bge-small-en-v1.5
|
@@ -37,8 +41,17 @@ model-index:
|
|
37 |
type: unknown
|
38 |
split: test
|
39 |
metrics:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
40 |
- type: accuracy
|
41 |
-
value: 0.
|
42 |
name: Accuracy
|
43 |
---
|
44 |
|
@@ -70,18 +83,18 @@ The model has been trained using an efficient few-shot learning technique that i
|
|
70 |
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
|
71 |
|
72 |
### Model Labels
|
73 |
-
| Label | Examples
|
74 |
-
|
75 |
-
| center | <ul><li>'A leading economist who vouched for Democratic presidential candidate Elizabeth Warren’s healthcare reform plan told Reuters on Thursday he doubts its staggering cost can be fully covered alongside her other government programs.'</li><li>'
|
76 |
-
|
|
77 |
-
|
|
78 |
|
79 |
## Evaluation
|
80 |
|
81 |
### Metrics
|
82 |
-
| Label | Accuracy |
|
83 |
-
|
84 |
-
| **all** | 0.
|
85 |
|
86 |
## Uses
|
87 |
|
@@ -101,7 +114,7 @@ from setfit import SetFitModel
|
|
101 |
# Download from the 🤗 Hub
|
102 |
model = SetFitModel.from_pretrained("JordanTallon/Unifeed")
|
103 |
# Run inference
|
104 |
-
preds = model("
|
105 |
```
|
106 |
|
107 |
<!--
|
@@ -133,17 +146,17 @@ preds = model("Though banks have fled many low-income communities, there’s a p
|
|
133 |
### Training Set Metrics
|
134 |
| Training set | Min | Median | Max |
|
135 |
|:-------------|:----|:--------|:----|
|
136 |
-
| Word count |
|
137 |
|
138 |
| Label | Training Sample Count |
|
139 |
|:-------|:----------------------|
|
140 |
-
| center |
|
141 |
-
| left |
|
142 |
-
| right |
|
143 |
|
144 |
### Training Hyperparameters
|
145 |
-
- batch_size: (
|
146 |
-
- num_epochs: (
|
147 |
- max_steps: -1
|
148 |
- sampling_strategy: oversampling
|
149 |
- num_iterations: 20
|
@@ -162,66 +175,185 @@ preds = model("Though banks have fled many low-income communities, there’s a p
|
|
162 |
### Training Results
|
163 |
| Epoch | Step | Training Loss | Validation Loss |
|
164 |
|:------:|:----:|:-------------:|:---------------:|
|
165 |
-
| 0.
|
166 |
-
| 0.
|
167 |
-
| 0.
|
168 |
-
| 0.
|
169 |
-
| 0.
|
170 |
-
| 0.
|
171 |
-
| 0.
|
172 |
-
| 0.
|
173 |
-
| 0.
|
174 |
-
| 0.
|
175 |
-
| 0.
|
176 |
-
| 0.
|
177 |
-
| 0.
|
178 |
-
| 0.
|
179 |
-
| 0.
|
180 |
-
| 0.
|
181 |
-
| 0.
|
182 |
-
| 0.
|
183 |
-
| 0.
|
184 |
-
| 0.
|
185 |
-
| 0.
|
186 |
-
| 0.
|
187 |
-
| 0.
|
188 |
-
| 0.
|
189 |
-
| 0.
|
190 |
-
| 0.
|
191 |
-
| 0.
|
192 |
-
| 0.
|
193 |
-
| 0.
|
194 |
-
| 0.
|
195 |
-
|
|
196 |
-
|
|
197 |
-
|
|
198 |
-
|
|
199 |
-
|
|
200 |
-
|
|
201 |
-
|
|
202 |
-
|
|
203 |
-
|
|
204 |
-
|
|
205 |
-
|
|
206 |
-
|
|
207 |
-
|
|
208 |
-
|
|
209 |
-
|
|
210 |
-
|
|
211 |
-
|
|
212 |
-
|
|
213 |
-
|
|
214 |
-
|
|
215 |
-
|
|
216 |
-
|
|
217 |
-
|
|
218 |
-
|
|
219 |
-
|
|
220 |
-
|
|
221 |
-
|
|
222 |
-
|
|
223 |
-
|
|
224 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
225 |
|
226 |
### Framework Versions
|
227 |
- Python: 3.10.12
|
|
|
6 |
- text-classification
|
7 |
- generated_from_setfit_trainer
|
8 |
metrics:
|
9 |
+
- f1
|
10 |
+
- precision
|
11 |
+
- recall
|
12 |
- accuracy
|
13 |
widget:
|
14 |
+
- text: Since the start of the coronavirus outbreak, Trump has hampered efforts to
|
15 |
+
slow the virus’s spread and encouraged Americans’ restlessness under quarantine.
|
16 |
+
- text: ' It has to be particularly described what he is looking for said Asha Rangappa
|
17 |
+
who was a counter intelligence agent for the FBI and now a Yale Law School professor
|
18 |
+
A judge isn t going to sign off some sort of blanket warrant that tells Facebook
|
19 |
+
to turn over everything '
|
20 |
+
- text: 'Now in response to these very serious crises it seems to me that we have
|
21 |
+
two choices First we can throw up our hands in despair We can say I am not going
|
22 |
+
to get involved '
|
23 |
+
- text: Over the past week, activists, some of who are believed to be affiliated with
|
24 |
+
Black Lives Matter have rioted across the country following the death of George
|
25 |
+
Floyd in police custody, wreaking havoc and destruction against America’s towns,
|
26 |
+
cities, and local communities.
|
27 |
+
- text: Working-class Americans, like those who make up the majority of South Bend
|
28 |
+
residents, have secured the largest wage hikes in the nation compared to all other
|
29 |
+
economic demographic groups — a direct result of Trump tightening the labor market.
|
30 |
pipeline_tag: text-classification
|
31 |
inference: true
|
32 |
base_model: BAAI/bge-small-en-v1.5
|
|
|
41 |
type: unknown
|
42 |
split: test
|
43 |
metrics:
|
44 |
+
- type: f1
|
45 |
+
value: 0.6952861952861953
|
46 |
+
name: F1
|
47 |
+
- type: precision
|
48 |
+
value: 0.6952861952861953
|
49 |
+
name: Precision
|
50 |
+
- type: recall
|
51 |
+
value: 0.6952861952861953
|
52 |
+
name: Recall
|
53 |
- type: accuracy
|
54 |
+
value: 0.6952861952861953
|
55 |
name: Accuracy
|
56 |
---
|
57 |
|
|
|
83 |
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
|
84 |
|
85 |
### Model Labels
|
86 |
+
| Label | Examples |
|
87 |
+
|:-------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
88 |
+
| center | <ul><li>'A leading economist who vouched for Democratic presidential candidate Elizabeth Warren’s healthcare reform plan told Reuters on Thursday he doubts its staggering cost can be fully covered alongside her other government programs.'</li><li>'U.S. President Donald Trump is doing well and is very healthy, White House adviser Kellyanne Conway told Fox News on Thursday, after a U.S. military official who worked at the White House was found to have been infected with the novel coronavirus.'</li><li>'Alabama has the most restrictive abortion law in the U.S., banning abortion at any stage of pregnancy and for any reason, including in cases of rape and incest.'</li></ul> |
|
89 |
+
| left | <ul><li>'Meet the shadowy accountants who do Trump’s taxes and help him seem richer than he is'</li><li>'When did vaccines become politicized? Amid a measles outbreak, suddenly Republicans support anti-vaxxers.'</li><li>'Last summer, the Republican White House announced plans to roll back the tougher standards, making it easier for the automotive industry to sell less efficient vehicles that pollute more.'</li></ul> |
|
90 |
+
| right | <ul><li>'Joe Biden told Wall Street donors to his campaign that he planned to reverse most of President Donald Trump’s tax cuts.'</li><li>'For far too many on the left, chaos is the point. Destruction is the goal. They prefer the unknown madness that lies ahead to whatever is still managing to (barely) hold us together in the present.'</li><li>'Cuba’s health ministry initially vowed an investigation into Paloma Dominguez Caballero’s death; last week, state media published a report essentially absolving the government of any wrongdoing, categorically stating that nothing was wrong with the vaccine Dominguez received.'</li></ul> |
|
91 |
|
92 |
## Evaluation
|
93 |
|
94 |
### Metrics
|
95 |
+
| Label | F1 | Precision | Recall | Accuracy |
|
96 |
+
|:--------|:-------|:----------|:-------|:---------|
|
97 |
+
| **all** | 0.6953 | 0.6953 | 0.6953 | 0.6953 |
|
98 |
|
99 |
## Uses
|
100 |
|
|
|
114 |
# Download from the 🤗 Hub
|
115 |
model = SetFitModel.from_pretrained("JordanTallon/Unifeed")
|
116 |
# Run inference
|
117 |
+
preds = model("Since the start of the coronavirus outbreak, Trump has hampered efforts to slow the virus’s spread and encouraged Americans’ restlessness under quarantine.")
|
118 |
```
|
119 |
|
120 |
<!--
|
|
|
146 |
### Training Set Metrics
|
147 |
| Training set | Min | Median | Max |
|
148 |
|:-------------|:----|:--------|:----|
|
149 |
+
| Word count | 6 | 33.1655 | 86 |
|
150 |
|
151 |
| Label | Training Sample Count |
|
152 |
|:-------|:----------------------|
|
153 |
+
| center | 802 |
|
154 |
+
| left | 784 |
|
155 |
+
| right | 788 |
|
156 |
|
157 |
### Training Hyperparameters
|
158 |
+
- batch_size: (32, 32)
|
159 |
+
- num_epochs: (3, 3)
|
160 |
- max_steps: -1
|
161 |
- sampling_strategy: oversampling
|
162 |
- num_iterations: 20
|
|
|
175 |
### Training Results
|
176 |
| Epoch | Step | Training Loss | Validation Loss |
|
177 |
|:------:|:----:|:-------------:|:---------------:|
|
178 |
+
| 0.0003 | 1 | 0.2552 | - |
|
179 |
+
| 0.0168 | 50 | 0.2613 | - |
|
180 |
+
| 0.0337 | 100 | 0.2653 | - |
|
181 |
+
| 0.0505 | 150 | 0.2574 | - |
|
182 |
+
| 0.0674 | 200 | 0.2455 | - |
|
183 |
+
| 0.0842 | 250 | 0.2583 | - |
|
184 |
+
| 0.1011 | 300 | 0.2736 | - |
|
185 |
+
| 0.1179 | 350 | 0.2341 | - |
|
186 |
+
| 0.1348 | 400 | 0.2524 | - |
|
187 |
+
| 0.1516 | 450 | 0.2429 | - |
|
188 |
+
| 0.1685 | 500 | 0.2579 | - |
|
189 |
+
| 0.1853 | 550 | 0.2363 | - |
|
190 |
+
| 0.2022 | 600 | 0.2789 | - |
|
191 |
+
| 0.2190 | 650 | 0.186 | - |
|
192 |
+
| 0.2358 | 700 | 0.2425 | - |
|
193 |
+
| 0.2527 | 750 | 0.1963 | - |
|
194 |
+
| 0.2695 | 800 | 0.1858 | - |
|
195 |
+
| 0.2864 | 850 | 0.1499 | - |
|
196 |
+
| 0.3032 | 900 | 0.2219 | - |
|
197 |
+
| 0.3201 | 950 | 0.1376 | - |
|
198 |
+
| 0.3369 | 1000 | 0.1115 | - |
|
199 |
+
| 0.3538 | 1050 | 0.1205 | - |
|
200 |
+
| 0.3706 | 1100 | 0.1398 | - |
|
201 |
+
| 0.3875 | 1150 | 0.1585 | - |
|
202 |
+
| 0.4043 | 1200 | 0.1328 | - |
|
203 |
+
| 0.4212 | 1250 | 0.0954 | - |
|
204 |
+
| 0.4380 | 1300 | 0.0707 | - |
|
205 |
+
| 0.4549 | 1350 | 0.2214 | - |
|
206 |
+
| 0.4717 | 1400 | 0.1351 | - |
|
207 |
+
| 0.4885 | 1450 | 0.1249 | - |
|
208 |
+
| 0.5054 | 1500 | 0.1656 | - |
|
209 |
+
| 0.5222 | 1550 | 0.1573 | - |
|
210 |
+
| 0.5391 | 1600 | 0.1103 | - |
|
211 |
+
| 0.5559 | 1650 | 0.0787 | - |
|
212 |
+
| 0.5728 | 1700 | 0.126 | - |
|
213 |
+
| 0.5896 | 1750 | 0.0876 | - |
|
214 |
+
| 0.6065 | 1800 | 0.1687 | - |
|
215 |
+
| 0.6233 | 1850 | 0.1319 | - |
|
216 |
+
| 0.6402 | 1900 | 0.0815 | - |
|
217 |
+
| 0.6570 | 1950 | 0.09 | - |
|
218 |
+
| 0.6739 | 2000 | 0.0471 | - |
|
219 |
+
| 0.6907 | 2050 | 0.1032 | - |
|
220 |
+
| 0.7075 | 2100 | 0.0858 | - |
|
221 |
+
| 0.7244 | 2150 | 0.0859 | - |
|
222 |
+
| 0.7412 | 2200 | 0.0946 | - |
|
223 |
+
| 0.7581 | 2250 | 0.0618 | - |
|
224 |
+
| 0.7749 | 2300 | 0.0233 | - |
|
225 |
+
| 0.7918 | 2350 | 0.0148 | - |
|
226 |
+
| 0.8086 | 2400 | 0.0367 | - |
|
227 |
+
| 0.8255 | 2450 | 0.0111 | - |
|
228 |
+
| 0.8423 | 2500 | 0.0034 | - |
|
229 |
+
| 0.8592 | 2550 | 0.0174 | - |
|
230 |
+
| 0.8760 | 2600 | 0.0304 | - |
|
231 |
+
| 0.8929 | 2650 | 0.0303 | - |
|
232 |
+
| 0.9097 | 2700 | 0.0031 | - |
|
233 |
+
| 0.9265 | 2750 | 0.0058 | - |
|
234 |
+
| 0.9434 | 2800 | 0.0034 | - |
|
235 |
+
| 0.9602 | 2850 | 0.0011 | - |
|
236 |
+
| 0.9771 | 2900 | 0.0013 | - |
|
237 |
+
| 0.9939 | 2950 | 0.0296 | - |
|
238 |
+
| 1.0108 | 3000 | 0.0008 | - |
|
239 |
+
| 1.0276 | 3050 | 0.0189 | - |
|
240 |
+
| 1.0445 | 3100 | 0.0295 | - |
|
241 |
+
| 1.0613 | 3150 | 0.0276 | - |
|
242 |
+
| 1.0782 | 3200 | 0.0008 | - |
|
243 |
+
| 1.0950 | 3250 | 0.0008 | - |
|
244 |
+
| 1.1119 | 3300 | 0.0009 | - |
|
245 |
+
| 1.1287 | 3350 | 0.0009 | - |
|
246 |
+
| 1.1456 | 3400 | 0.0008 | - |
|
247 |
+
| 1.1624 | 3450 | 0.0099 | - |
|
248 |
+
| 1.1792 | 3500 | 0.0009 | - |
|
249 |
+
| 1.1961 | 3550 | 0.0299 | - |
|
250 |
+
| 1.2129 | 3600 | 0.0007 | - |
|
251 |
+
| 1.2298 | 3650 | 0.001 | - |
|
252 |
+
| 1.2466 | 3700 | 0.0009 | - |
|
253 |
+
| 1.2635 | 3750 | 0.0008 | - |
|
254 |
+
| 1.2803 | 3800 | 0.001 | - |
|
255 |
+
| 1.2972 | 3850 | 0.0009 | - |
|
256 |
+
| 1.3140 | 3900 | 0.0008 | - |
|
257 |
+
| 1.3309 | 3950 | 0.0007 | - |
|
258 |
+
| 1.3477 | 4000 | 0.0007 | - |
|
259 |
+
| 1.3646 | 4050 | 0.03 | - |
|
260 |
+
| 1.3814 | 4100 | 0.0008 | - |
|
261 |
+
| 1.3982 | 4150 | 0.0012 | - |
|
262 |
+
| 1.4151 | 4200 | 0.0292 | - |
|
263 |
+
| 1.4319 | 4250 | 0.0006 | - |
|
264 |
+
| 1.4488 | 4300 | 0.0007 | - |
|
265 |
+
| 1.4656 | 4350 | 0.0006 | - |
|
266 |
+
| 1.4825 | 4400 | 0.0007 | - |
|
267 |
+
| 1.4993 | 4450 | 0.0008 | - |
|
268 |
+
| 1.5162 | 4500 | 0.0008 | - |
|
269 |
+
| 1.5330 | 4550 | 0.0015 | - |
|
270 |
+
| 1.5499 | 4600 | 0.0032 | - |
|
271 |
+
| 1.5667 | 4650 | 0.0015 | - |
|
272 |
+
| 1.5836 | 4700 | 0.0006 | - |
|
273 |
+
| 1.6004 | 4750 | 0.0006 | - |
|
274 |
+
| 1.6173 | 4800 | 0.0021 | - |
|
275 |
+
| 1.6341 | 4850 | 0.0013 | - |
|
276 |
+
| 1.6509 | 4900 | 0.0006 | - |
|
277 |
+
| 1.6678 | 4950 | 0.0006 | - |
|
278 |
+
| 1.6846 | 5000 | 0.0013 | - |
|
279 |
+
| 1.7015 | 5050 | 0.0006 | - |
|
280 |
+
| 1.7183 | 5100 | 0.0007 | - |
|
281 |
+
| 1.7352 | 5150 | 0.0005 | - |
|
282 |
+
| 1.7520 | 5200 | 0.0005 | - |
|
283 |
+
| 1.7689 | 5250 | 0.0006 | - |
|
284 |
+
| 1.7857 | 5300 | 0.0005 | - |
|
285 |
+
| 1.8026 | 5350 | 0.0005 | - |
|
286 |
+
| 1.8194 | 5400 | 0.0005 | - |
|
287 |
+
| 1.8363 | 5450 | 0.0004 | - |
|
288 |
+
| 1.8531 | 5500 | 0.0066 | - |
|
289 |
+
| 1.8699 | 5550 | 0.0005 | - |
|
290 |
+
| 1.8868 | 5600 | 0.0006 | - |
|
291 |
+
| 1.9036 | 5650 | 0.0005 | - |
|
292 |
+
| 1.9205 | 5700 | 0.0005 | - |
|
293 |
+
| 1.9373 | 5750 | 0.0014 | - |
|
294 |
+
| 1.9542 | 5800 | 0.0006 | - |
|
295 |
+
| 1.9710 | 5850 | 0.0004 | - |
|
296 |
+
| 1.9879 | 5900 | 0.0006 | - |
|
297 |
+
| 2.0047 | 5950 | 0.0005 | - |
|
298 |
+
| 2.0216 | 6000 | 0.0006 | - |
|
299 |
+
| 2.0384 | 6050 | 0.0005 | - |
|
300 |
+
| 2.0553 | 6100 | 0.0004 | - |
|
301 |
+
| 2.0721 | 6150 | 0.0012 | - |
|
302 |
+
| 2.0889 | 6200 | 0.0004 | - |
|
303 |
+
| 2.1058 | 6250 | 0.0005 | - |
|
304 |
+
| 2.1226 | 6300 | 0.0004 | - |
|
305 |
+
| 2.1395 | 6350 | 0.0005 | - |
|
306 |
+
| 2.1563 | 6400 | 0.0005 | - |
|
307 |
+
| 2.1732 | 6450 | 0.0005 | - |
|
308 |
+
| 2.1900 | 6500 | 0.0004 | - |
|
309 |
+
| 2.2069 | 6550 | 0.0004 | - |
|
310 |
+
| 2.2237 | 6600 | 0.0005 | - |
|
311 |
+
| 2.2406 | 6650 | 0.0004 | - |
|
312 |
+
| 2.2574 | 6700 | 0.0005 | - |
|
313 |
+
| 2.2743 | 6750 | 0.0004 | - |
|
314 |
+
| 2.2911 | 6800 | 0.0005 | - |
|
315 |
+
| 2.3080 | 6850 | 0.0007 | - |
|
316 |
+
| 2.3248 | 6900 | 0.0004 | - |
|
317 |
+
| 2.3416 | 6950 | 0.0018 | - |
|
318 |
+
| 2.3585 | 7000 | 0.0004 | - |
|
319 |
+
| 2.3753 | 7050 | 0.0004 | - |
|
320 |
+
| 2.3922 | 7100 | 0.0004 | - |
|
321 |
+
| 2.4090 | 7150 | 0.0004 | - |
|
322 |
+
| 2.4259 | 7200 | 0.0004 | - |
|
323 |
+
| 2.4427 | 7250 | 0.0005 | - |
|
324 |
+
| 2.4596 | 7300 | 0.0004 | - |
|
325 |
+
| 2.4764 | 7350 | 0.0005 | - |
|
326 |
+
| 2.4933 | 7400 | 0.0012 | - |
|
327 |
+
| 2.5101 | 7450 | 0.0026 | - |
|
328 |
+
| 2.5270 | 7500 | 0.0004 | - |
|
329 |
+
| 2.5438 | 7550 | 0.0003 | - |
|
330 |
+
| 2.5606 | 7600 | 0.0004 | - |
|
331 |
+
| 2.5775 | 7650 | 0.0004 | - |
|
332 |
+
| 2.5943 | 7700 | 0.0004 | - |
|
333 |
+
| 2.6112 | 7750 | 0.0004 | - |
|
334 |
+
| 2.6280 | 7800 | 0.0004 | - |
|
335 |
+
| 2.6449 | 7850 | 0.0004 | - |
|
336 |
+
| 2.6617 | 7900 | 0.0004 | - |
|
337 |
+
| 2.6786 | 7950 | 0.0003 | - |
|
338 |
+
| 2.6954 | 8000 | 0.0004 | - |
|
339 |
+
| 2.7123 | 8050 | 0.0004 | - |
|
340 |
+
| 2.7291 | 8100 | 0.0004 | - |
|
341 |
+
| 2.7460 | 8150 | 0.0004 | - |
|
342 |
+
| 2.7628 | 8200 | 0.0004 | - |
|
343 |
+
| 2.7796 | 8250 | 0.0004 | - |
|
344 |
+
| 2.7965 | 8300 | 0.0005 | - |
|
345 |
+
| 2.8133 | 8350 | 0.0004 | - |
|
346 |
+
| 2.8302 | 8400 | 0.0004 | - |
|
347 |
+
| 2.8470 | 8450 | 0.0004 | - |
|
348 |
+
| 2.8639 | 8500 | 0.0004 | - |
|
349 |
+
| 2.8807 | 8550 | 0.0004 | - |
|
350 |
+
| 2.8976 | 8600 | 0.0004 | - |
|
351 |
+
| 2.9144 | 8650 | 0.0004 | - |
|
352 |
+
| 2.9313 | 8700 | 0.0004 | - |
|
353 |
+
| 2.9481 | 8750 | 0.0004 | - |
|
354 |
+
| 2.9650 | 8800 | 0.0004 | - |
|
355 |
+
| 2.9818 | 8850 | 0.0004 | - |
|
356 |
+
| 2.9987 | 8900 | 0.0003 | - |
|
357 |
|
358 |
### Framework Versions
|
359 |
- Python: 3.10.12
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 133462128
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:34bc18b710d5938ff734c02b30b212c5f3972732c0a9ef6f687fead5bbc7db99
|
3 |
size 133462128
|
model_head.pkl
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 10143
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:3237c01d5595ea25eac4e75aeeb2f5a07cea99b9d189cd60cdcf10d6a28b6430
|
3 |
size 10143
|