Jit Bahadur Khamcha commited on
Commit
0bf21b8
1 Parent(s): 1068021

add code and streamlit code

Browse files
app.py CHANGED
@@ -1,4 +1,28 @@
1
  import streamlit as st
 
 
2
 
3
- x = st.slider('Select a value')
4
- st.write(x, 'squared is', x * x)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  import streamlit as st
2
+ import torch
3
+ from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
4
 
5
+
6
+ # Load the question answering pipeline
7
+ question_answerer = pipeline("question-answering", model="finetuning_squad/checkpoint-16000")
8
+
9
+ # Streamlit app
10
+ st.title("Question Answering App")
11
+
12
+ # Text box for context
13
+ context = st.text_area("Enter Context", "")
14
+
15
+ # Text box for question
16
+ question = st.text_input("Enter Question", "")
17
+
18
+ # Button to find the answer
19
+ if st.button("Find Answer"):
20
+ if context and question:
21
+ # Perform question answering
22
+ answer = question_answerer(context=context, question=question)
23
+
24
+ # Display the answer
25
+ st.subheader("Answer:")
26
+ st.write(answer)
27
+ else:
28
+ st.warning("Please enter both context and question.")
finetune.ipynb ADDED
@@ -0,0 +1,2720 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "code",
5
+ "execution_count": 2,
6
+ "metadata": {},
7
+ "outputs": [],
8
+ "source": [
9
+ "# importing libraries\n",
10
+ "\n",
11
+ "import pandas as pd\n",
12
+ "from datasets import load_dataset\n",
13
+ "\n",
14
+ "import torch\n",
15
+ "from transformers import AutoTokenizer, Trainer, TrainingArguments, DefaultDataCollator, AutoModelForQuestionAnswering\n"
16
+ ]
17
+ },
18
+ {
19
+ "cell_type": "code",
20
+ "execution_count": 2,
21
+ "metadata": {},
22
+ "outputs": [],
23
+ "source": [
24
+ "# download and load the dataset splitting intor train and test set. Here only number of dataset is limited to 5000 \n",
25
+ "squad = load_dataset(\"squad\")"
26
+ ]
27
+ },
28
+ {
29
+ "cell_type": "code",
30
+ "execution_count": 3,
31
+ "metadata": {},
32
+ "outputs": [
33
+ {
34
+ "data": {
35
+ "text/plain": [
36
+ "DatasetDict({\n",
37
+ " train: Dataset({\n",
38
+ " features: ['id', 'title', 'context', 'question', 'answers'],\n",
39
+ " num_rows: 87599\n",
40
+ " })\n",
41
+ " validation: Dataset({\n",
42
+ " features: ['id', 'title', 'context', 'question', 'answers'],\n",
43
+ " num_rows: 10570\n",
44
+ " })\n",
45
+ "})"
46
+ ]
47
+ },
48
+ "execution_count": 3,
49
+ "metadata": {},
50
+ "output_type": "execute_result"
51
+ }
52
+ ],
53
+ "source": [
54
+ "squad"
55
+ ]
56
+ },
57
+ {
58
+ "cell_type": "code",
59
+ "execution_count": 4,
60
+ "metadata": {},
61
+ "outputs": [
62
+ {
63
+ "data": {
64
+ "text/plain": [
65
+ "DatasetDict({\n",
66
+ " train: Dataset({\n",
67
+ " features: ['id', 'title', 'context', 'question', 'answers'],\n",
68
+ " num_rows: 87599\n",
69
+ " })\n",
70
+ " validation: Dataset({\n",
71
+ " features: ['id', 'title', 'context', 'question', 'answers'],\n",
72
+ " num_rows: 10570\n",
73
+ " })\n",
74
+ "})"
75
+ ]
76
+ },
77
+ "execution_count": 4,
78
+ "metadata": {},
79
+ "output_type": "execute_result"
80
+ }
81
+ ],
82
+ "source": [
83
+ "squad"
84
+ ]
85
+ },
86
+ {
87
+ "cell_type": "code",
88
+ "execution_count": 5,
89
+ "metadata": {},
90
+ "outputs": [
91
+ {
92
+ "name": "stdout",
93
+ "output_type": "stream",
94
+ "text": [
95
+ "{'id': '5733be284776f41900661182', 'title': 'University_of_Notre_Dame', 'context': 'Architecturally, the school has a Catholic character. Atop the Main Building\\'s gold dome is a golden statue of the Virgin Mary. Immediately in front of the Main Building and facing it, is a copper statue of Christ with arms upraised with the legend \"Venite Ad Me Omnes\". Next to the Main Building is the Basilica of the Sacred Heart. Immediately behind the basilica is the Grotto, a Marian place of prayer and reflection. It is a replica of the grotto at Lourdes, France where the Virgin Mary reputedly appeared to Saint Bernadette Soubirous in 1858. At the end of the main drive (and in a direct line that connects through 3 statues and the Gold Dome), is a simple, modern stone statue of Mary.', 'question': 'To whom did the Virgin Mary allegedly appear in 1858 in Lourdes France?', 'answers': {'text': ['Saint Bernadette Soubirous'], 'answer_start': [515]}}\n",
96
+ "{'id': '5733be284776f4190066117f', 'title': 'University_of_Notre_Dame', 'context': 'Architecturally, the school has a Catholic character. Atop the Main Building\\'s gold dome is a golden statue of the Virgin Mary. Immediately in front of the Main Building and facing it, is a copper statue of Christ with arms upraised with the legend \"Venite Ad Me Omnes\". Next to the Main Building is the Basilica of the Sacred Heart. Immediately behind the basilica is the Grotto, a Marian place of prayer and reflection. It is a replica of the grotto at Lourdes, France where the Virgin Mary reputedly appeared to Saint Bernadette Soubirous in 1858. At the end of the main drive (and in a direct line that connects through 3 statues and the Gold Dome), is a simple, modern stone statue of Mary.', 'question': 'What is in front of the Notre Dame Main Building?', 'answers': {'text': ['a copper statue of Christ'], 'answer_start': [188]}}\n",
97
+ "{'id': '5733be284776f41900661180', 'title': 'University_of_Notre_Dame', 'context': 'Architecturally, the school has a Catholic character. Atop the Main Building\\'s gold dome is a golden statue of the Virgin Mary. Immediately in front of the Main Building and facing it, is a copper statue of Christ with arms upraised with the legend \"Venite Ad Me Omnes\". Next to the Main Building is the Basilica of the Sacred Heart. Immediately behind the basilica is the Grotto, a Marian place of prayer and reflection. It is a replica of the grotto at Lourdes, France where the Virgin Mary reputedly appeared to Saint Bernadette Soubirous in 1858. At the end of the main drive (and in a direct line that connects through 3 statues and the Gold Dome), is a simple, modern stone statue of Mary.', 'question': 'The Basilica of the Sacred heart at Notre Dame is beside to which structure?', 'answers': {'text': ['the Main Building'], 'answer_start': [279]}}\n",
98
+ "{'id': '5733be284776f41900661181', 'title': 'University_of_Notre_Dame', 'context': 'Architecturally, the school has a Catholic character. Atop the Main Building\\'s gold dome is a golden statue of the Virgin Mary. Immediately in front of the Main Building and facing it, is a copper statue of Christ with arms upraised with the legend \"Venite Ad Me Omnes\". Next to the Main Building is the Basilica of the Sacred Heart. Immediately behind the basilica is the Grotto, a Marian place of prayer and reflection. It is a replica of the grotto at Lourdes, France where the Virgin Mary reputedly appeared to Saint Bernadette Soubirous in 1858. At the end of the main drive (and in a direct line that connects through 3 statues and the Gold Dome), is a simple, modern stone statue of Mary.', 'question': 'What is the Grotto at Notre Dame?', 'answers': {'text': ['a Marian place of prayer and reflection'], 'answer_start': [381]}}\n",
99
+ "{'id': '5733be284776f4190066117e', 'title': 'University_of_Notre_Dame', 'context': 'Architecturally, the school has a Catholic character. Atop the Main Building\\'s gold dome is a golden statue of the Virgin Mary. Immediately in front of the Main Building and facing it, is a copper statue of Christ with arms upraised with the legend \"Venite Ad Me Omnes\". Next to the Main Building is the Basilica of the Sacred Heart. Immediately behind the basilica is the Grotto, a Marian place of prayer and reflection. It is a replica of the grotto at Lourdes, France where the Virgin Mary reputedly appeared to Saint Bernadette Soubirous in 1858. At the end of the main drive (and in a direct line that connects through 3 statues and the Gold Dome), is a simple, modern stone statue of Mary.', 'question': 'What sits on top of the Main Building at Notre Dame?', 'answers': {'text': ['a golden statue of the Virgin Mary'], 'answer_start': [92]}}\n"
100
+ ]
101
+ }
102
+ ],
103
+ "source": [
104
+ "# Print the top 5 rows\n",
105
+ "for i in range(5):\n",
106
+ " print(squad[\"train\"][i])"
107
+ ]
108
+ },
109
+ {
110
+ "cell_type": "markdown",
111
+ "metadata": {},
112
+ "source": [
113
+ "### several important fields \n",
114
+ "<li>answers: the starting location of the answer token and the answer text. </li>\n",
115
+ "<li>context: background information from which the model needs to extract the answer. </li>\n",
116
+ "<li>question: the question a model should answer. </li>"
117
+ ]
118
+ },
119
+ {
120
+ "cell_type": "code",
121
+ "execution_count": 6,
122
+ "metadata": {},
123
+ "outputs": [],
124
+ "source": [
125
+ "tokenizer = AutoTokenizer.from_pretrained(\"distilbert-base-uncased\")"
126
+ ]
127
+ },
128
+ {
129
+ "cell_type": "markdown",
130
+ "metadata": {},
131
+ "source": [
132
+ "## Preprocessing Steps for Question Answering Tasks\n",
133
+ "\n",
134
+ "### Dealing with Long Contexts\n",
135
+ "\n",
136
+ "Some examples in a dataset may have a very long context that exceeds the maximum input length of the model. To address this:\n",
137
+ "\n",
138
+ "- Truncate only the context by setting `truncation=\"only_second\"`.\n",
139
+ "\n",
140
+ "### Mapping Answer Positions to Original Context\n",
141
+ "\n",
142
+ "To map the start and end positions of the answer to the original context:\n",
143
+ "\n",
144
+ "- Set `return_offset_mapping=True` during tokenization.\n",
145
+ "\n",
146
+ "### Finding Start and End Tokens of the Answer\n",
147
+ "\n",
148
+ "With the offset mapping, use the following steps to find the start and end tokens of the answer:\n",
149
+ "\n",
150
+ "- Utilize the `sequence_ids` method to determine which part of the offset corresponds to the question and which corresponds to the context."
151
+ ]
152
+ },
153
+ {
154
+ "cell_type": "code",
155
+ "execution_count": 7,
156
+ "metadata": {},
157
+ "outputs": [],
158
+ "source": [
159
+ "def preprocess_function(examples):\n",
160
+ " questions = [q.strip() for q in examples[\"question\"]]\n",
161
+ " inputs = tokenizer(\n",
162
+ " questions,\n",
163
+ " examples[\"context\"],\n",
164
+ " max_length=384,\n",
165
+ " truncation=\"only_second\",\n",
166
+ " return_offsets_mapping=True,\n",
167
+ " padding=\"max_length\",\n",
168
+ " )\n",
169
+ "\n",
170
+ " offset_mapping = inputs.pop(\"offset_mapping\")\n",
171
+ " answers = examples[\"answers\"]\n",
172
+ " start_positions = []\n",
173
+ " end_positions = []\n",
174
+ "\n",
175
+ " for i, offset in enumerate(offset_mapping):\n",
176
+ " answer = answers[i]\n",
177
+ " start_char = answer[\"answer_start\"][0]\n",
178
+ " end_char = answer[\"answer_start\"][0] + len(answer[\"text\"][0])\n",
179
+ " sequence_ids = inputs.sequence_ids(i)\n",
180
+ "\n",
181
+ " # Find the start and end of the context\n",
182
+ " idx = 0\n",
183
+ " while sequence_ids[idx] != 1:\n",
184
+ " idx += 1\n",
185
+ " context_start = idx\n",
186
+ " while sequence_ids[idx] == 1:\n",
187
+ " idx += 1\n",
188
+ " context_end = idx - 1\n",
189
+ "\n",
190
+ " # If the answer is not fully inside the context, label it (0, 0)\n",
191
+ " if offset[context_start][0] > end_char or offset[context_end][1] < start_char:\n",
192
+ " start_positions.append(0)\n",
193
+ " end_positions.append(0)\n",
194
+ " else:\n",
195
+ " # Otherwise it's the start and end token positions\n",
196
+ " idx = context_start\n",
197
+ " while idx <= context_end and offset[idx][0] <= start_char:\n",
198
+ " idx += 1\n",
199
+ " start_positions.append(idx - 1)\n",
200
+ "\n",
201
+ " idx = context_end\n",
202
+ " while idx >= context_start and offset[idx][1] >= end_char:\n",
203
+ " idx -= 1\n",
204
+ " end_positions.append(idx + 1)\n",
205
+ "\n",
206
+ " inputs[\"start_positions\"] = start_positions\n",
207
+ " inputs[\"end_positions\"] = end_positions\n",
208
+ " return inputs"
209
+ ]
210
+ },
211
+ {
212
+ "cell_type": "code",
213
+ "execution_count": 8,
214
+ "metadata": {},
215
+ "outputs": [
216
+ {
217
+ "name": "stderr",
218
+ "output_type": "stream",
219
+ "text": [
220
+ "Map: 0%| | 0/87599 [00:00<?, ? examples/s]"
221
+ ]
222
+ },
223
+ {
224
+ "name": "stderr",
225
+ "output_type": "stream",
226
+ "text": [
227
+ "Map: 100%|██████████| 87599/87599 [00:18<00:00, 4649.42 examples/s]\n",
228
+ "Map: 100%|██████████| 10570/10570 [00:02<00:00, 4331.36 examples/s]\n"
229
+ ]
230
+ }
231
+ ],
232
+ "source": [
233
+ "# applying preprocessing function to entire dataset\n",
234
+ "tokenized_datasets = squad.map(preprocess_function, batched=True)"
235
+ ]
236
+ },
237
+ {
238
+ "cell_type": "code",
239
+ "execution_count": 9,
240
+ "metadata": {},
241
+ "outputs": [
242
+ {
243
+ "data": {
244
+ "text/plain": [
245
+ "{'id': '5733be284776f41900661182',\n",
246
+ " 'title': 'University_of_Notre_Dame',\n",
247
+ " 'context': 'Architecturally, the school has a Catholic character. Atop the Main Building\\'s gold dome is a golden statue of the Virgin Mary. Immediately in front of the Main Building and facing it, is a copper statue of Christ with arms upraised with the legend \"Venite Ad Me Omnes\". Next to the Main Building is the Basilica of the Sacred Heart. Immediately behind the basilica is the Grotto, a Marian place of prayer and reflection. It is a replica of the grotto at Lourdes, France where the Virgin Mary reputedly appeared to Saint Bernadette Soubirous in 1858. At the end of the main drive (and in a direct line that connects through 3 statues and the Gold Dome), is a simple, modern stone statue of Mary.',\n",
248
+ " 'question': 'To whom did the Virgin Mary allegedly appear in 1858 in Lourdes France?',\n",
249
+ " 'answers': {'text': ['Saint Bernadette Soubirous'], 'answer_start': [515]},\n",
250
+ " 'input_ids': [101,\n",
251
+ " 2000,\n",
252
+ " 3183,\n",
253
+ " 2106,\n",
254
+ " 1996,\n",
255
+ " 6261,\n",
256
+ " 2984,\n",
257
+ " 9382,\n",
258
+ " 3711,\n",
259
+ " 1999,\n",
260
+ " 8517,\n",
261
+ " 1999,\n",
262
+ " 10223,\n",
263
+ " 26371,\n",
264
+ " 2605,\n",
265
+ " 1029,\n",
266
+ " 102,\n",
267
+ " 6549,\n",
268
+ " 2135,\n",
269
+ " 1010,\n",
270
+ " 1996,\n",
271
+ " 2082,\n",
272
+ " 2038,\n",
273
+ " 1037,\n",
274
+ " 3234,\n",
275
+ " 2839,\n",
276
+ " 1012,\n",
277
+ " 10234,\n",
278
+ " 1996,\n",
279
+ " 2364,\n",
280
+ " 2311,\n",
281
+ " 1005,\n",
282
+ " 1055,\n",
283
+ " 2751,\n",
284
+ " 8514,\n",
285
+ " 2003,\n",
286
+ " 1037,\n",
287
+ " 3585,\n",
288
+ " 6231,\n",
289
+ " 1997,\n",
290
+ " 1996,\n",
291
+ " 6261,\n",
292
+ " 2984,\n",
293
+ " 1012,\n",
294
+ " 3202,\n",
295
+ " 1999,\n",
296
+ " 2392,\n",
297
+ " 1997,\n",
298
+ " 1996,\n",
299
+ " 2364,\n",
300
+ " 2311,\n",
301
+ " 1998,\n",
302
+ " 5307,\n",
303
+ " 2009,\n",
304
+ " 1010,\n",
305
+ " 2003,\n",
306
+ " 1037,\n",
307
+ " 6967,\n",
308
+ " 6231,\n",
309
+ " 1997,\n",
310
+ " 4828,\n",
311
+ " 2007,\n",
312
+ " 2608,\n",
313
+ " 2039,\n",
314
+ " 14995,\n",
315
+ " 6924,\n",
316
+ " 2007,\n",
317
+ " 1996,\n",
318
+ " 5722,\n",
319
+ " 1000,\n",
320
+ " 2310,\n",
321
+ " 3490,\n",
322
+ " 2618,\n",
323
+ " 4748,\n",
324
+ " 2033,\n",
325
+ " 18168,\n",
326
+ " 5267,\n",
327
+ " 1000,\n",
328
+ " 1012,\n",
329
+ " 2279,\n",
330
+ " 2000,\n",
331
+ " 1996,\n",
332
+ " 2364,\n",
333
+ " 2311,\n",
334
+ " 2003,\n",
335
+ " 1996,\n",
336
+ " 13546,\n",
337
+ " 1997,\n",
338
+ " 1996,\n",
339
+ " 6730,\n",
340
+ " 2540,\n",
341
+ " 1012,\n",
342
+ " 3202,\n",
343
+ " 2369,\n",
344
+ " 1996,\n",
345
+ " 13546,\n",
346
+ " 2003,\n",
347
+ " 1996,\n",
348
+ " 24665,\n",
349
+ " 23052,\n",
350
+ " 1010,\n",
351
+ " 1037,\n",
352
+ " 14042,\n",
353
+ " 2173,\n",
354
+ " 1997,\n",
355
+ " 7083,\n",
356
+ " 1998,\n",
357
+ " 9185,\n",
358
+ " 1012,\n",
359
+ " 2009,\n",
360
+ " 2003,\n",
361
+ " 1037,\n",
362
+ " 15059,\n",
363
+ " 1997,\n",
364
+ " 1996,\n",
365
+ " 24665,\n",
366
+ " 23052,\n",
367
+ " 2012,\n",
368
+ " 10223,\n",
369
+ " 26371,\n",
370
+ " 1010,\n",
371
+ " 2605,\n",
372
+ " 2073,\n",
373
+ " 1996,\n",
374
+ " 6261,\n",
375
+ " 2984,\n",
376
+ " 22353,\n",
377
+ " 2135,\n",
378
+ " 2596,\n",
379
+ " 2000,\n",
380
+ " 3002,\n",
381
+ " 16595,\n",
382
+ " 9648,\n",
383
+ " 4674,\n",
384
+ " 2061,\n",
385
+ " 12083,\n",
386
+ " 9711,\n",
387
+ " 2271,\n",
388
+ " 1999,\n",
389
+ " 8517,\n",
390
+ " 1012,\n",
391
+ " 2012,\n",
392
+ " 1996,\n",
393
+ " 2203,\n",
394
+ " 1997,\n",
395
+ " 1996,\n",
396
+ " 2364,\n",
397
+ " 3298,\n",
398
+ " 1006,\n",
399
+ " 1998,\n",
400
+ " 1999,\n",
401
+ " 1037,\n",
402
+ " 3622,\n",
403
+ " 2240,\n",
404
+ " 2008,\n",
405
+ " 8539,\n",
406
+ " 2083,\n",
407
+ " 1017,\n",
408
+ " 11342,\n",
409
+ " 1998,\n",
410
+ " 1996,\n",
411
+ " 2751,\n",
412
+ " 8514,\n",
413
+ " 1007,\n",
414
+ " 1010,\n",
415
+ " 2003,\n",
416
+ " 1037,\n",
417
+ " 3722,\n",
418
+ " 1010,\n",
419
+ " 2715,\n",
420
+ " 2962,\n",
421
+ " 6231,\n",
422
+ " 1997,\n",
423
+ " 2984,\n",
424
+ " 1012,\n",
425
+ " 102,\n",
426
+ " 0,\n",
427
+ " 0,\n",
428
+ " 0,\n",
429
+ " 0,\n",
430
+ " 0,\n",
431
+ " 0,\n",
432
+ " 0,\n",
433
+ " 0,\n",
434
+ " 0,\n",
435
+ " 0,\n",
436
+ " 0,\n",
437
+ " 0,\n",
438
+ " 0,\n",
439
+ " 0,\n",
440
+ " 0,\n",
441
+ " 0,\n",
442
+ " 0,\n",
443
+ " 0,\n",
444
+ " 0,\n",
445
+ " 0,\n",
446
+ " 0,\n",
447
+ " 0,\n",
448
+ " 0,\n",
449
+ " 0,\n",
450
+ " 0,\n",
451
+ " 0,\n",
452
+ " 0,\n",
453
+ " 0,\n",
454
+ " 0,\n",
455
+ " 0,\n",
456
+ " 0,\n",
457
+ " 0,\n",
458
+ " 0,\n",
459
+ " 0,\n",
460
+ " 0,\n",
461
+ " 0,\n",
462
+ " 0,\n",
463
+ " 0,\n",
464
+ " 0,\n",
465
+ " 0,\n",
466
+ " 0,\n",
467
+ " 0,\n",
468
+ " 0,\n",
469
+ " 0,\n",
470
+ " 0,\n",
471
+ " 0,\n",
472
+ " 0,\n",
473
+ " 0,\n",
474
+ " 0,\n",
475
+ " 0,\n",
476
+ " 0,\n",
477
+ " 0,\n",
478
+ " 0,\n",
479
+ " 0,\n",
480
+ " 0,\n",
481
+ " 0,\n",
482
+ " 0,\n",
483
+ " 0,\n",
484
+ " 0,\n",
485
+ " 0,\n",
486
+ " 0,\n",
487
+ " 0,\n",
488
+ " 0,\n",
489
+ " 0,\n",
490
+ " 0,\n",
491
+ " 0,\n",
492
+ " 0,\n",
493
+ " 0,\n",
494
+ " 0,\n",
495
+ " 0,\n",
496
+ " 0,\n",
497
+ " 0,\n",
498
+ " 0,\n",
499
+ " 0,\n",
500
+ " 0,\n",
501
+ " 0,\n",
502
+ " 0,\n",
503
+ " 0,\n",
504
+ " 0,\n",
505
+ " 0,\n",
506
+ " 0,\n",
507
+ " 0,\n",
508
+ " 0,\n",
509
+ " 0,\n",
510
+ " 0,\n",
511
+ " 0,\n",
512
+ " 0,\n",
513
+ " 0,\n",
514
+ " 0,\n",
515
+ " 0,\n",
516
+ " 0,\n",
517
+ " 0,\n",
518
+ " 0,\n",
519
+ " 0,\n",
520
+ " 0,\n",
521
+ " 0,\n",
522
+ " 0,\n",
523
+ " 0,\n",
524
+ " 0,\n",
525
+ " 0,\n",
526
+ " 0,\n",
527
+ " 0,\n",
528
+ " 0,\n",
529
+ " 0,\n",
530
+ " 0,\n",
531
+ " 0,\n",
532
+ " 0,\n",
533
+ " 0,\n",
534
+ " 0,\n",
535
+ " 0,\n",
536
+ " 0,\n",
537
+ " 0,\n",
538
+ " 0,\n",
539
+ " 0,\n",
540
+ " 0,\n",
541
+ " 0,\n",
542
+ " 0,\n",
543
+ " 0,\n",
544
+ " 0,\n",
545
+ " 0,\n",
546
+ " 0,\n",
547
+ " 0,\n",
548
+ " 0,\n",
549
+ " 0,\n",
550
+ " 0,\n",
551
+ " 0,\n",
552
+ " 0,\n",
553
+ " 0,\n",
554
+ " 0,\n",
555
+ " 0,\n",
556
+ " 0,\n",
557
+ " 0,\n",
558
+ " 0,\n",
559
+ " 0,\n",
560
+ " 0,\n",
561
+ " 0,\n",
562
+ " 0,\n",
563
+ " 0,\n",
564
+ " 0,\n",
565
+ " 0,\n",
566
+ " 0,\n",
567
+ " 0,\n",
568
+ " 0,\n",
569
+ " 0,\n",
570
+ " 0,\n",
571
+ " 0,\n",
572
+ " 0,\n",
573
+ " 0,\n",
574
+ " 0,\n",
575
+ " 0,\n",
576
+ " 0,\n",
577
+ " 0,\n",
578
+ " 0,\n",
579
+ " 0,\n",
580
+ " 0,\n",
581
+ " 0,\n",
582
+ " 0,\n",
583
+ " 0,\n",
584
+ " 0,\n",
585
+ " 0,\n",
586
+ " 0,\n",
587
+ " 0,\n",
588
+ " 0,\n",
589
+ " 0,\n",
590
+ " 0,\n",
591
+ " 0,\n",
592
+ " 0,\n",
593
+ " 0,\n",
594
+ " 0,\n",
595
+ " 0,\n",
596
+ " 0,\n",
597
+ " 0,\n",
598
+ " 0,\n",
599
+ " 0,\n",
600
+ " 0,\n",
601
+ " 0,\n",
602
+ " 0,\n",
603
+ " 0,\n",
604
+ " 0,\n",
605
+ " 0,\n",
606
+ " 0,\n",
607
+ " 0,\n",
608
+ " 0,\n",
609
+ " 0,\n",
610
+ " 0,\n",
611
+ " 0,\n",
612
+ " 0,\n",
613
+ " 0,\n",
614
+ " 0,\n",
615
+ " 0,\n",
616
+ " 0,\n",
617
+ " 0,\n",
618
+ " 0,\n",
619
+ " 0,\n",
620
+ " 0,\n",
621
+ " 0,\n",
622
+ " 0,\n",
623
+ " 0,\n",
624
+ " 0,\n",
625
+ " 0,\n",
626
+ " 0,\n",
627
+ " 0,\n",
628
+ " 0,\n",
629
+ " 0,\n",
630
+ " 0,\n",
631
+ " 0,\n",
632
+ " 0,\n",
633
+ " 0],\n",
634
+ " 'attention_mask': [1,\n",
635
+ " 1,\n",
636
+ " 1,\n",
637
+ " 1,\n",
638
+ " 1,\n",
639
+ " 1,\n",
640
+ " 1,\n",
641
+ " 1,\n",
642
+ " 1,\n",
643
+ " 1,\n",
644
+ " 1,\n",
645
+ " 1,\n",
646
+ " 1,\n",
647
+ " 1,\n",
648
+ " 1,\n",
649
+ " 1,\n",
650
+ " 1,\n",
651
+ " 1,\n",
652
+ " 1,\n",
653
+ " 1,\n",
654
+ " 1,\n",
655
+ " 1,\n",
656
+ " 1,\n",
657
+ " 1,\n",
658
+ " 1,\n",
659
+ " 1,\n",
660
+ " 1,\n",
661
+ " 1,\n",
662
+ " 1,\n",
663
+ " 1,\n",
664
+ " 1,\n",
665
+ " 1,\n",
666
+ " 1,\n",
667
+ " 1,\n",
668
+ " 1,\n",
669
+ " 1,\n",
670
+ " 1,\n",
671
+ " 1,\n",
672
+ " 1,\n",
673
+ " 1,\n",
674
+ " 1,\n",
675
+ " 1,\n",
676
+ " 1,\n",
677
+ " 1,\n",
678
+ " 1,\n",
679
+ " 1,\n",
680
+ " 1,\n",
681
+ " 1,\n",
682
+ " 1,\n",
683
+ " 1,\n",
684
+ " 1,\n",
685
+ " 1,\n",
686
+ " 1,\n",
687
+ " 1,\n",
688
+ " 1,\n",
689
+ " 1,\n",
690
+ " 1,\n",
691
+ " 1,\n",
692
+ " 1,\n",
693
+ " 1,\n",
694
+ " 1,\n",
695
+ " 1,\n",
696
+ " 1,\n",
697
+ " 1,\n",
698
+ " 1,\n",
699
+ " 1,\n",
700
+ " 1,\n",
701
+ " 1,\n",
702
+ " 1,\n",
703
+ " 1,\n",
704
+ " 1,\n",
705
+ " 1,\n",
706
+ " 1,\n",
707
+ " 1,\n",
708
+ " 1,\n",
709
+ " 1,\n",
710
+ " 1,\n",
711
+ " 1,\n",
712
+ " 1,\n",
713
+ " 1,\n",
714
+ " 1,\n",
715
+ " 1,\n",
716
+ " 1,\n",
717
+ " 1,\n",
718
+ " 1,\n",
719
+ " 1,\n",
720
+ " 1,\n",
721
+ " 1,\n",
722
+ " 1,\n",
723
+ " 1,\n",
724
+ " 1,\n",
725
+ " 1,\n",
726
+ " 1,\n",
727
+ " 1,\n",
728
+ " 1,\n",
729
+ " 1,\n",
730
+ " 1,\n",
731
+ " 1,\n",
732
+ " 1,\n",
733
+ " 1,\n",
734
+ " 1,\n",
735
+ " 1,\n",
736
+ " 1,\n",
737
+ " 1,\n",
738
+ " 1,\n",
739
+ " 1,\n",
740
+ " 1,\n",
741
+ " 1,\n",
742
+ " 1,\n",
743
+ " 1,\n",
744
+ " 1,\n",
745
+ " 1,\n",
746
+ " 1,\n",
747
+ " 1,\n",
748
+ " 1,\n",
749
+ " 1,\n",
750
+ " 1,\n",
751
+ " 1,\n",
752
+ " 1,\n",
753
+ " 1,\n",
754
+ " 1,\n",
755
+ " 1,\n",
756
+ " 1,\n",
757
+ " 1,\n",
758
+ " 1,\n",
759
+ " 1,\n",
760
+ " 1,\n",
761
+ " 1,\n",
762
+ " 1,\n",
763
+ " 1,\n",
764
+ " 1,\n",
765
+ " 1,\n",
766
+ " 1,\n",
767
+ " 1,\n",
768
+ " 1,\n",
769
+ " 1,\n",
770
+ " 1,\n",
771
+ " 1,\n",
772
+ " 1,\n",
773
+ " 1,\n",
774
+ " 1,\n",
775
+ " 1,\n",
776
+ " 1,\n",
777
+ " 1,\n",
778
+ " 1,\n",
779
+ " 1,\n",
780
+ " 1,\n",
781
+ " 1,\n",
782
+ " 1,\n",
783
+ " 1,\n",
784
+ " 1,\n",
785
+ " 1,\n",
786
+ " 1,\n",
787
+ " 1,\n",
788
+ " 1,\n",
789
+ " 1,\n",
790
+ " 1,\n",
791
+ " 1,\n",
792
+ " 1,\n",
793
+ " 1,\n",
794
+ " 1,\n",
795
+ " 1,\n",
796
+ " 1,\n",
797
+ " 1,\n",
798
+ " 1,\n",
799
+ " 1,\n",
800
+ " 1,\n",
801
+ " 1,\n",
802
+ " 1,\n",
803
+ " 1,\n",
804
+ " 1,\n",
805
+ " 1,\n",
806
+ " 1,\n",
807
+ " 1,\n",
808
+ " 1,\n",
809
+ " 1,\n",
810
+ " 0,\n",
811
+ " 0,\n",
812
+ " 0,\n",
813
+ " 0,\n",
814
+ " 0,\n",
815
+ " 0,\n",
816
+ " 0,\n",
817
+ " 0,\n",
818
+ " 0,\n",
819
+ " 0,\n",
820
+ " 0,\n",
821
+ " 0,\n",
822
+ " 0,\n",
823
+ " 0,\n",
824
+ " 0,\n",
825
+ " 0,\n",
826
+ " 0,\n",
827
+ " 0,\n",
828
+ " 0,\n",
829
+ " 0,\n",
830
+ " 0,\n",
831
+ " 0,\n",
832
+ " 0,\n",
833
+ " 0,\n",
834
+ " 0,\n",
835
+ " 0,\n",
836
+ " 0,\n",
837
+ " 0,\n",
838
+ " 0,\n",
839
+ " 0,\n",
840
+ " 0,\n",
841
+ " 0,\n",
842
+ " 0,\n",
843
+ " 0,\n",
844
+ " 0,\n",
845
+ " 0,\n",
846
+ " 0,\n",
847
+ " 0,\n",
848
+ " 0,\n",
849
+ " 0,\n",
850
+ " 0,\n",
851
+ " 0,\n",
852
+ " 0,\n",
853
+ " 0,\n",
854
+ " 0,\n",
855
+ " 0,\n",
856
+ " 0,\n",
857
+ " 0,\n",
858
+ " 0,\n",
859
+ " 0,\n",
860
+ " 0,\n",
861
+ " 0,\n",
862
+ " 0,\n",
863
+ " 0,\n",
864
+ " 0,\n",
865
+ " 0,\n",
866
+ " 0,\n",
867
+ " 0,\n",
868
+ " 0,\n",
869
+ " 0,\n",
870
+ " 0,\n",
871
+ " 0,\n",
872
+ " 0,\n",
873
+ " 0,\n",
874
+ " 0,\n",
875
+ " 0,\n",
876
+ " 0,\n",
877
+ " 0,\n",
878
+ " 0,\n",
879
+ " 0,\n",
880
+ " 0,\n",
881
+ " 0,\n",
882
+ " 0,\n",
883
+ " 0,\n",
884
+ " 0,\n",
885
+ " 0,\n",
886
+ " 0,\n",
887
+ " 0,\n",
888
+ " 0,\n",
889
+ " 0,\n",
890
+ " 0,\n",
891
+ " 0,\n",
892
+ " 0,\n",
893
+ " 0,\n",
894
+ " 0,\n",
895
+ " 0,\n",
896
+ " 0,\n",
897
+ " 0,\n",
898
+ " 0,\n",
899
+ " 0,\n",
900
+ " 0,\n",
901
+ " 0,\n",
902
+ " 0,\n",
903
+ " 0,\n",
904
+ " 0,\n",
905
+ " 0,\n",
906
+ " 0,\n",
907
+ " 0,\n",
908
+ " 0,\n",
909
+ " 0,\n",
910
+ " 0,\n",
911
+ " 0,\n",
912
+ " 0,\n",
913
+ " 0,\n",
914
+ " 0,\n",
915
+ " 0,\n",
916
+ " 0,\n",
917
+ " 0,\n",
918
+ " 0,\n",
919
+ " 0,\n",
920
+ " 0,\n",
921
+ " 0,\n",
922
+ " 0,\n",
923
+ " 0,\n",
924
+ " 0,\n",
925
+ " 0,\n",
926
+ " 0,\n",
927
+ " 0,\n",
928
+ " 0,\n",
929
+ " 0,\n",
930
+ " 0,\n",
931
+ " 0,\n",
932
+ " 0,\n",
933
+ " 0,\n",
934
+ " 0,\n",
935
+ " 0,\n",
936
+ " 0,\n",
937
+ " 0,\n",
938
+ " 0,\n",
939
+ " 0,\n",
940
+ " 0,\n",
941
+ " 0,\n",
942
+ " 0,\n",
943
+ " 0,\n",
944
+ " 0,\n",
945
+ " 0,\n",
946
+ " 0,\n",
947
+ " 0,\n",
948
+ " 0,\n",
949
+ " 0,\n",
950
+ " 0,\n",
951
+ " 0,\n",
952
+ " 0,\n",
953
+ " 0,\n",
954
+ " 0,\n",
955
+ " 0,\n",
956
+ " 0,\n",
957
+ " 0,\n",
958
+ " 0,\n",
959
+ " 0,\n",
960
+ " 0,\n",
961
+ " 0,\n",
962
+ " 0,\n",
963
+ " 0,\n",
964
+ " 0,\n",
965
+ " 0,\n",
966
+ " 0,\n",
967
+ " 0,\n",
968
+ " 0,\n",
969
+ " 0,\n",
970
+ " 0,\n",
971
+ " 0,\n",
972
+ " 0,\n",
973
+ " 0,\n",
974
+ " 0,\n",
975
+ " 0,\n",
976
+ " 0,\n",
977
+ " 0,\n",
978
+ " 0,\n",
979
+ " 0,\n",
980
+ " 0,\n",
981
+ " 0,\n",
982
+ " 0,\n",
983
+ " 0,\n",
984
+ " 0,\n",
985
+ " 0,\n",
986
+ " 0,\n",
987
+ " 0,\n",
988
+ " 0,\n",
989
+ " 0,\n",
990
+ " 0,\n",
991
+ " 0,\n",
992
+ " 0,\n",
993
+ " 0,\n",
994
+ " 0,\n",
995
+ " 0,\n",
996
+ " 0,\n",
997
+ " 0,\n",
998
+ " 0,\n",
999
+ " 0,\n",
1000
+ " 0,\n",
1001
+ " 0,\n",
1002
+ " 0,\n",
1003
+ " 0,\n",
1004
+ " 0,\n",
1005
+ " 0,\n",
1006
+ " 0,\n",
1007
+ " 0,\n",
1008
+ " 0,\n",
1009
+ " 0,\n",
1010
+ " 0,\n",
1011
+ " 0,\n",
1012
+ " 0,\n",
1013
+ " 0,\n",
1014
+ " 0,\n",
1015
+ " 0,\n",
1016
+ " 0,\n",
1017
+ " 0],\n",
1018
+ " 'start_positions': 130,\n",
1019
+ " 'end_positions': 137}"
1020
+ ]
1021
+ },
1022
+ "execution_count": 9,
1023
+ "metadata": {},
1024
+ "output_type": "execute_result"
1025
+ }
1026
+ ],
1027
+ "source": [
1028
+ "tokenized_datasets['train'][0]"
1029
+ ]
1030
+ },
1031
+ {
1032
+ "cell_type": "code",
1033
+ "execution_count": 10,
1034
+ "metadata": {},
1035
+ "outputs": [],
1036
+ "source": [
1037
+ "# creating a batch of examples using DefaultDataCollator\n",
1038
+ "data_collator = DefaultDataCollator()"
1039
+ ]
1040
+ },
1041
+ {
1042
+ "cell_type": "code",
1043
+ "execution_count": 11,
1044
+ "metadata": {},
1045
+ "outputs": [
1046
+ {
1047
+ "name": "stderr",
1048
+ "output_type": "stream",
1049
+ "text": [
1050
+ "Some weights of DistilBertForQuestionAnswering were not initialized from the model checkpoint at distilbert-base-uncased and are newly initialized: ['qa_outputs.weight', 'qa_outputs.bias']\n",
1051
+ "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n"
1052
+ ]
1053
+ }
1054
+ ],
1055
+ "source": [
1056
+ "# loading pre-trained model\n",
1057
+ "model = AutoModelForQuestionAnswering.from_pretrained(\"distilbert-base-uncased\")\n"
1058
+ ]
1059
+ },
1060
+ {
1061
+ "cell_type": "code",
1062
+ "execution_count": 16,
1063
+ "metadata": {},
1064
+ "outputs": [],
1065
+ "source": [
1066
+ "# fine tuning the distilber-base-uncased model\n",
1067
+ "\n",
1068
+ "training_args = TrainingArguments(\n",
1069
+ " output_dir=\"finetuning_squad\",\n",
1070
+ " evaluation_strategy=\"epoch\",\n",
1071
+ " learning_rate=2e-5,\n",
1072
+ " per_device_train_batch_size=16,\n",
1073
+ " per_device_eval_batch_size=16,\n",
1074
+ " num_train_epochs=3,\n",
1075
+ " weight_decay=0.01,\n",
1076
+ ")\n",
1077
+ "\n",
1078
+ "trainer = Trainer(\n",
1079
+ " model=model,\n",
1080
+ " args=training_args,\n",
1081
+ " train_dataset=tokenized_datasets['train'],\n",
1082
+ " eval_dataset=tokenized_datasets['validation'],\n",
1083
+ " tokenizer=tokenizer,\n",
1084
+ " data_collator=data_collator,\n",
1085
+ ")\n"
1086
+ ]
1087
+ },
1088
+ {
1089
+ "cell_type": "code",
1090
+ "execution_count": 17,
1091
+ "metadata": {},
1092
+ "outputs": [
1093
+ {
1094
+ "name": "stderr",
1095
+ "output_type": "stream",
1096
+ "text": [
1097
+ " 0%| | 0/4107 [00:53<?, ?it/s]\n",
1098
+ " \n",
1099
+ " 3%|▎ | 500/16425 [02:27<1:18:36, 3.38it/s]"
1100
+ ]
1101
+ },
1102
+ {
1103
+ "name": "stdout",
1104
+ "output_type": "stream",
1105
+ "text": [
1106
+ "{'loss': 2.9417, 'learning_rate': 1.9391171993911722e-05, 'epoch': 0.09}\n"
1107
+ ]
1108
+ },
1109
+ {
1110
+ "name": "stderr",
1111
+ "output_type": "stream",
1112
+ "text": [
1113
+ " \n",
1114
+ " 6%|▌ | 1000/16425 [04:54<1:13:37, 3.49it/s]"
1115
+ ]
1116
+ },
1117
+ {
1118
+ "name": "stdout",
1119
+ "output_type": "stream",
1120
+ "text": [
1121
+ "{'loss': 1.7699, 'learning_rate': 1.8782343987823442e-05, 'epoch': 0.18}\n"
1122
+ ]
1123
+ },
1124
+ {
1125
+ "name": "stderr",
1126
+ "output_type": "stream",
1127
+ "text": [
1128
+ " \n",
1129
+ " 9%|▉ | 1500/16425 [07:18<1:10:45, 3.52it/s]"
1130
+ ]
1131
+ },
1132
+ {
1133
+ "name": "stdout",
1134
+ "output_type": "stream",
1135
+ "text": [
1136
+ "{'loss': 1.5303, 'learning_rate': 1.8173515981735163e-05, 'epoch': 0.27}\n"
1137
+ ]
1138
+ },
1139
+ {
1140
+ "name": "stderr",
1141
+ "output_type": "stream",
1142
+ "text": [
1143
+ " \n",
1144
+ " 12%|█▏ | 2000/16425 [09:42<1:08:20, 3.52it/s]"
1145
+ ]
1146
+ },
1147
+ {
1148
+ "name": "stdout",
1149
+ "output_type": "stream",
1150
+ "text": [
1151
+ "{'loss': 1.46, 'learning_rate': 1.756468797564688e-05, 'epoch': 0.37}\n"
1152
+ ]
1153
+ },
1154
+ {
1155
+ "name": "stderr",
1156
+ "output_type": "stream",
1157
+ "text": [
1158
+ " \n",
1159
+ " 15%|█▌ | 2500/16425 [12:06<1:05:58, 3.52it/s]"
1160
+ ]
1161
+ },
1162
+ {
1163
+ "name": "stdout",
1164
+ "output_type": "stream",
1165
+ "text": [
1166
+ "{'loss': 1.393, 'learning_rate': 1.69558599695586e-05, 'epoch': 0.46}\n"
1167
+ ]
1168
+ },
1169
+ {
1170
+ "name": "stderr",
1171
+ "output_type": "stream",
1172
+ "text": [
1173
+ " \n",
1174
+ " 18%|█▊ | 3000/16425 [14:27<1:00:22, 3.71it/s]"
1175
+ ]
1176
+ },
1177
+ {
1178
+ "name": "stdout",
1179
+ "output_type": "stream",
1180
+ "text": [
1181
+ "{'loss': 1.3692, 'learning_rate': 1.634703196347032e-05, 'epoch': 0.55}\n"
1182
+ ]
1183
+ },
1184
+ {
1185
+ "name": "stderr",
1186
+ "output_type": "stream",
1187
+ "text": [
1188
+ " \n",
1189
+ " 21%|██▏ | 3500/16425 [16:43<58:09, 3.70it/s]"
1190
+ ]
1191
+ },
1192
+ {
1193
+ "name": "stdout",
1194
+ "output_type": "stream",
1195
+ "text": [
1196
+ "{'loss': 1.3134, 'learning_rate': 1.573820395738204e-05, 'epoch': 0.64}\n"
1197
+ ]
1198
+ },
1199
+ {
1200
+ "name": "stderr",
1201
+ "output_type": "stream",
1202
+ "text": [
1203
+ " \n",
1204
+ " 24%|██▍ | 4000/16425 [18:59<55:51, 3.71it/s]"
1205
+ ]
1206
+ },
1207
+ {
1208
+ "name": "stdout",
1209
+ "output_type": "stream",
1210
+ "text": [
1211
+ "{'loss': 1.2416, 'learning_rate': 1.5129375951293761e-05, 'epoch': 0.73}\n"
1212
+ ]
1213
+ },
1214
+ {
1215
+ "name": "stderr",
1216
+ "output_type": "stream",
1217
+ "text": [
1218
+ " \n",
1219
+ " 27%|██▋ | 4500/16425 [21:15<53:34, 3.71it/s]"
1220
+ ]
1221
+ },
1222
+ {
1223
+ "name": "stdout",
1224
+ "output_type": "stream",
1225
+ "text": [
1226
+ "{'loss': 1.2574, 'learning_rate': 1.4520547945205482e-05, 'epoch': 0.82}\n"
1227
+ ]
1228
+ },
1229
+ {
1230
+ "name": "stderr",
1231
+ "output_type": "stream",
1232
+ "text": [
1233
+ " \n",
1234
+ " 30%|███ | 5000/16425 [23:31<51:17, 3.71it/s]"
1235
+ ]
1236
+ },
1237
+ {
1238
+ "name": "stdout",
1239
+ "output_type": "stream",
1240
+ "text": [
1241
+ "{'loss': 1.2039, 'learning_rate': 1.39117199391172e-05, 'epoch': 0.91}\n"
1242
+ ]
1243
+ },
1244
+ {
1245
+ "name": "stderr",
1246
+ "output_type": "stream",
1247
+ "text": [
1248
+ " 33%|███▎ | 5475/16425 [25:40<48:05, 3.80it/s] \n",
1249
+ "\u001b[A\n",
1250
+ "\u001b[A\n",
1251
+ "\u001b[A\n",
1252
+ "\u001b[A\n",
1253
+ "\u001b[A\n",
1254
+ "\u001b[A\n",
1255
+ "\u001b[A\n",
1256
+ "\u001b[A\n",
1257
+ "\u001b[A\n",
1258
+ "\u001b[A\n",
1259
+ "\u001b[A\n",
1260
+ "\u001b[A\n",
1261
+ "\u001b[A\n",
1262
+ "\u001b[A\n",
1263
+ "\u001b[A\n",
1264
+ "\u001b[A\n",
1265
+ "\u001b[A\n",
1266
+ "\u001b[A\n",
1267
+ "\u001b[A\n",
1268
+ "\u001b[A\n",
1269
+ "\u001b[A\n",
1270
+ "\u001b[A\n",
1271
+ "\u001b[A\n",
1272
+ "\u001b[A\n",
1273
+ "\u001b[A\n",
1274
+ "\u001b[A\n",
1275
+ "\u001b[A\n",
1276
+ "\u001b[A\n",
1277
+ "\u001b[A\n",
1278
+ "\u001b[A\n",
1279
+ "\u001b[A\n",
1280
+ "\u001b[A\n",
1281
+ "\u001b[A\n",
1282
+ "\u001b[A\n",
1283
+ "\u001b[A\n",
1284
+ "\u001b[A\n",
1285
+ "\u001b[A\n",
1286
+ "\u001b[A\n",
1287
+ "\u001b[A\n",
1288
+ "\u001b[A\n",
1289
+ "\u001b[A\n",
1290
+ "\u001b[A\n",
1291
+ "\u001b[A\n",
1292
+ "\u001b[A\n",
1293
+ "\u001b[A\n",
1294
+ "\u001b[A\n",
1295
+ "\u001b[A\n",
1296
+ "\u001b[A\n",
1297
+ "\u001b[A\n",
1298
+ "\u001b[A\n",
1299
+ "\u001b[A\n",
1300
+ "\u001b[A\n",
1301
+ "\u001b[A\n",
1302
+ "\u001b[A\n",
1303
+ "\u001b[A\n",
1304
+ "\u001b[A\n",
1305
+ "\u001b[A\n",
1306
+ "\u001b[A\n",
1307
+ "\u001b[A\n",
1308
+ "\u001b[A\n",
1309
+ "\u001b[A\n",
1310
+ "\u001b[A\n",
1311
+ "\u001b[A\n",
1312
+ "\u001b[A\n",
1313
+ "\u001b[A\n",
1314
+ "\u001b[A\n",
1315
+ "\u001b[A\n",
1316
+ "\u001b[A\n",
1317
+ "\u001b[A\n",
1318
+ "\u001b[A\n",
1319
+ "\u001b[A\n",
1320
+ "\u001b[A\n",
1321
+ "\u001b[A\n",
1322
+ "\u001b[A\n",
1323
+ "\u001b[A\n",
1324
+ "\u001b[A\n",
1325
+ "\u001b[A\n",
1326
+ "\u001b[A\n",
1327
+ "\u001b[A\n",
1328
+ "\u001b[A\n",
1329
+ "\u001b[A\n",
1330
+ "\u001b[A\n",
1331
+ "\u001b[A\n",
1332
+ "\u001b[A\n",
1333
+ "\u001b[A\n",
1334
+ "\u001b[A\n",
1335
+ "\u001b[A\n",
1336
+ "\u001b[A\n",
1337
+ "\u001b[A\n",
1338
+ "\u001b[A\n",
1339
+ "\u001b[A\n",
1340
+ "\u001b[A\n",
1341
+ "\u001b[A\n",
1342
+ "\u001b[A\n",
1343
+ "\u001b[A\n",
1344
+ "\u001b[A\n",
1345
+ "\u001b[A\n",
1346
+ "\u001b[A\n",
1347
+ "\u001b[A\n",
1348
+ "\u001b[A\n",
1349
+ "\u001b[A\n",
1350
+ "\u001b[A\n",
1351
+ "\u001b[A\n",
1352
+ "\u001b[A\n",
1353
+ "\u001b[A\n",
1354
+ "\u001b[A\n",
1355
+ "\u001b[A\n",
1356
+ "\u001b[A\n",
1357
+ "\u001b[A\n",
1358
+ "\u001b[A\n",
1359
+ "\u001b[A\n",
1360
+ "\u001b[A\n",
1361
+ "\u001b[A\n",
1362
+ "\u001b[A\n",
1363
+ "\u001b[A\n",
1364
+ "\u001b[A\n",
1365
+ "\u001b[A\n",
1366
+ "\u001b[A\n",
1367
+ "\u001b[A\n",
1368
+ "\u001b[A\n",
1369
+ "\u001b[A\n",
1370
+ "\u001b[A\n",
1371
+ "\u001b[A\n",
1372
+ "\u001b[A\n",
1373
+ "\u001b[A\n",
1374
+ "\u001b[A\n",
1375
+ "\u001b[A\n",
1376
+ "\u001b[A\n",
1377
+ "\u001b[A\n",
1378
+ "\u001b[A\n",
1379
+ "\u001b[A\n",
1380
+ "\u001b[A\n",
1381
+ "\u001b[A\n",
1382
+ "\u001b[A\n",
1383
+ "\u001b[A\n",
1384
+ "\u001b[A\n",
1385
+ "\u001b[A\n",
1386
+ "\u001b[A\n",
1387
+ "\u001b[A\n",
1388
+ "\u001b[A\n",
1389
+ "\u001b[A\n",
1390
+ "\u001b[A\n",
1391
+ "\u001b[A\n",
1392
+ "\u001b[A\n",
1393
+ "\u001b[A\n",
1394
+ "\u001b[A\n",
1395
+ "\u001b[A\n",
1396
+ "\u001b[A\n",
1397
+ "\u001b[A\n",
1398
+ "\u001b[A\n",
1399
+ "\u001b[A\n",
1400
+ "\u001b[A\n",
1401
+ "\u001b[A\n",
1402
+ "\u001b[A\n",
1403
+ "\u001b[A\n",
1404
+ "\u001b[A\n",
1405
+ "\u001b[A\n",
1406
+ "\u001b[A\n",
1407
+ "\u001b[A\n",
1408
+ "\u001b[A\n",
1409
+ "\u001b[A\n",
1410
+ "\u001b[A\n",
1411
+ "\u001b[A\n",
1412
+ "\u001b[A\n",
1413
+ "\u001b[A\n",
1414
+ "\u001b[A\n",
1415
+ "\u001b[A\n",
1416
+ "\u001b[A\n",
1417
+ "\u001b[A\n",
1418
+ "\u001b[A\n",
1419
+ "\u001b[A\n",
1420
+ "\u001b[A\n",
1421
+ "\u001b[A\n",
1422
+ "\u001b[A\n",
1423
+ "\u001b[A\n",
1424
+ "\u001b[A\n",
1425
+ "\u001b[A\n",
1426
+ "\u001b[A\n",
1427
+ "\u001b[A\n",
1428
+ "\u001b[A\n",
1429
+ "\u001b[A\n",
1430
+ "\u001b[A\n",
1431
+ "\u001b[A\n",
1432
+ "\u001b[A\n",
1433
+ "\u001b[A\n",
1434
+ "\u001b[A\n",
1435
+ "\u001b[A\n",
1436
+ "\u001b[A\n",
1437
+ "\u001b[A\n",
1438
+ "\u001b[A\n",
1439
+ "\u001b[A\n",
1440
+ "\u001b[A\n",
1441
+ "\u001b[A\n",
1442
+ "\u001b[A\n",
1443
+ "\u001b[A\n",
1444
+ "\u001b[A\n",
1445
+ "\u001b[A\n",
1446
+ "\u001b[A\n",
1447
+ "\u001b[A\n",
1448
+ "\u001b[A\n",
1449
+ "\u001b[A\n",
1450
+ "\u001b[A\n",
1451
+ "\u001b[A\n",
1452
+ "\u001b[A\n",
1453
+ "\u001b[A\n",
1454
+ "\u001b[A\n",
1455
+ "\u001b[A\n",
1456
+ "\u001b[A\n",
1457
+ "\u001b[A\n",
1458
+ "\u001b[A\n",
1459
+ "\u001b[A\n",
1460
+ "\u001b[A\n",
1461
+ "\u001b[A\n",
1462
+ "\u001b[A\n",
1463
+ "\u001b[A\n",
1464
+ "\u001b[A\n",
1465
+ "\u001b[A\n",
1466
+ "\u001b[A\n",
1467
+ "\u001b[A\n",
1468
+ "\u001b[A\n",
1469
+ "\u001b[A\n",
1470
+ "\u001b[A\n",
1471
+ "\u001b[A\n",
1472
+ "\u001b[A\n",
1473
+ "\u001b[A\n",
1474
+ "\u001b[A\n",
1475
+ "\u001b[A\n",
1476
+ "\u001b[A\n",
1477
+ "\u001b[A\n",
1478
+ "\u001b[A\n",
1479
+ "\u001b[A\n",
1480
+ "\u001b[A\n",
1481
+ "\u001b[A\n",
1482
+ "\u001b[A\n",
1483
+ "\u001b[A\n",
1484
+ "\u001b[A\n",
1485
+ "\u001b[A\n",
1486
+ "\u001b[A\n",
1487
+ "\u001b[A\n",
1488
+ "\u001b[A\n",
1489
+ "\u001b[A\n",
1490
+ "\u001b[A\n",
1491
+ "\u001b[A\n",
1492
+ "\u001b[A\n",
1493
+ "\u001b[A\n",
1494
+ "\u001b[A\n",
1495
+ "\u001b[A\n",
1496
+ "\u001b[A\n",
1497
+ "\u001b[A\n",
1498
+ "\u001b[A\n",
1499
+ "\u001b[A\n",
1500
+ "\u001b[A\n",
1501
+ "\u001b[A\n",
1502
+ "\u001b[A\n",
1503
+ "\u001b[A\n",
1504
+ "\u001b[A\n",
1505
+ "\u001b[A\n",
1506
+ "\u001b[A\n",
1507
+ "\u001b[A\n",
1508
+ "\u001b[A\n",
1509
+ "\u001b[A\n",
1510
+ "\u001b[A\n",
1511
+ "\u001b[A\n",
1512
+ "\u001b[A\n",
1513
+ "\u001b[A\n",
1514
+ "\u001b[A\n",
1515
+ "\u001b[A\n",
1516
+ "\u001b[A\n",
1517
+ "\u001b[A\n",
1518
+ "\u001b[A\n",
1519
+ "\u001b[A\n",
1520
+ "\u001b[A\n",
1521
+ "\u001b[A\n",
1522
+ "\u001b[A\n",
1523
+ "\u001b[A\n",
1524
+ "\u001b[A\n",
1525
+ "\u001b[A\n",
1526
+ "\u001b[A\n",
1527
+ "\u001b[A\n",
1528
+ "\u001b[A\n",
1529
+ "\u001b[A\n",
1530
+ "\u001b[A\n",
1531
+ "\u001b[A\n",
1532
+ "\u001b[A\n",
1533
+ "\u001b[A\n",
1534
+ "\u001b[A\n",
1535
+ "\u001b[A\n",
1536
+ "\u001b[A\n",
1537
+ "\u001b[A\n",
1538
+ "\u001b[A\n",
1539
+ "\u001b[A\n",
1540
+ "\u001b[A\n",
1541
+ "\u001b[A\n",
1542
+ "\u001b[A\n",
1543
+ "\u001b[A\n",
1544
+ "\u001b[A\n",
1545
+ "\u001b[A\n",
1546
+ "\u001b[A\n",
1547
+ "\u001b[A\n",
1548
+ "\u001b[A\n",
1549
+ "\u001b[A\n",
1550
+ "\u001b[A\n",
1551
+ "\u001b[A\n",
1552
+ "\u001b[A\n",
1553
+ "\u001b[A\n",
1554
+ "\u001b[A\n",
1555
+ "\u001b[A\n",
1556
+ "\u001b[A\n",
1557
+ "\u001b[A\n",
1558
+ "\u001b[A\n",
1559
+ "\u001b[A\n",
1560
+ "\u001b[A\n",
1561
+ "\u001b[A\n",
1562
+ "\u001b[A\n",
1563
+ "\u001b[A\n",
1564
+ "\u001b[A\n",
1565
+ "\u001b[A\n",
1566
+ "\u001b[A\n",
1567
+ "\u001b[A\n",
1568
+ "\u001b[A\n",
1569
+ "\u001b[A\n",
1570
+ "\u001b[A\n",
1571
+ "\u001b[A\n",
1572
+ "\u001b[A\n",
1573
+ "\u001b[A\n",
1574
+ "\u001b[A\n",
1575
+ "\u001b[A\n",
1576
+ "\u001b[A\n",
1577
+ "\u001b[A\n",
1578
+ "\u001b[A\n",
1579
+ " \n",
1580
+ "\u001b[A \n",
1581
+ "\n",
1582
+ " 33%|███▎ | 5475/16425 [26:39<48:05, 3.80it/s]\n",
1583
+ "\u001b[A\n",
1584
+ "\u001b[A"
1585
+ ]
1586
+ },
1587
+ {
1588
+ "name": "stdout",
1589
+ "output_type": "stream",
1590
+ "text": [
1591
+ "{'eval_loss': 1.1474684476852417, 'eval_runtime': 59.0937, 'eval_samples_per_second': 178.868, 'eval_steps_per_second': 11.186, 'epoch': 1.0}\n"
1592
+ ]
1593
+ },
1594
+ {
1595
+ "name": "stderr",
1596
+ "output_type": "stream",
1597
+ "text": [
1598
+ " \n",
1599
+ " 33%|███▎ | 5500/16425 [26:46<49:41, 3.66it/s]"
1600
+ ]
1601
+ },
1602
+ {
1603
+ "name": "stdout",
1604
+ "output_type": "stream",
1605
+ "text": [
1606
+ "{'loss': 1.1717, 'learning_rate': 1.330289193302892e-05, 'epoch': 1.0}\n"
1607
+ ]
1608
+ },
1609
+ {
1610
+ "name": "stderr",
1611
+ "output_type": "stream",
1612
+ "text": [
1613
+ " \n",
1614
+ " 37%|███▋ | 6000/16425 [29:02<46:49, 3.71it/s]"
1615
+ ]
1616
+ },
1617
+ {
1618
+ "name": "stdout",
1619
+ "output_type": "stream",
1620
+ "text": [
1621
+ "{'loss': 0.9552, 'learning_rate': 1.2694063926940641e-05, 'epoch': 1.1}\n"
1622
+ ]
1623
+ },
1624
+ {
1625
+ "name": "stderr",
1626
+ "output_type": "stream",
1627
+ "text": [
1628
+ " \n",
1629
+ " 40%|███▉ | 6500/16425 [31:18<44:30, 3.72it/s]"
1630
+ ]
1631
+ },
1632
+ {
1633
+ "name": "stdout",
1634
+ "output_type": "stream",
1635
+ "text": [
1636
+ "{'loss': 0.9877, 'learning_rate': 1.2085235920852361e-05, 'epoch': 1.19}\n"
1637
+ ]
1638
+ },
1639
+ {
1640
+ "name": "stderr",
1641
+ "output_type": "stream",
1642
+ "text": [
1643
+ " \n",
1644
+ " 43%|████▎ | 7000/16425 [33:45<46:40, 3.37it/s]"
1645
+ ]
1646
+ },
1647
+ {
1648
+ "name": "stdout",
1649
+ "output_type": "stream",
1650
+ "text": [
1651
+ "{'loss': 0.9877, 'learning_rate': 1.147640791476408e-05, 'epoch': 1.28}\n"
1652
+ ]
1653
+ },
1654
+ {
1655
+ "name": "stderr",
1656
+ "output_type": "stream",
1657
+ "text": [
1658
+ " \n",
1659
+ " 46%|████▌ | 7500/16425 [36:13<43:45, 3.40it/s]"
1660
+ ]
1661
+ },
1662
+ {
1663
+ "name": "stdout",
1664
+ "output_type": "stream",
1665
+ "text": [
1666
+ "{'loss': 0.9727, 'learning_rate': 1.08675799086758e-05, 'epoch': 1.37}\n"
1667
+ ]
1668
+ },
1669
+ {
1670
+ "name": "stderr",
1671
+ "output_type": "stream",
1672
+ "text": [
1673
+ " \n",
1674
+ " 49%|████▊ | 8000/16425 [38:33<37:53, 3.71it/s]"
1675
+ ]
1676
+ },
1677
+ {
1678
+ "name": "stdout",
1679
+ "output_type": "stream",
1680
+ "text": [
1681
+ "{'loss': 0.9713, 'learning_rate': 1.025875190258752e-05, 'epoch': 1.46}\n"
1682
+ ]
1683
+ },
1684
+ {
1685
+ "name": "stderr",
1686
+ "output_type": "stream",
1687
+ "text": [
1688
+ " \n",
1689
+ " 52%|█████▏ | 8500/16425 [40:49<35:44, 3.70it/s]"
1690
+ ]
1691
+ },
1692
+ {
1693
+ "name": "stdout",
1694
+ "output_type": "stream",
1695
+ "text": [
1696
+ "{'loss': 0.9441, 'learning_rate': 9.64992389649924e-06, 'epoch': 1.55}\n"
1697
+ ]
1698
+ },
1699
+ {
1700
+ "name": "stderr",
1701
+ "output_type": "stream",
1702
+ "text": [
1703
+ " \n",
1704
+ " 55%|█████▍ | 9000/16425 [43:05<33:23, 3.71it/s]"
1705
+ ]
1706
+ },
1707
+ {
1708
+ "name": "stdout",
1709
+ "output_type": "stream",
1710
+ "text": [
1711
+ "{'loss': 0.9527, 'learning_rate': 9.04109589041096e-06, 'epoch': 1.64}\n"
1712
+ ]
1713
+ },
1714
+ {
1715
+ "name": "stderr",
1716
+ "output_type": "stream",
1717
+ "text": [
1718
+ " \n",
1719
+ " 58%|█████▊ | 9500/16425 [45:21<31:09, 3.70it/s]"
1720
+ ]
1721
+ },
1722
+ {
1723
+ "name": "stdout",
1724
+ "output_type": "stream",
1725
+ "text": [
1726
+ "{'loss': 0.9311, 'learning_rate': 8.432267884322679e-06, 'epoch': 1.74}\n"
1727
+ ]
1728
+ },
1729
+ {
1730
+ "name": "stderr",
1731
+ "output_type": "stream",
1732
+ "text": [
1733
+ " \n",
1734
+ " 61%|██████ | 10000/16425 [47:38<28:52, 3.71it/s]"
1735
+ ]
1736
+ },
1737
+ {
1738
+ "name": "stdout",
1739
+ "output_type": "stream",
1740
+ "text": [
1741
+ "{'loss': 0.9318, 'learning_rate': 7.823439878234399e-06, 'epoch': 1.83}\n"
1742
+ ]
1743
+ },
1744
+ {
1745
+ "name": "stderr",
1746
+ "output_type": "stream",
1747
+ "text": [
1748
+ " \n",
1749
+ " 64%|██████▍ | 10500/16425 [49:54<26:39, 3.70it/s]"
1750
+ ]
1751
+ },
1752
+ {
1753
+ "name": "stdout",
1754
+ "output_type": "stream",
1755
+ "text": [
1756
+ "{'loss': 0.9639, 'learning_rate': 7.214611872146119e-06, 'epoch': 1.92}\n"
1757
+ ]
1758
+ },
1759
+ {
1760
+ "name": "stderr",
1761
+ "output_type": "stream",
1762
+ "text": [
1763
+ " 67%|██████▋ | 10950/16425 [51:56<24:02, 3.80it/s] \n",
1764
+ "\u001b[A\n",
1765
+ "\u001b[A\n",
1766
+ "\u001b[A\n",
1767
+ "\u001b[A\n",
1768
+ "\u001b[A\n",
1769
+ "\u001b[A\n",
1770
+ "\u001b[A\n",
1771
+ "\u001b[A\n",
1772
+ "\u001b[A\n",
1773
+ "\u001b[A\n",
1774
+ "\u001b[A\n",
1775
+ "\u001b[A\n",
1776
+ "\u001b[A\n",
1777
+ "\u001b[A\n",
1778
+ "\u001b[A\n",
1779
+ "\u001b[A\n",
1780
+ "\u001b[A\n",
1781
+ "\u001b[A\n",
1782
+ "\u001b[A\n",
1783
+ "\u001b[A\n",
1784
+ "\u001b[A\n",
1785
+ "\u001b[A\n",
1786
+ "\u001b[A\n",
1787
+ "\u001b[A\n",
1788
+ "\u001b[A\n",
1789
+ "\u001b[A\n",
1790
+ "\u001b[A\n",
1791
+ "\u001b[A\n",
1792
+ "\u001b[A\n",
1793
+ "\u001b[A\n",
1794
+ "\u001b[A\n",
1795
+ "\u001b[A\n",
1796
+ "\u001b[A\n",
1797
+ "\u001b[A\n",
1798
+ "\u001b[A\n",
1799
+ "\u001b[A\n",
1800
+ "\u001b[A\n",
1801
+ "\u001b[A\n",
1802
+ "\u001b[A\n",
1803
+ "\u001b[A\n",
1804
+ "\u001b[A\n",
1805
+ "\u001b[A\n",
1806
+ "\u001b[A\n",
1807
+ "\u001b[A\n",
1808
+ "\u001b[A\n",
1809
+ "\u001b[A\n",
1810
+ "\u001b[A\n",
1811
+ "\u001b[A\n",
1812
+ "\u001b[A\n",
1813
+ "\u001b[A\n",
1814
+ "\u001b[A\n",
1815
+ "\u001b[A\n",
1816
+ "\u001b[A\n",
1817
+ "\u001b[A\n",
1818
+ "\u001b[A\n",
1819
+ "\u001b[A\n",
1820
+ "\u001b[A\n",
1821
+ "\u001b[A\n",
1822
+ "\u001b[A\n",
1823
+ "\u001b[A\n",
1824
+ "\u001b[A\n",
1825
+ "\u001b[A\n",
1826
+ "\u001b[A\n",
1827
+ "\u001b[A\n",
1828
+ "\u001b[A\n",
1829
+ "\u001b[A\n",
1830
+ "\u001b[A\n",
1831
+ "\u001b[A\n",
1832
+ "\u001b[A\n",
1833
+ "\u001b[A\n",
1834
+ "\u001b[A\n",
1835
+ "\u001b[A\n",
1836
+ "\u001b[A\n",
1837
+ "\u001b[A\n",
1838
+ "\u001b[A\n",
1839
+ "\u001b[A\n",
1840
+ "\u001b[A\n",
1841
+ "\u001b[A\n",
1842
+ "\u001b[A\n",
1843
+ "\u001b[A\n",
1844
+ "\u001b[A\n",
1845
+ "\u001b[A\n",
1846
+ "\u001b[A\n",
1847
+ "\u001b[A\n",
1848
+ "\u001b[A\n",
1849
+ "\u001b[A\n",
1850
+ "\u001b[A\n",
1851
+ "\u001b[A\n",
1852
+ "\u001b[A\n",
1853
+ "\u001b[A\n",
1854
+ "\u001b[A\n",
1855
+ "\u001b[A\n",
1856
+ "\u001b[A\n",
1857
+ "\u001b[A\n",
1858
+ "\u001b[A\n",
1859
+ "\u001b[A\n",
1860
+ "\u001b[A\n",
1861
+ "\u001b[A\n",
1862
+ "\u001b[A\n",
1863
+ "\u001b[A\n",
1864
+ "\u001b[A\n",
1865
+ "\u001b[A\n",
1866
+ "\u001b[A\n",
1867
+ "\u001b[A\n",
1868
+ "\u001b[A\n",
1869
+ "\u001b[A\n",
1870
+ "\u001b[A\n",
1871
+ "\u001b[A\n",
1872
+ "\u001b[A\n",
1873
+ "\u001b[A\n",
1874
+ "\u001b[A\n",
1875
+ "\u001b[A\n",
1876
+ "\u001b[A\n",
1877
+ "\u001b[A\n",
1878
+ "\u001b[A\n",
1879
+ "\u001b[A\n",
1880
+ "\u001b[A\n",
1881
+ "\u001b[A\n",
1882
+ "\u001b[A\n",
1883
+ "\u001b[A\n",
1884
+ "\u001b[A\n",
1885
+ "\u001b[A\n",
1886
+ "\u001b[A\n",
1887
+ "\u001b[A\n",
1888
+ "\u001b[A\n",
1889
+ "\u001b[A\n",
1890
+ "\u001b[A\n",
1891
+ "\u001b[A\n",
1892
+ "\u001b[A\n",
1893
+ "\u001b[A\n",
1894
+ "\u001b[A\n",
1895
+ "\u001b[A\n",
1896
+ "\u001b[A\n",
1897
+ "\u001b[A\n",
1898
+ "\u001b[A\n",
1899
+ "\u001b[A\n",
1900
+ "\u001b[A\n",
1901
+ "\u001b[A\n",
1902
+ "\u001b[A\n",
1903
+ "\u001b[A\n",
1904
+ "\u001b[A\n",
1905
+ "\u001b[A\n",
1906
+ "\u001b[A\n",
1907
+ "\u001b[A\n",
1908
+ "\u001b[A\n",
1909
+ "\u001b[A\n",
1910
+ "\u001b[A\n",
1911
+ "\u001b[A\n",
1912
+ "\u001b[A\n",
1913
+ "\u001b[A\n",
1914
+ "\u001b[A\n",
1915
+ "\u001b[A\n",
1916
+ "\u001b[A\n",
1917
+ "\u001b[A\n",
1918
+ "\u001b[A\n",
1919
+ "\u001b[A\n",
1920
+ "\u001b[A\n",
1921
+ "\u001b[A\n",
1922
+ "\u001b[A\n",
1923
+ "\u001b[A\n",
1924
+ "\u001b[A\n",
1925
+ "\u001b[A\n",
1926
+ "\u001b[A\n",
1927
+ "\u001b[A\n",
1928
+ "\u001b[A\n",
1929
+ "\u001b[A\n",
1930
+ "\u001b[A\n",
1931
+ "\u001b[A\n",
1932
+ "\u001b[A\n",
1933
+ "\u001b[A\n",
1934
+ "\u001b[A\n",
1935
+ "\u001b[A\n",
1936
+ "\u001b[A\n",
1937
+ "\u001b[A\n",
1938
+ "\u001b[A\n",
1939
+ "\u001b[A\n",
1940
+ "\u001b[A\n",
1941
+ "\u001b[A\n",
1942
+ "\u001b[A\n",
1943
+ "\u001b[A\n",
1944
+ "\u001b[A\n",
1945
+ "\u001b[A\n",
1946
+ "\u001b[A\n",
1947
+ "\u001b[A\n",
1948
+ "\u001b[A\n",
1949
+ "\u001b[A\n",
1950
+ "\u001b[A\n",
1951
+ "\u001b[A\n",
1952
+ "\u001b[A\n",
1953
+ "\u001b[A\n",
1954
+ "\u001b[A\n",
1955
+ "\u001b[A\n",
1956
+ "\u001b[A\n",
1957
+ "\u001b[A\n",
1958
+ "\u001b[A\n",
1959
+ "\u001b[A\n",
1960
+ "\u001b[A\n",
1961
+ "\u001b[A\n",
1962
+ "\u001b[A\n",
1963
+ "\u001b[A\n",
1964
+ "\u001b[A\n",
1965
+ "\u001b[A\n",
1966
+ "\u001b[A\n",
1967
+ "\u001b[A\n",
1968
+ "\u001b[A\n",
1969
+ "\u001b[A\n",
1970
+ "\u001b[A\n",
1971
+ "\u001b[A\n",
1972
+ "\u001b[A\n",
1973
+ "\u001b[A\n",
1974
+ "\u001b[A\n",
1975
+ "\u001b[A\n",
1976
+ "\u001b[A\n",
1977
+ "\u001b[A\n",
1978
+ "\u001b[A\n",
1979
+ "\u001b[A\n",
1980
+ "\u001b[A\n",
1981
+ "\u001b[A\n",
1982
+ "\u001b[A\n",
1983
+ "\u001b[A\n",
1984
+ "\u001b[A\n",
1985
+ "\u001b[A\n",
1986
+ "\u001b[A\n",
1987
+ "\u001b[A\n",
1988
+ "\u001b[A\n",
1989
+ "\u001b[A\n",
1990
+ "\u001b[A\n",
1991
+ "\u001b[A\n",
1992
+ "\u001b[A\n",
1993
+ "\u001b[A\n",
1994
+ "\u001b[A\n",
1995
+ "\u001b[A\n",
1996
+ "\u001b[A\n",
1997
+ "\u001b[A\n",
1998
+ "\u001b[A\n",
1999
+ "\u001b[A\n",
2000
+ "\u001b[A\n",
2001
+ "\u001b[A\n",
2002
+ "\u001b[A\n",
2003
+ "\u001b[A\n",
2004
+ "\u001b[A\n",
2005
+ "\u001b[A\n",
2006
+ "\u001b[A\n",
2007
+ "\u001b[A\n",
2008
+ "\u001b[A\n",
2009
+ "\u001b[A\n",
2010
+ "\u001b[A\n",
2011
+ "\u001b[A\n",
2012
+ "\u001b[A\n",
2013
+ "\u001b[A\n",
2014
+ "\u001b[A\n",
2015
+ "\u001b[A\n",
2016
+ "\u001b[A\n",
2017
+ "\u001b[A\n",
2018
+ "\u001b[A\n",
2019
+ "\u001b[A\n",
2020
+ "\u001b[A\n",
2021
+ "\u001b[A\n",
2022
+ "\u001b[A\n",
2023
+ "\u001b[A\n",
2024
+ "\u001b[A\n",
2025
+ "\u001b[A\n",
2026
+ "\u001b[A\n",
2027
+ "\u001b[A\n",
2028
+ "\u001b[A\n",
2029
+ "\u001b[A\n",
2030
+ "\u001b[A\n",
2031
+ "\u001b[A\n",
2032
+ "\u001b[A\n",
2033
+ "\u001b[A\n",
2034
+ "\u001b[A\n",
2035
+ "\u001b[A\n",
2036
+ "\u001b[A\n",
2037
+ "\u001b[A\n",
2038
+ "\u001b[A\n",
2039
+ "\u001b[A\n",
2040
+ "\u001b[A\n",
2041
+ "\u001b[A\n",
2042
+ "\u001b[A\n",
2043
+ "\u001b[A\n",
2044
+ "\u001b[A\n",
2045
+ "\u001b[A\n",
2046
+ "\u001b[A\n",
2047
+ "\u001b[A\n",
2048
+ "\u001b[A\n",
2049
+ "\u001b[A\n",
2050
+ "\u001b[A\n",
2051
+ "\u001b[A\n",
2052
+ "\u001b[A\n",
2053
+ "\u001b[A\n",
2054
+ "\u001b[A\n",
2055
+ "\u001b[A\n",
2056
+ "\u001b[A\n",
2057
+ "\u001b[A\n",
2058
+ "\u001b[A\n",
2059
+ "\u001b[A\n",
2060
+ "\u001b[A\n",
2061
+ "\u001b[A\n",
2062
+ "\u001b[A\n",
2063
+ "\u001b[A\n",
2064
+ "\u001b[A\n",
2065
+ "\u001b[A\n",
2066
+ "\u001b[A\n",
2067
+ "\u001b[A\n",
2068
+ "\u001b[A\n",
2069
+ "\u001b[A\n",
2070
+ "\u001b[A\n",
2071
+ "\u001b[A\n",
2072
+ "\u001b[A\n",
2073
+ "\u001b[A\n",
2074
+ "\u001b[A\n",
2075
+ "\u001b[A\n",
2076
+ "\u001b[A\n",
2077
+ "\u001b[A\n",
2078
+ "\u001b[A\n",
2079
+ "\u001b[A\n",
2080
+ "\u001b[A\n",
2081
+ "\u001b[A\n",
2082
+ "\u001b[A\n",
2083
+ "\u001b[A\n",
2084
+ "\u001b[A\n",
2085
+ "\u001b[A\n",
2086
+ "\u001b[A\n",
2087
+ "\u001b[A\n",
2088
+ "\u001b[A\n",
2089
+ "\u001b[A\n",
2090
+ "\u001b[A\n",
2091
+ "\u001b[A\n",
2092
+ "\u001b[A\n",
2093
+ "\u001b[A\n",
2094
+ " \n",
2095
+ "\n",
2096
+ "\u001b[A\u001b[A \n",
2097
+ " 67%|██████▋ | 10950/16425 [52:56<24:02, 3.80it/s]\n",
2098
+ "\u001b[A\n",
2099
+ "\u001b[A"
2100
+ ]
2101
+ },
2102
+ {
2103
+ "name": "stdout",
2104
+ "output_type": "stream",
2105
+ "text": [
2106
+ "{'eval_loss': 1.0952799320220947, 'eval_runtime': 59.2147, 'eval_samples_per_second': 178.503, 'eval_steps_per_second': 11.163, 'epoch': 2.0}\n"
2107
+ ]
2108
+ },
2109
+ {
2110
+ "name": "stderr",
2111
+ "output_type": "stream",
2112
+ "text": [
2113
+ " \n",
2114
+ " 67%|██████▋ | 11000/16425 [53:09<24:24, 3.71it/s]"
2115
+ ]
2116
+ },
2117
+ {
2118
+ "name": "stdout",
2119
+ "output_type": "stream",
2120
+ "text": [
2121
+ "{'loss': 0.9442, 'learning_rate': 6.605783866057839e-06, 'epoch': 2.01}\n"
2122
+ ]
2123
+ },
2124
+ {
2125
+ "name": "stderr",
2126
+ "output_type": "stream",
2127
+ "text": [
2128
+ " \n",
2129
+ " 70%|███████ | 11500/16425 [55:29<24:05, 3.41it/s]"
2130
+ ]
2131
+ },
2132
+ {
2133
+ "name": "stdout",
2134
+ "output_type": "stream",
2135
+ "text": [
2136
+ "{'loss': 0.7817, 'learning_rate': 5.996955859969558e-06, 'epoch': 2.1}\n"
2137
+ ]
2138
+ },
2139
+ {
2140
+ "name": "stderr",
2141
+ "output_type": "stream",
2142
+ "text": [
2143
+ " \n",
2144
+ " 73%|███████▎ | 12000/16425 [57:57<21:38, 3.41it/s]"
2145
+ ]
2146
+ },
2147
+ {
2148
+ "name": "stdout",
2149
+ "output_type": "stream",
2150
+ "text": [
2151
+ "{'loss': 0.7787, 'learning_rate': 5.388127853881279e-06, 'epoch': 2.19}\n"
2152
+ ]
2153
+ },
2154
+ {
2155
+ "name": "stderr",
2156
+ "output_type": "stream",
2157
+ "text": [
2158
+ " \n",
2159
+ " 76%|███████▌ | 12500/16425 [1:00:25<17:38, 3.71it/s]"
2160
+ ]
2161
+ },
2162
+ {
2163
+ "name": "stdout",
2164
+ "output_type": "stream",
2165
+ "text": [
2166
+ "{'loss': 0.7367, 'learning_rate': 4.779299847792998e-06, 'epoch': 2.28}\n"
2167
+ ]
2168
+ },
2169
+ {
2170
+ "name": "stderr",
2171
+ "output_type": "stream",
2172
+ "text": [
2173
+ " \n",
2174
+ " 79%|███████▉ | 13000/16425 [1:02:41<15:24, 3.70it/s]"
2175
+ ]
2176
+ },
2177
+ {
2178
+ "name": "stdout",
2179
+ "output_type": "stream",
2180
+ "text": [
2181
+ "{'loss': 0.7513, 'learning_rate': 4.170471841704719e-06, 'epoch': 2.37}\n"
2182
+ ]
2183
+ },
2184
+ {
2185
+ "name": "stderr",
2186
+ "output_type": "stream",
2187
+ "text": [
2188
+ " \n",
2189
+ " 82%|████████▏ | 13500/16425 [1:04:57<13:10, 3.70it/s]"
2190
+ ]
2191
+ },
2192
+ {
2193
+ "name": "stdout",
2194
+ "output_type": "stream",
2195
+ "text": [
2196
+ "{'loss': 0.7739, 'learning_rate': 3.5616438356164386e-06, 'epoch': 2.47}\n"
2197
+ ]
2198
+ },
2199
+ {
2200
+ "name": "stderr",
2201
+ "output_type": "stream",
2202
+ "text": [
2203
+ " \n",
2204
+ " 85%|████████▌ | 14000/16425 [1:07:14<11:57, 3.38it/s]"
2205
+ ]
2206
+ },
2207
+ {
2208
+ "name": "stdout",
2209
+ "output_type": "stream",
2210
+ "text": [
2211
+ "{'loss': 0.7591, 'learning_rate': 2.9528158295281586e-06, 'epoch': 2.56}\n"
2212
+ ]
2213
+ },
2214
+ {
2215
+ "name": "stderr",
2216
+ "output_type": "stream",
2217
+ "text": [
2218
+ " \n",
2219
+ " 88%|████████▊ | 14500/16425 [1:09:43<09:28, 3.39it/s]"
2220
+ ]
2221
+ },
2222
+ {
2223
+ "name": "stdout",
2224
+ "output_type": "stream",
2225
+ "text": [
2226
+ "{'loss': 0.7491, 'learning_rate': 2.343987823439878e-06, 'epoch': 2.65}\n"
2227
+ ]
2228
+ },
2229
+ {
2230
+ "name": "stderr",
2231
+ "output_type": "stream",
2232
+ "text": [
2233
+ " \n",
2234
+ " 91%|█████████▏| 15000/16425 [1:12:12<07:00, 3.39it/s]"
2235
+ ]
2236
+ },
2237
+ {
2238
+ "name": "stdout",
2239
+ "output_type": "stream",
2240
+ "text": [
2241
+ "{'loss': 0.7394, 'learning_rate': 1.7351598173515982e-06, 'epoch': 2.74}\n"
2242
+ ]
2243
+ },
2244
+ {
2245
+ "name": "stderr",
2246
+ "output_type": "stream",
2247
+ "text": [
2248
+ " \n",
2249
+ " 94%|█████████▍| 15500/16425 [1:14:29<04:10, 3.70it/s]"
2250
+ ]
2251
+ },
2252
+ {
2253
+ "name": "stdout",
2254
+ "output_type": "stream",
2255
+ "text": [
2256
+ "{'loss': 0.7548, 'learning_rate': 1.1263318112633182e-06, 'epoch': 2.83}\n"
2257
+ ]
2258
+ },
2259
+ {
2260
+ "name": "stderr",
2261
+ "output_type": "stream",
2262
+ "text": [
2263
+ " \n",
2264
+ " 97%|█████████▋| 16000/16425 [1:16:46<01:54, 3.70it/s]"
2265
+ ]
2266
+ },
2267
+ {
2268
+ "name": "stdout",
2269
+ "output_type": "stream",
2270
+ "text": [
2271
+ "{'loss': 0.7728, 'learning_rate': 5.17503805175038e-07, 'epoch': 2.92}\n"
2272
+ ]
2273
+ },
2274
+ {
2275
+ "name": "stderr",
2276
+ "output_type": "stream",
2277
+ "text": [
2278
+ "100%|██████████| 16425/16425 [1:18:46<00:00, 3.48it/s]\n",
2279
+ "\u001b[A\n",
2280
+ "\u001b[A\n",
2281
+ "\u001b[A\n",
2282
+ "\u001b[A\n",
2283
+ "\u001b[A\n",
2284
+ "\u001b[A\n",
2285
+ "\u001b[A\n",
2286
+ "\u001b[A\n",
2287
+ "\u001b[A\n",
2288
+ "\u001b[A\n",
2289
+ "\u001b[A\n",
2290
+ "\u001b[A\n",
2291
+ "\u001b[A\n",
2292
+ "\u001b[A\n",
2293
+ "\u001b[A\n",
2294
+ "\u001b[A\n",
2295
+ "\u001b[A\n",
2296
+ "\u001b[A\n",
2297
+ "\u001b[A\n",
2298
+ "\u001b[A\n",
2299
+ "\u001b[A\n",
2300
+ "\u001b[A\n",
2301
+ "\u001b[A\n",
2302
+ "\u001b[A\n",
2303
+ "\u001b[A\n",
2304
+ "\u001b[A\n",
2305
+ "\u001b[A\n",
2306
+ "\u001b[A\n",
2307
+ "\u001b[A\n",
2308
+ "\u001b[A\n",
2309
+ "\u001b[A\n",
2310
+ "\u001b[A\n",
2311
+ "\u001b[A\n",
2312
+ "\u001b[A\n",
2313
+ "\u001b[A\n",
2314
+ "\u001b[A\n",
2315
+ "\u001b[A\n",
2316
+ "\u001b[A\n",
2317
+ "\u001b[A\n",
2318
+ "\u001b[A\n",
2319
+ "\u001b[A\n",
2320
+ "\u001b[A\n",
2321
+ "\u001b[A\n",
2322
+ "\u001b[A\n",
2323
+ "\u001b[A\n",
2324
+ "\u001b[A\n",
2325
+ "\u001b[A\n",
2326
+ "\u001b[A\n",
2327
+ "\u001b[A\n",
2328
+ "\u001b[A\n",
2329
+ "\u001b[A\n",
2330
+ "\u001b[A\n",
2331
+ "\u001b[A\n",
2332
+ "\u001b[A\n",
2333
+ "\u001b[A\n",
2334
+ "\u001b[A\n",
2335
+ "\u001b[A\n",
2336
+ "\u001b[A\n",
2337
+ "\u001b[A\n",
2338
+ "\u001b[A\n",
2339
+ "\u001b[A\n",
2340
+ "\u001b[A\n",
2341
+ "\u001b[A\n",
2342
+ "\u001b[A\n",
2343
+ "\u001b[A\n",
2344
+ "\u001b[A\n",
2345
+ "\u001b[A\n",
2346
+ "\u001b[A\n",
2347
+ "\u001b[A\n",
2348
+ "\u001b[A\n",
2349
+ "\u001b[A\n",
2350
+ "\u001b[A\n",
2351
+ "\u001b[A\n",
2352
+ "\u001b[A\n",
2353
+ "\u001b[A\n",
2354
+ "\u001b[A\n",
2355
+ "\u001b[A\n",
2356
+ "\u001b[A\n",
2357
+ "\u001b[A\n",
2358
+ "\u001b[A\n",
2359
+ "\u001b[A\n",
2360
+ "\u001b[A\n",
2361
+ "\u001b[A\n",
2362
+ "\u001b[A\n",
2363
+ "\u001b[A\n",
2364
+ "\u001b[A\n",
2365
+ "\u001b[A\n",
2366
+ "\u001b[A\n",
2367
+ "\u001b[A\n",
2368
+ "\u001b[A\n",
2369
+ "\u001b[A\n",
2370
+ "\u001b[A\n",
2371
+ "\u001b[A\n",
2372
+ "\u001b[A\n",
2373
+ "\u001b[A\n",
2374
+ "\u001b[A\n",
2375
+ "\u001b[A\n",
2376
+ "\u001b[A\n",
2377
+ "\u001b[A\n",
2378
+ "\u001b[A\n",
2379
+ "\u001b[A\n",
2380
+ "\u001b[A\n",
2381
+ "\u001b[A\n",
2382
+ "\u001b[A\n",
2383
+ "\u001b[A\n",
2384
+ "\u001b[A\n",
2385
+ "\u001b[A\n",
2386
+ "\u001b[A\n",
2387
+ "\u001b[A\n",
2388
+ "\u001b[A\n",
2389
+ "\u001b[A\n",
2390
+ "\u001b[A\n",
2391
+ "\u001b[A\n",
2392
+ "\u001b[A\n",
2393
+ "\u001b[A\n",
2394
+ "\u001b[A\n",
2395
+ "\u001b[A\n",
2396
+ "\u001b[A\n",
2397
+ "\u001b[A\n",
2398
+ "\u001b[A\n",
2399
+ "\u001b[A\n",
2400
+ "\u001b[A\n",
2401
+ "\u001b[A\n",
2402
+ "\u001b[A\n",
2403
+ "\u001b[A\n",
2404
+ "\u001b[A\n",
2405
+ "\u001b[A\n",
2406
+ "\u001b[A\n",
2407
+ "\u001b[A\n",
2408
+ "\u001b[A\n",
2409
+ "\u001b[A\n",
2410
+ "\u001b[A\n",
2411
+ "\u001b[A\n",
2412
+ "\u001b[A\n",
2413
+ "\u001b[A\n",
2414
+ "\u001b[A\n",
2415
+ "\u001b[A\n",
2416
+ "\u001b[A\n",
2417
+ "\u001b[A\n",
2418
+ "\u001b[A\n",
2419
+ "\u001b[A\n",
2420
+ "\u001b[A\n",
2421
+ "\u001b[A\n",
2422
+ "\u001b[A\n",
2423
+ "\u001b[A\n",
2424
+ "\u001b[A\n",
2425
+ "\u001b[A\n",
2426
+ "\u001b[A\n",
2427
+ "\u001b[A\n",
2428
+ "\u001b[A\n",
2429
+ "\u001b[A\n",
2430
+ "\u001b[A\n",
2431
+ "\u001b[A\n",
2432
+ "\u001b[A\n",
2433
+ "\u001b[A\n",
2434
+ "\u001b[A\n",
2435
+ "\u001b[A\n",
2436
+ "\u001b[A\n",
2437
+ "\u001b[A\n",
2438
+ "\u001b[A\n",
2439
+ "\u001b[A\n",
2440
+ "\u001b[A\n",
2441
+ "\u001b[A\n",
2442
+ "\u001b[A\n",
2443
+ "\u001b[A\n",
2444
+ "\u001b[A\n",
2445
+ "\u001b[A\n",
2446
+ "\u001b[A\n",
2447
+ "\u001b[A\n",
2448
+ "\u001b[A\n",
2449
+ "\u001b[A\n",
2450
+ "\u001b[A\n",
2451
+ "\u001b[A\n",
2452
+ "\u001b[A\n",
2453
+ "\u001b[A\n",
2454
+ "\u001b[A\n",
2455
+ "\u001b[A\n",
2456
+ "\u001b[A\n",
2457
+ "\u001b[A\n",
2458
+ "\u001b[A\n",
2459
+ "\u001b[A\n",
2460
+ "\u001b[A\n",
2461
+ "\u001b[A\n",
2462
+ "\u001b[A\n",
2463
+ "\u001b[A\n",
2464
+ "\u001b[A\n",
2465
+ "\u001b[A\n",
2466
+ "\u001b[A\n",
2467
+ "\u001b[A\n",
2468
+ "\u001b[A\n",
2469
+ "\u001b[A\n",
2470
+ "\u001b[A\n",
2471
+ "\u001b[A\n",
2472
+ "\u001b[A\n",
2473
+ "\u001b[A\n",
2474
+ "\u001b[A\n",
2475
+ "\u001b[A\n",
2476
+ "\u001b[A\n",
2477
+ "\u001b[A\n",
2478
+ "\u001b[A\n",
2479
+ "\u001b[A\n",
2480
+ "\u001b[A\n",
2481
+ "\u001b[A\n",
2482
+ "\u001b[A\n",
2483
+ "\u001b[A\n",
2484
+ "\u001b[A\n",
2485
+ "\u001b[A\n",
2486
+ "\u001b[A\n",
2487
+ "\u001b[A\n",
2488
+ "\u001b[A\n",
2489
+ "\u001b[A\n",
2490
+ "\u001b[A\n",
2491
+ "\u001b[A\n",
2492
+ "\u001b[A\n",
2493
+ "\u001b[A\n",
2494
+ "\u001b[A\n",
2495
+ "\u001b[A\n",
2496
+ "\u001b[A\n",
2497
+ "\u001b[A\n",
2498
+ "\u001b[A\n",
2499
+ "\u001b[A\n",
2500
+ "\u001b[A\n",
2501
+ "\u001b[A\n",
2502
+ "\u001b[A\n",
2503
+ "\u001b[A\n",
2504
+ "\u001b[A\n",
2505
+ "\u001b[A\n",
2506
+ "\u001b[A\n",
2507
+ "\u001b[A\n",
2508
+ "\u001b[A\n",
2509
+ "\u001b[A\n",
2510
+ "\u001b[A\n",
2511
+ "\u001b[A\n",
2512
+ "\u001b[A\n",
2513
+ "\u001b[A\n",
2514
+ "\u001b[A\n",
2515
+ "\u001b[A\n",
2516
+ "\u001b[A\n",
2517
+ "\u001b[A\n",
2518
+ "\u001b[A\n",
2519
+ "\u001b[A\n",
2520
+ "\u001b[A\n",
2521
+ "\u001b[A\n",
2522
+ "\u001b[A\n",
2523
+ "\u001b[A\n",
2524
+ "\u001b[A\n",
2525
+ "\u001b[A\n",
2526
+ "\u001b[A\n",
2527
+ "\u001b[A\n",
2528
+ "\u001b[A\n",
2529
+ "\u001b[A\n",
2530
+ "\u001b[A\n",
2531
+ "\u001b[A\n",
2532
+ "\u001b[A\n",
2533
+ "\u001b[A\n",
2534
+ "\u001b[A\n",
2535
+ "\u001b[A\n",
2536
+ "\u001b[A\n",
2537
+ "\u001b[A\n",
2538
+ "\u001b[A\n",
2539
+ "\u001b[A\n",
2540
+ "\u001b[A\n",
2541
+ "\u001b[A\n",
2542
+ "\u001b[A\n",
2543
+ "\u001b[A\n",
2544
+ "\u001b[A\n",
2545
+ "\u001b[A\n",
2546
+ "\u001b[A\n",
2547
+ "\u001b[A\n",
2548
+ "\u001b[A\n",
2549
+ "\u001b[A\n",
2550
+ "\u001b[A\n",
2551
+ "\u001b[A\n",
2552
+ "\u001b[A\n",
2553
+ "\u001b[A\n",
2554
+ "\u001b[A\n",
2555
+ "\u001b[A\n",
2556
+ "\u001b[A\n",
2557
+ "\u001b[A\n",
2558
+ "\u001b[A\n",
2559
+ "\u001b[A\n",
2560
+ "\u001b[A\n",
2561
+ "\u001b[A\n",
2562
+ "\u001b[A\n",
2563
+ "\u001b[A\n",
2564
+ "\u001b[A\n",
2565
+ "\u001b[A\n",
2566
+ "\u001b[A\n",
2567
+ "\u001b[A\n",
2568
+ "\u001b[A\n",
2569
+ "\u001b[A\n",
2570
+ "\u001b[A\n",
2571
+ "\u001b[A\n",
2572
+ "\u001b[A\n",
2573
+ "\u001b[A\n",
2574
+ "\u001b[A\n",
2575
+ "\u001b[A\n",
2576
+ "\u001b[A\n",
2577
+ "\u001b[A\n",
2578
+ "\u001b[A\n",
2579
+ "\u001b[A\n",
2580
+ "\u001b[A\n",
2581
+ "\u001b[A\n",
2582
+ "\u001b[A\n",
2583
+ "\u001b[A\n",
2584
+ "\u001b[A\n",
2585
+ "\u001b[A\n",
2586
+ "\u001b[A\n",
2587
+ "\u001b[A\n",
2588
+ "\u001b[A\n",
2589
+ "\u001b[A\n",
2590
+ "\u001b[A\n",
2591
+ "\u001b[A\n",
2592
+ "\u001b[A\n",
2593
+ "\u001b[A\n",
2594
+ "\u001b[A\n",
2595
+ "\u001b[A\n",
2596
+ "\u001b[A\n",
2597
+ "\u001b[A\n",
2598
+ "\u001b[A\n",
2599
+ "\u001b[A\n",
2600
+ "\u001b[A\n",
2601
+ "\u001b[A\n",
2602
+ "\u001b[A\n",
2603
+ "\u001b[A\n",
2604
+ "\u001b[A\n",
2605
+ "\u001b[A\n",
2606
+ "\u001b[A\n",
2607
+ "\u001b[A\n",
2608
+ "\u001b[A\n",
2609
+ "\u001b[A\n",
2610
+ "\u001b[A\n",
2611
+ "\u001b[A\n",
2612
+ "\u001b[A\n",
2613
+ "\u001b[A\n",
2614
+ "\u001b[A\n",
2615
+ "\u001b[A\n",
2616
+ "\u001b[A\n",
2617
+ "\u001b[A\n",
2618
+ "\u001b[A\n",
2619
+ "\u001b[A\n",
2620
+ " \n",
2621
+ "\u001b[A \n",
2622
+ "\n",
2623
+ "100%|██████████| 16425/16425 [1:19:51<00:00, 3.48it/s]\n",
2624
+ "\u001b[A\n",
2625
+ " \n",
2626
+ "100%|██████████| 16425/16425 [1:19:51<00:00, 3.43it/s]"
2627
+ ]
2628
+ },
2629
+ {
2630
+ "name": "stdout",
2631
+ "output_type": "stream",
2632
+ "text": [
2633
+ "{'eval_loss': 1.1455239057540894, 'eval_runtime': 64.9253, 'eval_samples_per_second': 162.802, 'eval_steps_per_second': 10.181, 'epoch': 3.0}\n",
2634
+ "{'train_runtime': 4791.8158, 'train_samples_per_second': 54.843, 'train_steps_per_second': 3.428, 'train_loss': 1.0782835241866437, 'epoch': 3.0}\n"
2635
+ ]
2636
+ },
2637
+ {
2638
+ "name": "stderr",
2639
+ "output_type": "stream",
2640
+ "text": [
2641
+ "\n"
2642
+ ]
2643
+ },
2644
+ {
2645
+ "data": {
2646
+ "text/plain": [
2647
+ "TrainOutput(global_step=16425, training_loss=1.0782835241866437, metrics={'train_runtime': 4791.8158, 'train_samples_per_second': 54.843, 'train_steps_per_second': 3.428, 'train_loss': 1.0782835241866437, 'epoch': 3.0})"
2648
+ ]
2649
+ },
2650
+ "execution_count": 17,
2651
+ "metadata": {},
2652
+ "output_type": "execute_result"
2653
+ }
2654
+ ],
2655
+ "source": [
2656
+ "trainer.train()"
2657
+ ]
2658
+ },
2659
+ {
2660
+ "cell_type": "code",
2661
+ "execution_count": 6,
2662
+ "metadata": {},
2663
+ "outputs": [],
2664
+ "source": [
2665
+ "question = \"how many parameters in bloom?\"\n",
2666
+ "context = \"BLOOM has 176 billion parameters and can generate text in 46 languages natural languages and 13 programming languages.\""
2667
+ ]
2668
+ },
2669
+ {
2670
+ "cell_type": "code",
2671
+ "execution_count": 7,
2672
+ "metadata": {},
2673
+ "outputs": [
2674
+ {
2675
+ "data": {
2676
+ "text/plain": [
2677
+ "{'score': 0.8450772762298584, 'start': 10, 'end': 21, 'answer': '176 billion'}"
2678
+ ]
2679
+ },
2680
+ "execution_count": 7,
2681
+ "metadata": {},
2682
+ "output_type": "execute_result"
2683
+ }
2684
+ ],
2685
+ "source": [
2686
+ "from transformers import pipeline\n",
2687
+ "question_answerer = pipeline(\"question-answering\", model=\"finetuning_squad/checkpoint-16000\")\n",
2688
+ "question_answerer(question=question, context=context)"
2689
+ ]
2690
+ },
2691
+ {
2692
+ "cell_type": "code",
2693
+ "execution_count": null,
2694
+ "metadata": {},
2695
+ "outputs": [],
2696
+ "source": []
2697
+ }
2698
+ ],
2699
+ "metadata": {
2700
+ "kernelspec": {
2701
+ "display_name": "transformer",
2702
+ "language": "python",
2703
+ "name": "python3"
2704
+ },
2705
+ "language_info": {
2706
+ "codemirror_mode": {
2707
+ "name": "ipython",
2708
+ "version": 3
2709
+ },
2710
+ "file_extension": ".py",
2711
+ "mimetype": "text/x-python",
2712
+ "name": "python",
2713
+ "nbconvert_exporter": "python",
2714
+ "pygments_lexer": "ipython3",
2715
+ "version": "3.12.1"
2716
+ }
2717
+ },
2718
+ "nbformat": 4,
2719
+ "nbformat_minor": 2
2720
+ }
finetuning_squad/checkpoint-16000/config.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "distilbert-base-uncased",
3
+ "activation": "gelu",
4
+ "architectures": [
5
+ "DistilBertForQuestionAnswering"
6
+ ],
7
+ "attention_dropout": 0.1,
8
+ "dim": 768,
9
+ "dropout": 0.1,
10
+ "hidden_dim": 3072,
11
+ "initializer_range": 0.02,
12
+ "max_position_embeddings": 512,
13
+ "model_type": "distilbert",
14
+ "n_heads": 12,
15
+ "n_layers": 6,
16
+ "pad_token_id": 0,
17
+ "qa_dropout": 0.1,
18
+ "seq_classif_dropout": 0.2,
19
+ "sinusoidal_pos_embds": false,
20
+ "tie_weights_": true,
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.36.2",
23
+ "vocab_size": 30522
24
+ }
finetuning_squad/checkpoint-16000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:31d5b145696822a4d87bbbbfb78718c21d0e3af6d3223481ed6583cad3634933
3
+ size 265470032
finetuning_squad/checkpoint-16000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6ed327d5a672048f2766660157675fc6060d6a8fea4b668d5b57a8cd257717c1
3
+ size 531000890
finetuning_squad/checkpoint-16000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c7694ad61c6f3fdf06c3886f3b2051ac78ad82b8d781c31b56980e3472c1e84b
3
+ size 14244
finetuning_squad/checkpoint-16000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9606f0ea2f5eb5f76ad015d4a642c5e3726e2eb5a07b004c0b731a70b88768fb
3
+ size 1064
finetuning_squad/checkpoint-16000/special_tokens_map.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": "[CLS]",
3
+ "mask_token": "[MASK]",
4
+ "pad_token": "[PAD]",
5
+ "sep_token": "[SEP]",
6
+ "unk_token": "[UNK]"
7
+ }
finetuning_squad/checkpoint-16000/tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
finetuning_squad/checkpoint-16000/tokenizer_config.json ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_lower_case": true,
47
+ "mask_token": "[MASK]",
48
+ "model_max_length": 512,
49
+ "pad_token": "[PAD]",
50
+ "sep_token": "[SEP]",
51
+ "strip_accents": null,
52
+ "tokenize_chinese_chars": true,
53
+ "tokenizer_class": "DistilBertTokenizer",
54
+ "unk_token": "[UNK]"
55
+ }
finetuning_squad/checkpoint-16000/trainer_state.json ADDED
@@ -0,0 +1,229 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 2.922374429223744,
5
+ "eval_steps": 500,
6
+ "global_step": 16000,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.09,
13
+ "learning_rate": 1.9391171993911722e-05,
14
+ "loss": 2.9417,
15
+ "step": 500
16
+ },
17
+ {
18
+ "epoch": 0.18,
19
+ "learning_rate": 1.8782343987823442e-05,
20
+ "loss": 1.7699,
21
+ "step": 1000
22
+ },
23
+ {
24
+ "epoch": 0.27,
25
+ "learning_rate": 1.8173515981735163e-05,
26
+ "loss": 1.5303,
27
+ "step": 1500
28
+ },
29
+ {
30
+ "epoch": 0.37,
31
+ "learning_rate": 1.756468797564688e-05,
32
+ "loss": 1.46,
33
+ "step": 2000
34
+ },
35
+ {
36
+ "epoch": 0.46,
37
+ "learning_rate": 1.69558599695586e-05,
38
+ "loss": 1.393,
39
+ "step": 2500
40
+ },
41
+ {
42
+ "epoch": 0.55,
43
+ "learning_rate": 1.634703196347032e-05,
44
+ "loss": 1.3692,
45
+ "step": 3000
46
+ },
47
+ {
48
+ "epoch": 0.64,
49
+ "learning_rate": 1.573820395738204e-05,
50
+ "loss": 1.3134,
51
+ "step": 3500
52
+ },
53
+ {
54
+ "epoch": 0.73,
55
+ "learning_rate": 1.5129375951293761e-05,
56
+ "loss": 1.2416,
57
+ "step": 4000
58
+ },
59
+ {
60
+ "epoch": 0.82,
61
+ "learning_rate": 1.4520547945205482e-05,
62
+ "loss": 1.2574,
63
+ "step": 4500
64
+ },
65
+ {
66
+ "epoch": 0.91,
67
+ "learning_rate": 1.39117199391172e-05,
68
+ "loss": 1.2039,
69
+ "step": 5000
70
+ },
71
+ {
72
+ "epoch": 1.0,
73
+ "eval_loss": 1.1474684476852417,
74
+ "eval_runtime": 59.0937,
75
+ "eval_samples_per_second": 178.868,
76
+ "eval_steps_per_second": 11.186,
77
+ "step": 5475
78
+ },
79
+ {
80
+ "epoch": 1.0,
81
+ "learning_rate": 1.330289193302892e-05,
82
+ "loss": 1.1717,
83
+ "step": 5500
84
+ },
85
+ {
86
+ "epoch": 1.1,
87
+ "learning_rate": 1.2694063926940641e-05,
88
+ "loss": 0.9552,
89
+ "step": 6000
90
+ },
91
+ {
92
+ "epoch": 1.19,
93
+ "learning_rate": 1.2085235920852361e-05,
94
+ "loss": 0.9877,
95
+ "step": 6500
96
+ },
97
+ {
98
+ "epoch": 1.28,
99
+ "learning_rate": 1.147640791476408e-05,
100
+ "loss": 0.9877,
101
+ "step": 7000
102
+ },
103
+ {
104
+ "epoch": 1.37,
105
+ "learning_rate": 1.08675799086758e-05,
106
+ "loss": 0.9727,
107
+ "step": 7500
108
+ },
109
+ {
110
+ "epoch": 1.46,
111
+ "learning_rate": 1.025875190258752e-05,
112
+ "loss": 0.9713,
113
+ "step": 8000
114
+ },
115
+ {
116
+ "epoch": 1.55,
117
+ "learning_rate": 9.64992389649924e-06,
118
+ "loss": 0.9441,
119
+ "step": 8500
120
+ },
121
+ {
122
+ "epoch": 1.64,
123
+ "learning_rate": 9.04109589041096e-06,
124
+ "loss": 0.9527,
125
+ "step": 9000
126
+ },
127
+ {
128
+ "epoch": 1.74,
129
+ "learning_rate": 8.432267884322679e-06,
130
+ "loss": 0.9311,
131
+ "step": 9500
132
+ },
133
+ {
134
+ "epoch": 1.83,
135
+ "learning_rate": 7.823439878234399e-06,
136
+ "loss": 0.9318,
137
+ "step": 10000
138
+ },
139
+ {
140
+ "epoch": 1.92,
141
+ "learning_rate": 7.214611872146119e-06,
142
+ "loss": 0.9639,
143
+ "step": 10500
144
+ },
145
+ {
146
+ "epoch": 2.0,
147
+ "eval_loss": 1.0952799320220947,
148
+ "eval_runtime": 59.2147,
149
+ "eval_samples_per_second": 178.503,
150
+ "eval_steps_per_second": 11.163,
151
+ "step": 10950
152
+ },
153
+ {
154
+ "epoch": 2.01,
155
+ "learning_rate": 6.605783866057839e-06,
156
+ "loss": 0.9442,
157
+ "step": 11000
158
+ },
159
+ {
160
+ "epoch": 2.1,
161
+ "learning_rate": 5.996955859969558e-06,
162
+ "loss": 0.7817,
163
+ "step": 11500
164
+ },
165
+ {
166
+ "epoch": 2.19,
167
+ "learning_rate": 5.388127853881279e-06,
168
+ "loss": 0.7787,
169
+ "step": 12000
170
+ },
171
+ {
172
+ "epoch": 2.28,
173
+ "learning_rate": 4.779299847792998e-06,
174
+ "loss": 0.7367,
175
+ "step": 12500
176
+ },
177
+ {
178
+ "epoch": 2.37,
179
+ "learning_rate": 4.170471841704719e-06,
180
+ "loss": 0.7513,
181
+ "step": 13000
182
+ },
183
+ {
184
+ "epoch": 2.47,
185
+ "learning_rate": 3.5616438356164386e-06,
186
+ "loss": 0.7739,
187
+ "step": 13500
188
+ },
189
+ {
190
+ "epoch": 2.56,
191
+ "learning_rate": 2.9528158295281586e-06,
192
+ "loss": 0.7591,
193
+ "step": 14000
194
+ },
195
+ {
196
+ "epoch": 2.65,
197
+ "learning_rate": 2.343987823439878e-06,
198
+ "loss": 0.7491,
199
+ "step": 14500
200
+ },
201
+ {
202
+ "epoch": 2.74,
203
+ "learning_rate": 1.7351598173515982e-06,
204
+ "loss": 0.7394,
205
+ "step": 15000
206
+ },
207
+ {
208
+ "epoch": 2.83,
209
+ "learning_rate": 1.1263318112633182e-06,
210
+ "loss": 0.7548,
211
+ "step": 15500
212
+ },
213
+ {
214
+ "epoch": 2.92,
215
+ "learning_rate": 5.17503805175038e-07,
216
+ "loss": 0.7728,
217
+ "step": 16000
218
+ }
219
+ ],
220
+ "logging_steps": 500,
221
+ "max_steps": 16425,
222
+ "num_input_tokens_seen": 0,
223
+ "num_train_epochs": 3,
224
+ "save_steps": 500,
225
+ "total_flos": 2.508519922649395e+16,
226
+ "train_batch_size": 16,
227
+ "trial_name": null,
228
+ "trial_params": null
229
+ }
finetuning_squad/checkpoint-16000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ef75c14140ee1a816e11644c80a4f2ddc72be729b079d3bd065d7c138a8789ae
3
+ size 4664
finetuning_squad/checkpoint-16000/vocab.txt ADDED
The diff for this file is too large to render. See raw diff
 
finetuning_squad/runs/Feb04_15-16-16_gpu-MS-7C06/events.out.tfevents.1707039077.gpu-MS-7C06.1949275.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3e4e3b23a4abd33f5d2d3c2395f9e5369b147d6ad87af4f04663d0713cfb45f3
3
+ size 4184
finetuning_squad/runs/Feb04_15-16-59_gpu-MS-7C06/events.out.tfevents.1707039122.gpu-MS-7C06.1949275.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:faa2173fe2fef60ae9786509ae0c0ef1babf79e919b2d708c3f13486475f88dd
3
+ size 4184
finetuning_squad/runs/Feb04_15-17-09_gpu-MS-7C06/events.out.tfevents.1707039130.gpu-MS-7C06.1949275.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:04b6ae48515fca49b081f9198716de0654024dd6c98e7b6ea633d9359004f3f8
3
+ size 10386
requirements.txt ADDED
@@ -0,0 +1,88 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ accelerate==0.26.1
2
+ aiohttp==3.9.3
3
+ aiosignal==1.3.1
4
+ asttokens @ file:///home/conda/feedstock_root/build_artifacts/asttokens_1698341106958/work
5
+ attrs==23.2.0
6
+ certifi==2024.2.2
7
+ charset-normalizer==3.3.2
8
+ comm @ file:///home/conda/feedstock_root/build_artifacts/comm_1704278392174/work
9
+ datasets==2.16.1
10
+ debugpy @ file:///work/perseverance-python-buildout/croot/debugpy_1698884710808/work
11
+ decorator @ file:///home/conda/feedstock_root/build_artifacts/decorator_1641555617451/work
12
+ dill==0.3.7
13
+ exceptiongroup @ file:///home/conda/feedstock_root/build_artifacts/exceptiongroup_1704921103267/work
14
+ executing @ file:///home/conda/feedstock_root/build_artifacts/executing_1698579936712/work
15
+ filelock==3.13.1
16
+ frozenlist==1.4.1
17
+ fsspec==2023.10.0
18
+ huggingface-hub==0.20.3
19
+ idna==3.6
20
+ importlib-metadata @ file:///home/conda/feedstock_root/build_artifacts/importlib-metadata_1703269254275/work
21
+ ipykernel @ file:///home/conda/feedstock_root/build_artifacts/ipykernel_1705417941265/work
22
+ ipython @ file:///home/conda/feedstock_root/build_artifacts/ipython_1706795662110/work
23
+ jedi @ file:///home/conda/feedstock_root/build_artifacts/jedi_1696326070614/work
24
+ Jinja2==3.1.3
25
+ jupyter_client @ file:///home/conda/feedstock_root/build_artifacts/jupyter_client_1699283905679/work
26
+ jupyter_core @ file:///work/perseverance-python-buildout/croot/jupyter_core_1701731747496/work
27
+ MarkupSafe==2.1.5
28
+ matplotlib-inline @ file:///home/conda/feedstock_root/build_artifacts/matplotlib-inline_1660814786464/work
29
+ mpmath==1.3.0
30
+ multidict==6.0.5
31
+ multiprocess==0.70.15
32
+ nest_asyncio @ file:///home/conda/feedstock_root/build_artifacts/nest-asyncio_1705850609492/work
33
+ networkx==3.2.1
34
+ numpy==1.26.3
35
+ nvidia-cublas-cu12==12.1.3.1
36
+ nvidia-cuda-cupti-cu12==12.1.105
37
+ nvidia-cuda-nvrtc-cu12==12.1.105
38
+ nvidia-cuda-runtime-cu12==12.1.105
39
+ nvidia-cudnn-cu12==8.9.2.26
40
+ nvidia-cufft-cu12==11.0.2.54
41
+ nvidia-curand-cu12==10.3.2.106
42
+ nvidia-cusolver-cu12==11.4.5.107
43
+ nvidia-cusparse-cu12==12.1.0.106
44
+ nvidia-nccl-cu12==2.19.3
45
+ nvidia-nvjitlink-cu12==12.3.101
46
+ nvidia-nvtx-cu12==12.1.105
47
+ packaging @ file:///home/conda/feedstock_root/build_artifacts/packaging_1696202382185/work
48
+ pandas==2.2.0
49
+ parso @ file:///home/conda/feedstock_root/build_artifacts/parso_1638334955874/work
50
+ pexpect @ file:///home/conda/feedstock_root/build_artifacts/pexpect_1706113125309/work
51
+ pickleshare @ file:///home/conda/feedstock_root/build_artifacts/pickleshare_1602536217715/work
52
+ pillow==10.2.0
53
+ platformdirs @ file:///home/conda/feedstock_root/build_artifacts/platformdirs_1706713388748/work
54
+ prompt-toolkit @ file:///home/conda/feedstock_root/build_artifacts/prompt-toolkit_1702399386289/work
55
+ psutil @ file:///work/perseverance-python-buildout/croot/psutil_1698863411559/work
56
+ ptyprocess @ file:///home/conda/feedstock_root/build_artifacts/ptyprocess_1609419310487/work/dist/ptyprocess-0.7.0-py2.py3-none-any.whl
57
+ pure-eval @ file:///home/conda/feedstock_root/build_artifacts/pure_eval_1642875951954/work
58
+ pyarrow==15.0.0
59
+ pyarrow-hotfix==0.6
60
+ Pygments @ file:///home/conda/feedstock_root/build_artifacts/pygments_1700607939962/work
61
+ python-dateutil @ file:///home/conda/feedstock_root/build_artifacts/python-dateutil_1626286286081/work
62
+ pytz==2024.1
63
+ PyYAML==6.0.1
64
+ pyzmq @ file:///croot/pyzmq_1705605076900/work
65
+ regex==2023.12.25
66
+ requests==2.31.0
67
+ safetensors==0.4.2
68
+ setuptools==68.2.2
69
+ six @ file:///home/conda/feedstock_root/build_artifacts/six_1620240208055/work
70
+ stack-data @ file:///home/conda/feedstock_root/build_artifacts/stack_data_1669632077133/work
71
+ sympy==1.12
72
+ tokenizers==0.15.1
73
+ torch==2.2.0
74
+ torchaudio==2.2.0+cu118
75
+ torchvision==0.17.0+cu118
76
+ tornado @ file:///work/perseverance-python-buildout/croot/tornado_1698866362018/work
77
+ tqdm==4.66.1
78
+ traitlets @ file:///home/conda/feedstock_root/build_artifacts/traitlets_1704212992681/work
79
+ transformers==4.37.2
80
+ triton==2.2.0
81
+ typing_extensions @ file:///home/conda/feedstock_root/build_artifacts/typing_extensions_1702176139754/work
82
+ tzdata==2023.4
83
+ urllib3==2.2.0
84
+ wcwidth @ file:///home/conda/feedstock_root/build_artifacts/wcwidth_1704731205417/work
85
+ wheel==0.41.2
86
+ xxhash==3.4.1
87
+ yarl==1.9.4
88
+ zipp @ file:///home/conda/feedstock_root/build_artifacts/zipp_1695255097490/work