|
--- |
|
license: apache-2.0 |
|
widget: |
|
- text: My name is El Microondas the Wise, and |
|
example_title: El Microondas |
|
- text: Kennesaw State University is a public |
|
example_title: Kennesaw State University |
|
- text: Bungie Studios is an American video game developer. They are most famous for |
|
developing the award winning Halo series of video games. They also made Destiny. |
|
The studio was founded |
|
example_title: Bungie |
|
- text: The Mona Lisa is a world-renowned painting created by |
|
example_title: Mona Lisa |
|
- text: The Harry Potter series, written by J.K. Rowling, begins with the book titled |
|
example_title: Harry Potter Series |
|
- text: 'Question: I have cities, but no houses. I have mountains, but no trees. I |
|
have water, but no fish. What am I? |
|
|
|
Answer:' |
|
example_title: Riddle |
|
- text: The process of photosynthesis involves the conversion of |
|
example_title: Photosynthesis |
|
- text: Jane went to the store to buy some groceries. She picked up apples, oranges, |
|
and a loaf of bread. When she got home, she realized she forgot |
|
example_title: Story Continuation |
|
- text: 'Problem 2: If a train leaves Station A at 9:00 AM and travels at 60 mph, |
|
and another train leaves Station B at 10:00 AM and travels at 80 mph, when will |
|
they meet if the distance between the stations is 300 miles? |
|
|
|
To determine' |
|
example_title: Math Problem |
|
- text: In the context of computer programming, an algorithm is |
|
example_title: Algorithm Definition |
|
--- |
|
# Mixsmol-4x400M-v0.1 by Ontocord |
|
This is the third checkpoint (Epoch 3) of Mixsmol-4x400M-v0.1 |
|
Note that this is an experimental in data mixing. Therefore, we only trained the model on 50B tokens (95% English and 5% Vietnamese) to test the following: |
|
- Reasoining capabilities through high-quality synthetic textbooks data pretraining |
|
- Crosslingual understanding through machine translation and multilingual + multiple tasks pretraining |
|
|
|
After verifying our hypothesis with this run, we will schedule a second run on bigger data and compute for it to achieve its maximum capability. |
|
|
|
## Data |
|
- Synthetic Textbooks: 8M samples |
|
- RefinedWeb: 1M samples |
|
- RedPajama-v2: 500K samples |
|
- MathPile: Everything |
|
- ThePile: MiniPile Subset |
|
- GoodWiki |
|
- The Stack Smol XL |
|
- The Vault: train_small split |
|
- Instruction Pretraining: 250k samples |