File size: 767 Bytes
b82cc5b
ce5973a
b82cc5b
 
ce5973a
b82cc5b
5e18787
b82cc5b
5e18787
b82cc5b
5e18787
b82cc5b
5e18787
b82cc5b
5e18787
b82cc5b
5e18787
b82cc5b
5e18787
b82cc5b
5e18787
b82cc5b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
---
language: en
tags:
- torchtune
---

# My Torchtune Model 

This model is a finetuned version of [None](https://huggingface.co/None) 

# Model description

More information needed

# Training and evaluation results

More information needed

# Training procedure

This model was trained using the [torchtune](https://github.com/pytorch/torchtune) library using the following command:

    ```bash
    /Users/salmanmohammadi/projects/torchtune/recipes/lora_finetune_single_device.py --config /Users/salmanmohammadi/projects/torchtune/recipes/configs/gemma/2B_lora_single_device.yaml \
 device=mps \
 epochs=1 \
 max_steps_per_epoch=10

    ```

    # Framework versions

    - torchtune 0.0.0
    - torchao 0.5.0
    - datasets 2.20.0
    - sentencepiece 0.2.0