Habana
regisss HF staff commited on
Commit
b05fbed
1 Parent(s): f785e89

Change usage section

Browse files
Files changed (1) hide show
  1. README.md +26 -19
README.md CHANGED
@@ -23,23 +23,30 @@ This enables to specify:
23
  ## Usage
24
 
25
  The model is instantiated the same way as in the Transformers library.
26
- The only difference is that there are a few new training arguments specific to HPUs:
27
-
28
- ```
29
- from optimum.habana import GaudiTrainer, GaudiTrainingArguments
30
- from transformers import ViTForImageClassification
31
-
32
- model = ViTForImageClassification.from_pretrained('google/vit-base-patch16-224-in21k')
33
- args = GaudiTrainingArguments(
34
- output_dir="/tmp/output_dir",
35
- use_habana=True,
36
- use_lazy_mode=True,
37
- gaudi_config_name="Habana/vit",
38
- )
39
-
40
- trainer = GaudiTrainer(
41
- model=model,
42
- args=args,
43
- )
44
- trainer.train()
 
 
 
 
 
45
  ```
 
 
 
23
  ## Usage
24
 
25
  The model is instantiated the same way as in the Transformers library.
26
+ The only difference is that there are a few new training arguments specific to HPUs.
27
+
28
+ [Here](https://github.com/huggingface/optimum-habana/blob/main/examples/image-classification/run_image_classification.py) is an image classification example script to fine-tune a model. You can run it with ViT with the following command:
29
+ ```bash
30
+ python run_image_classification.py \
31
+ --model_name_or_path google/vit-base-patch16-224-in21k \
32
+ --dataset_name cifar10 \
33
+ --output_dir /tmp/outputs/ \
34
+ --remove_unused_columns False \
35
+ --do_train \
36
+ --do_eval \
37
+ --learning_rate 2e-5 \
38
+ --num_train_epochs 5 \
39
+ --per_device_train_batch_size 64 \
40
+ --per_device_eval_batch_size 64 \
41
+ --evaluation_strategy epoch \
42
+ --save_strategy epoch \
43
+ --load_best_model_at_end True \
44
+ --save_total_limit 3 \
45
+ --seed 1337 \
46
+ --use_habana \
47
+ --use_lazy_mode \
48
+ --gaudi_config_name Habana/vit \
49
+ --throughput_warmup_steps 2
50
  ```
51
+
52
+ Check the [documentation](https://huggingface.co/docs/optimum/habana/index) out for more advanced usage and examples.