--- library_name: transformers license: apache-2.0 base_model: google/vit-base-patch16-224-in21k tags: - image-classification - generated_from_trainer metrics: - accuracy model-index: - name: finetuned-fake-food results: [] --- # finetuned-fake-food This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the indian_food_images dataset. It achieves the following results on the evaluation set: - Loss: 0.2326 - Accuracy: 0.8987 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 9000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 0.5991 | 0.0505 | 100 | 0.6129 | 0.7028 | | 0.6593 | 0.1011 | 200 | 0.4338 | 0.8364 | | 0.4908 | 0.1516 | 300 | 0.4490 | 0.8099 | | 0.4756 | 0.2021 | 400 | 0.7639 | 0.7003 | | 0.547 | 0.2527 | 500 | 0.4253 | 0.8335 | | 0.4702 | 0.3032 | 600 | 0.3864 | 0.8446 | | 0.5099 | 0.3537 | 700 | 0.4819 | 0.7755 | | 0.5484 | 0.4042 | 800 | 0.3940 | 0.8264 | | 0.6263 | 0.4548 | 900 | 0.6219 | 0.7118 | | 0.5453 | 0.5053 | 1000 | 0.4548 | 0.7888 | | 0.5431 | 0.5558 | 1100 | 0.4210 | 0.8084 | | 0.5678 | 0.6064 | 1200 | 0.4946 | 0.8038 | | 0.3266 | 0.6569 | 1300 | 0.4538 | 0.8264 | | 0.4225 | 0.7074 | 1400 | 0.4366 | 0.8088 | | 0.32 | 0.7580 | 1500 | 0.5586 | 0.7884 | | 0.473 | 0.8085 | 1600 | 0.4805 | 0.7974 | | 0.4557 | 0.8590 | 1700 | 0.3707 | 0.8371 | | 0.408 | 0.9096 | 1800 | 0.4968 | 0.7999 | | 0.4979 | 0.9601 | 1900 | 0.4432 | 0.7898 | | 0.4115 | 1.0106 | 2000 | 0.3722 | 0.8392 | | 0.3421 | 1.0611 | 2100 | 0.5450 | 0.7401 | | 0.5165 | 1.1117 | 2200 | 0.4611 | 0.7988 | | 0.4066 | 1.1622 | 2300 | 0.3226 | 0.8725 | | 0.5085 | 1.2127 | 2400 | 0.5858 | 0.7762 | | 0.4814 | 1.2633 | 2500 | 0.3981 | 0.7766 | | 0.4554 | 1.3138 | 2600 | 0.5076 | 0.7816 | | 0.2816 | 1.3643 | 2700 | 0.4732 | 0.8127 | | 0.2516 | 1.4149 | 2800 | 0.4315 | 0.8074 | | 0.2903 | 1.4654 | 2900 | 0.3845 | 0.8557 | | 0.3493 | 1.5159 | 3000 | 0.4921 | 0.7977 | | 0.4251 | 1.5664 | 3100 | 0.3855 | 0.8231 | | 0.3356 | 1.6170 | 3200 | 0.4012 | 0.8328 | | 0.3597 | 1.6675 | 3300 | 0.3308 | 0.8496 | | 0.257 | 1.7180 | 3400 | 0.4104 | 0.8138 | | 0.3709 | 1.7686 | 3500 | 0.2769 | 0.8879 | | 0.3393 | 1.8191 | 3600 | 0.3412 | 0.8643 | | 0.4151 | 1.8696 | 3700 | 0.3078 | 0.8747 | | 0.3043 | 1.9202 | 3800 | 0.3424 | 0.8650 | | 0.3302 | 1.9707 | 3900 | 0.3513 | 0.8335 | | 0.4033 | 2.0212 | 4000 | 0.3371 | 0.8511 | | 0.3386 | 2.0718 | 4100 | 0.3402 | 0.8396 | | 0.3661 | 2.1223 | 4200 | 0.3277 | 0.8561 | | 0.2914 | 2.1728 | 4300 | 0.3065 | 0.8650 | | 0.4444 | 2.2233 | 4400 | 0.3207 | 0.8493 | | 0.2922 | 2.2739 | 4500 | 0.2968 | 0.8686 | | 0.3464 | 2.3244 | 4600 | 0.4151 | 0.8070 | | 0.2684 | 2.3749 | 4700 | 0.3810 | 0.8385 | | 0.3779 | 2.4255 | 4800 | 0.3368 | 0.8514 | | 0.4462 | 2.4760 | 4900 | 0.2677 | 0.8965 | | 0.3766 | 2.5265 | 5000 | 0.3732 | 0.8439 | | 0.4971 | 2.5771 | 5100 | 0.3266 | 0.8618 | | 0.3795 | 2.6276 | 5200 | 0.3380 | 0.8607 | | 0.4205 | 2.6781 | 5300 | 0.3436 | 0.8618 | | 0.3652 | 2.7287 | 5400 | 0.3483 | 0.8518 | | 0.3999 | 2.7792 | 5500 | 0.2603 | 0.8908 | | 0.2909 | 2.8297 | 5600 | 0.3080 | 0.8693 | | 0.3703 | 2.8802 | 5700 | 0.2950 | 0.8808 | | 0.4048 | 2.9308 | 5800 | 0.3191 | 0.8500 | | 0.3333 | 2.9813 | 5900 | 0.3773 | 0.8443 | | 0.2917 | 3.0318 | 6000 | 0.3731 | 0.8432 | | 0.4204 | 3.0824 | 6100 | 0.3783 | 0.8528 | | 0.3832 | 3.1329 | 6200 | 0.3009 | 0.8693 | | 0.32 | 3.1834 | 6300 | 0.3690 | 0.8367 | | 0.3761 | 3.2340 | 6400 | 0.3398 | 0.8392 | | 0.4041 | 3.2845 | 6500 | 0.2726 | 0.8761 | | 0.3373 | 3.3350 | 6600 | 0.3735 | 0.8285 | | 0.2869 | 3.3855 | 6700 | 0.2326 | 0.8987 | | 0.3381 | 3.4361 | 6800 | 0.2562 | 0.8933 | | 0.2193 | 3.4866 | 6900 | 0.2605 | 0.8912 | | 0.2685 | 3.5371 | 7000 | 0.2592 | 0.8822 | | 0.2867 | 3.5877 | 7100 | 0.3182 | 0.8636 | | 0.318 | 3.6382 | 7200 | 0.2988 | 0.8743 | | 0.3088 | 3.6887 | 7300 | 0.2870 | 0.8768 | | 0.3531 | 3.7393 | 7400 | 0.2924 | 0.8697 | | 0.2605 | 3.7898 | 7500 | 0.2942 | 0.8704 | | 0.419 | 3.8403 | 7600 | 0.3634 | 0.8485 | | 0.264 | 3.8909 | 7700 | 0.2996 | 0.8629 | | 0.2349 | 3.9414 | 7800 | 0.2417 | 0.8937 | | 0.2726 | 3.9919 | 7900 | 0.3228 | 0.8518 | | 0.3398 | 4.0424 | 8000 | 0.2684 | 0.8897 | | 0.1933 | 4.0930 | 8100 | 0.2657 | 0.8919 | | 0.435 | 4.1435 | 8200 | 0.2455 | 0.8972 | | 0.2373 | 4.1940 | 8300 | 0.2929 | 0.8690 | | 0.3151 | 4.2446 | 8400 | 0.2745 | 0.8761 | | 0.2258 | 4.2951 | 8500 | 0.2486 | 0.8922 | | 0.2592 | 4.3456 | 8600 | 0.2696 | 0.8801 | | 0.2301 | 4.3962 | 8700 | 0.2719 | 0.8811 | | 0.1388 | 4.4467 | 8800 | 0.2617 | 0.8879 | | 0.3242 | 4.4972 | 8900 | 0.2543 | 0.8915 | | 0.1693 | 4.5478 | 9000 | 0.2602 | 0.8879 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Tokenizers 0.19.1