Insane model size on disk
#3
by
vetka925
- opened
Why does 13B model take up 55 GB of disk space?
The model is originally stored in full precision (fp32), where each parameter uses 4 bytes. However, it can also be loaded in half-precision, resulting in a memory footprint of 27 GB.
model = AutoModelForCausalLM.from_pretrained("inception-mbzuai/jais-13b-chat", torch_dtype=torch.float16)
sunilitggu
changed discussion status to
closed