How to freeze layers and fine tuning

#172
by Sarwg - opened

Hi everyone,
I’m trying to fine-tune the model, and I’d like to freeze some of the layers, but I’m not sure how to do it.

Thanks for your help! :)

Example if u like to freeze the encoder u can do.

Freeze encoder

for param in model.model.encoder.parameters():
param.requires_grad = False

Also u can calculate Total trainable or frozen parameters, etc..

trainable_params = 0
frozen_params = 0

for name, param in model.named_parameters():
if param.requires_grad:
trainable_params += param.numel()
print(f"Trainable: {name} | shape: {param.shape}")
else:
frozen_params += param.numel()
# print(f"Frozen: {name} | shape: {param.shape}")

total_params = trainable_params + frozen_params

print("Total trainable parameters:", trainable_params)
print("Total frozen parameters:", frozen_params)
print("Total parameters:", total_params)
print("Ratio of trained params to total params:", trainable_params / total_params)

Thx a lot i will try this a soo as possible :)

Sign up or log in to comment