YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Mean and STD:

  • lat_mean: 39.95177538047139
  • lat_std: 0.000688423824245344
  • lon_mean: -75.19147811784511
  • lon_std: 0.0006632296829719546

Implemented a ResNet50-based model using PyTorch: | import torch import torch.nn as nn from torchvision.models import resnet50

class CustomResNet50(nn.Module): def init(self, num_classes=2): super().init() self.model = resnet50(pretrained=False) num_features = self.model.fc.in_features self.model.fc = nn.Linear(num_features, num_classes)

  def forward(self, x):
      return self.model(x)

Run the following code to access the model: | from huggingface_hub import hf_hub_download import torch import torch.nn as nn from torchvision.models import resnet50

repo_id = "ImageGPSProj/ResNet50Model" filename = "custom_resnet50.pth" model_path = hf_hub_download(repo_id=repo_id, filename=filename)

Re-instantiate the architecture

loaded_model = resnet50(pretrained=False) num_features = loaded_model.fc.in_features loaded_model.fc = nn.Linear(num_features, 2)

Load the state_dict

state_dict = torch.load(model_path, map_location=torch.device('cpu')) loaded_model.load_state_dict(state_dict)

loaded_model.eval()

dataset_info: features: - name: image dtype: image - name: Latitude dtype: float64 - name: Longitude dtype: float64 splits: - name: train num_bytes: 6747451504 num_examples: 825 - name: test num_bytes: 928890377 num_examples: 105 - name: val num_bytes: 791887265 num_examples: 102 download_size: 7405818019 dataset_size: 8468229146 configs:

  • config_name: default data_files:
    • split: train path: data/train-*
    • split: test path: data/test-*
    • split: val path: data/val-*
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.