GBI-16-2D-Legacy / README.md
anonuser7251's picture
Update README.md
7aa6332 verified
metadata
license: cc-by-4.0
pretty_name: Ground-based 2d images assembled in Maireles-González et al.
tags:
  - astronomy
  - compression
  - images
dataset_info:
  - config_name: full
    features:
      - name: image
        dtype:
          image:
            mode: I;16
      - name: telescope
        dtype: string
      - name: image_id
        dtype: string
    splits:
      - name: train
        num_bytes: 3509045373
        num_examples: 120
      - name: test
        num_bytes: 970120060
        num_examples: 32
    download_size: 2240199274
    dataset_size: 4479165433
  - config_name: tiny
    features:
      - name: image
        dtype:
          image:
            mode: I;16
      - name: telescope
        dtype: string
      - name: image_id
        dtype: string
    splits:
      - name: train
        num_bytes: 307620695
        num_examples: 10
      - name: test
        num_bytes: 168984698
        num_examples: 5
    download_size: 238361934
    dataset_size: 476605393

GBI-16-2D-Legacy Dataset

GBI-16-2D-Legacy is a Huggingface dataset wrapper around a compression dataset assembled by Maireles-González et al. (Publications of the Astronomical Society of the Pacific, 135:094502, 2023 September; doi: https://doi.org/10.1088/1538-3873/acf6e0). It contains 226 FITS images from 5 different ground-based telescope/cameras with a varying amount of entropy per image.

Usage

You first need to install the datasets and astropy packages:

pip install datasets astropy PIL

There are two datasets: tiny and full, each with train and test splits. The tiny dataset has 5 2D images in the train and 1 in the test. The full dataset contains all the images in the data/ directory.

Local Use (RECOMMENDED)

You can clone this repo and use directly without connecting to hf:

git clone https://huggingface.co/datasets/AnonAstroData/GBI-16-2D-Legacy
git lfs pull

Then cd GBI-16-2D-Legacy and start python like:

from datasets import load_dataset
dataset = load_dataset("./GBI-16-2D-Legacy.py", "tiny", data_dir="./data/", writer_batch_size=1, trust_remote_code=True)
ds = dataset.with_format("np")

Now you should be able to use the ds variable like:

ds["test"][0]["image"].shape # -> (4200, 2154)

Note of course that it will take a long time to download and convert the images in the local cache for the full dataset. Afterward, the usage should be quick as the files are memory-mapped from disk. If you run into issues with downloading the full dataset, try changing num_proc in load_dataset to >1 (e.g. 5). You can also set the writer_batch_size to ~10-20.

Use from Huggingface Directly

This method may only be an option when trying to access the "tiny" version of the dataset.

To directly use from this data from Huggingface, you'll want to log in on the command line before starting python:

huggingface-cli login

or

import huggingface_hub
huggingface_hub.login(token=token)

Then in your python script:

from datasets import load_dataset
dataset = load_dataset("AstroCompress/GBI-16-2D-Legacy", "tiny", writer_batch_size=1, trust_remote_code=True)
ds = dataset.with_format("np")

Demo Colab Notebook

We provide a demo collab notebook to get started on using the dataset here.

Utils scripts

Note that utils scripts such as eval_baselines.py must be run from the parent directory of utils, i.e. python utils/eval_baselines.py.