The dataset viewer is not available for this dataset.
Error code: ConfigNamesError Exception: ImportError Message: To be able to use AnonAstroData/GBI-16-2D-Legacy, you need to install the following dependency: astropy. Please install it using 'pip install astropy' for instance. Traceback: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/dataset/config_names.py", line 66, in compute_config_names_response config_names = get_dataset_config_names( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 347, in get_dataset_config_names dataset_module = dataset_module_factory( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1914, in dataset_module_factory raise e1 from None File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1880, in dataset_module_factory return HubDatasetModuleFactoryWithScript( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1504, in get_module local_imports = _download_additional_modules( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 354, in _download_additional_modules raise ImportError( ImportError: To be able to use AnonAstroData/GBI-16-2D-Legacy, you need to install the following dependency: astropy. Please install it using 'pip install astropy' for instance.
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
GBI-16-2D-Legacy Dataset
GBI-16-2D-Legacy is a Huggingface dataset
wrapper around a compression dataset assembled by Maireles-González et al. (Publications of the Astronomical Society of the Pacific, 135:094502, 2023 September; doi: https://doi.org/10.1088/1538-3873/acf6e0). It contains 226 FITS images from 5 different ground-based telescope/cameras with a varying amount of entropy per image.
Usage
You first need to install the datasets
and astropy
packages:
pip install datasets astropy PIL
There are two datasets: tiny
and full
, each with train
and test
splits. The tiny
dataset has 5 2D images in the train
and 1 in the test
. The full
dataset contains all the images in the data/
directory.
Local Use (RECOMMENDED)
You can clone this repo and use directly without connecting to hf:
git clone https://huggingface.co/datasets/AnonAstroData/GBI-16-2D-Legacy
git lfs pull
Then cd GBI-16-2D-Legacy
and start python like:
from datasets import load_dataset
dataset = load_dataset("./GBI-16-2D-Legacy.py", "tiny", data_dir="./data/", writer_batch_size=1, trust_remote_code=True)
ds = dataset.with_format("np")
Now you should be able to use the ds
variable like:
ds["test"][0]["image"].shape # -> (4200, 2154)
Note of course that it will take a long time to download and convert the images in the local cache for the full
dataset. Afterward, the usage should be quick as the files are memory-mapped from disk. If you run into issues with downloading the full
dataset, try changing num_proc
in load_dataset
to >1 (e.g. 5). You can also set the writer_batch_size
to ~10-20.
Use from Huggingface Directly
This method may only be an option when trying to access the "tiny" version of the dataset.
To directly use from this data from Huggingface, you'll want to log in on the command line before starting python:
huggingface-cli login
or
import huggingface_hub
huggingface_hub.login(token=token)
Then in your python script:
from datasets import load_dataset
dataset = load_dataset("AstroCompress/GBI-16-2D-Legacy", "tiny", writer_batch_size=1, trust_remote_code=True)
ds = dataset.with_format("np")
Demo Colab Notebook
We provide a demo collab notebook to get started on using the dataset here.
Utils scripts
Note that utils scripts such as eval_baselines.py
must be run from the parent directory of utils
, i.e. python utils/eval_baselines.py
.
- Downloads last month
- 31