File size: 3,522 Bytes
246fca2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7aa6332
246fca2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7493295
 
3d65ed7
7493295
 
246fca2
e1df46a
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
---
license: cc-by-4.0
pretty_name: Ground-based 2d images assembled in Maireles-González et al.
tags:
- astronomy
- compression
- images
dataset_info:
- config_name: full
  features:
  - name: image
    dtype:
      image:
        mode: I;16
  - name: telescope
    dtype: string
  - name: image_id
    dtype: string
  splits:
  - name: train
    num_bytes: 3509045373
    num_examples: 120
  - name: test
    num_bytes: 970120060
    num_examples: 32
  download_size: 2240199274
  dataset_size: 4479165433
- config_name: tiny
  features:
  - name: image
    dtype:
      image:
        mode: I;16
  - name: telescope
    dtype: string
  - name: image_id
    dtype: string
  splits:
  - name: train
    num_bytes: 307620695
    num_examples: 10
  - name: test
    num_bytes: 168984698
    num_examples: 5
  download_size: 238361934
  dataset_size: 476605393
---

# GBI-16-2D-Legacy Dataset

GBI-16-2D-Legacy is a Huggingface `dataset` wrapper around a compression dataset assembled by Maireles-González et al. (Publications of the Astronomical Society of the Pacific, 135:094502, 2023 September; doi: [https://doi.org/10.1088/1538-3873/acf6e0](https://doi.org/10.1088/1538-3873/acf6e0)). It contains 226 FITS images from 5 different ground-based telescope/cameras with a varying amount of entropy per image.

# Usage

You first need to install the `datasets` and `astropy` packages:

```bash
pip install datasets astropy PIL
```

There are two datasets: `tiny` and `full`, each with `train` and `test` splits. The `tiny` dataset has 5 2D images in the `train` and 1 in the `test`. The `full` dataset contains all the images in the `data/` directory.

## Local Use (RECOMMENDED)

You can clone this repo and use directly without connecting to hf:

```bash
git clone https://huggingface.co/datasets/AnonAstroData/GBI-16-2D-Legacy
```

```bash
git lfs pull
```

Then `cd GBI-16-2D-Legacy` and start python like:

```python
from datasets import load_dataset
dataset = load_dataset("./GBI-16-2D-Legacy.py", "tiny", data_dir="./data/", writer_batch_size=1, trust_remote_code=True)
ds = dataset.with_format("np")
```

Now you should be able to use the `ds` variable like:

```python
ds["test"][0]["image"].shape # -> (4200, 2154)
```

Note of course that it will take a long time to download and convert the images in the local cache for the `full` dataset. Afterward, the usage should be quick as the files are memory-mapped from disk. If you run into issues with downloading the `full` dataset, try changing `num_proc` in `load_dataset` to >1 (e.g. 5). You can also set the `writer_batch_size` to ~10-20.



## Use from Huggingface Directly

This method may only be an option when trying to access the "tiny" version of the dataset.

To directly use from this data from Huggingface, you'll want to log in on the command line before starting python:

```bash
huggingface-cli login
```

or

```
import huggingface_hub
huggingface_hub.login(token=token)
```

Then in your python script:

```python
from datasets import load_dataset
dataset = load_dataset("AstroCompress/GBI-16-2D-Legacy", "tiny", writer_batch_size=1, trust_remote_code=True)
ds = dataset.with_format("np")
```


## Demo Colab Notebook
We provide a demo collab notebook to get started on using the dataset [here](https://colab.research.google.com/drive/1wcz7qMqSAMST2kXFlL-TbwpYR26gYIYy?usp=sharing).


## Utils scripts
Note that utils scripts such as `eval_baselines.py` must be run from the parent directory of `utils`, i.e. `python utils/eval_baselines.py`.