luisroque commited on
Commit
6c96549
1 Parent(s): 0659855

use the datasets package to get the datasets

Browse files
Files changed (1) hide show
  1. README.md +14 -27
README.md CHANGED
@@ -48,23 +48,13 @@ The prediction data has a similar structure to the training data and is used for
48
  Below is an example of how to load and use the datasets using the `datasets` library:
49
 
50
  ```python
51
- import pickle
52
 
53
- def load_pickle(file_path):
54
- with open(file_path, 'rb') as file:
55
- data = pickle.load(file)
56
- return data
57
-
58
- # Paths to your datasets
59
- m5_path = 'path/to/m5.pkl'
60
- police_path = 'path/to/police.pkl'
61
- prison_path = 'path/to/prison.pkl'
62
- tourism_path = 'path/to/tourism.pkl'
63
-
64
- m5_data = load_pickle(m5_path)
65
- police_data = load_pickle(police_path)
66
- prison_data = load_pickle(prison_path)
67
- tourism_data = load_pickle(tourism_path)
68
 
69
  # Example: Accessing specific data from the datasets
70
  print("M5 Data:", m5_data)
@@ -73,14 +63,14 @@ print("Prison Data:", prison_data)
73
  print("Tourism Data:", tourism_data)
74
 
75
  # Access the training data
76
- train_data = prison_data["train"]
77
 
78
  # Access the prediction data
79
- predict_data = prison_data["predict"]
80
 
81
  # Example: Extracting x_values and data
82
- x_values = train_data["x_values"]
83
- data = train_data["data"]
84
 
85
  print(f"x_values: {x_values}")
86
  print(f"data shape: {data.shape}")
@@ -88,14 +78,11 @@ print(f"data shape: {data.shape}")
88
 
89
  ### Steps to Follow:
90
 
91
- 1. **Clone the Repository:**
92
  ```sh
93
- git clone https://huggingface.co/datasets/zaai-ai/hierarchical_time_series_datasets.git
94
- cd hierarchical_time_series_datasets
95
  ```
96
- 2. **Update the File Paths:**
97
- - Ensure the paths to the .pkl files are correct in your Python script.
98
 
99
- 3. **Load the Datasets:**
100
- - Use the `pickle` library in Python to load the `.pkl` files.
101
 
 
48
  Below is an example of how to load and use the datasets using the `datasets` library:
49
 
50
  ```python
51
+ from datasets import load_dataset
52
 
53
+ # Load the datasets from Hugging Face Hub
54
+ m5_data = load_dataset('zaai-ai/hierarchical_time_series_datasets', 'm5')
55
+ police_data = load_dataset('zaai-ai/hierarchical_time_series_datasets', 'police')
56
+ prison_data = load_dataset('zaai-ai/hierarchical_time_series_datasets', 'prison')
57
+ tourism_data = load_dataset('zaai-ai/hierarchical_time_series_datasets', 'tourism')
 
 
 
 
 
 
 
 
 
 
58
 
59
  # Example: Accessing specific data from the datasets
60
  print("M5 Data:", m5_data)
 
63
  print("Tourism Data:", tourism_data)
64
 
65
  # Access the training data
66
+ train_data = prison_data['train']
67
 
68
  # Access the prediction data
69
+ predict_data = prison_data['predict']
70
 
71
  # Example: Extracting x_values and data
72
+ x_values = train_data['x_values']
73
+ data = train_data['data']
74
 
75
  print(f"x_values: {x_values}")
76
  print(f"data shape: {data.shape}")
 
78
 
79
  ### Steps to Follow:
80
 
81
+ 1. **Install the datasets library:**
82
  ```sh
83
+ pip install datasets
 
84
  ```
85
+ 2. **Load the Datasets:**
86
+ - Use the load_dataset function from the datasets library to load the datasets directly from the Hugging Face Hub.
87
 
 
 
88