Datasets:

Modalities:
Tabular
Text
Formats:
parquet
Languages:
code
ArXiv:
Libraries:
Datasets
Dask
License:

Questions about aws configure

#7
by Riddlevv - opened

Hi, I always met 'botocore.exceptions.ClientError: An error occurred (InvalidAccessKeyId) when calling the GetObject operation: The AWS Access Key Id you provided does not exist in our records.' error when using this dataset. I try every method I can find in the stack overflow but still can not figure it out. I find many people have the same confusion and they also still not figure it out.

And I think there are some blob_ids that don't exist in software heritage? Which will cause an error when map the dataset. For example, "s3://softwareheritage/content/aff1a9263e183610f403a4d6a7f27b45eacb7ff2" doesn't exist.

If these error was caused by my wrong use. Please let me know.

Thanks very much🤗

BigCode org

Hi, turns out you don't need AWS credentials to access a public bucket such as Software Heritage. This works:

import os
import boto3
from botocore import UNSIGNED
from botocore.client import Config
from smart_open import open
from datasets import load_dataset


s3 = boto3.client('s3', region_name='us-east-1', config=Config(signature_version=UNSIGNED))

def download_contents(blob_id, src_encoding):
    s3_url = f"s3://softwareheritage/content/{blob_id}"
    
    with open(s3_url, "rb", compression=".gz", transport_params={"client": s3}) as s3bucket:
        content = s3bucket.read().decode(src_encoding)
    
    return {"content": content}


if __name__ == "__main__":
    for r in load_dataset("bigcode/the-stack-v2-dedup", name='Python', split="train", streaming=True):
        row = download_contents(r["blob_id"], r["src_encoding"])
        print(row["content"])

Sign up or log in to comment