Edit model card

timbrooks/instruct-pix2pix to deploy with Inference Endpoints

Expected payload:

def predict(path_to_image, prompt):
    with open(path_to_image, "rb") as i:
        b64 = base64.b64encode(i.read()).decode()
    payload = {
        "inputs": b64,
        "parameters": { 
            "prompt": prompt
        }
    }
    response = r.post(
        ENDPOINT_URL, json=payload, headers={"Content-Type": "application/json"}
    )
    return response.json()

Call it with:

resp = predict(
    path_to_image="car.png", 
    prompt="make the car green"
)
img = Image.open(BytesIO(base64.b64decode(resp)))
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for sergeipetrov/pix2pix-instruct-IE

Finetuned
(3)
this model