WorldlyLabs commited on
Commit
e584dab
1 Parent(s): dbe3f9b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -1,16 +1,16 @@
1
  ---
2
- license: apache-2.0
3
  ---
4
 
5
  # worldly-v1: Bias Mitigation Script for Image Generation
6
 
7
  ## Overview
8
 
9
- **worldly-v1** is a bias mitigation script designed to modify prompts before sending them to an image generation model. It introduces diverse ethnicities and other demographic characteristics into prompts that contain vague references to "people," "person," or related terms, helping ensure more equitable representation in generated images. This version is specifically demonstrated with the **FluxPipeline** model, but it can be used with any image generation model that accepts text-based prompts.
10
 
11
  ## Purpose
12
 
13
- The goal of **worldly-v1** is to mitigate bias in AI-generated imagery by diversifying the representations of people in prompts. This script dynamically modifies prompts by injecting randomly selected ethnicities or demographic details, ensuring equal chances of different ethnicities being represented in the generated images.
14
 
15
  ## How It Works
16
 
@@ -32,7 +32,7 @@ pip install torch diffusers huggingface_hub Pillow
32
 
33
  1. **Download the Script**
34
 
35
- You can download and integrate the **worldly-v1** script into your image generation pipeline. Use the `huggingface_hub` library to fetch the script:
36
 
37
  ```python
38
  from huggingface_hub import hf_hub_download
@@ -50,7 +50,7 @@ spec.loader.exec_module(worldly_v1)
50
 
51
  2. **Integrating into Image Generation**
52
 
53
- Once downloaded, you can use **worldly-v1** to modify prompts before image generation. Below is an example of how to integrate the script into an image generation pipeline using the **FluxPipeline** model from Diffusers.
54
 
55
  ### Example Script
56
 
@@ -156,4 +156,4 @@ if __name__ == "__main__":
156
 
157
  ## License
158
 
159
- The **worldly-v1** script is licensed under the MIT License. You are free to use, modify, and distribute this script, as long as the original copyright and permission notice is retained.
 
1
  ---
2
+ license: mit
3
  ---
4
 
5
  # worldly-v1: Bias Mitigation Script for Image Generation
6
 
7
  ## Overview
8
 
9
+ **worldly** is a bias mitigation script designed to modify prompts before sending them to an image generation model. It introduces diverse ethnicities and other demographic characteristics into prompts that contain vague references to "people," "person," or related terms, helping ensure more equitable representation in generated images. This version is specifically demonstrated with the **FluxPipeline** model, but it can be used with any image generation model that accepts text-based prompts.
10
 
11
  ## Purpose
12
 
13
+ The goal of **worldly** is to mitigate bias in AI-generated imagery by diversifying the representations of people in prompts. This script dynamically modifies prompts by injecting randomly selected ethnicities or demographic details, ensuring equal chances of different ethnicities being represented in the generated images.
14
 
15
  ## How It Works
16
 
 
32
 
33
  1. **Download the Script**
34
 
35
+ You can download and integrate the **worldly** script into your image generation pipeline. Use the `huggingface_hub` library to fetch the script:
36
 
37
  ```python
38
  from huggingface_hub import hf_hub_download
 
50
 
51
  2. **Integrating into Image Generation**
52
 
53
+ Once downloaded, you can use **worldly** to modify prompts before image generation. Below is an example of how to integrate the script into an image generation pipeline using the **FluxPipeline** model from Diffusers.
54
 
55
  ### Example Script
56
 
 
156
 
157
  ## License
158
 
159
+ The **worldly** script is licensed under the MIT License. You are free to use, modify, and distribute this script, as long as the original copyright and permission notice is retained.