stazizov commited on
Commit
67d2b26
·
verified ·
1 Parent(s): a94a369

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -5
README.md CHANGED
@@ -41,11 +41,14 @@ To try our models, you have 2 options:
41
  2. Use our custom nodes for ComfyUI and test it with provided workflows (check out folder /workflows)
42
 
43
  ## Instruction for ComfyUI
44
- 1. Update x-flux-comfy with `git pull` or reinstall it.
45
- 2. Download Clip-L `model.safetensors` from [OpenAI VIT CLIP large](https://huggingface.co/openai/clip-vit-large-patch14), and put it to `ComfyUI/models/clip_vision/*`.
46
- 3. Download our IPAdapter from [huggingface](https://huggingface.co/XLabs-AI/flux-ip-adapter/tree/main), and put it to `ComfyUI/models/xlabs/ipadapters/*`.
47
- 4. Use `Flux Load IPAdapter` and `Apply Flux IPAdapter` nodes, choose right CLIP model and enjoy your genereations.
48
- 5. You can find example workflow in folder workflows in this repo.
 
 
 
49
 
50
  ### Limitations
51
  The IP Adapter is currently in beta.
 
41
  2. Use our custom nodes for ComfyUI and test it with provided workflows (check out folder /workflows)
42
 
43
  ## Instruction for ComfyUI
44
+ 1. Go to ComfyUI/custom_nodes
45
+ 2. Clone this repo, path should be ComfyUI/custom_nodes/x-flux-comfyui/*, where * is all the files in this repo
46
+ 3. Go to ComfyUI/custom_nodes/x-flux-comfyui/ and run python setup.py
47
+ 4. Update x-flux-comfy with `git pull` or reinstall it.
48
+ 5. Download Clip-L `model.safetensors` from [OpenAI VIT CLIP large](https://huggingface.co/openai/clip-vit-large-patch14), and put it to `ComfyUI/models/clip_vision/*`.
49
+ 6. Download our IPAdapter from [huggingface](https://huggingface.co/XLabs-AI/flux-ip-adapter/tree/main), and put it to `ComfyUI/models/xlabs/ipadapters/*`.
50
+ 7. Use `Flux Load IPAdapter` and `Apply Flux IPAdapter` nodes, choose right CLIP model and enjoy your genereations.
51
+ 8. You can find example workflow in folder workflows in this repo.
52
 
53
  ### Limitations
54
  The IP Adapter is currently in beta.