Windows installation (Watch my youtube video here for more info: )
git lfs install git clone https://huggingface.co/Aitrepreneur/FLUX-Prompt-Generator
Inside the FLUX-Prompt-Generator create a new virtual env python -m venv env env\Scripts\activate
Install the requirements pip install -r requirements.txt
Then run the application python app.py
Inside the app.py file, On line 337: self.groq_client = Groq(api_key="YOUR-GROQ-API-KEY") replace YOUR-GROQ-API-KEY by your API GROQ Key if you wanna use Groq for the text generation.
IF YOU WANT TO USE EVERYTHING LOCALLY:
- Download and install ollama: https://ollama.com/
- Once Ollama is running in the background, download the Llama 3 8B model by running this command in a new cmd window: ollama run llama3
- Once the llama 3 8B model is donwloaded go in the FLux prompter Webui, check the "Use Ollama (local)" checkbox and you are good to go