I had a backlog of LoRA model weights for SDXL that I decided to prioritize this weekend and publish. I know many are using SD3 right now, however if you have the time to try them, I hope you enjoy them.
I intend to start writing more fully on the thought process behind my approach to curating and training style and subject finetuning, beginning this next week.
Thank you for reading this post! You can find the models on my page and I'll drop a few previews here.
Mixtral or Llama 70B on Google Spreadsheet thanks to Hugging Face's Serverless Inference API ๐ค
The Add-on is now available on the HF repo "Journalists on Hugging Face" and allows rapid generation of synthetic data, automatic translation, answering questions and more from simple spreadsheet cells ๐ฅ๏ธ
Although this tool was initially developed for journalists, it actually finds a much wider inking among daily users of the Google suite and the remaining use cases to be explored are numerous.
Only a free Hugging Face API key is required to start using this no-code extension.
Do not hesitate to submit ideas for features that we could add!
Thanks to @fdaudens for initiating this development.