Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
victorΒ 
posted an update Jun 15
Post
3995
Together MoA is a really interesting approach based on open source models!

"We introduce Mixture of Agents (MoA), an approach to harness the collective strengths of multiple LLMs to improve state-of-the-art quality. And we provide a reference implementation, Together MoA, which leverages several open-source LLM agents to achieve a score of 65.1% on AlpacaEval 2.0, surpassing prior leader GPT-4o (57.5%)."

Read more here: https://www.together.ai/blog/together-moa

PS: they provide some demo code: (https://github.com/togethercomputer/MoA/blob/main/bot.py) - if someone release a Space for it it could go πŸš€

Neat! Took a look at the huggingchat repo and it looks fairly straightforward to implement MoA without API keys (and any HF chat models as aggregator/refs, would be interesting to see how they'd benchmark!) similarly to this quick shot at it: https://huggingface.co/spaces/SixOpen/MoA-implementation-on-OllamaUI :)
image.png

In this post