← Back to blog

Arcee/Mergekit launch Model Merging Hackathon

Arcee & MergeKit advance model merging innovations with launch of MergeKit Hackathon, co-sponsored by AWS. Submit your model merging research, experiments, and results for the chance to win cash prizes!

Arcee/Mergekit launch Model Merging Hackathon

If you've tried it, you know it's hard to stop.

Model merging–in the words of AI researcher Maya Akim in our Small Language Model Show this week–can get a bit addictive.

Well, we're giving you a chance to feed that addiction and earn some cash prizes at the same time.

Arcee is launching a Model Merging Hackathon, starting tomorrow (April 19) and running through May 13 (yes, that's an extended deadline–we've decided to give you all an extra week 🙌🏼!).

We're thrilled to have AWS as a Co-Sponsor in our valiant mission to get the world merging. (Side note: it's time to design a merging emoji 🤔.)

We'll be giving out a total of $9k in cash prizes, across the following categories:

  • Best new merge 
  • Best integration with other ecosystems
  • Merge that’s most upsetting to the natural order.

We want you to get creative, and–most importantly–continue to have fun with your model merging research.

Submissions will be judged by our in-house Mergers-in-Chief, MergeKit founder Charles Goddard and Arcee CEO Mark McQuade.

You can submit your work here.


More about Arcee, MergeKit, & Model Merging

Arcee is the industry leader in specialized language models for enterprise generative AI.

We're not the only ones saying that–we have a host of incredible customers, including Thomson Reuters and Guild.

And if you've been following Arcee news the past few months, you already know that–once we picked up on just how revolutionary model merging has the potential to become–we joined forces with Charles Goddard, one of the world's top experts in model merging and the creator of the Github repo MergeKit

Model Merging is a state-of-the-art technique for combining various large language models (LLMs), each fine-tuned on distinct tasks, into one single model. It’s extremely cost-effective, enabling organizations to build more performant LLMs while dramatically reducing the need for GPUs. 

If you're just discovering model merging, don't let that hold you back from participating in the hackathon–it's very easy to get started merging. Just read up on the MergeKit repo, and also check out the easy-to-use MergeKit GUI that we launched as a Hugging Face Space last week.

If you're feeling extra ambitious, you can even delve into Evolutionary Model Merging with this tutorial we just released.


Here are some other great resources to get you merging

• Background on the recent merger (pun intended) of Arcee and MergeKit

• Merge LLMs with MergeKit, by Maxime Labonne 

• Create Mixtures of Experts with MergeKit, by Maxime Labonne 

• Create Your Own Mixture of Experts Model with Mergekit and Runpod, by Plaban Nayak

• Arcee’s MergeKit: a Toolkit for Leveraging Large Language Models, academic paper by Arcee's research team, on Arxiv

• Case Study: How Arcee is Innovating Domain Adaptation, through Continual Pre-Training and Model Merging, by Arcee's research team (download here)

• All about Arcee's collaboration with the AWS Trainium team, RE the efficient training of LLMs: Revolutionizing large language model training with Arcee and AWS Trainium

• And last but not least, we devoted the first and third episodes of the Small Languge Model (SLM) Show to all things merging.

Questions about the Hackathon?


Feel free to direct any questions to maccarthy@arcee.ai. Happy merging!