March is Merge Madness To celebrate Arcee’s recent merger with mergekit, we’re bringing you a month of resources and knowledge on model merging.
Arcee takes leading role in model merging innovations Arcee's recent merger with mergekit has made us a leader in model merging research and development. Check out our video interview with mergekit Founder Charles Goddard, who's come onboard our team as a Senior Research Engineer.
When should I use LLMs vs SLMs? When it comes to the world of language models and Gen AI, a key question for companies looking to adopt these innovations is which model(s) to use. As if it’s not already complicated enough with the plethora of foundational models out there, it is now even more daunting
Introducing Arcee’s SLM Adaptation System At Arcee, we believe in a world of smaller, specialized models that we call SLM’s. The “S” stands for smaller, specialized, scalable, and secure. These models are grounded on your data, run entirely in your own environment, and are infinitely scalable for all your use cases. We feel these
Arcee and mergekit unite Several months ago, I stumbled upon an innovative technique in the world of language model training known as Model Merging. This SOTA approach involves the fusion of two or more LLMs into a singular, cohesive model, presenting a novel and experimental method for creating sophisticated models at a fraction of
What is an SLM (Small Language Model)? The world of LLMs (Large Language Models) has cooked up a storm in recent years, with the rise of OpenAI’s GPT and the increasing proliferation of open source language models. Much excitement abounds, and virtually everyone and their grandma are mesmerized by the fact that a chat-based LLM can
24 AI influencers to follow for 2024 2024 is poised to be the year of AI adoption across businesses and the consumer sector. At Arcee, our goal is to continue sharing insights from both our team and other domain experts, so that we empower every business with the knowledge and ability to leverage this phenomenal technology. As