Skip to main content

Arcee AI Signs Strategic Collaboration Agreement with AWS to Accelerate the Deployment of Smaller, Specialized Language Models

Strategic collaboration empowers organizations of all sizes with efficient, high-performance, generative AI solutions

Arcee AI, a pioneer in artificial intelligence (AI) and language models, announced today the signing of a Strategic Collaboration Agreement (SCA) with Amazon Web Services (AWS). The collaboration aims to deliver advanced, tailored small language models (SLMs) to the market, enabling organizations of all sizes to harness the power of AI with efficiency and effectiveness.

As part of this collaborative agreement, Arcee AI will deploy its proven AI models using AWS and extend the success it’s already seeing with existing customers:

“A recent success with a Fortune 500 financial services customer improved their internal benchmark rankings by 23% and reduced deployment costs by 96% in the first iteration,” said Arcee AI CEO and Co-Founder Mark McQuade, who added, “Similarly, a top global Property and Casualty insurance client boosted model performance by 63% while cutting deployment costs by 82%.”

Mission Critical AI Success

Guild Education, a career advancement program with millions of customers, is building its enterprise AI strategy and deployment around Arcee AI’s SLMs – which have already helped to accelerate customer onboarding and to scale their services across operating regions.

Guild’s Head of Artificial Intelligence, Matthew Bishop, says the SLMs far exceed commercially available foundational models (FMs).

“We worked with Arcee AI to build one of the world's best career coaching AI tools. They enabled us to take the collective intelligence of all our career coaches across half a million conversations, and train an SLM that embodies Guild's brand, tone, values, and expertise in the career mobility domain. We evaluated the model against existing closed-source large language models (LLMs) with retrieval-augmented generation (RAG) solutions and fine-tuned commercial LLMs, and more than 90% of the time our staff selected the output from the Arcee AI-trained SLM. Arcee AI has enabled us to have a true one-of-a-kind competitive advantage, while keeping our total cost of ownership (TCO) lower than any closed-source model – with higher security, privacy, and quality than any other solution on the market,” continued Bishop.

Arcee AI customers can deploy and test models in minutes thanks to Amazon SageMaker JumpStart. For production, they can seamlessly deploy them privately within their AWS infrastructure, benefiting from AWS's security, scalability, and reliability. Models including SuperNova and tools like Swarm can be deployed directly via Amazon SageMaker in AWS Marketplace.

"We’re thrilled to collaborate with Arcee AI to bring innovative and efficient AI solutions to a broad set of customers and industries," says Jon Jones, AWS Vice President of Startups. “This collaboration aligns with our mission to provide access to powerful AI tools, enabling organizations of all sizes to leverage language models for their specific needs. Together, we can ensure that our generative AI solutions are scalable, secure, and state-of-the-art."

Sign up for a demo today to experience the power of SLMs, partnered with AWS. You can also meet us at AWS re:Invent booth 1406.

About Arcee AI

Arcee AI is at the forefront of AI innovation and specializes in developing and deploying language models. The company focuses on creating highly specialized, efficient, and effective AI solutions tailored to the needs of various industries. Through continuous research and development, Arcee AI is committed to pushing the boundaries of what AI can achieve.

For more information, visit Arcee.ai and AWS.

Contacts

Media Contact:

Donna Loughlin

LMGPR for Arcee AI

donna@lmgpr.com

(408) 393-5575

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.