Snowflake Introduces Arctic: Leading The Way In Open, Enterprise-Grade Large Language Models
Snowflake has announced the release of Snowflake Arctic, a cutting-edge large language model (LLM) that is highly open and designed for enterprise use. With its unique Mixture-of-Experts (MoE) architecture, Arctic offers top-tier intelligence with exceptional efficiency at scale. The model has been optimized for complex workloads and outperforms industry benchmarks in SQL code generation and following. Snowflake is also setting a new for openness by releasing Arctic's weights under an 2.0 and sharing details of the research behind its training. Arctic is part of the Snowflake Arctic model family, which includes practical text-embedding models for retrieval use cases.
According to a recent report by Forrester, nearly half of global enterprise AI decision-makers are using existing open source LLMs to incorporate generative AI into their AI strategies. Snowflake aims to empower all users by providing them with industry-leading open LLMs through its data foundation. With the launch of Arctic, Snowflake offers a powerful and truly open model under an Apache 2.0 license. Users can leverage code templates and choose from flexible inference and training options, including NVIDIA NIM, vLLM, and Hugging Face. Arctic is immediately available for serverless inference in Snowflake Cortex and will also be available on Amazon Web Services (AWS) along with other model gardens and catalogs.

Snowflake's AI research team, comprising leading researchers and system engineers, built Arctic in less than three months and at a fraction of the cost of similar models. The team used Amazon Elastic Compute Cloud (Amazon EC2) P5 instances to train Arctic, setting a new benchmark for training state-of-the-art open, enterprise-grade models quickly and cost-effectively. Arctic's MoE design enhances both training systems and model performance, with a carefully curated data composition tailored to enterprise needs. The model achieves high-quality results by activating only a fraction of its parameters, surpassing leading open models in coding, SQL generation, and general language understanding while using significantly fewer parameters.
Snowflake continues to provide enterprises with the data foundation and AI building blocks necessary to create powerful AI and machine learning applications. Arctic, when deployed in Snowflake Cortex, enables customers to build production-grade AI apps at scale securely and within the governance perimeter of the Data Cloud. In addition to Arctic, Snowflake offers the Arctic embed family of text embedding models to the open-source community under an Apache 2.0 license.
These models, optimized for retrieval performance, are available on Hugging Face and will soon be accessible as part of the Snowflake Cortex embed function. Snowflake also offers customers access to other powerful LLMs in the Data Cloud and has expanded its partnership with NVIDIA to enhance AI innovation.
AI experts have praised Snowflake's release of Arctic as a significant step towards driving AI access, democratization, and innovation. Partners such as AI21 Labs, Coda, Hugging Face, Lamini, Landing AI, Microsoft, Perplexity, Reka, and Together AI have expressed excitement about the transparency and collaboration Snowflake brings to the AI field.
The launch of Arctic enables fine-tuning, evaluation, and innovation on cutting-edge models, driving value for end-users and contributing to the advancement of open-source AI. With its commitment to open innovation, Snowflake continues to expand the boundaries of what AI can accomplish and make it accessible to all users.