Site icon AIT365

Snowflake Puts Industry-Leading Large Language and AI Models in the Hands of All Users with Snowflake Cortex

Snowflake

Snowflake, the Data Cloud company, announced at its Snowday 2023 event new innovations that enable all users to securely tap into the power of generative AI with their enterprise data — regardless of their technical expertise. Snowflake is simplifying how every organization can securely derive value from generative AI with Snowflake Cortex (private preview), Snowflake’s new fully managed service that enables organizations to more easily discover, analyze, and build AI apps in the Data Cloud.

Snowflake Cortex gives users instant access to a growing set of serverless functions that include industry-leading large language models (LLMs) such as Meta AI’s Llama 2 model, task-specific models, and advanced vector search functionality. Using these functions, teams can accelerate their analytics and quickly build contextualized LLM-powered apps within minutes. Snowflake has also built three LLM-powered experiences leveraging Snowflake Cortex to enhance user productivity including Document AI (private preview), Snowflake Copilot (private preview), and Universal Search (private preview).

“Snowflake is helping pioneer the next wave of AI innovation by providing enterprises with the data foundation and cutting-edge AI building blocks they need to create powerful AI and machine learning apps while keeping their data safe and governed,” said Sridhar Ramaswamy, SVP of AI, Snowflake. “With Snowflake Cortex, businesses can now tap into the power of large language models in seconds, build custom LLM-powered apps within minutes, and maintain flexibility and control over their data — while reimagining how all users tap into generative AI to deliver business value.”

Customers Can Easily Develop LLM-Powered Apps Using Serverless Functions with Snowflake Cortex

As a fully managed service, Snowflake Cortex gives all customers the necessary building blocks to easily harness LLMs and AI, without the need for AI expertise or complex GPU-based infrastructure management. This includes a growing set of serverless functions available with a function call in SQL or Python code. Users of all skill sets can access these functions to quickly analyze data or build AI apps — all running in Snowflake Cortex’s cost-optimized infrastructure. These new functions include:

Streamlit in Snowflake (public preview) further helps accelerate the creation of custom LLM-powered apps, enabling users to quickly turn their data, AI models, and analytic and app functions into interactive apps written in Python. There have been over 10,000+ apps developed using Streamlit in Snowflake (as of September 2023), with organizations including Priority Health, the health plan of Corewell Health, AppFolio, Braze, TransUnion, and more creating production-ready apps.

Also Read: DDN Storage Solutions Deliver 700% Gains in AI and Machine Learning for Image Segmentation…

Snowflake Cortex Unlocks Native LLM Experiences to Increase Productivity in the Data Cloud

Snowflake is also unveiling new LLM-powered experiences built on Snowflake Cortex as the underlying service. These complete experiences include user interfaces and high-performance LLMs fully hosted and managed by Snowflake Cortex, making them ideal for business teams and analysts across organizations. To further improve productivity across the Data Cloud, Snowflake’s new LLM experiences include:

Snowflake Empowers Users to Fully Customize Their LLM Apps with Virtually No Limits

For more advanced users that want to fully customize their LLM apps, Snowflake is empowering them with Snowpark Container Services (public preview soon in select AWS regions), which simplifies the deployment and management of containerized workloads securely in Snowflake. Using Snowpark Container Services, developers have the flexibility to run sophisticated third-party apps, including those from commercial LLMs and vector databases, entirely in their Snowflake account. Organizations can also easily deploy, fine-tune, and manage any open source LLM within the Data Cloud.

SOURCE: BusinessWire

Exit mobile version