Snowflake, the Data Cloud company, announced at its Snowday 2023 event new innovations that enable all users to securely tap into the power of generative AI with their enterprise data — regardless of their technical expertise. Snowflake is simplifying how every organization can securely derive value from generative AI with Snowflake Cortex (private preview), Snowflake’s new fully managed service that enables organizations to more easily discover, analyze, and build AI apps in the Data Cloud.
Snowflake Cortex gives users instant access to a growing set of serverless functions that include industry-leading large language models (LLMs) such as Meta AI’s Llama 2 model, task-specific models, and advanced vector search functionality. Using these functions, teams can accelerate their analytics and quickly build contextualized LLM-powered apps within minutes. Snowflake has also built three LLM-powered experiences leveraging Snowflake Cortex to enhance user productivity including Document AI (private preview), Snowflake Copilot (private preview), and Universal Search (private preview).
“Snowflake is helping pioneer the next wave of AI innovation by providing enterprises with the data foundation and cutting-edge AI building blocks they need to create powerful AI and machine learning apps while keeping their data safe and governed,” said Sridhar Ramaswamy, SVP of AI, Snowflake. “With Snowflake Cortex, businesses can now tap into the power of large language models in seconds, build custom LLM-powered apps within minutes, and maintain flexibility and control over their data — while reimagining how all users tap into generative AI to deliver business value.”
Customers Can Easily Develop LLM-Powered Apps Using Serverless Functions with Snowflake Cortex
As a fully managed service, Snowflake Cortex gives all customers the necessary building blocks to easily harness LLMs and AI, without the need for AI expertise or complex GPU-based infrastructure management. This includes a growing set of serverless functions available with a function call in SQL or Python code. Users of all skill sets can access these functions to quickly analyze data or build AI apps — all running in Snowflake Cortex’s cost-optimized infrastructure. These new functions include:
- Specialized Functions (private preview): A set of task-specific functions that leverage cost-effective language and AI models to accelerate everyday analytics. For any given input text, these models can detect sentiment, extract an answer, summarize the text, and translate it to a selected language. Specialized functions also include Snowflake’s existing machine learning-powered functions, including forecasting (generally available soon), anomaly detection (generally available soon), contribution explorer (public preview), and classification (private preview soon).
- General-Purpose Functions (private preview): A set of conversational functions that leverage industry-leading open source LLMs (private preview), including the open source Llama 2 model, and high-performance Snowflake LLMs (private preview soon), including a text-to-SQL model, for users to easily “chat” with their data to support a broad range of use cases. These functions also include vector embedding and search functionality (private preview soon), so users can easily contextualize the model responses with their data to create customized apps in minutes. Snowflake is also adding vector as a native data type within the Data Cloud, helping users more efficiently run these functions against their data in Snowflake.
Streamlit in Snowflake (public preview) further helps accelerate the creation of custom LLM-powered apps, enabling users to quickly turn their data, AI models, and analytic and app functions into interactive apps written in Python. There have been over 10,000+ apps developed using Streamlit in Snowflake (as of September 2023), with organizations including Priority Health, the health plan of Corewell Health, AppFolio, Braze, TransUnion, and more creating production-ready apps.
Also Read: DDN Storage Solutions Deliver 700% Gains in AI and Machine Learning for Image Segmentation…
Snowflake Cortex Unlocks Native LLM Experiences to Increase Productivity in the Data Cloud
Snowflake is also unveiling new LLM-powered experiences built on Snowflake Cortex as the underlying service. These complete experiences include user interfaces and high-performance LLMs fully hosted and managed by Snowflake Cortex, making them ideal for business teams and analysts across organizations. To further improve productivity across the Data Cloud, Snowflake’s new LLM experiences include:
- Snowflake Copilot (private preview): Snowflake’s new LLM-powered assistant, Snowflake Copilot, brings generative AI to everyday Snowflake coding tasks with natural language. Users can now ask questions of their data in plain text, write SQL queries against relevant data sets, refine queries and filter down insights, and more.
- Universal Search (private preview): With Universal Search, Snowflake is unveiling new LLM-powered search functionality so users can find and start getting value from the most relevant data and apps for their use cases, faster. This includes search across a customer’s Snowflake account, including databases, views, and Iceberg Tables (public preview soon), alongside search across data and Snowflake Native Apps available on Snowflake Marketplace.
- Document AI (private preview): Serving as Snowflake’s first LLM experience, Document AI helps enterprises use LLMs to easily extract content like invoice amounts or contractual terms from documents and fine-tune results using a visual interface and natural language. Customers are using Document AI to help their teams be smarter about their businesses, and increase efficiency in secure and scalable ways.
Snowflake Empowers Users to Fully Customize Their LLM Apps with Virtually No Limits
For more advanced users that want to fully customize their LLM apps, Snowflake is empowering them with Snowpark Container Services (public preview soon in select AWS regions), which simplifies the deployment and management of containerized workloads securely in Snowflake. Using Snowpark Container Services, developers have the flexibility to run sophisticated third-party apps, including those from commercial LLMs and vector databases, entirely in their Snowflake account. Organizations can also easily deploy, fine-tune, and manage any open source LLM within the Data Cloud.
SOURCE: BusinessWire