Sunday, December 22, 2024

Introducing Snorkel Custom, a new offering to Accelerate Enterprise AI Development

Related stories

Doc.com Expands AI developments to Revolutionize Healthcare Access

Doc.com, a pioneering healthcare technology company, proudly announces the development...

Amesite Announces AI-Powered NurseMagic™ Growth in Marketing Reach to Key Markets

Amesite Inc., creator of the AI-powered NurseMagic™ app, announces...

Quantiphi Joins AWS Generative AI Partner Innovation Alliance

Quantiphi, an AI-first digital engineering company, has been named...
spot_imgspot_img

Snorkel Custom combines AI know-how from Snorkel experts with its programmatic AI data development platform, Snorkel Flow to accelerate enterprise AI adoption 

Snorkel AI announced Snorkel Custom, a new services and platform offering that helps enterprises use their data to adapt Large Language Models (LLMs) and quickly deliver production quality AI. Enterprises are realizing that LLMs generally do not work “off the shelf” for custom use cases, but instead must be evaluated and tuned on accurately labeled and curated data to achieve production quality. Snorkel Custom combines Snorkel’s pioneering AI data development platform, Snorkel Flow, with hands-on support from Snorkel’s machine learning experts to programmatically develop data and evaluate, tune, and serve LLMs for enterprises’ custom AI use cases.

“Production quality AI is entirely dependent on the data used to tune and align models. To cross the chasm from flashy AI demo to production ROI, enterprises must master curating, labeling and developing data for their specific use cases and settings,” said Alex Ratner, CEO and co-founder of Snorkel AI. “For over a decade, the Snorkel team has been pioneering approaches for AI data development. Snorkel Custom gives enterprises direct access to our experience developing LLMs for Fortune 100 partners, and to Snorkel Flow, our platform for programmatic AI data development. It is a unique offering in that we are accelerating delivery of production quality AI that is custom-tuned for customers’ use cases and data, while also sharing the knowledge and platform that creates a path to full self-sufficiency and ownership.”

Snorkel Custom enables enterprises to adapt any LLM for their unique use cases, and to make it easier for customers to use any model, Snorkel is committed to providing a broad library of native integrations with best-of-breed LLMs. As part of this initiative, Snorkel also announced that it has added native integration for Google Cloud’s Gemini models.

Accelerating Production Quality Generative AI
The recent explosion of Generative AI interest has created enterprise urgency to deliver production-ready AI applications. However, achieving acceptable performance on custom use cases generally requires tuning LLMs on an enterprise’s unique data. Done manually, this data development work can be time-consuming, expensive, error prone and in some cases, violate data regulations. Snorkel Flow is a modern alternative to manual AI data development that makes key data operations like data labeling, data sampling, data filtering and data slicing, programmatic – just like software development.

Snorkel Custom combines Snorkel Flow with a structured engagement process, which includes:

  • Evaluation and benchmark workshops conducted collaboratively to create a custom benchmark for each use case, combining Snorkel’s experience in LLM evaluation and data operations with customers’ requirements and domain knowledge–leading to more insightful evaluations of LLM performance.
  • Snorkel led data & LLM development to support end-to-end delivery of LLMs that are fine-tuned and aligned to meet production level performance as measured against the custom benchmark.
  • Model cost optimization and serving to optionally distill LLMs into specialized “small language models” (SLMs) that improve enterprise task-specific accuracy while also dramatically reducing cost.
  • Implementation of Snorkel Flow as the foundation for programmatic AI data development, supporting ongoing adaptation, maintenance, and auditability of the developed LLMs; and optional enablement for teams wanting to move to a full self-serve model.

Also Read: WiMi Developed a Hybrid Machine Learning Model Based on VMD and SVR to Lead Bitcoin Price Prediction

Using Snorkel Custom, Wayfair deploys thousands of product tags in months vs years
To ensure that relevant products appear in customer searches Wayfair relies on over 10,000 product tags across more than 30 million products. Wayfair’s traditional tagging process burdened suppliers with providing extensive product details and machine learning models powered by manually annotated data.

Needing a new approach, Wayfair turned to Snorkel. Together, Wayfair and Snorkel’s team implemented Snorkel Flow to fit Wayfair’s specific needs with a programmatic data development approach. Now the machine learning team can build AI services 10x faster and quickly respond to new trends with programmatic data operations. Improved model precision enables Wayfair to surface relevant products for customers, which has translated to improved cart performance and higher conversion rates.

“Snorkel is not just a vendor but a partner on our AI journey,” said Margaret Pierson, Director, Machine Learning, Wayfair. “With Snorkel, we can build significantly higher performing AI services in days as opposed to months, and we can update models programmatically as part of a full model lifecycle with Snorkel Flow at its core. When new trends emerge, we can immediately refresh our catalog to reflect items that we already have that match the new style. That is a true value unlock from a business point of view.”

Source: PRNewsWire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img