Saturday, November 16, 2024

NetApp drives the future of AI with intelligent data infrastructure

Related stories

Frost & Sullivan Launches FrostAI to Drive Growth Opportunities

Accelerating your Transformational Growth Journey: FrostAI’s elevated user experience...

Kyndryl & Microsoft Launch Services to Boost Cyber Resilience

New Kyndryl services, co-developed with Microsoft, provide augmented security...

Cognigy Launches Agentic AI for Enterprise Contact Centers

Cognigy, a global leader in AI-powered customer service solutions,...

Why Sustainable Data Centers Are Vital for the Future of Business

In today’s digital landscape, data centers are the critical...
spot_imgspot_img

Innovations across the NetApp portfolio and collaborations with industry leaders like NVIDIA drive business results with AI

NetApp, the intelligent data infrastructure company, announced new developments in its collaboration with industry leaders to accelerate AI innovation. By delivering the intelligent data infrastructure required to power GenAI, NetApp is helping organizations capitalize on one of the most significant developments for business and IT in the past decade.

GenAI supports practical and highly visible use cases for business innovation such as content creation, summarizing large amounts of information, and answering questions. Gartner studies predict that spending on AI software will grow to $297.9 billion by 2027, and that GenAI will account for more than a third of that. The key to success in the AI ​​era is mastering controllable, trustworthy, and traceable data.

Yesterday, NetApp CEO George Kurian kicked off NetApp INSIGHT 2024 with a comprehensive vision of this era of data intelligence. A big part of the AI ​​challenge is a data challenge, and Kurian shared a vision of how an intelligent data infrastructure can ensure that the relevant data is secure, governed, and always updated to feed a unified, integrated GenAI stack.

Today at NetApp INSIGHT, NetApp will unveil additional innovations in intelligent data infrastructure, including a transformative vision for AI running on NetApp ONTAP®, the leading operating system for unified storage. Specifically, NetApp’s vision includes:

  • NVIDIA DGX SuperPOD Storage Certification for NetApp ONTAP : NetApp has begun the NVIDIA certification process for NetApp ONTAP storage on the AFF A90 platform with NVIDIA DGX SuperPOD AI infrastructure, which will enable enterprises to leverage industry-leading data management capabilities for their largest AI projects. This certification will complement and build upon the existing certification of NetApp ONTAP with NVIDIA DGX BasePOD . NetApp ONTAP solves the challenges of data management for large language models (LLMs), eliminating the need for data management compromises for AI training workloads.
  • Creating a global metadata namespace to securely and compliantly explore and manage data across a customer’s hybrid multi-cloud environment to enable feature extraction and data classification for AI. NetApp separately today announced a new integration with NVIDIA AI software that can leverage the global metadata namespace with ONTAP to drive retrieval augmented generation (RAG) for agent-based AI in the enterprise.
  • Directly integrated AI data pipeline that enables ONTAP to automatically and iteratively prepare unstructured data for AI by capturing incremental changes to the customer record, performing policy-driven data classification and anonymization, generating highly compressible vector embeddings, and storing them in a vector database integrated into the ONTAP data model, ready for large-scale semantic search, low-latency retrieval augmented generation (RAG) inference, and retrieval augmented generation (RAG) inference.
  • A disaggregated storage architecture that enables full sharing of the storage backend, maximizing the use of network and flash speeds and reducing infrastructure costs. This significantly improves performance while saving rack space and power for very large-scale, compute-intensive AI workloads such as LLM training. This architecture will be an integral part of NetApp ONTAP, allowing the benefits of a disaggregated storage architecture to be realized while retaining ONTAP’s proven capabilities in resiliency, data management, security, and governance.
  • New capabilities for native cloud services to drive AI innovation in the cloud. NetApp is working to provide an integrated and centralized data platform for collecting, discovering, and cataloging data across all of its native cloud services. NetApp is also integrating its cloud services with data warehouses and developing data processing services to visualize, prepare, and transform data. The prepared data sets can then be securely shared and used with cloud providers’ AI and machine learning services, including third-party solutions. NetApp will also announce a planned integration that will enable customers to use Google Cloud NetApp Volumes as data storage for BigQuery and Vertex AI.

Also Read: GRAX Simplifies Salesforce Insights with AI Reporting

“Companies of all sizes are experimenting with GenAI to increase efficiency and accelerate innovation,” said Krish Vitaldevara, senior vice president, Platform at NetApp. “NetApp enables companies to realize the full potential of GenAI to drive innovation and create value across multiple line-of-business applications. By providing a secure, scalable, and high-performance intelligent data infrastructure that integrates with other industry-leading platforms, NetApp helps customers overcome barriers to implementing GenAI. With these solutions, companies can use their data in GenAI applications faster and more efficiently and outperform their competitors.”

NetApp continues to innovate in the AI ​​ecosystem:

  • Domino Data Labs Selects Amazon FSx for NetApp ONTAP: To advance the state of machine learning operations (MLOps), NetApp has partnered with Domino Data Labs, underscoring the importance of seamless integration into AI workflows. Starting today, Domino is using Amazon FSx for NetApp ONTAP as the underlying storage for Domino datasets running on the Domino Cloud Platform to provide cost-effective performance, scalability, and the ability to accelerate model development. In addition to Domino’s use of FSx for NetApp ONTAP, Domino and NetApp have also begun joint development to integrate Domino’s MLOps platform directly into NetApp ONTAP, simplifying data management for AI workloads.
  • AIPod with Lenovo for NVIDIA OVX General Availability : Announced in May 2024, the NetApp AIPod with Lenovo ThinkSystem servers for NVIDIA OVX converged infrastructure solution is now generally available. This infrastructure solution is designed for organizations looking to leverage generative AI and RAG capabilities to increase productivity, optimize operations, and unlock new revenue opportunities.
  • New features for FlexPod AI : NetApp is releasing new features for its FlexPod AI solution, the hybrid infrastructure and operations platform that accelerates the delivery of modern workloads. FlexPod AI with RAG simplifies, automates, and secures AI applications and enables organizations to realize the full potential of their data. With Cisco Compute, Cisco Network, and NetApp Storage, customers benefit from lower costs, efficient scaling, faster time to value, and reduced risk.

“Implementing AI requires a number of finely tuned pieces of technology infrastructure that must work perfectly together,” said Mike Leone, Practice Director, Data Analytics & AI, Enterprise Strategy Group, part of TechTarget. “NetApp offers robust storage and data management capabilities to help customers run and support their AI data pipelines. But storage is only one piece of the puzzle. By collaborating with other industry-leading AI infrastructure providers, NetApp customers can be confident that their compute, networking, storage and AI software solutions will integrate seamlessly to drive AI innovation.”

Source: Businesswire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img