Friday, January 31, 2025

Oracle Cloud Infrastructure Utilized by Microsoft for Bing Conversational Search

Related stories

ServiceNow Unveils Agentic AI to Solve Complex Challenges

Powerful new AI Agent Orchestrator brings order to chaos,...

Ocient Partners with AMD for Enhanced Data & AI Efficiency

3.5X increase in processing power and over 2X increase...

SPX FLOW & Siemens Collaborate on AI & Digital Twin Design

SPX FLOW, a global leader in fluid technology, has...

Traefik Labs Delivers API Gateway for Nutanix Kubernetes

Integration Enables Advanced API Management, Robust Security, and Accelerated...

DuckDuckGoose & Banco Daycoval Partner to Fight Deepfakes

DuckDuckGoose, a Netherlands-based global leader in AI-driven deepfake detection,...
spot_imgspot_img

Oracle announced a multi-year agreement with Microsoft to support the explosive growth of AI services. Microsoft is using Oracle Cloud Infrastructure (OCI) AI infrastructure, along with Microsoft Azure AI infrastructure, for inferencing of AI models that are being optimized to power Microsoft Bing conversational searches daily. Leveraging the Oracle Interconnect for Microsoft Azure, Microsoft is able to use managed services like Azure Kubernetes Service (AKS) to orchestrate OCI Compute at massive scale to support increasing demand for Bing conversational search.

Bing conversational search requires powerful clusters of computing infrastructure that support the evaluation and analysis of search results that are conducted by Bing’s inference model.

“Generative AI is a monumental technological leap and Oracle is enabling Microsoft and thousands of other businesses to build and run new products with our OCI AI capabilities,” said Karan Batta, senior vice president, Oracle Cloud Infrastructure. “By furthering our collaboration with Microsoft, we are able to help bring new experiences to more people around the world.”

Also Read: The National Quantum Computing Centre Signs Agreement with IBM to Provide Quantum Computing Access…

“Microsoft Bing is leveraging the latest advancements in AI to provide a dramatically better search experience for people across the world,” said Divya Kumar, global head of marketing for Search & AI at Microsoft. “Our collaboration  with Oracle and use of Oracle Cloud Infrastructure along with our Microsoft Azure AI infrastructure, will expand access to customers and improve the speed  of many of our search results.”

Inference models require thousands of compute and storage instances and tens of thousands of GPUs that can operate in parallel as a single supercomputer over a multi-terabit network.

SOURCE: PRNewswire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img