The new integration makes it easier for developers to access Claude models directly from their Anthropic account
Elastic, the Search AI Company, announced the Elasticsearch Open Inference API now integrates with Anthropic, providing developers with seamless access to Anthropic’s Claude, including Claude 3.5 Sonnet, Claude 3 Haiku and Claude 3 Opus, directly from their Anthropic account.
“The integration of Claude with Elasticsearch Open Inference API allows engineers to analyze proprietary data in real time and generate important context like signals, business insights, or metadata with our frontier model family,” said Michael Gerstenhaber, vice president, Product at Anthropic . “Supporting inference during ingestion pipelines provides more flexibility for users, particularly with features that generate and store answers to frequently asked questions to minimize latency and cost. This integration will help our common customers build efficient, reliable and beneficial AI applications.”
Also Read: GitLab Announces the General Availability of GitLab Duo Enterprise
“The pace and sophistication of Anthropic’s innovation in building reliable AI systems is inspiring,” said Shay Banon, founder and chief technical officer at Elastic . “Anthropic’s Claude models are a welcome addition to the simple and powerful abstraction the Elasticsearch Open Inference API provides for developers.”
Elastic, the Search AI Company, enables everyone to find the answers they need in real-time using all their data, at scale. Elastic’s solutions for search, observability and security are built on the Elastic Search AI Platform, the development platform used by thousands of companies.
Source: Elastic