Elasticsearch Open Inference API and Playground Now Support Amazon Bedrock
Developers now have more LLMs to choose from when iterating and building production-ready RAG applications
Posted: Friday, Jul 12
  • KBI.Media
  • $
  • Elasticsearch Open Inference API and Playground Now Support Amazon Bedrock
Elasticsearch Open Inference API and Playground Now Support Amazon Bedrock

Elastic announced support for Amazon Bedrock-hosted models in Elasticsearch Open Inference API and Playground. Developers now have the flexibility to choose any large language model (LLM) available on Amazon Bedrock to build production-ready RAG applications.ย ย 

โ€œOur latest integration with Amazon Bedrock continues our focus on making it easier for AWS developers to build next-generation search experiences,โ€ said Shay Banon, founder and chief technology officer at Elastic. โ€œBy leveraging Elasticsearch and Amazon Bedrockโ€™s extensive model library, developers can deliver transformative conversational search.โ€ย 

Developers using Elasticsearch and models hosted on Amazon Bedrock can now store and use embeddings, refine retrieval to ground answers with proprietary data and more. Amazon Bedrock models are also available in the low-code playground experience, giving developers more choice when A/B testing LLMs.ย 

Support for Amazon Bedrock is available today, read the Inference API and Playground blogs to get started.ย ย 

Share This