Latest collaboration continues companies’ work to unlock developer’s ability to create innovative AI-driven applications and experiences
LangChain’s OpenGPTs, an open-source initiative, introduces a more flexible approach to generative AI. It allows users to choose their models, control data retrieval, and manage where data is stored. Integrated with LangSmith for advanced debugging, logging, and monitoring, OpenGPTs offers a unique user-controlled experience. “The OpenGPTs project is bringing the same ideas of an agent to open source but allowing for more control over what model you use, how you do retrieval, and where your data is stored,” said Harrison Chase, Co-Founder and CEO of LangChain.Redis Cloud is foundational to OpenGPTs, serving various persistent storage needs. Chase adds, “We’re using Redis Cloud for everything persistent in OpenGPTs, including as a vector store for retrieval and a database to store messages and agent configurations. The fact that you can do all of those in one database from Redis is really appealing.”
“OpenGPTs is a wonderful example of the kind of AI applications developers can build using Redis Cloud to solve challenges like retrieval, conversational LLM memory, and semantic caching,” said Yiftach Shoolman, Co-Founder and Chief Technology Officer of Redis. “This great development by LangChain shows how our customers can address these pain points within one solution at real-time speed that is also cost-effective. We’re working across the AI ecosystem to support up-and-coming startups like LangChain to drive forward the opportunity generative AI offers the industry.”
Also Read: Pendo and Google Cloud Partner to Transform Product Management with Generative AI Capabilities and…
Redis Cloud: Enhancing OpenGPTs’ Capabilities
Redis Cloud offers several advantages for the OpenGPTs project, including:
- Versatility: Redis offers multi-model data structure support and efficient processing capabilities, adding significant value to LangChain’s OpenGPT project.
- Performance: Redis’ renowned data handling performance ensures scalability and speed, qualities essential in the fast-paced world of real-time AI applications.
- Customization and Control: This partnership reflects a growing trend in AI application development towards more tailored and user-defined applications.
- Trust in Redis: LangChain’s choice of Redis for a critical role in OpenGPTs demonstrates Redis’ proven enterprise-hardened capabilities and relevance in supporting advanced AI projects.
Innovative Integrations: Redis and LangChain
This collaboration is an important step for Redis and LangChain. It not only demonstrates Redis Cloud’s adaptability and robustness but also encourages the AI community to explore new avenues in AI application development. The LangChain OpenGPTs project builds on the long-standing partnership with LangChain that includes the integration of Redis as a vector store, semantic cache, and conversational memory.
Redis is known for being easy to use and simplifying the developer experience. Redis and LangChain are making it even easier to build AI-powered apps with LangChain Templates. The RAG template powered by Redis‘ vector search and OpenAI will help developers build and deploy a chatbot application, for example, over a set of public company financial PDFs. RAG injects large language models (LLMs) with the specific and relevant external domain-specific data they need to ground their responses, providing reliable, fast, and accurate responses. LangChain Templates integrate directly with another new LangChain offering, LangServe, making it possible to quickly ship these services as a REST API backed by FastAPI.
SOURCE: BusinessWire