IT Brief UK - Technology news for CIOs & IT decision-makers
Story image

Couchbase to pioneer vector search in AI software industry

Mon, 4th Mar 2024

Couchbase, the cloud database platform provider, has announced its plan to introduce vector search as a new feature throughout its product range. This step constitutes an industry-first and, according to the company, could revolutionise the AI application landscape by enabling them to run anywhere, from on-site to across clouds, on mobiles, and even over IoT devices at the edge of the network.

The firm has introduced vector search to its Couchbase Capella Database-as-a-Service (DBaaS) and Couchbase Server to aid enterprises in accelerating the development of a new breed of AI-powered adaptive applications. These applications, which engage users in hyper-personalised and context-based interactions, are gaining prominence in the market.

Scott Anderson, SVP of product management and business operations at Couchbase, celebrated the groundbreaking move, stating: "Adding vector search to our platform is the next step in enabling our customers to build a new wave of adaptive applications, and our ability to bring vector search from cloud to edge is game-changing. Couchbase is seizing this moment, bringing together vector search and real-time data analysis on the same platform."

Common use cases of such adaptive applications powered by vector search span a range of sectors. For example, a customer may narrow their online search for products complementary to a particular outfit by uploading a photo of the outfit, along with specific brands, customer ratings, price range, and availability at a certain geographical area. These ultra-personalised and superior-performing adaptive applications are the future of AI development.

The inclusion of vector search allows businesses to consolidate workloads into a single cloud database platform, thus simplifying the process and increasing developmental efficiency of trustworthy adaptive applications. The new feature brings along numerous benefits, including the capability of hybrid search which combines text, vector, range, and geospatial search abilities in one place. It enhances the performance since all search patterns can be supported within a single index, thereby reducing response latency.

Couchbase, along its strategic path, also plans to extend its AI partner ecosystem with LangChain and LlamaIndex, further bolstering developer productivity. The integrations are aimed at accelerating query prompt assembly, ameliorating response validation, and facilitating retrieval-augmented generation applications.

"Retrieval has become the predominant way to combine data with large language models. Many LLM-driven applications demand user-specific data beyond the model's training dataset, relying on robust databases to feed in supplementary data and context from different sources. Our integration with Couchbase provides customers another powerful database option for vector store so they can more easily build AI applications," said Harrison Chase, CEO and co-founder of LangChain.

Doug Henschen, vice president and principal analyst at Constellation Research, comments on the industry shift, stating, "The next generation of apps will be incredibly advanced as organisations put AI in the driver's seat of their innovation." Henschen also highlights the pivotal role of companies like Couchbase by providing the necessary infrastructure and reducing complexity, thus furthering the development of adaptive applications.

Couchbase, affirming its efforts to lead industry innovation, plans to make these new capabilities available in the first quarter of its fiscal year 2025, across Capella, Couchbase Server, and in beta for mobile and edge.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X