Last year, Google Cloud and LangChain announced integrations that give generative AI developers access to a suite of LangChain Python packages. This allowed application developers to leverage Google Cloud’s database portfolio in their gen AI applications to drive the most value from their private data.
Today, we are expanding language support for our integrations to include Go, Java, and JavaScript.
Each package will have up to three LangChain integrations:
Vector stores to enable semantic search for our databases
Chat message history to enable chains to recall previous conversations
Document loader for loading documents from your enterprise data
Developers now have the flexibility to create intricate workflows and easily interchange underlying components (like a vector database) as needed to align with specific use cases. This technology unlocks a variety of applications, including personalized product recommendations, question answering, document search and synthesis, customer service automation, and more.
In this post, we’ll share more about the integrations – and code snippets to get started.
LangChain is known for its popular Python package; however, your team’s expertise and services may not be in Python. Java and Go are commonly used programming languages for production-grade and enterprise-scale applications. Developers may prefer Javascript and Typescript for the asynchronous programming support and compatibility with front-end frameworks like React and Vue.
In addition to Python developers, the LangChain developer community encompasses developers proficient in Java, JavaScript, and Go. It is an active and supportive community centered around the LangChain framework, which facilitates the development of applications powered by large language models (LLMs).
Google Cloud is dedicated to providing secure and easy to use database integrations for your Gen AI applications. Our integrations embed Google Cloud connectors that create secure connections, handle SSL certificates, and support IAM authorization and authentication. The integrations are optimized for PostgreSQL databases (AlloyDB for PostgreSQL, AlloyDB Omni, Cloud SQL for PostgreSQL) to ensure proper connection management, flexible tables schemas, and improved filtering.
JavaScript developers can utilize LangChain.js, which provides tools and building blocks for developing applications leveraging LLMs. LangChain simplifies the process of connecting LLMs to external data sources and enables reasoning capabilities in applications. Other Google Cloud integrations, such as Gemini models, are available within LangChain.js, allowing seamless interaction with GCP resources.
Resources (Cloud SQL PostgreSQL support only) | Links |
Documentation | |
How-to guides | |
Quick start guide | |
GitHub |
Below are the integrations and their code snippets to get started.
Install the dependency:
Engine
Use this package with AlloyDB for PostgreSQL and AlloyDB Omni by customizing your Engine to connect your instance. You will need the AlloyDB Auth Proxy to make authorized, encrypted connections to AlloyDB instances.
Vector store
Chat message history
Loader
For Java developers, there’s LangChain4j, a Java implementation of LangChain. This allows Java developers to build LLM-powered applications with a familiar ecosystem. In LangChain4j, you can also access the full array of VertexAI Gemini models.
*Note: Cloud SQL integrations will be released soon.
Below are the integrations and their code snippets to get started.
For Maven in pom.xml:
Engine
Embedding store
Document loader
Go support
LangchainGo is the Go programming language port of LangChain.
The LangChain framework was designed to support the development of sophisticated applications that connect language models to data sources and enable interaction with their environment. The most powerful and differentiated applications go beyond simply using a language model via an API; they are data-aware and agentic.
Last year Google’s SDKs were added as providers for LangChainGo; this makes it possible to use the capabilities of the LangChain framework with Google’s Gemini models as LLM providers.
We now have AlloyDB and Cloud SQL for PostgreSQL support in LangchainGo.
Resources | Links |
How-to guides | |
Quick start guides | |
GitHub |
Below are the integrations and their code snippets to get started.
Install the dependency
Engine
Vector Store
Chat message history
*Note code is shown for AlloyDB. See links for Cloud SQL for Postgres examples.
The LangChain Vector stores integration is available for Google Cloud databases with vector support, including AlloyDB, Cloud SQL for PostgreSQL, Firestore, Memorystore for Redis, and Spanner.
The Document loaders and Memory integrations are available for all Google Cloud databases including AlloyDB, Cloud SQL for MySQL, PostgreSQL and SQL Server, Firestore, Datastore, Bigtable, Memorystore for Redis, El Carro for Oracle databases, and Spanner. Below are a few resources to get started.
Resources:
All credit to user PGC for these videos: https://civitai.com/models/1818841/wan-22-workflow-t2v-i2v-t2i-kijai-wrapper It looks like they used Topaz…
Researchers have unveiled a new quantum material that could make quantum computers much more stable…
Humanoid robots raced and punched their way through three days of a multi-sport competition at…
How to close the loop between user behavior and LLM performance, and why human-in-the-loop systems…
I sent my cats' saliva to the lab to get health and genetic insights sent…
The Instagirl Wan LoRa was just updated to v2.3. It was retrained to be better…