Blog GenAI Gradio Blog 2 HighLevel
In our new age of low-code and no-code application development, AI has become the tool of choice for rapidly extending, powering and modernizing applications. With our ever-shifting technology landscape bringing new potential and opportunity to connect and engage with customers, or optimize and infuse insights and experiences, leading organizations are racing to build new applications faster. Whether it’s to embrace generative AI technologies, or maintain their competitive advantage, AI-infused application development is quickly becoming a necessity to to make it in today’s market .
In this blog, we will discuss how to use Gradio, an open source frontend framework, with Vertex AI Conversation. Vertex AI Conversation allows developers with limited machine learning skills to tap into the power of conversational AI technologies, and seamlessly develop gen AI proof-of-concept applications. With these two tools, organizations can deploy a PoC with an engaging, low-lift generative AI experience that wow your customers, and inspire your development team.
Gen AI powered chatbots can provide powerful and relative conversations by learning from your company’s own unstructured data. The Gradio front-end framework is an intuitive interface to build custom, interactive applications that allow developers to easily share and demo ML models.
One of Gradio’s framework main capabilities is to create demo apps on top of your models with a friendly web interface so that anyone can use it and provide to your organization immediate feedback. Integrating a Gradio app with a generative AI agent built on Vertex AI Conversation unlocks key features allowing you to tweak and tune to your individual needs and feedback from users. Using the power of programmability, you can drive deep personalization and contextualization into your chatbot’s conversations with your customers using your organization’s data and demo them rapidly.
With the unprecedented boom in generative AI, businesses need an accessible and seamless interface to validate their machine learning models, API, or data science workflow. Chatbots are a popular application of Large Language Models (LLMs). Because the interaction with LLMs feels natural and intuitive, businesses are turning to conversational interfaces such as voice-activated chatbots or voice bots. Voice bots are gaining popularity because of the convenience they bring; it’s much easier to speak than to type.
Gradio is an open-source Python framework that makes it easy to build quick interfaces like chatbots, voice-activated bots, and even full-fledged web applications to share your machine learning model, API or data science workflow with clients or collaborators. With Gradio, you can build quick demos and share them, all in Python with just a few lines of code. You can learn more about Gradio here.
Vertex AI Conversation’s data ingestion tools parse your content to create a virtual agent powered by LLMs. Your agent can then generate conversations using your organization’s data to provide a contextual and personal interaction with end-users. Seamless deployment through a web browser means demonstrating your application is easier than ever with the Gradio framework.
Gradio can be used to build chatbots that can answer user questions using a variety of data sources. To do this, you can build a middleware that uses Vertex AI Conversation to process the user’s input and generate a response from an agent. The agent can then search for answers in a data store of documents, such as your company’s knowledge base.
When the agent finds an answer, it can summarize it and present it to the user in the Gradio app. The agent can also provide links to the sources of the answer so that the user can learn more.
Here is a more detailed explanation of each step:
The following diagram describes a high level architecture to be presented that can be used as a foundational building block for a MVP with core functionalities.
The following is a description of the components of the chatbot architecture
Once the application is launched, the interface will look like the image below.
Record from microphone: Allows users to send voice-based messages.
Start a new conversation: Erases chat history and initiates a new conversation.
Source: Displays links to the sources of the response, such as the user manual.
You can find examples of the implementation on Github repo genai-gradio-example.
The code is illustrating a deployable PoC application which demonstrates some basic functionalities implemented through Vertex AI and complemented by a Gradio custom UI UX portal. As next steps, we recommended exploring and generating ideas for user centric products that can be powered by Vertex AI in your company. Google can help.
In this post, we have discussed how to integrate your Gradio conversations with a generative AI agent using Vertex AI Conversation. This can be used to build rapid generative AI PoC applications, and begin the discussions within your organization for how you can harness the power of generative AI. We are also providing you with the high-level architecture for your application and sample code to get you started right away. We hope that this information will be helpful to developers who are looking to rapidly build gen AI-powered applications. While still in its early stages of development, gen AI is already changing how organizations connect, engage, and support their customers, and with such fast-shifting technologies, fortune favors the bold.
The large language model (LLM) has become a cornerstone of many AI applications.
Computer use is a breakthrough capability from Anthropic that allows foundation models (FMs) to visually…
OpenAI's new API and Agents SDK consolidate a previously fragmented complex ecosystem into a unified,…
A directive from the National Institute of Standards and Technology eliminates mention of “AI safety”…
Speech foundation models, such as HuBERT and its variants, are pre-trained on large amounts of…
This post was co-written with Vishal Singh, Data Engineering Leader at Data & Analytics team…