ML 17916 image001
Amazon Bedrock Flows offers an intuitive visual builder and a set of APIs to seamlessly link foundation models (FMs), Amazon Bedrock features, and AWS services to build and automate user-defined generative AI workflows at scale. Amazon Bedrock Agents offers a fully managed solution for creating, deploying, and scaling AI agents on AWS. With Flows, you can provide explicitly stated, user-defined decision logic to execute workflows, and add Agents as a node in a flow to use FMs to dynamically interpret and execute tasks based on contextual reasoning for certain steps in your workflow.
Today, we’re excited to announce multi-turn conversation with an agent node (preview), a powerful new capability in Flows. This new capability enhances the agent node functionality, enabling dynamic, back-and-forth conversations between users and flows, similar to a natural dialogue in a flow execution.
With this new feature, when an agent node requires clarification or additional context from the user before it can continue, it can intelligently pause the flow’s execution and request user-specific information. After the user sends the requested information, the flow seamlessly resumes the execution with the enriched input, maintaining the executionId
of the conversation.
This creates a more interactive and context-aware experience, because the node can adapt its behavior based on user responses. The following sequence diagram shows the flow steps.
Multi-turn conversations make it straightforward to developers to create agentic workflows that can adapt and reason dynamically. This is particularly valuable for complex scenarios where a single interaction might not be sufficient to fully understand and address the user’s needs.
In this post, we discuss how to create a multi-turn conversation and explore how this feature can transform your AI applications.
Consider ACME Corp, a leading fictional online travel agency developing an AI-powered holiday trip planner using Flows. They face several challenges in their implementation:
Let’s explore how the new multi-turn conversation capability in Flows addresses these challenges and enables ACME Corp to build a more intelligent, context-aware, and efficient holiday trip planner that truly enhances the customer’s travel planning experience.
The flow offers two distinct interaction paths. For general travel inquiries, users receive instant responses powered by an LLM. However, when users want to search or book flights and hotels, they are connected to an agent who guides them through the process, collecting essential information while maintaining the session until completion. The workflow is illustrated in the following diagram.
For this example, you need the following:
To create a multi-turn conversation flow, complete the following steps:
ACME-Corp-trip-planner
.For detailed instructions on creating a Flow, see Amazon Bedrock Flows is now generally available with enhanced safety and traceability.
Bedrock provides different node types to build your prompt flow.
categoryLetter=A
if the user wants to search or book a hotel or flight and categoryLetter=B
if the user is asking for destination information. If you’re using Amazon Bedrock Prompt Management, you can select the prompt from there.For this node, we use the following message in the prompt configuration:
For our example, we chose Amazon’s Nova Lite model and set the temperature inference parameter to 0.1 to minimize hallucinations and enhance output reliability. You can select other available Amazon Bedrock models.
For our example, we chose Amazon’s Nova Lite model and set the temperature inference parameter to 0.1 to minimize hallucinations and enhance output reliability.
You’re now ready to test the flow through the Amazon Bedrock console or API. First, we ask for information about Paris. In the response, you can review the flow traces, which provide detailed visibility into the execution process. These traces help you monitor and debug response times for each step, track the processing of customer inputs, verify if guardrails are properly applied, and identify any bottlenecks in the system. Flow traces offer a comprehensive overview of the entire response generation process, allowing for more efficient troubleshooting and performance optimization.,
Next, we continue our conversation and request to book a travel to Paris. As you can see, now with the multi-turn support in Flows, our agent node is able to ask follow-up questions to gather all information and make the booking.
We continue talking to our agent, providing all required information, and finally, the agent makes the booking for us. In the traces, you can check the ExecutionId
that maintains the session for the multi-turn requests.
After the confirmation, the agent has successfully completed the user request.
You can also interact with flows programmatically using the InvokeFlow API, as shown in the following code. During the initial invocation, the system automatically generates a unique executionId
, which maintains the session for 1 hour. This executionId
is essential for subsequent InvokeFlow
API calls, because it provides the agent with contextual information necessary for maintaining conversation history and completing actions.
If the agent node in the flow decides that it needs more information from the user, the response stream (responseStream
) from InvokeFlow
includes a FlowMultiTurnInputRequestEvent
event object. The event has the requested information in the content(FlowMultiTurnInputContent
) field.
The following is an example FlowMultiTurnInputRequestEvent
JSON object:
Because the flow can’t continue until more input is received, the flow also emits a FlowCompletionEvent
event. A flow always emits the FlowMultiTurnInputRequestEvent
before the FlowCompletionEvent
. If the value of completionReason
in the FlowCompletionEvent
event is INPUT_REQUIRED
, the flow needs more information before it can continue.
The following is an example FlowCompletionEvent
JSON object:
Send the user response back to the flow by calling the InvokeFlow
API again. Be sure to include the executionId
for the conversation.
The following is an example JSON request for the InvokeFlow
API, which provides additional information required by an agent node:
This back and forth continues until no more information is needed and the agent has all that is required to complete the user’s request. When no more information is needed, the flow emits a FlowOutputEvent
event, which contains the final response.
The following is an example FlowOutputEvent
JSON object:
The flow also emits a FlowCompletionEvent
event. The value of completionReason
is SUCCESS
.
The following is an example FlowCompletionEvent
JSON object:
To get started with multi-turn invocation, use the following example code. It handles subsequent interactions using the same executionId
and maintains context throughout the conversation. You need to specify your flow’s ID in FLOW_ID
and its alias ID in FLOW_ALIAS_ID
(refer to View information about flows in Amazon Bedrock for instructions on obtaining these IDs).
The system will prompt for additional input as needed, using the executionId
to maintain context across multiple interactions, providing a coherent and continuous conversation flow while executing the requested actions.
To clean up your resources, delete the flow, agent, AWS Lambda functions created for the agent, and knowledge base.
The introduction of multi-turn conversation capability in Flows marks a significant advancement in building sophisticated conversational AI applications. In this post, we demonstrated how this feature enables developers to create dynamic, context-aware workflows that can handle complex interactions while maintaining conversation history and state. The combination of the Flows visual builder interface and APIs with powerful agent capabilities makes it straightforward to develop and deploy intelligent applications that can engage in natural, multi-step conversations.
With this new capability, businesses can build more intuitive and responsive AI solutions that better serve their customers’ needs. Whether you’re developing a travel booking system, customer service or other conversational application, multi-turn conversation with Flows provides the tools needed to create sophisticated AI workflows with minimal complexity.
We encourage you to explore these capabilities on the Bedrock console and start building your own multi-turn conversational applications today. For more information and detailed documentation, visit the Amazon Bedrock User Guide. We look forward to seeing the innovative solutions you will create with these powerful new features.
The End of the AI Safety DebateFor years, a passionate contingent of researchers, ethicists, and…
A new wave of AI-powered browser-use agents is emerging, promising to transform how enterprises interact…
Employees throughout the federal government have until 11:59pm ET Monday to detail five things they…
Researchers are blurring the lines between robotics and materials, with a proof-of-concept material-like collective of…
Be sure to check out the previous articles in this series: •
TL;DR We compared Grok 3 and o3-mini’s results on this topic. They both passed. Since…