ML 20204 image 1
This post is cowritten with James Luo from BGL.
Data analysis is emerging as a high-impact use case for AI agents. According to Anthropic’s 2026 State of AI Agents Report, 60% of organizations rank data analysis and report generation as their most impactful agentic AI applications. 65% of enterprises cite it as a top priority. In practice, businesses face two common challenges:
Like many other businesses, BGL faced similar challenges with its data analysis and reporting use cases. BGL is a leading provider of self-managed superannuation fund (SMSF) administration solutions that help individuals manage the complex compliance and reporting of their own or a client’s retirement savings, serving over 12,700 businesses across 15 countries. BGL’s solution processes complex compliance and financial data through over 400 analytics tables, each representing a specific business domain, such as aggregated customer feedback, investment performance, compliance tracking, and financial reporting. BGL’s customers and employees need to find insights from the data. For example, Which products had the most negative feedback last quarter? or Show me investment trends for high-net-worth accounts. Working with Amazon Web Services (AWS), BGL built an AI agent using Claude Agent SDK hosted on Amazon Bedrock AgentCore. By using the AI agent business users can retrieve analytic insights through natural language while aligning with the security and compliance requirements of financial services, including session isolation and identity-based access controls.
In this blog post, we explore how BGL built its production-ready AI agent using Claude Agent SDK and Amazon Bedrock AgentCore. We cover three key aspects of BGL’s implementation:
When engineering teams implement an AI agent for analytics use cases, a common anti-pattern is to have the agent handle everything including understanding database schemas, transforming complex datasets, sorting out business logic for analyses and interpreting results. The AI agent is likely to produce inconsistent results and fail by joining tables incorrectly, missing edge cases, or producing incorrect aggregations.
BGL used its existing mature big data solution powered by Amazon Athena and dbt Labs, to process and transform terabytes of raw data across various business data sources. The extract, transform, and load (ETL) process builds analytic tables and each table answers a specific category of business questions. Those tables are aggregated, denormalized datasets (with metrics and, summaries) that serve as a business-ready single source of truth for business intelligence (BI) tools, AI agents, and applications. For details on how to build a serverless data transformation architecture with Athena and dbt, see How BMW Group built a serverless terabyte-scale data transformation architecture with dbt and Amazon Athena.
The AI agent’s role is to handle complex data transformation within the data system by focusing on interpreting the user’s natural language questions, translating it, and generating SQL SELECT queries against well-structured analytic tables. When needed, the AI agent writes Python scripts to further process results and generate visualizations. This separation of concerns significantly reduces the risk of hallucination and offers several key benefits:
“Many people think the AI agent is so powerful that they can skip building the data platform; they want the agent to do everything. But you can’t achieve consistent and accurate results that way. Each layer should solve complexity at the appropriate level”
– James Luo, BGL Head of Data and AI
BGL’s development team has been using Claude Code powered by Amazon Bedrock as its AI coding assistant. This integration uses temporary, session-based access to mitigate credential exposure, and integrates with existing identity providers to align with financial services compliance requirements. For details of integration, see Guidance for Claude Code with Amazon Bedrock
Through its daily use of the Claude Code, BGL recognized that its core capabilities extend beyond coding. BGL used its ability to reason through complex problems, write and execute code, and interact with files and systems autonomously. Claude Agent SDK packages the same agentic capabilities into a Python and TypeScript SDK, so that developers can build custom AI agents on top of Claude Code. For BGL, this meant they could build an analytics AI agent with:
CLAUDE.md file for project context and Agent Skills for product line domain-specific expertiseAnalytics queries often return thousands of rows and sometimes beyond megabytes of data. Standard tool-use, function calling, and Model Context Protocol (MCP) patterns often pass retrieved data directly into the context window, which quickly reaches model context window limits. BGL implemented a different approach: the agent writes SQL to query Athena, then writes Python code to process the CSV file results directly in its file system. This enables the agent to handle large result sets, perform complex aggregations, and generate charts without reaching context window limits. You can learn more about the code execution patterns in Code execution with MCP: Building more efficient agents.
To handle BGL’s diverse product lines and complex domain knowledge, the implementation uses a modular approach with two key configuration types that work together seamlessly.
The CLAUDE.md file provides the agent with global context—the project structure, environment configuration (test, production, and so on), and critically, how to execute SQL queries. It defines which folders store intermediate results and final outputs, making sure files land in a defined file path that users can access. The following diagram shows the structure of a CLAUDE.md file:
BGL organizes their agent domain knowledge by product lines using the SKILL.md configuration files. Each skill acts as a specialized data analyst for a specific product. For example, the BGL CAS 360 product has a skill called CAS360 Data Analyst agent, which handles company and trust management with ASIC compliance alignment; while BGL’s Simple Fund 360 product has a skill called Simple Fund 360 Data Analyst agent, which is equipped with SMSF administration and compliance-related domain skills. A SKILL.md file defines three things:
By using SKILL.md files, the agent can dynamically discover and load the right skill to gain domain-specific expertise for corresponding tasks.
CLAUDE.md file into a single prompt. This allows the agent to simultaneously apply project-wide standards (for example, always save to disk) while using domain-specific knowledge (such as mapping user questions to a group of tables).As shown in the preceding figure, agent skills are organized per product line. Each product folder contains a SKILL.md definition file and a references directory with more domain knowledge and support materials that the agent loads on demand.
For details about Anthropic Agent Skills, see the Anthropic blog post, agents for the real world with Agent Skills
To deliver a more secure and scalable text-to-SQL experience, BGL uses Amazon Bedrock AgentCore to host Claude Agent SDK while keeping data transformation in the existing big data solution.
The preceding figure illustrates a high-level architecture and workflow. The analytic tables are pre-built daily using Athena and dbt, and serve as the single source of truth. A typical user interaction flows through the following stages:
Deploying an AI agent that executes arbitrary Python code requires significant infrastructure considerations. For instance, you need isolation to help ensure that there’s no cross-session access to data or credentials. Amazon Bedrock AgentCore provides fully-managed, stateful execution sessions, each session has its own isolated microVM with a separate CPU, memory, and file system. When a session ends, the microVM terminates fully and sanitizes memory, helping to ensure no remnants persist for future sessions. BGL found this service especially valuable:
“There’s Gateway, Memory, Browser tools, a whole ecosystem built around it. I know AWS is investing in this direction, so everything we build now can integrate with these services in the future.”
– James Luo, BGL Head of Data and AI.
BGL is already planning to integrate AgentCore Memory for storing user preferences and query patterns.
For BGL’s more than 200 employees, this represents a significant shift in how they extract business intelligence. Product managers can now validate hypotheses instantly without waiting for the data team. Compliance teams can spot risk trends without learning SQL. Customer success managers can pull account-specific analytics in real-time during client calls. This democratization of data access helps transform analytics from a bottleneck into a competitive advantage, enabling faster decision-making across the organization while freeing the data team to focus on strategic initiatives rather than one-time query requests.
BGL’s journey demonstrates how combining a strong data foundation with agentic AI can democratize business intelligence. By using Amazon Bedrock AgentCore and the Claude Agent SDK, BGL built a more secure and scalable AI agent that empowers employees to tap into their data to answer business questions. Here are some key takeaways:
If you’re ready to build similar capabilities for your organization, get started by exploring the Claude Agent SDK and a short demo of Deploying Claude Agent SDK on Amazon Bedrock AgentCore Runtime. If you have a similar use case or need support designing your architecture, reach out to your AWS account team.
References:
To advance Polar code design for 6G applications, we develop a reinforcement learning-based universal sequence…
In his book The Intimate Animal, sex and relationships researcher Justin Garcia says people have…
ComfyUI-CacheDiT brings 1.4-1.6x speedup to DiT (Diffusion Transformer) models through intelligent residual caching, with zero…
The large language models (LLMs) hype wave shows no sign of fading anytime soon:…