Biomanufacturing of Tomorrow Requires a Connected Company Today

Next-generation cell-based therapy and gene editing therapeutics are set to revolutionize medicine. However, to harness the scientific potential of these drugs, biomanufacturing organizations must develop new systems to handle the increased complexity of process development, manufacturing operations, and quality assurance associated with these new therapeutics.

In this blog post, we discuss the shortcomings of legacy software systems at biomanufacturing organizations and explore the benefits of a more connected approach that leverages the Palantir Foundry Ontology.

The Challenge: Aging infrastructure encounters increased complexity

Biomanufacturing relies on tightly controlled processes subject to rigorous monitoring, testing, and release requirements. Currently, the underlying software enabling operations and quality assurance is often a patchwork of legacy systems. The different stages of the manufacturing process are frequently siloed and disconnected. This is particularly true for cross-organization data collection, processing, and end-to-end analytics across the life cycle of a therapy. Critical activities, ranging from process development to root cause investigation of deviations, still require highly manual curation of data from fragmented systems. At best, this slows the speed of process improvement, but it can also result in unnecessary production delays or loss of drug product batches, potentially costing millions.

For next-generation therapeutics, these challenges are compounded. Characterized by high process variability across manufacturing, personalized medicine and treatments for rare diseases also require smaller production scales, each with specialized production requirements. Scaling a portfolio of such therapies efficiently therefore necessitates greater visibility and collaboration across teams than traditional laboratory information management systems (LIMs) and manufacturing execution systems (MES) can provide on their own.

Specifically, process operating systems fit for the development of novel biologic medicines must be able to provide a consolidated view of:

  • Suppliers and raw materials: Any deviation of composition, purity, or quality in raw materials from different vendors needs to be traceable. Results of validation assays must be communicated rapidly to limit impact on batch yield.
  • Upstream processing, upscaling history: Key parameters of a production batch must be be logged and easily accessible with historical context to make processes reproducible and enable effective root cause analysis.
  • Downstream processing: Enrichment, purification, and product composition must have robust monitoring to proactively identify conditions predictive of required maintenance before it affects product quality.

In addition, monitoring clinical responses to personalized therapies often involves the collection of more granular, multidimensional data, which presents an opportunity to link individual patient data and clinical trials assays with the manufacturing processes. This convergence can inform a greater understanding of the root cause of safety events and can help tease out nuanced variance in treatment response for different patient subpopulations.

The Vision: A Connected Approach

A connected biomanufacturing company can address the challenges presented by operational and data silos by enabling every part of the organization to consume data from the same source. Rather than relying on periodic, curated snapshots, the connected company provides individuals with full visibility into operations at any point in time, helping to inform strategy and action.

For instance, when faced with diminished yield or quality in recent production, an operations manager could pull raw material records from relevant ERP systems, batch records, as well as QMS data with a few clicks, forming a data mesh accessible across the enterprise. A digital twin can then be built across every aspect of the manufacturing process, empowering teams, both analytical and operational, to focus on what really matters: identifying root cause and achieving consensus on remediation.

A connected approach can also support a range of planning and operational activities such as quality assurance reporting automation, collaborative process development, accelerated scaling from development to production, and deviation monitoring and alerting. This disciplined data curation also serves as a prerequisite for effective deployment of operational AI and predictive modeling. Bolstered by a central data foundation, these alerts and models can then be used to accomplish predictive maintenance, proactive parameter adjustment for yield optimization, and automated regulatory reporting.

Implementation: Transforming pharmaceutical production

Building a connected pharmaceutical production program requires a digital backbone that is able to:

  • Link highly fragmented source data, no matter if structured or unstructured, or in highly variable formats, into a single repository.
  • Ensure trust with access and governance tooling to enforce role and/or purpose-based access control, and allow for efficient audits of code, data, and decisions.
  • Provide a semantic layer (‘ontology’) that transforms digital assets — including data, models, and processes — into a dynamic, actionable representation of the business for all users to leverage.
  • Enable reproducible, collaborative analytics for trend identification, dashboards, key performance metric tracking, and management of evolving artificial intelligence and machine learning models.

How can true digital connective transformation be achieved using existing infrastructure? While modern production facilities are often built with digital infrastructure and IoT sensors, transforming a legacy setup remains a challenging task.

Most modernization initiatives focus on renewing existing infrastructure, aiming to replace a fragmented legacy landscape with step-by-step modernization of point solutions such as replacing manual environmental assays with realtime IoT equivalents, updating an MES, or harmonizing multiple ERP systems. These investments, while highly important, rarely aim to build a fully connected system. Such initiatives often result in persistence of the very siloed landscapes they were intended to overcome.

The development of a true connective infrastructure requires a coordination layer integrating legacy investments and modern point solutions. An enterprise operating system like Palantir Foundry does exactly that. Foundry provides a connective tissue on top of existing infrastructure, aggregating data from point solutions and harmonizing divergent data models.

The Ontology facilitates systematic mapping of data to meaningful semantic concepts that empower the rapid development of applications, reports, and workflows. Application developers and data scientists can spend more time on interfaces, models, and workflows, unburdened by the need to seek out and transform data and maintain deployment infrastructure. This also opens the door for the utilization of AI to monitor operational efficiency, optimize yield, and predict deviations.

The result is an organizational capacity for data-driven decision-making above and beyond what is typically possible with traditional MOM, MES, or ERP systems.

Within this context, a connected digital infrastructure powered by Foundry can help an organization achieve:

  • A shared view of world — Curating data from diverse systems and data formats enables a common view into the health of processes and progression of batches. With integrated orchestration of computation, even non-tabular data such as imaging or immunofluoresence-based assays can be incorporated into one coherent digital twin of the product.
  • Adaptive operations — Rapid authoring and deployment of applications on top of the Ontology enables organizations to adapt quickly. From lab or process notebooks and quality-deviation alerting dashboards to rolling out workflows to the“ edge” of the production floor, organizations can deploy complex data and model driven workflows with greater confidence.
  • Improved decision-making — Scenario-driven investigations and collaborative analytical views empower individuals across the organization to make data-driven decisions informed by a synthesis of integrated systems.

These foundational capabilities enable a wide range of process development and manufacturing operations activities, including end-to-end quality control, product or batch 360, root cause investigations, collaborative strategic resource allocation planning, and provenance tracing of treatment efficacy.

Where to begin: Identify a clear business problem and a clear outcome

The biggest challenge to achieving a truly connected company is finding where to start. Digital initiatives are often either too ambitious and broad, only working in a greenfield setting, or are too small to make a real impact.

Our philosophy is to build iteratively toward a truly connected company by starting with a clear business problem and clear outcomes to achieve. By focusing on one concrete problem at a time while leveraging a common data foundation, we establish the momentum of a successful digital transformation.

To begin today, visit our website.

Authors
Matthew Owens, BSE&MSE, Palantir Life Sciences
Tilman Flock, PhD, Palantir Life Sciences


Biomanufacturing of Tomorrow Requires a Connected Company Today was originally published in Palantir Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.