12A6EyNkgusXItWzDtMNKPIDQ
Next-generation cell-based therapy and gene editing therapeutics are set to revolutionize medicine. However, to harness the scientific potential of these drugs, biomanufacturing organizations must develop new systems to handle the increased complexity of process development, manufacturing operations, and quality assurance associated with these new therapeutics.
In this blog post, we discuss the shortcomings of legacy software systems at biomanufacturing organizations and explore the benefits of a more connected approach that leverages the Palantir Foundry Ontology.
Biomanufacturing relies on tightly controlled processes subject to rigorous monitoring, testing, and release requirements. Currently, the underlying software enabling operations and quality assurance is often a patchwork of legacy systems. The different stages of the manufacturing process are frequently siloed and disconnected. This is particularly true for cross-organization data collection, processing, and end-to-end analytics across the life cycle of a therapy. Critical activities, ranging from process development to root cause investigation of deviations, still require highly manual curation of data from fragmented systems. At best, this slows the speed of process improvement, but it can also result in unnecessary production delays or loss of drug product batches, potentially costing millions.
For next-generation therapeutics, these challenges are compounded. Characterized by high process variability across manufacturing, personalized medicine and treatments for rare diseases also require smaller production scales, each with specialized production requirements. Scaling a portfolio of such therapies efficiently therefore necessitates greater visibility and collaboration across teams than traditional laboratory information management systems (LIMs) and manufacturing execution systems (MES) can provide on their own.
Specifically, process operating systems fit for the development of novel biologic medicines must be able to provide a consolidated view of:
In addition, monitoring clinical responses to personalized therapies often involves the collection of more granular, multidimensional data, which presents an opportunity to link individual patient data and clinical trials assays with the manufacturing processes. This convergence can inform a greater understanding of the root cause of safety events and can help tease out nuanced variance in treatment response for different patient subpopulations.
A connected biomanufacturing company can address the challenges presented by operational and data silos by enabling every part of the organization to consume data from the same source. Rather than relying on periodic, curated snapshots, the connected company provides individuals with full visibility into operations at any point in time, helping to inform strategy and action.
For instance, when faced with diminished yield or quality in recent production, an operations manager could pull raw material records from relevant ERP systems, batch records, as well as QMS data with a few clicks, forming a data mesh accessible across the enterprise. A digital twin can then be built across every aspect of the manufacturing process, empowering teams, both analytical and operational, to focus on what really matters: identifying root cause and achieving consensus on remediation.
A connected approach can also support a range of planning and operational activities such as quality assurance reporting automation, collaborative process development, accelerated scaling from development to production, and deviation monitoring and alerting. This disciplined data curation also serves as a prerequisite for effective deployment of operational AI and predictive modeling. Bolstered by a central data foundation, these alerts and models can then be used to accomplish predictive maintenance, proactive parameter adjustment for yield optimization, and automated regulatory reporting.
Building a connected pharmaceutical production program requires a digital backbone that is able to:
How can true digital connective transformation be achieved using existing infrastructure? While modern production facilities are often built with digital infrastructure and IoT sensors, transforming a legacy setup remains a challenging task.
Most modernization initiatives focus on renewing existing infrastructure, aiming to replace a fragmented legacy landscape with step-by-step modernization of point solutions such as replacing manual environmental assays with realtime IoT equivalents, updating an MES, or harmonizing multiple ERP systems. These investments, while highly important, rarely aim to build a fully connected system. Such initiatives often result in persistence of the very siloed landscapes they were intended to overcome.
The development of a true connective infrastructure requires a coordination layer integrating legacy investments and modern point solutions. An enterprise operating system like Palantir Foundry does exactly that. Foundry provides a connective tissue on top of existing infrastructure, aggregating data from point solutions and harmonizing divergent data models.
The Ontology facilitates systematic mapping of data to meaningful semantic concepts that empower the rapid development of applications, reports, and workflows. Application developers and data scientists can spend more time on interfaces, models, and workflows, unburdened by the need to seek out and transform data and maintain deployment infrastructure. This also opens the door for the utilization of AI to monitor operational efficiency, optimize yield, and predict deviations.
The result is an organizational capacity for data-driven decision-making above and beyond what is typically possible with traditional MOM, MES, or ERP systems.
Within this context, a connected digital infrastructure powered by Foundry can help an organization achieve:
These foundational capabilities enable a wide range of process development and manufacturing operations activities, including end-to-end quality control, product or batch 360, root cause investigations, collaborative strategic resource allocation planning, and provenance tracing of treatment efficacy.
The biggest challenge to achieving a truly connected company is finding where to start. Digital initiatives are often either too ambitious and broad, only working in a greenfield setting, or are too small to make a real impact.
Our philosophy is to build iteratively toward a truly connected company by starting with a clear business problem and clear outcomes to achieve. By focusing on one concrete problem at a time while leveraging a common data foundation, we establish the momentum of a successful digital transformation.
To begin today, visit our website.
Authors
Matthew Owens, BSE&MSE, Palantir Life Sciences
Tilman Flock, PhD, Palantir Life Sciences
Biomanufacturing of Tomorrow Requires a Connected Company Today was originally published in Palantir Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.
Here's v2 of a project I started a few days ago. This will probably be…
We present STARFlow, a scalable generative model based on normalizing flows that achieves strong performance…
GUEST: Quantum computing (QC) brings with it a mix of groundbreaking possibilities and significant risks.…
The social network started experiencing global outages within minutes of Donald Trump posting details of…
What models or workflows are people using to generate these? submitted by /u/danikcara [link] [comments]
Ever felt like trying to find a needle in a haystack? That’s part of the…