Life science companies involved in the discovery, development and commercialization of new therapeutics face increasing business challenges due to a decade of poor conversion of R&D projects to produce new molecular entities that have large commercial value. This situation is exacerbated by the operational bottom line losses drug makers will experience as their products’ patents expire and generics gain marketshare.
As a result, many of the top 50 pharma/biotech companies have shifted their mid-to-long term global commercialization strategies. Today we see a steady stream of reductions in R&D budgets, manufacturing facility losses and the outsourcing of research, development and manufacturing initiatives to a host of “partners” on a global scale.
Most notable of the partners is the reliance on contract research organizations (CROs), contract manufacturing organizations (CMOs) and a new category of contract development and manufacturing organization (CDMOs). This shift to outsourcing of critical-path activities demands a cohesive data management/communication strategy between the outsourcing agent (CDMO) and the source life science company.
All to often, key intellectual property (data and informatics) are contained in a hybrid of paper-based and IT-based “systems” with poor transparency and timeliness between the partners. In addition, as the project matures to cGMP commercialization the compliance requirements further challenge the traditional hybrid systems, resulting in compliance deviations and regulatory scrutiny. This overarching issue comes under the definition of a Scientific Information Life-Cycle Management (SILM) umbrella and has significant implications for Quality by Design (QbD) initiatives, operational excellence programs and improved regulatory processes in quality control/quality assurance.
The CMO trends
The global CMO industry has enjoyed double-digit growth over the past two decades and that growth is expected to continue for the next five years. It is expected to exceed $60 billion dollars by 2017 from a base of approximately $30 billion in 2011. In the past, the use of outsourced manufacturing was largely the result of companies needing capacity support or specific capabilities not available internally. As the above- mentioned global business challenges continue, the use of these externalization partners is now a major strategic plan to:
• Reduce overall R&D costs
• Reduce the associated risks of multiple product/projects
• Reduce overall investments in production research
• Provide a more flexible and lean capacity environment for successful product launches and routine production
Data - an intellectual property asset
A key challenge when engaging with an external partner is the effective capturing, cataloguing and transparent utilization of all the data assets. Often the data is housed in data silos comprised of paper-based binders, logbooks, notebooks and information management systems that contain only a portion of the data asset stream. Data from development (formulations, synthesis or biological processes, analytical methods development etc.) are valuable both for a successful product as well as to glean learnings from those that fail. In fact, recent regulatory initiatives from the FDA embrace the QbD principles that quality can’t be inspected into products but should be an integral part of the development-to-commercialization plan.
By creating a “design space” for agile adjustments to the critical process parameters, the plant can assure the critical quality attributes stay in-line with the specifications. This data set can only be determined by the development experiments and catalogued in a suitable database for real-time use. Thus a comprehensive electronic data capture and management system is needed for new product development and commercialization. Also, the development database is a source of product/process information that can be used to solve production “events” in as close to real-time as possible for operational excellence and cost control while assuring product quality.
Scientific Life-Cycle Management (SILM) – Electronic Data Capture, Management, Reporting and Visualization
cGMP data management technology has matured over the last few decades and in-fact is converging from a point-to-point custom coding integration scheme that is hard to maintain and validate to a cohesive tiered IT infrastructure purpose-configured for QbD and Production/QC operational excellence.
The SILM environment begins early in the product development cycle. The product structure is known and proven to be effective (usually in animal models or early human trials). The experimental processes to determine the optimal synthesis or biological process, formulation and associated analytical test method development all are managed in an electronic lab notebook (ELN). ELNs are structured to be agile and open to capture intellectual property for both successful and failed experiments. Both successes and failures are critical to the ultimate use of the data – QbD and event-based triage in production. This is true for the molecule as well as the new analytical methods used to monitor the conditions. As the molecule or biologic advances to final pilot and early commercial operations, the process conditions become more fixed or rigid, the analytical methods are ruggedized and become routine.
This situation then warrants a very rigid procedure execution data capture technology. This is referred to as a Lab Execution System (LES) in the QC lab or for the production environment an electronic batch-record system (EBR). The common theme of this electronic environment is capturing all data and, most importantly, all metadata associated with routine operations for compliance reporting and future regulatory auditing/reporting. The LES and EBR technology present the method or process recipe “under-glass” (see figure 1 ) with all data, metadata and compliance state of all instruments and devices used in the process confirmed prior to use.
This assures that no deviations occurred during “execution” of defined and rigid procedures/SOPs. Should a process or method step fall out of defined parameters, a real-time note will be presented to the analyst/operator to stop him from causing a deviation. After all procedural steps are finalized, the software presents a dashboard for review and approvals both at the local level and QA levels, thus enabling zero deviations on procedure execution for QC labs and operations batch records.
Should a production “event” occur that requires process engineering intervention (ie. CPP and CQA mis-alignment ), the analytics processes of the software platform will outline corrective actions based on QbD design space correlations.
Implementation options – on site or Cloud
The technology to automate the capturing and cataloguing of critical-path development and operational QC/QA and batch records can be implemented in a variety of options. Many global life science companies opt for localized or regionalized data centres for direct control of the data management. Others opt for a cloud-based solution that does not require an investment in localized hardware/servers and IT/IS resources. Today, the latter scenario is generally applied to smaller organizations such as generic companies and CMOs / CDMO organizations, however, as the broader life science industry adopts cloud-based operations in the future, the platform can be totally implemented in off-site “cloud” data centres. The general consensus today is for on-site or regional deployment; however, cloud implementation is on the forefront for major pharma/biotech considerations. Either way the technology can be easily implemented today.
Implementation metrics – users of ELNs and LESs in development and QC
At the annual IMACS (International Meeting on Automation Compliance Systems) conference, implementers of ELN/LES/EBR automation outlined key compliance and operational metrics obtained upon implementation of a platform strategy for development and QC/QA/Production data capture environments. Likewise, several life science companies and CRO/CMO organizations presented operational metrics that outline the benefits of such a platform technology. Figure 2 shows the operational improvements in product development to commercialization for new chemical/biological entities with respect to the analytical development R&D and ruggedization for NDA submission. Figure 3 shows the significant reduction in lab deviations across several dozen global manufacturing sites upon implementation of a LES while figure 4 outlines the compliance benefits obtained. Figure 5 outlines the expected outcomes of implementation for a CRO/CMO based on an initial pilot program, yielding 15% reduction on laboratory hours, 25% reduction in QA hours, 50% reduction in report writing, 23% reduction in archiving hours and 40% reduction in printing costs for traditional paper-based notebooks.
Operational Excellence is a key mantra for modern life science companies and CMO/CDMO organizations that drive the adoption of new drug products in the marketplace. A key consideration is the value of data obtained during the life-cycle of development through commercial operations. Developing a coherent data management strategy with respect to total life-cycle management from early molecular development through final product production offers a unique management strategy that spans development experimentation, QbD data sets, production and recipe execution and QC/QA test method execution. Sharing this data with the host company through a shared data exchange portal will allow timely data capture, analysis, interpretation, and knowledge development for effective partnering. OpEx benefits have been demonstrated to include a 20%+ cost improvement, 50%+ cycle time reduction and overall compliance risk reduction for cGMP product development, production and QC. Development and production/QC data management and a SILM strategy is indeed a critical-path, operational excellence contributor for 21st century life science businesses.l
John P. Helfrich, VP, Strategic Programs, Accelrys Inc.