5 MINUTES WITH: Mikael Hagstroem, CEO, LabVantage Solutions

DCAT Value Chain Insights’ “5 Minutes With,” part of the DCAT Member Company Community section, features interviews with business and industry leaders on issues impacting the bio/pharmaceutical manufacturing value chain.

Mikael Hagstroem
CEO
LabVantage Solutions

This “5 Minutes With” features Mikael Hagstroem, CEO, LabVantage Solutions, a provider of laboratory informatics solutions, to provide insights on the key issues impacting data integrity, data management, and data sharing in the bio/pharmaceutical manufacturing value chain, including the future of artificial intelligence in these functions.

Question: Data management and data integrity for analytical testing and quality control are crucial in the supply of raw materials, intermediates, active pharmaceutical ingredients, and drug products. What would you identify as three key trends in data management, how that data may be shared, and/or other trends in information technology impacting how information is developed and shared among partners in the bio/pharmaceutical manufacturing value chain?

Hagstroem (LabVantage): The trend for many years has been to replace the multiple, siloed, and aging legacy systems in the lab and consolidate them into one modern centrally hosted laboratory information management system (LIMS). Where in the past various distributed systems served each company location, consolidation of data provides one system of record for all quality and research data. More up-to-date systems also provide an opportunity to harmonize workflows, testing methods, and policies and procedures, and allows for more complete data-integrity compliance. 

Once data are in a centrally hosted platform serving the enterprise, that data are available for access to view, trend, and subject to algorithms to analyze. Greater use of powerful analytics solutions that can ingest data from the lab and other data sources allows companies to use artificial intelligence/machine learning (AI/ML) capability to predict possible outcomes or future events that can affect success.

This, of course, would benefit from international standards for data. At the risk of oversimplifying, sharing data can be risky, and the solution is to ensure that parties share a common framework for the privacy, protection, and control of shared data. Once established, many different technologies might be used to meet this goal, with an appropriate emphasis placed on the electronic delivery of data that is both real-time and fosters self-service through shared electronic systems and advanced analytics.

Question: During the pandemic, remote/virtual customer audits and remote/virtual inspections by regulatory agencies came into play in the bio/pharmaceutical industry. From a data-management and data-sharing perspective, what additional practices/technologies were applied to enable these types of audits/inspections? Do you see them having continued use?

Hagstroem (LabVantage): When an auditor cannot physically sit at a table to review documents and data, as was the case during the pandemic, having clear, accurate, and fully accessible information that can be shared remotely becomes essential. Moving critical business systems to the cloud and accessing them via the web was the result of having digital technology integrated into all areas of the lab and rationalized to tell a data story that is accurate and transparent.

Seen as a lever to force change, the pandemic brought with it a level of considerable urgency. An up-side to that change was to move digital transformation from an aspiration to a necessity whose reward is well worth maintaining into the future. Based on our experience, we anticipate the majority of audits going forward will be remote.

Question: Real-time data sharing between sponsor companies (i.e., bio/pharma companies) and CDMOs/CMOs for a given project is a goal, but how is that best enabled between both parties—what technologies, systems, and user practices need to be implemented? What are the challenges and benefits?

Hagstroem (LabVantage):
Exchanging data between two separate corporate entities (i.e., Pharma and CDMOs) is possible but a challenge. Differences in enterprise systems, terminology, and methodologies require a translation layer between the two businesses to properly exchange data. Transport technologies, such as XML, HL7 or agreed-to file formats, are useful. Our approach has been to provide a secure web portal to the LIMS, thereby enabling external staff to interact securely and appropriately with internal business applications.

Question: Looking forward over the next five years, what do you see as the three most important technologies/systems or practices emerging for data management and data sharing as it applies to manufacturing and related analytical testing?

Hagstroem (LabVantage): We believe in the independent industry groups, such as BioPhorum, which are attempting to provide standards for data representation that would foster the sharing of data between key business systems and business organizations. With these standards, data-sharing barriers will begin to shrink. Analytics solutions that include AI/ML capabilities that are embedded in critical business applications like LIMS will move closer to intelligent solutions. Once the lab is fully digitized and analytics are applied for informed decision-making and predicting outcomes, the lab of the future can be realized. Lastly, emerging technologies that provide augmented reality at the lab bench will speed work execution, improve data integrity, and better support an IoT [Internet of Things] strategy.

Question: Longer term, what role will emerging technologies, such as AI or other technologies, play in the day-to-day operations in data management and ensuring data integrity in manufacturing and related analytical testing. Are there uses, not yet in play, that would be in a vision for the future?

Hagstroem (LabVantage): The application of artificial intelligence—including machine learning—will make a significant impact on laboratory operations. Currently, a human reviewer is required to spend a considerable amount of time examining not only results against specifications, but also the behavioral patterns of lab personnel during testing and quality processes. Reviewers are trained to examine data for procedural variations, inadvertent errors, and even the intentional falsification of data. While we may not be ready to allow computers to make all of our review and approval decisions in the laboratory, systems utilizing machine intelligence can be taught to quickly identify patterns in data that require additional investigation compared to those where both data and behavior likely have the least risk. This is becoming increasingly important due to the robust manner in which modern laboratory systems track and store larger volumes of data, including audit records against individual samples or specimens.  AI/ML capabilities can monitor lab operations to optimize work plans and also predict process or system failures, so corrective actions can be taken to reduce waste and downtime. 

Improving intra- and inter-organizational lab processes are required and can be designed top-down, resulting in execution of local processes in a distributed manner. However, implementing data flow in such a scenario is highly complex because of the combinatorial explosion of possible solutions and possible conflicting goals for data flow. Leveraging an AI/Heuristic algorithm can overcome those complexities.

Mikael Hagstroem is CEO of LabVantage Solutions, a provider of laboratory informatics solutions. He is a well-respected strategist in the fields of analytics, digital transformation, and artificial intelligence. Formerly the Chief Operating Officer of McKinsey Analytics and President at SAS International, he was CEO and President at MetricStream prior to joining LabVantage in March 2021.