The pharmaceutical industry could get more efficient by more efficiently using data produced by bioprocessing. That’s according to Michael Sokolov, PhD, co-founder and COO of DataHow.

In an exclusive interview, he explained that the pharma industry needs to, “to use fewer resources while operating in a more efficient way” to master the well-documented pressures of an aging population, biosimilars and new therapeutics.

DataHow illustrationAnd since, he argues, companies only use an estimated 6% of their data, “data must become a central pillar in terms of [use of] resources.” Amgen, for example, he says, invested more than saved $100 million in digitizing their data, and – he argues – the business case for digitization will become clearer over the next few years.

DataHow, a spinoff company from ETH Zurich, has developed an algorithmic toolbox to analyze biomanufacturing processes. The aim, he says, is to help process operators with decision planning and to aid automation as part of Industry 4.0. After acquiring their first customer in 2017, he says the company now works with large pharmaceutical clients, such as Sanofi and Merck.

The toolbox, which will soon be released as a software package, DataHowLab, works on top of simple data sources or sophisticated data management systems often provided by big technology suppliers, Dr. Sokolov explains. It can be used for decision planning, or to actively monitor, control and optimise a process, for which DataHow collaborates with TU Berlin.

“We don’t believe biopharma has a big data problem,” he says. “The amount of data created during process development over months to years is in the terabyte range – that’s the same as Facebook in a day!”

Rather than a size problem, he believes biopharma has a problem with data complexity. Although companies analyze a few important process variables, this isn’t implemented consistently and doesn’t account for interactions between multiple variables, he says.

He compares the difference to a nuclear power plant. “The controls there are very robust because they’ve had sophisticated know-how built into their control scheme for tens of years. But, in a biopharmaceutical plant, the controls are very basic and don’t reflect all the know-how already available.”

“Our tool tries to analyze all the data,” he says. “We compensate for the lack of data, and complexity problem, by teaching our engineering knowledge to the software– we’re all biotechnologists and chemical engineers, and we find experts in industrial processes as well.” This knowledge is combined with AI to derive non-intuitive patterns in the data, he explains.

The post Data Analytics to Become “Central Pillar” of Biomanufacturing appeared first on GEN – Genetic Engineering and Biotechnology News.

Source