monoclonal-antibodyIt’s well known that biotherapeutic drugs (also known as biologics) are more difficult to produce consistently than our old friends, small molecule drugs. And, manufacturers spend untold amounts of money trying to make sure that each batch of biologics not only performs exactly the same way but also meets the strict regulatory requirements for consistency.

By the way, here is a very informative 9-min video by Professor DeMasi at MCPHS University, Boston (USA), titled, Therapeutic Applications of Monoclonal Antibodies, that is worth a watch.

 

 

When Human Clinical Trials Go Bad

All this in mind, surely nothing ever goes wrong, right? Unfortunately that’s not the case. Possibly the most risky stage of a drug’s development is during the human clinical trials, where the drug is given to human volunteers, albeit at very low doses to begin with. For some drugs the early testing can be done on animals, but animals (even primates) are not always a good proxy for humans, and I think we’ll save the ethics conversation for another day.

There’s a well known and well-documented example of a human clinical trial going bad during the testing of TGN1412 (link to article), a so-called super monoclonal antibody developed by TeGenero Immuno Therapeutics (link to article about company). Although the drug had worked safely and effectively during animal tests, during the first human tests in 2006 (at very low doses), the drug caused catastrophic organ failure in four of the test subjects. The inquiry (still underway) appears to be saying that trial protocols may not be to blame, and that it could be an “unforeseen biological action in humans.”

So, what about TeGenero? You may not be surprised to hear that they went bankrupt later that year. The concerning addendum to this story is that the commercial rights to the drug were purchased by another biotech company (link to company website) and they plan to restart clinical trials this year. Luckily, they plan to use much lower dose levels than TeGenero.

TGN1412 was a specific, and rather unlucky case of how it can all go wrong even when regulatory guidelines are strictly adhered to. But what happens if the regulations are not followed?

 

Complying with Biopharma Regulatory Guidelines

Another well-documented example concerns Cetero Research (link to article about the company). In 2011, the U.S. Department of Health and Human Services (part of the FDA) identified significant and widespread evidence of data falsification during a drug trial. Amazingly, you can read the actual letter sent to Cetero by the U.S. Government (link to letter on FDA website). For instrument companies, such as Thermo Fisher Scientific Inc., one of the critical points made in the letter by the FDA is that the company “fail(ed) to demonstrate that the analytical method used in an in vivo… …is accurate and of sufficient sensitivity to measure, with appropriate precision, the actual concentration of the active drug ingredient or therapeutic moiety.”

Basically, the FDA said that the company did not use sufficiently powerful technology in their biopharmaceutical characterization. So where did this leave Cetero? You guessed it, huge debts and, in 2013, they filed for Chapter 7 bankruptcy.

Like what you are learning?

Sign up to stay connected with all Thermo Scientific resources, applications, blog posts and promotions.
Keep Me Informed!

Worryingly, Cetero is by far from the only example. Another company, MDS Pharma (link to article about company), faced the consequences of contamination problems associated with a bioequivalence study. They were mandated to repeat the bioequivalence study at a different bioanalytical facility and this forced them out of business. And, it even happens to big companies: for example, in 2006, inspectors found deviations from the FDA cGMP regulations (link to regulations) at the generics pharmaceutical company Ranbaxy. Investigations found impurities in the drugs and apparent inconsistencies in the drug dosage levels. As a result, the share price crashed, and the FDA fined the company half a billion dollars.

And to top it all off, before you think this only applies to the commercial producers, earlier this year the National Institutes of Health (link to website) in the U.S. was forced to close its own Clinical Center for Pharmaceutical Development (link to article) due to “the discovery of serious manufacturing problems and lack of compliance with standard operating procedures.”

So what’s to be done? I spoke to my friend Christian Luber at a well-known biopharmaceutical company and he told me that what the biotherapeutic manufacturers need is confidence. What he meant by that was the reassurance that the analytical tools they are using will give them the right answer each time, and also the same answer each time (that is reproducibility).

 

Analytical Solutions for Biopharmaceutical Reproducibility

From discovery to lot testing, there are a certain number of analytical techniques that are used at almost every stage. A great example of this is liquid chromatography, or UHPLC as it is now commonly known. Just this month, we launched a new UHPLC system (Thermo Scientific Vanquish Flex UHPLC System) could help with the reproducibility issue as this robust, inert and flexible system was designed to deliver the most repeatable separations possible in the demanding environment of biotherapeutic production.

As I was listening to some excellent presentations at the recent HPLC conference in Geneva last month, it seemed to me that the area of sample preparation, digestion and clean-up are potentially bigger sources of reproducibility variation than the UHPLC system or the chromatography column. And, it got me thinking about our latest tool for protein digestion (Thermo Scientific SMART Digest kit) designed specifically for biotherapeutic proteins. If you’re not familiar with protein characterization, one of the major steps is digestion to allow analysis of the protein constituents (that is, peptides). This is a slow, labor-intensive technique that results in significant sample-to-sample variation.

Put another way, it’s a weak link in the characterization process, both from a time perspective and from a data consistency perspective. Using the kit requires far less manual handling, vastly less time, and gives provably better consistency of protein digestion. The product is still very new, but right now we believe it to be a quantum leap forward for biotherapeutic characterization workflows such as peptide mapping.

In conclusion, the issues with drug variability won’t be solved overnight but we are working tirelessly to improve the lot of the laboratory scientist laboring over proteins that are unstable, heterogeneous, and difficult to characterize. Here’s to more solutions just around the corner!

 

Do you have any insight into biotherapeutics, especially ADCs or biosimilars? Or, do you have an interesting story about the regulated environment that these drugs are produced in? If so, please let us know.