Why nanonisation is vital for bioavailable nanoparticles

by

Dr Niklas Sandler, chief technology officer at Nanoform, explains why the use of nanonisation and artificial intelligence (AI) can help pharmaceutical manufacturers unlock added value for bioavailable nanoparticles. 

Nanonisation, the process of manufacturing nanometer-scale particles, is a powerful technique for increasing the bioavailability of drug substances by enhancement of dissolution rates and solubility. It is estimated that half of all drug substances currently working their way through pharma pipelines exhibit poor solubility. This can all too easily result in failure during clinical development, as poorly soluble compounds are absorbed less efficiently into the body. Overcoming the problem of low solubility by simply increasing dose is often not an option, as it can either be practically unfeasible or result in increased side effects. Nanoparticles have a clear advantage over microparticles in terms of bioavailability, which can best be explained by the difference in relative surface area between particles on the nanometer versus the micrometer scale. A 100-fold decrease in the size of a cubic particle will increase its surface area to volume ratio by 100 times. This dramatically increases the particle’s exposure to solvent and leads to an observed increase in dissolution rates, directly correlating to increased solubility and better absorption into the body.

One drug on the market for which this increased relative surface area was crucial for successful commercialisation is aprepitant, used in the treatment of chemotherapy-induced nausea. Aprepitant is a largely non-polar compound that shows very poor solubility in water and is absorbed through a small section of the upper gastrointestinal tract. It was only through the manufacture of nanoparticles that the drug could be made sufficiently soluble to take advantage of this narrow absorption window. However, the manufacture of these nanoparticles required a complex milling process, which is unreproducible for many drug substances in development, and introduces numerous excipients into the final drug formulation. New technology must therefore be investigated as a means to better overcome the industry’s formulation challenges and help enable more promising drug candidates to reach the market.

Bringing together advanced tech, expert knowledge and machine learning

Recent advances in nanonisation technology, such as controlled expansion of supercritical solutions (CESS), can reproducibly generate pure drug particles, free from excipients. In the CESS process, drug substances are dissolved under high pressure in supercritical CO2, and then recrystallised through a controlled step-wise reduction in pressure. After careful optimisation of parameters, the methodology enables unprecedented control over the thermodynamic processes of particle nucleation and growth. A deeper understanding of the specific parameters that are likely to lead to successful nanonisation of a given drug compound will allow further gains in quality-by-design, and help pharma companies to focus on winning compounds.

Manufacture of effective nanoparticles requires knowledge at the cross-section of biology, chemistry and physics. A significant facet of this is understanding how a given candidate for nanonisation sits in chemical space, and the numerous physical and chemical properties that will affect its performance in experiments. These properties are many and varied, including the overall size of the molecule, its functional groups, and its hydrogen bonding network. As drug substances go through the nanonisation process, the reaction data collected can be used to help elucidate the interplay between different molecular properties and how they influence nanoparticle manufacture. However, due to the sheer number of variables that can affect overall nanonisation success, a machine learning AI approach has the potential to be extremely beneficial in truly understanding how a molecule will behave. Smart integration of AI machine learning systems can complement advanced technology and expert knowledge with their ability to deliver the insights needed for process optimisation.

AI for all

AI solutions that are being currently being rolled out within the pharma industry are typically based on machine learning algorithms using big data. However, for optimising drug particle nanonisation, this approach has limited use. Firstly, the input data is generally taken from many different sources and used in such a way that its accuracy can be difficult to verify. Secondly, the scale of the datasets required is something that even the largest pharma companies struggle to acquire. An alternative approach is therefore to leverage sparse data AI, which can allow for process optimisation with far fewer experimental data points. With sparse data AI, smaller data sets are directly augmented with detailed expert knowledge to allow for the probabilistic prediction of factors, such as how a given molecule will behave under certain circumstances. The approach is a promising way to help identify the compounds that are most likely to be successfully nanonised, and the conditions that will best facilitate the process.

A deeper level of understanding about factors influencing the nanonisation process allows more precise design of nanoparticle attributes, in turn allowing more formulation challenges to be overcome. In order to expand this understanding, sparse data AI can use available in vivo and in vitro experimental data to ground and guide in silico experiments that investigate how various parameter settings affect different molecules. This will enable more rapid decision-making about which molecules are likely to succeed, and provide good indication of the process parameters likely to be successful. As a molecule moves through the iterative stage of process optimisation, AI can be further used to drive decisions about parameters like temperature, flow rate and pressure, in order to achieve optimal nanoparticle products.

Helping drug candidates reach the clinic

The use of state-of-the-art machine learning to predict a drug substance’s nanonisation success will help pharma companies unlock added value in their preclinical, clinical and late phase programmes. By delivering constant process improvements through consistent learning, the technique is poised to help in the reliable production of high quality, highly soluble, highly bioavailable nanoparticles. In turn, this will help solve the industry’s challenging particle manufacturing problems and enable more molecules into the clinic. Through the application of expert knowledge, cutting-edge technology and sparse data AI, advanced nanonisation processes have the potential to minimise problems with solubility and double the number of drugs reaching the market.

Back to topbutton