All posts by webprofits

5 Reasons Why Zetasizer Is Still The Most Widely Used Dynamic Light Scattering (DLS) System

Dynamic light scattering (DLS) is now a ubiquitous tool in many laboratories, and offers an accessible and accurate way to determine hydrodynamic size distribution in minutes. The non-invasive technique requires very little sample and is quite easy to use for a range of user abilities. 

The newly released Zetasizer Lab, Pro and Zetasizer Ultra are the latest editions to the Zetasizer range. Adding ease and performance to the popular Zetasizer Nano range, they offer unique updated measurement features, hardware capabilities and software intelligence that are unmatched. In this article we deep dive into the top 5 features and updates of the Zetasizer Pro and Ultra systems and share some insights into how you can get the most out of your DLS instrument – and your analysis.

An overview: how DLS technology works

In DLS, the speed at which particles diffuse due to Brownian motion is measured. This is done by shining light directed to a sample contained in a cell. For dilute samples most of the laser light passes through the sample but some light will be scattered by particles in all angles. A detector is used to measure the intensity of the scattered light. In the Zetasizer advanced series, the detector position will be either at 173° (non-invasive backscatter) or 90° (side scattering) or 13° (forward scattering). 

The intensity of scattered light must be within a specific range for the detector to successfully measure it. If too much light is detected, then the detector will become saturated. To overcome this, an attenuator is used to reduce the intensity of the laser source and hence reduce the intensity of scattering. For samples that do not scatter much light, such as very small particles or samples of low concentration, the amount of scattered light must be increased. In this situation, the attenuator will allow more laser light through to the sample. 

The scattering intensity signal from the detector is passed to a correlator which compares the scattering intensity at successive time intervals to derive the rate at which the intensity is varying. This correlator information is then passed to the Zetasizer software to analyse the data and derive size information.

Zetasizer is the standard in light scattering for over 40 years

What sets the Zetasizer apart for other similar systems is the performance, reliability and ease of use. Over 40 years ago, the Malvern correlator opened doors to a field of research and development investigating ever smaller particles, and continuously advancing since then. Since first launched over 2 decades ago, the Zetasizer Nano series has been the standard for performing dynamic light scattering (DLS) measurements on a wide range of particles and materials. The Zetasizer Nano was the first system that combined dynamic, electrophoretic and static light scattering in one instrument. Zetasizer quickly gained well-deserved attention with core features including fast, simple-to-use, yet sophisticated software with built in guidance. Most significantly, Zetasizer is known not only for its ability to provide the highest sensitivity but also the widest concentration range from the then novel Non-Invasive Back Scattering (NIBS). NIBS reduces the effect known as multiple scattering where light from one particle is itself scattered by other particles by moving the focusing lens and changing the measurement position. In this way light passes through a shorter path length of the sample, allowing for higher concentrations and turbid/opaque samples to be measured.

The need to clarify, NIBS and Backscatter are not the same

NIBS is one of the key features with unique functionality that separates the Zetasizer from other DLS systems. The patented NIBS technology enables the highest sensitivity for both small and large particles even for the most concentrated samples. This unique ability to perform at the highest level no matter the application has resulted in the Zetasizer Nano being the most popular instrument for sizing by DLS, with over 100,000 peer-reviewed publications.

DLS has traditionally used 90 ° detection angles. Adding a backscatter angle provides several benefits allowing for higher sensitivity and a higher size range with increasing volumes. Backscatter measurements are also less sensitive to large particulates such as dust, removing the need for time consuming sample preparation traditional 90° measurements require. However, the benefits of backscatter come with compromises; increased volume reduces the high concentration range, increased flare creates more noise and a reduced sensitivity may be unable to detect the presence of important aggregates. 

These are overcome in the Zetasizer using NIBS at a detection angle of 173°. For aggregate detection, a forward angle of 13° is employed to detect the presence of aggregates at much lower concentrations (with higher sensitivity) than backscatter or 90°.

NIBS and automatically determines the optimum measurement position within the cuvette and correct attenuation of the laser for the sample being measured. When analysing very low concentrations or weakly scattering particles, NIBS automatically positions the detector optics at the center of the cell to maximise the scattering volume. As the concentration or scattering intensity increases, it avoids multiple scattering by moving the optics across the cell in small increments. At high concentrations the optics will be positioned at the cell wall, reducing the path length and therefore minimising multiple scattering. This together with the attenuator, which automatically adjusts to ensure the optimum amount of light is used, ensures that no matter what the concentration, size and scattering efficiency, the optimal results are reached covering the broadest range of applications. These features make NIBS unique, providing extremely useful functionality unavailable on other instruments (even those using back scatter detection).

New Zetasizer Advance series – top 5 features

The Zetasizer Ultra has multiple features that help to reduce the time taken for measurements while providing much more detail on sample properties.

  1. Faster size with Adaptive Correlation (AC), and better size data. It takes less time to make a measurement. You can also get data from samples that were too noisy before.

Adaptive correlation is a new approach for capturing and processing DLS data. It uses statistical models to highlight any captured data that is not representative of the sample such as rare dust particles. Multiple short sub runs are performed and the resultant correlation functions are averaged to reduce the effects of noise. AC allows the characterisation of consistent and steady size components without data being skewed by intermittent or transient scatters. In this way, the measurements can exclude effects of dust while also increasing measurement speed and repeatability. There is also less need for filtering of samples and dispersants, simplifying sample preparation procedures. 

  1. AI guided, neural network help with size data quality advice: Even a novice without any prior light scattering knowledge can make sense of sizing data.

The new ZS Xplorer software offers intuitive, guided workflows that make setting up a method and performing a measurement easy and straight-forward. Using an artificial intelligence (AI) led approach to data quality assessment, it brings attention to any potential measurement issues and provides guidance on how to improve them.

  1. Fluorescence filter wheel allows for measurement of fluorescent samples which can cause large background noise in the data. The fluorescence filter eliminates that noise and makes a measurement possible despite the presence of fluorescence. 

For fluorescent samples like quantum dots, light emitted by the sample other than laser scattering will decrease signal to noise. The Zetasizer has an option that can minimise the effect of (incoherent and thus undesirable for dynamic light scattering) fluorescent light: the fluorescence filter eliminates most light that is not very close to the laser wavelength. The fluorescence filter is an optical component consisting of glass with a special coating to reflect light outside the designated wavelength range. Therefore it only allows a select wave length range for transmission.

  1. Polarisation filter, both vertical and horizontal polarisation components can be detected, potentially gaining insights into particle rotational diffusion.

Adding a polarising filter can clean up the optical signal by removing any depolarised light which can be a source of noise in measurements caused by multiple scattering. This feature provides versatility to measure over a wide concentration range, improving signal-to-noise without impairing overall system sensitivity. In addition to a vertical polariser, the Zetasizer Pro and Ultra have a horizontal polariser which measures depolarised light. A depolarised DLS signal can be used to detect rotational diffusion which indicates differences in particle shape and whether particles are spherical or have surface differences.

  1. Novel 3uL low volume size cell, lowers sample volume and extends concentration range for 90 degrees: You can measure even turbid samples now at 90 degrees, which was previously not possible and required NIBS backscattering. 

As particle size increases, thermal Brownian motion is no longer sufficient to keep particles in suspension, and samples may sediment over time, meaning that the motion of the particles is no longer random. In addition measurements of particles over 1 micron in size may show some difference in variability as a function of temperature, suggesting that thermal effects may influence the artefacts seen in the measured correlation functions. The geometry of the 1mm capillary used in the low volume disposable sizing cell helps to prevent the formation of convection currents and thus allows accurate measurements without modification of the sample dispersant over the entire measurable size range for DLS. Repeatability for polydisperse samples is improved over comparable measurements in a standard cuvette.  The cell also eliminates the errors associated with multiple scattering, allowing samples to be measured over a wider dynamic concentration range than would normally be possible at side scatter (90°).

Contact us for more information or to book a demonstration with one of our specialists.

Additional resources

Next Generation LNP Manufacturing

Ease of scalability

Producing COVID-19 vaccines for the entire world population rapidly demonstrated the microfluidic development scale up roadblock. Discovery formulations developed using microfluidics had to be scaled up using alternative higher throughput formulating technology with different operating parameters.   

With a growing demand for more mRNA-LNP based vaccines, proven to have high efficacy, there is currently a need to facilitate a faster and cheaper global deployment of a high-throughput manufacturing method. A GMP compliant approach is needed, from the earliest possible stage through to manufacturing, that is also tuneable in size to access various tissues or specific drug targets.  

Advanced cross flow mixing (AXF) from Micropore Technologies allows for seamless scaleup with consistent physics, mechanisms conditions and geometry across its equipment range. Ultimately Micropores AXF technology can be scaled up to a device with over 10 million pores with the potential for a throughput of up to 20 Liters per minute all this from a device that would still fit inside a briefcase. AXF allows controlled, low shear precision continuous flow mixing technology from nano to micro formulations for when you want to avoid roadblocks on your product development journey. It uses the same same shear, same physics and same technology from lab bench to manufacturing scale to enable scale up with confidence.  

Micropore technology address key concerns for adoption

  • Expertise, availability of skilled personnel, associated risks, and the need for a reliable supply chain. 

Micropore is a technology provider with global experience in manufacturing all different types of vaccine modalities can further ensure a cost-effective, high-quality process. Partnering with Micropore will enable a stronger benchmark with in-depth expertise and the ability to leverage novel technologies will also help reduce risk and shorten timelines.

  • There is a need to develop standard analytical methods for quality characterisation and reference material that lend insight into the mechanisms of stability or degradation of mRNA and LNPs containing it. 

The determinants of stability of mRNA in LNP formulations – what parts are predicated on the payload of mRNA and what portions are predicated on the lipid nanoparticles themselves or what portions are predicated on the freeze-drying cycle that’s used – can be related to its size and its secondary structure. Messenger RNA poses a unique manufacturing challenge because of its large size. Other RNA entities such as siRNA and guide RNA for clustered regularly interspaced short palindromic repeats (CRISPR) technology typically are produced using chemical synthesis, which can be performed in a relatively controlled environment. But mRNAs are larger, with complex three dimensional structures that aren’t yet fully understood. Malvern Zetasizer (DLS) enables particle size and stability measurements while the RedShiftBio Aurora (MMS) system enables secondary structure (HOS) determination.

  • The need to move to larger scale manufacturing of mRNA-LNPs without an extensive setup or excess capacity to convert.

With mRNA vaccine production requiring relatively less space than other approaches, new facilities may be more feasible and affordable. Micropore technology can enable localised production of vaccines and thus accelerate access to a much larger population. In locations with limited or no infrastructure, the Micropore approach can be the shortest route to production and can reflect the exact needs of the organisation at minimal cost.

  • The need for future vaccine manufacturing facilities to be closed and continuous processing, plus equipment connectivity and communication.

Micropore offers a minimal cost model achieved through the AXF advanced cross flow technology platform. The flexibility of mRNA-based vaccines when manufactured using this single piece of stainless steel equipment with no consumables means it requires the least capital investment. The scalability of production (from 200 μL to 1500 L/hr) reduces the facility design complexity and means that more doses can be manufactured in a continuous process which eliminates variability, compared to a batch-to-batch approach. As such, this vaccine modality combined with the Micropore mixing platform can be a robust starting point for production with low risk.

  • The need to overcome current barriers of manufacturing process to be fully digitalised as regulatory authorities rely on data and parameters recorded during production for verification and approval.

Micropore cross flow technology demonstrates predicable scalability which is favourable especially for GMP manufacturing which means process controls can be introduced that are automated – process analytical technologies (PAT). Automated analysis of properties such as online particle size enables the option to automatically control any deviations in size and feed that back to control the pumps and optimise control to give the correct size again. This increases confidence in quality of production meaning throughput can be increased further.

Continuous Formation & Stabilisation of LNPs Minimises Risk of mRNA Degradation

Batch mode vs continuous mode manufacturing: While Impingement jet mixing (IJM) or T-mixers are the most widespread manufacturing method currently, high turbulent mixing combined with high pressure and shear stress effects can compromise LNP stability and affect overall performance of the product. The process also suffers from high batch variability and high wastage due to holding or prolonged processing steps.

Micropore technologies advanced cross flow mixing device for manufacturing LNPs at scale.

Micropore Technologies offers Continuous Manufacturing: Micropore Technologies employs laminar flow mixing across a permanent stainless-steel membrane to produce reproducible, scalable LNPs. The outer dispersed phase is continuously mixed with the aqueous inner compartment to form LNPs. Size controlled uniform particles are generated in a continuous flow capacity of up to 1500 L/hour making this by far the fastest production rate in the LNP industry. This would translate to roughly 58,000 doses of vaccine every minute, an important capability when faced with the demands of global disease emergencies. 

Micropore Technologies: 2nd generation Vaccine manufacturing Process

Micropore technologies advanced cross flow mixing device for manufacturing LNPs at scale.

Micropore’s Membrane Technology: How it works

Micropore’s equipment is uniquely suited to the production of complex nanomedicines in a scalable manner – from small 200µl samples through to 1 billion doses.

Micropore technologies differs from conventional mixing LNP techniques by considering the process not as discrete or separate unit operations but as one single whole process.

Micropore technologies advanced cross flow mixing device for manufacturing LNPs at scale.


• High efficiency, scalable and reproducible.

• Reduced manufacturing time and costs.

• Customisation for diverse LNP formulations.

• Tunable particle size with narrow size distribution (PDI).

The Micropore system comprises a membrane inside a housing. Lipid mixtures added from the top are mixed with the RNA/buffer solution from the side which enters the membrane to produce monodisperse particles.  The system operates at very low pressure – approximately 2 Bar. This allows the size of the particles created to be very predictable with any change in flow rate. T-mixer devices operate at much higher pressures – about 30 Bar.

Micropore technologies advanced cross flow mixing device for controlled manufacturing LNPs at scale.

The graphs above show comparison control curves for AXF vs. T-mixer.

Advanced Cross Flow (AXF) mixing

Micropore technologies advanced cross flow mixing device for manufacturing LNPs at scale.

Micropore’s equipment is uniquely suited to the production of complex nanomedicines in a scalable manner – from mL batches up to tonnes.

A lipid/organic phase passes through the 100,000s of membrane pores into the flow of aqueous continuous phase that passes through the centre of the membrane tube. Gentle, laminar flow based mixing allows for good preservation of sensitive materials. Precision engineered equipment gives great size distributions even at scale, allowing precise targeting of distributions.

Micropore technologies advanced cross flow mixing device for manufacturing LNPs at scale.

Process robustness and high encapsulation efficiency

Here we present data from a formulation prepared using the AXF Pathfinder 20.

The conditions used are listed below:

  • Formulation ratio DDAB:DSPC:cholesterol:DMS-PEG2000, 40:10:48:2 mol%.
  • Lipid concentration, 3.5mg/ml
  • N:P ratio, 1:6
  • polyA concentration, 0.046mg/ml
  • Flow rate ratio 3:1
  • Total flow rate 100mL/min
  • continuous phase, Tris buffer, 10mM pH 7.4

The Pathfinder delivered highly repeatable, monodisperse LNPs that were 70nm in size with a narrow PDI of 0.13. The experiment was repeated in triplicate and samples were measured using the Malvern Zetasizer DLS system.

Encapsulation efficiency is about 97-98%. The AXF mini uses cross flow mixing which is a gentle mixing technique and prevents RNA degradation.

Process tunability and predictability

The two graphs above demonstrate the superior capability of the AXF mini to control the size of nanoparticles produced which can be critically important, especially for GMP manufacturing.

The plot on the right shows two formulations, plotted in red and green. Differences arise from e.g. concentration of DMG-PEG200 in the range 0.004 – 0.12 μmol which reduces size from 200 nm to 30 nm but both exhibit the same shape and therefore control. Although the PDI is still quite tight in both formulations but the formulation plotted in green is considerably larger (100-160nm) compared to the formulation in red (50-110nm). From this experiment you can quickly see it is possible to produce a 70nm particle by using a total flow rate around 100 ml/min. 

The plot on the left shows an experiment where the AXF mini was used to create two different formulations using two different flow rates. At 20ml/min the LNP size was 107 nm. When the total flow rate was increased to 200 ml/min, the LNP size reduced to 55 nm with a remarkably low PDI of 0.06.  

This data demonstrates the predicable scalability which is favourable especially for GMP manufacturing and regulators which means you can start to introduce process controls that are automated – process analytical technologies (PAT). Automated analysis of properties such as online particle size enables the option to automatically control any deviations in size and feed that back to control the pumps and optimise control and give the correct size again. This increases confidence in quality of production meaning throughput can be increased further. ATA Scientific offers several technologies to characterise LNPs.

Contact us for a demo today!

A Fundamental History of RNA

Whilst the planet soon learnt these few letters, mRNA, thanks to COVID and the lifesaving vaccine approach it afforded; RNA arguably is a molecular fossil – a piece of evolution that likely existed before modern cells. 

The RNA world hypothesis suggests that life on Earth began with a simple RNA molecule that could copy itself without help from other molecules. How could we hypothesise this? Think about hereditary expression and how that is conferred. If this was indeed the case, then RNA would have to both store genetic information and catalyse the reactions for the cells. Curiously RNA persists in catalysing many of the fundamental processes in cells today. How DNA rose to be the genetic material and how did the code arise, when did proteins become the main catalyst? If we have such a little handle on the history, then the body of present knowledge is clearly lacking depth, a world of wonder is yet to be discovered. Insight can be found in “The RNA World and the Origins of Life”1

Understanding RNA

All living cells contain RNA (Ribonucleic Acid) in its many forms, whilst there are structural similarities to DNA there are clear differences. Most RNA is single stranded, has a backbone with alternating phosphate groups and a sugar called ribose. Each of these sugars has attached one of four bases, adenine (A), uracil (U), cytosine (C) or guanine (G). There are many forms of RNA such as: messenger RNA (mRNA), Transfer RNA (tRNA), ribosomal RNA (rRNA), Small Interfering RNA (siRNA), small RNA (sRNA) and the list seems to be expanding. RNA is ubiquitous. It is involved in gene expression and some viruses use it for their genomic material. RNA is a fundamental building block of life. 

We are familiar with the use of mRNA for its usefulness in our immune responses to pathogens – such as COVID – 19. It’s success is largely due to a culmination of an enormous body of knowledge. Hundreds of scientists had worked on mRNA vaccines for decades before the coronavirus pandemic brought a breakthrough. Many Australian scientists including those at the Australian National University (ANU) made important contributions towards understanding the role of RNA. In the 1970s the Shine-Dalgarno sequence was discovered which tells the bacteria where to start protein synthesis so that the genetic code in mRNA is read correctly. This insight has enabled scientists to use bacteria as biofactories to make a host of different proteins that are now in use as drugs such as antibiotics, vaccines and cancer therapies or form part of important processes in biotechnology to develop yeasts, pesticides, enzymes, fuels, and solvents.

Traditionally, vaccines can take around 10 years to develop and consist of entire pathogens that have been killed or weakened so that they cannot cause disease. The recent COVID mRNA vaccines were developed in under a year and work by defining the genetic code of the target – easy now but before 1990 and the start of the Human Genome Project, this would have been particularly arduous. mRNA delivers the instructions your body needs to recognise the virus and fight it off. Cells then break down the mRNA and get rid of it. This gives cells the opportunity to change the type and number of proteins made based on demand which is key to allowing living things to grow, change, and react to their environment.

RNA is generally synthesised from DNA (often bacterial) by the enzyme RNA Polymerase through a method called transcription (think trans-scribe – i.e., to write) where not a copy but the complimentary RNA sequence to the DNA template is produced. The protein production is handled by the ribosome, these proteins are released, and the body sees them as foreign and mounts an antibody response, voila – a vaccine. No need to grow batches of cells in bioreactors or infect millions of eggs. The RNAs involved in this procedure are mRNA, tRNA and rRNA. In a self-amplified mRNA (samRNA) consider not only are we attempting to defeat a virus, but we also hijack another part of a virus’ genetic machinery to aid in it. The alpha virus is manipulated by replacing viral structural proteins with the gene of interest. The genes encoding the Alphavirus RNA replication machinery are retained which translates into an RNA Dependent RNA Polymerase that is responsible for creating many copies of the sub-genomic RNA, resulting in the translation of multiple antigens, thus reducing the initial dose requirements. 

Plunge yourself back into the dark days of 2020, close your eyes and transport yourself to New York, picture the refrigerated trucks lined up with scores of dead as the morgues had overflowed, the palpable fear of everyone as they potentially could be carrying your death sentence. Then, as Sandra Lindsay (Director of nursing critical care at Long Island Jewish Medical Center) explained “My whole life just changed tremendously in that one moment in time,” she added “What was going through my mind is, I cannot wait for this needle to pierce my arm,” 2. Sandra was the first person in the U.S. to get a Covid vaccine outside a clinical trial. Interestingly Covid vaccines prevented more than 3.2 million deaths and 18.5 million hospitalisations in the U.S. from December 2020 through November 20222. In this context the mRNA vaccines were tremendous, they were developed and created in record time, it is likely the simplest, safest vaccine that has ever been produced. Such speed can only be appreciated in context, Moderna’s COVID vaccine development was initiated after the SARS-CoV-2 genome was posted on January 10, 2020; manufacture and delivery of clinical trials material was completed within 45 days, and the first trial participants were vaccinated on March 16, 2020, just 66 days after the genomic sequence of the virus was posted. Arguably a ‘vaccine’ material would have likely been produced in about a week – in contrast to a cell based or cultured vaccine would take months to produce, then the scale up to pandemic levels is another massive undertaking.  

Bivalent vaccines that protect against 2 strains are commonplace now and have proven to show greater effectiveness. It was earlier postulated the possibility to load an array of vaccines into one shot – ponder having an annual shot for Covid, Flu and whatever else is lurking out there to mess with our day. RNA certainly lends itself to this, though vaccines are just one modality of use for RNA. 

Things become less clear when defining the role of long non-coding RNAs (lncRNAs). Largely thought of as ‘Junk’ RNA in the past, the last decade has seen evidence mount for the lncRNAs3 having key roles in gene regulation and studies are noting a divergence of biogenesis of lncRNA compared to mRNA. Localisation of the lncRNA and their interactions with proteins, DNA and RNA appears to give insight into their roles be it interfering with signalling pathways, chromatin modulation, affecting stability and translation of mRNA in the cytoplasm along with the regulation and function of membraneless nuclear bodies. Such processes have a knock-on effect in gene expression impacting a varied array of physiopathological and biological conditions including cancer, immune responses, and neuronal disorders. Their localisation and condition specific patterns are gaining interest as biomarkers for disease states. So much for Junk!

RNA Therapeutics

Before we dive into what therapeutics are possible with RNA, perhaps it is best to understand how this can be achieved. Consider the process of DNA forming RNA then forming proteins – there are 3 distinct ways to use this process to prevent disease. I) Gene knockout – completely remove the DNA, 2) prevent or alter the Transcription of RNA from DNA, 3) prevent or alter the Translation of the protein. 

Transcriptional Silencing is not as simple as once assumed. The most highly studied phenomenon in epigenetic modifications by far is DNA methylation, which typically refers to covalently attaching a methyl group (CH3) to the 5th position of the cytosine nucleotide by means of a group of specific enzymes called DNA methyltransferases (DNMTs) using S-adenosyl-L-methionine (SAM) as substrate4.
Resolving genetic defects at this point can have a “global” or complete rectification of the disorder, importantly other processes may be at play permitting alternative modes of action such as the terrific ‘R-loop’ research in Fragile X syndrome5.

One of the most important advances in biology has been the discovery that siRNA (small interfering RNA) is able to regulate the expression of genes, by a phenomenon known as RNAi (RNA interference). siRNA is a double stranded non-coding RNA typically 20 – 27 base pairs in length. It has some specific properties where the primary and secondary modes of action are the inhibition of translation and mRNA cleavage.

The major difference between siRNAs and microRNA/ miRNAs is that the former are highly specific with only one mRNA target, whereas the latter have multiple targets. microRNA controls gene expression mainly by binding with mRNA in the cell cytoplasm. Instead of being translated quickly into a protein, the marked mRNA will be destroyed and recycled, or it will be preserved and translated later. If miRNA is under expressed the level of the protein it normally regulates may be high, and visa versa6.  

miRNA is often used in cancer diagnosis, cancer prognosis and drug discovery research given its use in determining the function of a protein or gene in a cell. The miRNA-based therapeutics could be categorised into two types: miRNAs mimics and miRNAs inhibitors. The former are double-stranded RNA molecules that mimic miRNAs, while the latter are single-stranded RNA oligos designed to interfere with miRNAs. For example, there are clinical trials into miRNA mimics to treat blood cancers, fibrosis, and a tumour suppressor miRNA for solid tumours just to show the breadth of application. For more about RNA therapeutics see Fig 1. from: RNA-based therapeutics: an overview and prospectus8

The term “undruggable” may be one day a distant relic in an era of genetic medicines. The growing understanding of RNA functions and their crucial roles in disease lends weight to broadening the therapeutic targets. Research is scratching the surface of what is possible and if the body of research into non-coding RNA is any indicator, this scratch will likely resolve any itch. Slack and Chinnaiyan’s review in Cell7 removes all doubt. RNA, due to its distinct physiological and physiochemical properties can theoretically target any gene of interest should the correct nucleotide sequence be selected. The enormity of the genome thus holds great prospect for such therapeutics, diagnostics, and silencers. Pitch this against the only 0.05% of the genome drugged by current approved protein targeted small molecule chemicals and antibodies. Besides, around 85% of proteins lack specific clefts and pockets for small molecules binding8.  

The infancy of this technology and the runs on the board are encouraging. Fomivirsen is the first FDA-approved ASO ( antisense oligonucleotide) drug for treating cytomegalovirus retinitis (CMV) in patients with AIDS, Patisiran is the first FDA-approved RNAi-based drug for treating Familial amyloid polyneuropathy, also called transthyretin-related hereditary amyloidosis. Sarepta Therapeutics ‘Eteplirsen’ for Duchenne Muscular Dystrophy (DMD) has completed clinical trials. Notably the ASO for spinal muscular atrophy, Nusinersen targets the CNS (central nervous system) was approved in 2016, this previously undruggable disease has new hope. It is referred to as an orphan drug given the rare genetic nature of the disease. Showing promise was the Translate Bio MRT5005 inhalable mRNA treatment for Cystic Fibrosis.  Unfortunately, there was no pattern of increases in ppFEV1 (percent predicted forced expiratory volume in 1 second) – a measure of lung function. Despite this, a great deal can be gleaned from this trial, notably the tolerability of multiple mRNA doses, the appeared absorption into the blood from an initial inhalation of the mRNA, plus evaluation of immunogenicity markers showed no clear pattern of anti-CFTR antibodies, anti-PEG antibodies or T-cell sensitisation to CFTR.

The mRNA was delivered in a LNP (Lipid NanoParticle) and promisingly there were no patients which had detectable levels of lipid in the blood10.  Sanofi acquired Translate Bio in Aug 2021 post the announcement of the clinical trial results. Today, there are already 18 clinically approved RNA-based therapeutics, including two vaccines that made mRNA a household word during the COVID-19 pandemic11

Initially RNA -based therapeutics were embraced with clear rationales for disease in oncology, neurology, and infections. Given the advancement in research there are around 8 ASOs, 3 siRNAs and two mRNA FDA approved drugs. Ito et al9 conducted a review of Clinical trials with ncRNAs showing a staggering 757,348 published articles with a PubMed search with 321,672 since January 2017. I mention this to articulate the depth of research. Of the 108 their exclusion criteria failed to remove, there is a clear trend, 95% were ncRNAs used as observational tools and clearly only a few interventional in clinical trial. I expect the number of interventional ncRNA therapeutics in Clinical trial to explode in the near future fuelled largely by small biotechs and academia spin outs. 

RNA Diagnostics

The development of the Nobel Prize winning discovery of the Polymerase Chain Reaction (PCR) technique is an exemplar of how previous discoveries pave the path for something special. By 1980 all the components required to perform PCR amplification were known, it was only a matter of time before Kary Mullis put these pieces together to create the thermo-cycled PCR amplification we know today.
Reverse Transcription Polymerase Chain reaction (RT-PCR), Quantitative real-time Polymerase chain reaction (qPCR), Reverse Transcription quantitative real-time polymerase Chain reaction (RT-qPCR), and RNA sequencing (RNA-seq) today form the basis of RNA diagnostics. Whilst the application for RNA previously described here focusses on the production of a protein or a mechanism for gene manipulation, it is clear that RNA is dynamic with diverse and essential roles throughout the entire genome. The broad distribution and utility of the biomolecule has made it a foci for diagnostic, prognostic, and biomarker functions, however, the translation to clinical diagnostics has unearthed significant challenges, particularly in the realm of liquid biopsies. 

Historically, the detection of specific mutations in cell free DNA (cfDNA) has been the main thrust of research with a few cfDNA tests being approved by the FDA for diagnosis and since 2020 only one is based on next generation sequencing. There currently are no cell free RNA (cfRNA) clinical diagnostic tests approved.

What is promising is the pot of gold that awaits the inventor of the technology to become the gold standard for diagnostics, the liquid biopsy market is expected to eclipse 5.8 billion dollars by 202612. Not withstanding the lack of standardisation of method to collect, prepare, screen and analyse a sample plus the contamination issues of cellular RNA and DNA the technology issue may be dwarfed by the enormity of the project. Firstly, consider the conundrum. There is a need to screen a population of otherwise asymptomatic individuals to gather the data required given the concept of cfRNA targeting is early detection – before a tumour is developed and there is cfDNA floating about. Historical retention samples are of little use currently as the modality of preservation has been shown to contaminate the results. Without standardised methods that respect the inherent biases and sufficient cohorts to assess, it is likely the requirements are beyond the grasp of even the most collaborative research projects. The data interpretation will be challenging and arduous for medical staff inadept in bioinformatics and statistical analysis. miRNA may be surpassed by long RNAs given some recent promising results in new disease-associated RNAs, albeit very early stages these may be the next revolution in screening and diagnosis. 


Better known as genetic scissors to the layperson, the discovery13 of this remarkable tool is a triumph for collaborative research, conference meetings, and a quest of deciphering the unknown. Bacteria have a mechanism for not only fighting off viruses but a memory method for future infections not only in the individual bacterium, but this memory is passed down through generations. This is an adaptive immune system that detects viral DNA and destroys it – to achieve this they use CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats). What is interesting is it is programmable, in the original study Doudna and Charpentier13 decided to use some purified Cas9 protein and combined them with a crRNA (Crispr RNA) strand to see if they could duplicate the phenomenon by finding a unique DNA sequence and cut it – this failed. Given the abundance of another RNA in bacteria and the vicinity of it to the CRISPR protein the collaborators postulated it may be needed. They included tracrRNA and this cleaved the DNA. Next step was to purify the tracrRNA and fuse it with the CRISPR, create an experiment where different genetic codes are programmed in to cut at specific lengths of DNA, this was hugely successful, earning a Nobel Prize.
An explosion of research to utilise this ensued, and the lay-media went for the unintended use of genetically engineering babies to be smarter, faster, stronger totally misinterpreting the intent of the research and it’s potential to be used to treat genetic diseases. 

Putting the ethics aside, the use of RNA in this is pivotal. Using a guide RNA, CRISPR has transformed the world of genomics, now it is possible to target a disease such as Duchenne Muscular Dystrophy in the aim of resolving the genetic defect. CRISPR Cas9 can be used to disrupt a sequence- inactivating a gene, larger sections of DNA can be cleaved either side of the desired deletion followed by a cellular repair process that joins the strands thus deleting the gene. 

Another action is to correct the gene by homology directed repair, the DNA is cleaved as in deletions but the cell uses the supplied DNA template to repair the break, thereby replacing the faulty DNA sequence or even inserting a new gene14.
‘In short, it’s only slightly hyperbolic to say that if scientists can dream of a genetic manipulation, CRISPR can now make it happen. At one point during the human gene-editing summit, Charpentier described its capabilities as “mind-blowing.” It’s the simple truth. For better or worse, we all now live in CRISPR’s world´15

Evolution of Nanomedicine Production Technology 

As RNA science continues to make headlines around the world as the new way of making safer, more targeted medicines, it has sparked a wave of new studies. With this comes the need for improved methods that deliver faster, more efficient and scalable drug manufacturing. Currently mRNA-LNP drugs are produced using methods such as lipid hydration, extrusion or impingement jet mixing (IJM), but these suffer from turbulent mixing and multiple harsh processing steps which compromise stability and give high batch variability. Microfluidic mixing offers rapid formulation with low polydispersity but again cannot accommodate high-volume production.

Micropore Pathfinder offers seamless scalability from initial R&D (0.2 mL) to final pandemic-scale GMP manufacturing (1500 L/h). This translates to 58,000 doses of vaccine every minute from a device small enough to fit in the palm of the hand. Collaborations with The University of Strathclyde and Professor Yvonne Perrie demonstrated mRNA encapsulation efficiencies over 97% in LNP production using Micropores AXF advanced crossflow mixing. This demonstrates that the Micropore Pathfinder can provide efficient mass production of a new generation of RNA-based therapeutics. The series is designed to be easy to operate, highly reproducible and stable in operation.

Book a demonstration with us today and try it yourself !


  1. ‘The RNA World and the Origins of Life’ Bruce Alberts 2002; site access 15May2023
  2. ‘Two years after Covid vaccines rolled out, researchers are calling for newer, better options’ Aria Bendix, NBC News 14 Dec 2022. Site accessed 23 May 2023.
  3. ‘Gene regulation by long non-coding RNAs and its biological functions’ Luisa Statello et al 2020; site accessed 17 May 2023
  4. ‘DNA Methylation Signature of Aging: Potential Impact on the Pathogenesis of Parkinson’s Disease’ Yazar V, Dawson VL, Dawson TM, Kang SU. 2023,  site accessed 22 May 2023
  5. ‘Site-specific R-loops induce CGG repeat contraction and fragile X gene reactivation’ Hun-Goo Lee et al Cell, 2023. Site accessed 23 May 2023.
  6. ‘The Limitless Future of RNA Therapeutics’ Damase et al ; 2021 Site accessed 24 May 2023
  7. ‘The Role of Non-coding RNAs in Oncology’ Frank J. Slack and Arul M. Chinnaiyan, Cell 2019. site accessed 25 May 2023
  8. ‘RNA-based therapeutics: an overview and prospectus’ Zhu et al; Cell death and Disease 2022. Site accessed 25 May 2023
  9. “Current clinical trials with non-coding RNA-based therapeutics in malignant diseases: A systematic review’ Ito et al, Translational Oncology 2023. site accessed 25 May 2023
  10. Translate Bio Announces Results from Second Interim Data Analysis from Ongoing Phase 1/2 Clinical Trial of MRT5005 in Patients with Cystic Fibrosis (CF)’ Translate Bio, Inc. site accessed 25 May 2023
  11. ‘RNA therapeutics’ Michelle L. Hastings & Adrian R. Krainer; RNA (2023) Vol. 29, No. 4
  12. ‘Current challenges and best practices for cell-free long RNA biomarker discovery’ Cabús, L., Lagarde, J., Curado, J. et al. Biomark Res 10, 62 (2022) site accessed 30 May 2023.
  13. Discovery Story: Genome Engineering with CRISPR-Cas9 (Doudna, Jinek, Charpentier) May 2017 site visited 31 May 2023
  14. CRISPR/Cas9 – a specific, efficient and versatile gene-editing technology we can harness to modify, delete or correct precise regions of our DNA CRISPR Therapeutics 2023. Site accessed 31 May 2023.

And Science’s 2015 Breakthrough of the Year is…  John Travis 2015. Site accessed 31 May 2023.

Brownian Motion

What is Brownian Motion and Why Is It Important?

In 1827, Scottish botanist Robert Brown looked through a microscope at pollen grains suspended in water, and discovered the pollen was moving in a random fashion – tiny particles did not slow or stop, but were in constant motion. This phenomenon, we now call Brownian Motion, is not unique to pollen but is commonly observable in daily life. It is not specific to biology either but instead has been proven mathematically and is due to physics. Most people might have noticed dust particles dancing in a ray of light in a dark room or the diffusion of pollutants/smoke in the air or diffusion of calcium in bones – these are all examples of Brownian motion. 

Brownian motion is the random movement of particles due to the bombardment by the molecules that surround them. Understanding Brownian motion is important because it provides us with the evidence that atoms exist. Einstein’s mathematical model of Brownian motion from 1905 is one of his least well known but very important contributions to physics. It described how tiny visible particles suspended in a liquid are bombarded or moved by invisible water molecules around them causing them to jiggle. The model explained this motion in detail and was able to accurately predict the irregular random motions of the particles which could be directly observed under a microscope. Einstein’s theory of Brownian motion offered a way to prove that molecules exist despite the fact that molecules are too small to be seen directly. Soon after a French physicist J.B. Perrin conducted a series of experiments that confirmed Einstein’s predictions. The theory also helped to understand how particle size is related to their speed of movement. 

What causes Brownian motion?

While Brownian motion of small particles has been observed quite easily using a light microscope and studied for the past 200 years, the mechanism that drives Brownian motion is not well understood. What we do know is that Brownian motion is caused both by the structure and physics of fluids, i.e., liquids and gases. According to kinetic theory as proposed by J.C. Maxwell, L. Boltzmann and R.J.E. Clausius, all matter is in motion; atoms and molecules especially within liquids and gases are in constant vibrating motion. These particles will travel in straight lines until redirected by a collision. Particles within gases and liquids are constantly moving, colliding, and moving toward equilibrium.  

There are mainly 4 factors that affect Brownian motion: temperature, particle number, particle size, and viscosity. The larger the particle or molecule and the more viscous the dispersion medium, the slower the Brownian motion will be. Smaller particles are “kicked” further by the solvent molecules and move more rapidly. In addition, a high temperature and a high number of particles, all increase the rate of motion.

How do you measure Brownian motion?

Given particle speed of movement or Brownian motion can be correlated to particle size, various analytical measurement techniques have been developed that exploit this relationship. 

Dynamic Light Scattering (DLS) – Malvern Zetasizer

Dynamic light scattering measures Brownian motion and relates this to the size of the particles. DLS, sometimes referred to as Photon Correlation Spectroscopy or Quasi Elastic Light Scattering (QELS), is a non-invasive, well-established technique for measuring the size and size distribution of molecules and particles dispersed in a liquid typically in the submicron region and extending to lower than 1nm using the latest technology pioneered by the manufacturer MalvernPanalytical. Typical applications of dynamic light scattering are the characterisation of particles, emulsions or molecules which have been dispersed or dissolved in a liquid. Common samples analysed by DLS include colloidal silica, titanium dioxide, ceramics, carbon dots, lipid nanoparticles, proteins and adeno-associated virus (AAV). The sensitivity of modern systems is such that it can also be used to measure the size and concentration of macromolecules in solution with little dilution using small sample volumes (3µL). 

The Malvern Zetasizer which utilises sophisticated DLS technology works by determining the rate at which the intensity of the scattered light fluctuates when detected using an optical arrangement. Briefly, a cuvette containing particles in suspension (moving under Brownian motion) is illuminated by a laser causing the light to be scattered at different intensities. The small particles cause the intensity to fluctuate more rapidly than the large ones. Analysis of the intensity fluctuations yields the velocity of the Brownian motion and hence the particle size which is measured as the hydrodynamic diameter. 

The speed of movement of particles or velocity of the Brownian motion is defined by a property known as the translational diffusion coefficient. The size of the particle or the diameter that is obtained by DLS indicates how a particle diffuses within a fluid and is essentially related to the diameter of a sphere that has the same translational diffusion coefficient as the particle.

The size of a particle is calculated from the translational diffusion coefficient by using the Stokes-Einstein equation;

d(H)= kT/3πηD


d(H) = hydrodynamic diameter

D = translational diffusion coefficient

k = Boltzmann’s constant

T = absolute temperature

η = viscosity

Factors that can affect the velocity of Brownian motion

A number of factors can affect the accuracy and precision of DLS measurements, including temperature stability and accuracy. An accurately known temperature is necessary for DLS because knowledge of the viscosity is required (because the viscosity of a liquid is related to its temperature). The temperature also needs to be stable, otherwise convection currents in the sample will cause non-random movements that will impact the correct interpretation of size.

The measurement of the particle translational diffusion coefficient will depend not only on the size of the particle “core”, but also on any surface structure that will affect the diffusion speed, as well as the concentration and type of ions in the medium. The ions in the medium and the total ionic concentration can affect the particle diffusion speed by changing the thickness of the electric double layer which is called the Debye length. A low conductivity medium will produce an extended double layer of ions around the particle, reducing the diffusion speed and resulting in a larger, apparent hydrodynamic diameter. Conversely, higher conductivity media will suppress the electrical double layer reducing the measured hydrodynamic diameter. Any change to the surface of a particle that affects the diffusion speed will correspondingly change the apparent size of the particle. Similarly, an adsorbed polymer layer projecting out into the medium will reduce the diffusion speed more than if the polymer is lying flat on the surface. The nature of the surface and the polymer, as well as the ionic concentration of the medium can affect the polymer conformation, which in turn can change the apparent size by several nanometers.

DLS will not be applicable when the particle motion is not random. Therefore the maximum particle size that can be measured reliably by DLS is sample dependent and is normally defined by the onset of particle sedimentation. All particles will sediment and the rate will depend upon the particle size and relative densities of the particles and suspending medium. For successful DLS measurements, the rate of sedimentation should be much slower than the rate of diffusion since a consequence of slow diffusion is long measurement times. The presence of sedimentation can be determined using the Malvern Zetasizer by checking the stability of the count rate from repeat measurements of the same sample. Count rates which are decreasing with successive measurements indicates that sedimentation is present and the Expert Advice system will highlight this to the user.

Nanoparticle Tracking Analysis (NTA) – New Malvern NanoSight Pro

The advent of advanced computer technology with video analysis has allowed scientists to make automated measurements with visual validation to understand the dynamics of the motion and more accurately quantify the particles in a suspension. 

Nanoparticle tracking analysis (NTA) utilises the properties of both light scattering and Brownian motion to obtain the particle size distribution of samples in liquid suspension.  Simply, a laser beam is passed through the sample chamber, and the particles in suspension in the path of this beam scatter light in such a manner that they can easily be visualised via a microscope onto which is mounted a camera. The camera captures video files of the particles moving under Brownian motion within the field of view. Intuitive software simultaneously identifies and tracks the center of each of the observed particles, and determines the average speed moved by each particle. This value allows the particle diffusion coefficient to be determined from which, if the sample temperature and solvent viscosity are known, the sphere-equivalent hydrodynamic diameter of the particles can be identified using the Stokes-Einstein equation. 

Both NanoSight NTA and Zetasizer DLS measure the diffusion coefficient and derive the size from that diffusion coefficient. DLS provides excellent population statistics for an average size (by intensity) and average size distribution or polydispersity index. NTA on the other hand provides single particle tracking for a highly peak-resolved distribution by number combined with concentration determination and a fluorescence mode allows differentiation of suitably labelled particles. For biomedical research, using a fluorescently tagged drug molecule makes it possible to determine how many drug delivery nanoparticles had successfully been loaded with drug molecules. Integrating a combination of both DLS and NTA systems can help take advantage of the complementary information the two techniques can provide.

The new NanoSight Pro nanoparticle tracking analysis (NTA) system from Malvern Panalytical integrates advanced engineering with machine learning to provide the most detailed NTA solution for the characterization of bio- and nanomaterials.  Smart tools built into the software automate workflows and help remove subjectivity to generate extremely accurate and reproducible size and concentration data. An upgraded temperature controller allows stress and aggregation studies to be performed at up to 70°C. Advances in fluorescence measurement provide powerful insights into sample specificity while opening new possibilities in diagnostic, biomarker analysis and therapy applications. Previous limitations linked to small biological particles and other low scatterers are overcome by NanoSight Pro, which is optimised for use with samples including exosomes, viruses, vaccines, and drug delivery systems.

Ensure High Quality in Analytical Characterisation with ATA Scientific

NanoSight is already trusted by scientists around the world for its superior data quality and ease of use, with thousands of publications referring to NanoSight NTA data. As the world continues on it’s journey into developing better products to improve our daily lives especially related to research focused on drug delivery, viruses and vaccines, high-quality analytical characterisation is now even more important. Contact us for a free demonstration to discover how we can help you achieve more. 

A Simple Guide For Preparing Samples For Sem Imaging

Scanning electron microscopes (SEMs) are versatile instruments and they can do much more than you would expect. An SEM can provide key information such as structure, morphology and elemental composition about the surface or near-surface region of a sample. For this reason, it has become the tool of choice for several fields from material science to forensics, battery and additive manufacturing and more.  Desktop SEMs have now been personalised enabling faster, easier to use, on-site SEM imaging and analysis.   

Good sample preparation is a critical step when a high-quality SEM image is needed. Some samples can be quite challenging to image particularly if they are non-conducting. This guide will provide users with a few helpful tips and tricks when preparing samples for imaging. Meant for those who are approaching scanning electron microscopy for the first time, or are relatively new to it, this guide will ensure you obtain good results and get the highest detailed information from your samples. The content is valid for small to larger sample sizes of various compositions. For more detailed information on specific kinds of samples, please contact us.

Basic sample preparation

Every SEM is equipped with a sample holder or a loading chamber where the sample can be inserted.

To load a sample in a SEM, the use of aluminium stubs is recommended. These come in different, standard sizes and are readily available on a commercial basis.

Sample adhesion to the surface of the stub is crucial before placing it in the sample holder or stage. This will prevent pieces of sample being dislodged under vacuum and contaminating the SEM column which can affect the final image quality. It may also damage the SEM imaging system which can be expensive to repair.

TIP 1: Stick the sample securely to the pin stub, by using:

  • Double-sided carbon sticker
  • Conductive paint
  • Conductive tape
  • Special clamps
  • A combination of the above.

TIP 2 : Remove all loose particles from your sample after adhering the sample to the pin stub by:

  • Holding the aluminium stub with tweezers, tilt it by 90° and gently tapping it on its side.
  • Spraying dry air on the sample.

TIP 3: Use tweezers when handing the pin stub

  • This should be done in order to prevent contamination.

TIP 4: Make sure that the mounting procedure is solid

  • This is so that you do not introduce mechanical vibrations due to incorrect mounting.

TIP 5: DO NOT spray dry air in the direction of any electronics 

  • Or a scanning electron microscope, because it might be flammable.

TIP 6: Make sure there is no condensed liquid in your spray air straw 

  • You can do this by first spraying away from your sample.

These precautions will help to reduce the risk of contamination of your system and sample holder and guarantee better performance over time. Below we discuss best practice sample preparation techniques for 5 common sample types which include: Non-conductive samples; Magnetic samples; Beam sensitive samples; Powders and particles and Samples containing moist or outgassing samples.

Non-Conductive samples

When a non-conductive material like a biological sample is imaged, the electrons fired onto the sample surface don’t have a path to the ground potential, causing them to accumulate on the surface. The image will become increasingly bright or entirely white until details are no longer visible. Mild movement can also be detected, caused by the mutual interaction of the electrons. This will cause blurriness in the collected image. 

Several solutions are widely used:

  • Conductive tapes or paints

By covering part of the sample with a piece of conductive tape (e.g. copper tape) or some conductive paint, a bridge to the surface of the aluminum stub is created. SEM image of sugar cube charging. SEM image of sugar cane in low vacuum. This will allow the sample to partially discharge and is enough to image mildly non-conductive samples when imaging areas close to the tape edge.

  • Low vacuum

Introducing an atmosphere in the sample chamber allows beam interaction with air molecules. Positive ions are generated and attracted by the large number of electrons on the sample

surface. The ions will further interact with the electrons, discharging the sample. While this technique adds some noise to the final image, you can analyse the sample faster and at lower cost without further processing.

Designed to eliminate additional sample preparation of non-conductive samples, it allows samples such as paper, polymers, organic materials, ceramics, glass, and coatings to be imaged in their original state. The charge reduction sample holder contains a pressure limiting aperture which allows a controlled amount of air into the sample chamber to raise the pressure around the sample. The leakage rate is designed for optimal charge reduction while maintaining a high vacuum in the column for stable system operation. Compared to standard holders, the charge reduction sample holder can be used to obtain significantly higher magnification images from non-conductive materials. 

  • Sputter coating

By using a sputter coater such as the LUXOR series, it is possible to create a thin layer of a conductive material on the sample surface. This creates a connection between the surface of the aluminum pin and the ground potential. The choice of coating material is strongly dependent on the kind of analysis to be performed on the sample. Gold and platinum are ideal materials for high-resolution images because both have extremely high conductivity. Lighter elements, like carbon, can be used when Energy Dispersive Spectroscopy (EDS) analysis on non-organic samples is required. An alloy of indium oxide and titanium oxide (ITO) can create transparent, conductive layers, to be used on optical glasses to make them suitable for SEM.

However, there are disadvantages to using a sputter coater: Additional instrumentation is required, the analysis becomes more time consuming, and the samples undergo more pumping cycles. Also, any advantage of using a backscatter electron detector (BSD) to image the sample is lost, as the contrast becomes very homogeneous and there is no difference in gray intensity for different elements. The option for EDS analysis for elemental analysis is also lost.

Magnetic samples

Samples that generate a magnetic field can interfere with the accuracy of the electron beam, reshaping it and producing deformed, blurry images, usually elongated along one axis.

This problem is known as stigmation alteration and consists of an increase in the eccentricity (a measure of how circular the curve is) of the beam cross section. Bigger eccentricities are less curved.

Stigmation correction 

All SEMs offer the chance to tune the stigmation. Certain instruments require the user to fix stigmation values every time, while others can store standard values that are valid for most samples.

The procedure alters the magnetic field of the lenses, which reshapes the beam. When the shape is circular again, the best image can be produced. When changing the stigmation, it might be necessary to finetune the focus again.

Beam-sensitive samples

Delicate samples, like thin polymeric foils or biological samples, can be damaged by the electron beam due to the heat generated in the interaction area or the rupture of chemical bonds.

This will result in either a hole in the surface or a progressive deformation of the scanned area.

Beam settings

The easiest way to reduce this effect is to use lower values for voltage and current. In these cases, the smallest possible values are recommended.

Sputter coating

In the worst cases, a thin coating layer can be applied to the sample to shield the sensitive surface. Increased conduction will also improve image resolution.


Thermal effects can be reduced by using a temperature controlled device. Removing the heat generated by the beam will protect the sample from thermal-induced surface modifications.


Spending a long time on a specific spot will cause damage to the sample, over time. Being quick during the analysis will prevent excessive alterations, but might not produce the best results in terms of image quality.


Zooming in implies having the same number of electrons shot on a smaller area. The thermal drift is increased and the deformation effects will become more evident. When possible, low magnification is recommended.

Powders and particles


When imaging particles, information like particle size or shape are important in the design of the process flow. The easiest way to prepare a powder or particles sample is to collect a small amount of sample with a spoon and let it fall on a carbon double-sided sticker, then using spray air to remove the excess particles.

Unfortunately, this method will cause many particles to overlap, hiding important features, or to be blown off, inducing errors in particle counting routines.

Particles disperser

The best way to prepare a powder sample is by using a particle disperser unit such as our Nebula. This will allow an even distribution of the sample on the sticker, reducing the incidence of overlapping particles and generating a pattern that can be used to study granulometry. Operational parameters, such us the vacuum level and the amount of sample needed, depend largely on the nature of the powder. Factors to consider:

  • Fine powders require a smaller amount of sample.
  • Delicate samples might break due to strong pressure outburst.
  • Hydrophilic samples might need a higher vacuum burst to be separated.

Samples containing moist or outgassing samples

When electron microscopes operate in high vacuum levels, every wet sample that is loaded in the imaging chamber will immediately start to outgas.

Certain samples have microstructures that will resist the phase change, providing excellent results without major concerns.

A typical example is a fresh leaf. A sample without a rigid structure can be imaged if force drying or critical point drying is used to prepare it.

Force drying

To verify whether the sample will resist the vacuum, the use of another instrument, such as a desiccator or a sputter coater, is recommended. Eventual changes in the sample should be immediately noticeable.

Critical point drying

Also known as supercritical drying, this technique forces the liquids in the sample to evaporate, maintaining a low temperature. The evaporation is driven by the pressure level, which is broughtbelow the vapor tension of the liquid in the sample. During this process, the liquids will create fractures in the sample, causing modifications in the structure.


This is an alternative to drying techniques that will preserve the structure of the sample completely intact by freezing the sample. If the phase change is quick enough, the liquids in the sample will not form crystals and the structure will be perfectly preserved. It is important to consider that the phase change is not permanent and a prolonged exposure to a high vacuum will increase the evaporation rate.

Low vacuum

If the sample does not have a particularly high moisture content, using a small amount of sample at a reduced vacuum level can be enough to collect images. The overall image quality will be lower, but the sample can be imaged in its original state.

Small amount of sample

Using a small quantity of sample is sometimes enough to contain the effects of vacuum and evaporation. The sample can be collected with a toothpick and a veil of it can be deposited on the stub. This technique is particularly effective with gels and emulsions.

Sample preparation is just the beginning for faster and better analysis. Learn how to improve your process even more by speaking with an SEM expert. Contact us today


3 Factors to Consider for Automated Live-Cell Imaging

Live-cell imaging is important for many applications however limitations of conventional methods have constrained its routine use. Reliable live cell imaging requires an environment that keeps the cells functioning during the experiment while also being able to ensure the experimental method is not perturbing the cells and affecting the interpretation of the results. Here we discuss the growing popularity of automated live-cell imaging systems and highlight some key features to look for when selecting a live cell imaging system.

What is live-cell imaging?

Live-cell imaging is a microscopy-based technique used to examine living cells in real-time. It offers deeper insights into dynamic cellular processes such as migration, confluency and signaling and can reveal findings that might otherwise have been overlooked. Both brightfield and fluorescence-based live-cell imaging modalities support a range of different analysis needs.

How is live-cell imaging used?

Applications of live-cell imaging span basic research through to biopharmaceutical manufacturing. In a research setting, live-cell imaging can be used during cell culture to help define the best time for harvest, determine the senescence status of cells, assess drug treatments for cytotoxicity or detect and monitor phagocytosis. For example, Phagocytosis, is a process by which certain live cells, called phagocytes internalise foreign matter. This defensive reaction against infection is key in the study of immunology and plays an important role in immune responses, tissue homeostasis, and continuous clearance of apoptotic cells. Generally, phagocytotic activity is assayed using flow cytometry. However, this process only provides quantitative data and does not provide the means to monitor phagocytosis in real time. Performing fluorescence-based assays using the CELENA® X High Content Imaging System with pH-sensitive fluorescent particles, like pHrodo™ Green, can be an effective and efficient system for quantifying and monitoring apoptosis activity. For biopharmaceutical manufacturing, live-cell imaging has broad utility for process development and control throughout the production of biologic drugs and vaccines.

Limitations of conventional live-cell imaging methods?

Historically, live-cell imaging has involved manually monitoring cells by culturing them in a CO2 incubator. The culture vessel is removed several times to take images of cells over time using a digital microscope. This approach is labor-intensive and highly prone to human error, largely because it offers no means of finding the same position in the culture vessel. Fluctuating environmental conditions can also cause cellular stresses, which can compromise results. While benchtop imaging systems improve on this method, they are bulky and cumbersome, and often struggle to maintain a stable environment.

3 Factors to consider when choosing an automated live-cell imaging system

Automated live-cell imaging systems like the CELENA® X offer a flexible design that is smaller, faster and easier to use to meet both the demands of the drug discovery industry and the basic research needs of the smaller laboratory.

Multiple imaging modes that are affordable

Live cell imaging systems that offer both brightfield and fluorescence options for either time-lapse or real-time monitoring offer maximum flexibility. Celena X integrates an automated fluorescence microscope with quantitative image analysis software to process large datasets at an affordable cost. Its interface allows a user to run multi-well or multi-spectral experiments with capacity for multi-point imaging with only a few clicks. The microscope provides for a multitude of fluorescence cell imaging possibilities by supporting all objective magnification from 1.25x to 100x, both brightfield and phase contrast illumination, and with LED filter cubes.

Stable scanning performance and compatibility

The system is compatible with a wide range of cell culture vessels such as multi-well plates, dishes, flasks and slides to cover a wide variety of assay types. Depending upon the application, either image-based or laser-based autofocus methods can be used. The CELENA® X can be used for image confluency in McCoy cells seeded in 96-well plates over 48 hours with brightfield image-based autofocusing, demonstrating how the system can be modified and applied as a high throughput method for various cell-based assays. With laser-based autofocusing and multiple filter cubes, this method utilised the CELENA® X to image the dose-dependent effects of anti-cancer drugs throughout the cell cycle in HeLa cells seeded in 96-well plates, demonstrating how the system can be used in a multivariate drug screening process.

User friendly interface and 3D modeling

An analysis of high content images with large datasets can cause problems with most types of analytical software. Each new assay requires the creation of new modules, which can be challenging for lab staff and often involves IT staff. The CELENA® X provides an easy-to-use modular-design analysis software based on a powerful CellProfiler engine. Tens of thousands of images can be analysed automatically to obtain quantitative information with a complex software setting or optimisation.

Three-dimensional (3D) cell models can provide a more accurate representation of real cell environments than 2D cell culture systems but requires a different strategy of imaging and analysis compared to 2D cell culture. Organoids, for example, are organ-specific 3D cell models derived from human stem cells, designed to mimic the functionality and structure of human organs while 3D spheroids can represent a gradient of nutrients and oxygen between cells located in both outer and inner layers which is more relevant to physiological environments. These 3D models are notably useful for studying various types of cancers.

The challenge when imaging organoid and spheroid assays comes from organoids having multiple focal planes making it difficult to acquire in-focused images for multiple organoids. For live/dead cell viability of the single organoid, a different analysis strategy is required since individual cells in an organoid do not exist as a single live/dead status. To address this issue, CELENA X employs MergeFocus software module after acquiring Z-stack images from multi-channel fluorescence.

We invite you to use the Celena X automated live cell imaging system today and compare it for yourself.

Contact us to arrange a free trial.

ATA Scientific Pty Ltd
+61 2 9541 3500 

Product page link: –

5 Step Guide to Improving the Accuracy of Automated Cell Counting

Cell counting underpins numerous applications, spanning basic research through to the development and production of cell therapies. In recent years, manual cell counting methods have been replaced by the use of automated cell counters, which are both faster and more accurate, especially for complex sample types. Here we discuss 5 factors to consider when choosing an automated cell counter and share useful tips for instrument use to ensure accurate results.

1. Match the instrument specifications to the cell type

While cell-based research has traditionally relied on immortalised cell lines, it is increasingly common for more complex sample types to be used. These include primary cells, peripheral blood cells, stem cells, dissociated tumor cells, and even engineered T cells, which vary in terms of size, shape, and aggregation properties. To improve cell counting accuracy, many users have switched from manual, haemocytometer-based methods to counting cells with automated platforms. However, automated cell counters can only provide accurate results if they feature the right specifications for the cell type in question. Although an automated cell counter equipped with brightfield microscopy optics and a low-magnification objective represents a budget-friendly option for counting well-isolated, homogenous cell lines, an instrument capable of measuring fluorescence is often a better choice for counting clumpy samples or smaller cell types such as peripheral blood mononuclear cells (PBMCs).

2. Determine an appropriate sample dilution range

So, the question is, what factors should you consider when using an automated cell counter? A primary concern is the optimal dilution range, typically provided as cells/mL. This differs among instruments based on the size of the field of view (FOV), the magnification and the image sensor size. For counting accuracy, the sample should be diluted such that it falls within this range—if too dilute, counts will be inaccurate; if too concentrated, the instrument software will struggle to distinguish individual cells.

3. Consider using fluorescence staining for viability measurements

Most automated cell counters allow viability measurements to be performed using trypan blue staining and brightfield imaging. But, while this approach provides accurate results for homogenous samples like cancer cell lines, it is less reliable for more diverse sample types. For example, primary cells are often contaminated with large numbers of red blood cells, which will be mistakenly classified as dead cells after staining with trypan blue. Cellular debris and non-cellular particles can also be misidentified in this way, leading to data being artificially skewed. Fluorescence-based methods such as acridine orange/propidium iodide (AO/PI) staining provide greater accuracy than trypan blue, regardless of cell type, and are fast becoming a preferred method for measuring cell viability.

4. Maximise the counting volume

Once samples have been prepared for counting, they are loaded onto a chamber slide; this functions to provide a fixed volume measurement based on the chamber height. Because the chamber height is set by the slide manufacturer, scanning multiple FOVs is the best way of increasing cell counting volume to achieve greater accuracy and precision. Modern instruments equipped with an automated scanning stage can count volumes of up to 5.1 µL, which is 10-fold higher than a conventional haemocytometer measurement.

5. Optimise the counting protocol

The most common error when counting cells is incorrect focusing, which, for brightfield microscopy, can lead to poor discrimination of live cells from dead cells. If an automated cell counter offers only manual focusing, it is recommended that users obtain a bright spot at the center of each live cell for brightfield imaging to produce an accurate count. Fortunately, most modern automated cell counters feature an autofocusing function, which can minimise focus-related issues when running in brightfield mode. Alternatively, since fluorescence imaging is relatively insensitive to focus, using a fluorescence cell counter with an autofocus function will eliminate this problem.

Another widespread mistake is failure to optimise the cell counting algorithm, also known as the counting protocol. This is proprietary to each automated cell counter and is used for image preprocessing, object finding, and object classification. As it would be impossible for a single counting protocol to cover every cell type, most algorithms allow the end user to perform further optimisation. Factors to consider here include the cell size, spot brightness, cell detection sensitivity, roundness, noise reduction, and fluorescence intensity, all of which should be carefully tailored to the diversity of cell types within the sample.

Introducing the LUNA™ Automated Cell Counter

The LUNA™ automated cell counter family from Logos Biosystems provides fast, accurate counting for a broad range of cell types, including PBMC and CAR T cells. Dilution of highly concentrated samples and replicate counts at low cell concentrations are no longer necessary when using the LUNA series of Automated cell counters.

The LUNA-FX7 is the newest member of the LUNA Automated Cell Counter family that provides unmatched cell counting accuracy, dual fluorescent and brightfield detection, advanced declustering algorithm, precision autofocus and 21 CFR PART 11 compliance. It has built-in quality control features and precise validation slides for monitoring QC and bioprocesses. LUNA-FX7 meets a broad range of counting applications from confirming cell quantities for single-cell sequencing to highly accurate cell therapy dose.

Key benefits:

  • Larger counting volumes, up to 5.1 µL for high accuracy (1% CV)
  • High throughput using either 8- or 3- channel slides.
  • Bioprocess package that can monitor individual and specific batches.
  • Innovative CountWire™ package for 21 CFR Part 11 compliance.

Higher throughput. Offering a variety of slide options, the LUNA-FX7™ utilises a counting volume of up to 5.1µL, lowering error and CV for each count.

We invite you to use the LUNA-FX7 Automated Cell Counter today and compare it for yourself.

Contact us to arrange a free trial.

ATA Scientific Pty Ltd
+61 2 9541 3500 

Product page link:

The Essential Tools That Are Driving Advanced Additive Manufacturing

Australia’s Additive manufacturing (AM) industry is off and running, transforming the way we produce and distribute goods. Parts which required multiple components to assemble manually can now be produced more viably using AM in a one-step build process. It removes the need for complex shipping arrangements to move instruments from place to place, relying instead on digital files to print products on-demand. In the heat of the COVID-19 pandemic, 3D printing stepped up to become a vital technology to provide solutions to severe disruptions in supply chains ranging from personal protective equipment (PPE) to emergency dwellings to isolate patients. From aerospace to automotive engineering, and from the medical to the dental industry, AM is an evolving technology revolutionising industries across the country.1

Backed by the Australian Government ‘Modern Manufacturing Strategy’ as well as the funding available for the sector, has enabled new manufacturing-focused research facilities that work alongside industry.2 With new opportunities to deliver cutting-edge R&D in AM and materials processing, highly complex or previously unachievable products can be created quickly and efficiently for the global market. CSIRO established Lab22 with a vision to grow a new manufacturing industry as Australia’s Centre for Additive Innovation citing a recent focus on critical mineral and hybrid manufacturing.3 The University of Sydney and GE Additive have also joined forces to collaborate on R&D topics and demonstrate AM technology via the new Sydney Manufacturing Hub.4 These and many other AM research facilities in Australia underpin a growing AM industry helping to build sovereign capabilities. Instead of sending processed ores overseas and importing them back as powders, Australia can forge on finding ways to turn minerals into new AM innovations. 

But as exciting as the possibilities are in AM, the process itself is not without its challenges. Problems with final product consistency and a narrow range of expensive raw materials are some of the biggest obstacles to the widespread adoption of AM. In processes that use powder as the raw material, for example, just one particle could contaminate the rest of the material, impacting the overall quality of the end product. To maintain consistent high quality in these components, producers need to ensure that their input materials are carefully monitored and optimised. 

Why particle characterisation is critical 

Key to developing and manufacturing high quality materials with the required functionality and performance is understanding the relationship between material structure and material properties. From metals and polymers to composites and ceramics, monitoring the particle size and shape is important to ensure the powder supply is consistent and meets specifications. Beyond quality control, it also plays a vital role when investigating novel alloys or composites or developing a new AM process.

Below are two key analytical tools that support additive manufacturers with material characterisation.

The use of laser diffraction for particle size distribution

Particle size distribution is critical for powder bed AM processes since it affects powder bed packing and flowability which in-turn impacts on build quality and final component properties. The Malvern Mastersizer 3000 uses laser diffraction, an established technique for measuring the particle size distribution of metal, ceramic and polymer powders for additive manufacturing, and is employed by powder producers, component manufacturers and machine manufacturers worldwide to qualify and optimise powder properties. A complete high-resolution particle size distribution is provided in a matter of minutes (from 10 nm to 3.5 mm) using either wet or dry dispersion. The technique can also be integrated into a process line to provide real-time particle sizing.5

Automated Image Analysis for particle shape and composition

Powder bed density and powder flowability are influenced by particle size and shape. Particle morphology is therefore, another important metric for powder bed additive manufacturing, with smooth, regular-shaped particles preferable as they can flow and pack more easily than those with a rough surface and irregular shape. The Malvern Morphologi 4-ID provides automated optical image analysis to classify and quantify the size and shape of metal, ceramic and polymer powders. The fully integrated Raman spectrometer also enables component-specific morphological descriptions of chemical species.

The Phenom ParticleX AM is a specialised high-resolution desktop scanning electron microscope (SEM) dedicated to optimising AM metal powders and final product quality. By combining an imaging resolution of <8nm and magnifications up to 200,000x together with X-ray analysis (EDS) for elemental composition, properties such as structural integrity, print resolution, surface uniformity, phases and the presence of impurities or defects can be determined to contribute unique insights not possible with other systems. A scanning area of 100x100mm, grants a large degree of freedom to image and assess the size and shape of whole parts or sections of a larger component simultaneously. This fully integrated system is simple to operate and eliminates the need for outsourcing for quality checks, speeding up time-to-market. 

We can go further together

At ATA Scientific, we don’t just sell our instruments – through collaboration with a broad range of industries and academic institutions, we play a key role in the AM ecosystem. We support our customers by providing optimal material characterisation techniques used in AM together with key insights into the application, measurements and analysis to fully understand material behaviour. 

Contact us for more information today!

ATA Scientific Pty Ltd
+61 2 9541 3500


  1. Additive manufacturing: could it drive global success for Australian businesses?
  2. Additive manufacturing and critical minerals come together at CSIRO’s Lab22 – CSIRO
  3. Additive manufacturing and critical minerals come together at CSIRO’s Lab22 – CSIRO
  4. $25M Sydney manufacturing hub launches to drive state wide innovation
  5. Characterising material properties for powder additive manufacturing

The Importance of RNA Research in Australia

As RNA science continues to make headlines around the world as the new way of making safer, more targeted medicines, it has sparked a wave a new studies that offer significant potential not only as broad-spectrum vaccines but also treatments for cancer, genetic and autoimmune diseases. This is quite extraordinary, given prior to the COVID-19 outbreak, few people outside the RNA research community were aware that this technology even existed. That Australia is still unable to manufacture mRNA vaccines – despite their proven success against the SARS-CoV-2 virus – presents a huge opportunity.

The COVID-19 pandemic brought together world-leading researchers in a mass response from across many fields, diverting resources to deliver a vaccine that could help save millions of lives. Scientists stepped up and worked tirelessly despite the myriad of challenges, the biggest one of which was the challenge of vaccine supply. Recognising this challenge and the need for building local capability, led to a humble yet exceptionally talented scientist from Iceland to step up and run the first RNA institute in Australia.

Australia’s RNA capability strengthens as UNSW RNA Institute opens

Professor Pall (Palli) Thordarson, an award-winning researcher and chemistry professor at UNSW Science, is set to kickstart RNA capabilities here in NSW and finally launch our Genetic Medicine Ecosystem. Fuelled by his interest in the interface between chemistry and biology, Palli has always been fascinated by the role of RNA and now, this molecule is the focal point of his work. RNA science seems like an overnight success, but it has been decades in the making and holds huge potential for making critical contributions toward advancing human health. The facility will allow scientists to connect and network with industry partners and other collaborators to meet research and manufacturing needs.

The importance of driving onshore advances in RNA research and therapies

Today, the NSW Government together with 14 universities that constitute the NSW RNA Bioscience Alliance and the dozen research organisations within the NSW RNA Production Research Network are now involved2. The goal to create a national Genetic Medicine manufacturing facility in Australia, was captured during an early meeting of the Australian RNA Production Consortium (ARPC). It is inspiring to see so much occurring around the nation knowing each of the original members of the ARPC are tirelessly working to build this, in a fashion that is not simply replicating facilities in each state, but to build a collaborative web of skills and infrastructure. These people have changed the future for scientific research in Australia – thank goodness for this cognisant few!

It seemed puzzling to find a chemistry expert amongst so many RNA giants however, as time passed, it had become clear the sheer value someone like Palli had in such a forum. Palli’s fascination with RNA began to emerge early in his PhD, “I remember looking at RNA and RNA-based systems biology as a PhD student and thinking, ‘These are the most fascinating chemical machines in the world’.” Palli has long tried to connect the dots between the pure chemistry world and the biological sciences and just how that chemical machine results in a biological translation.

As we spend inordinate efforts to educate and develop a constant crop of scientists, from high school to early career researchers, we hear the same things from PhD students today as we did back in the 1980’s and I suspect before- “we can’t find work as we are either too inexperienced or we are over-qualified!” Such enormous investment into this ecosystem can only help resolve much of this. 

Whilst there is a great deal of investment in Victoria and some in QLD, there are synergies between the research institutes and many collaborations. The global nature of science has throughout history proven there are no boundaries, no borders and only one tribe – the science tribe – we see beyond nationality – and we travel the planet in a quest for knowledge, going to where the research is. To be part of this global community Australia needs vision such as shown by Palli. Without this, we could never attract the best of the best to come to Australia, nor would we retain our global giants.

Next generation technologies will nurture a wide range of genetic medicines

Being privy to much of the research and seeing technologies you supply have material impact in the future lives of potentially millions of people is inspirational. Consider this technological revolution answers the question we didn’t even know existed – to solve a problem we didn’t know we had. As Palli often states- the missing link to all this manufacturing ecosystem is the humble Lipid Nanoparticle. Now humble should not elicit the thought it is simple, far from it. However, it has been made a great deal simpler by the NanoAssemblr platform3. These technologies are unique. They create very small particles, quickly and repeatably. They encapsulate a payload, such as mRNA, a peptide, protein, or small molecule for the delivery to the cell by stealth. Imagine you are a cancer researcher and you have figured out the mechanism of how the cancer cells are replicating and where they stem from. Using a Nobel prize winning technique called CRISPR you edit the RNA – how do you do this? Hitch a ride inside a Lipid Nanoparticle, slip into the tumour and stop it in its tracks! This is elite science happening right here in Sydney with team collaborations across the nation. This is the key – we can do wonderous science that will save lives – this is what drives the scientist. The government sees the billions in revenue and the possibility of jobs, jobs, and jobs. It makes ethical and economic sense! 

The benefits of RNA research are far-reaching

For those who assume it is just medical, think again. The Agricultural industry is set to benefit from multi millions in investment with the help of Palli to aid in biosecurity, disease prevention and control – think lumpy skin or foot and mouth disease in cattle. It is hard to see any downside to what Palli has orchestrated. It is visionary and all power to UNSW backing him to run down this path. Nice to see Palli back on the farm.

ATA Scientific collaborates with thousands of scientists throughout Australia and New Zealand providing solutions for scientific challenges. We are constantly adding novel technologies that help pose the obscure questions needed to advance science. Collaborate with us today


1) Website Accessed 19 July 2022
2) Website accessed 19 July 2022
3) Website accessed 19 July 2022

An Introduction to Battery Research and Manufacturing

The need to build new energy storage solutions to address the increasing global demand has helped drive a power revolution in battery research and technology. Lithium-ion (Li-ion) batteries are predicted to play a key role in the trend toward renewable and sustainable industrial electrification solutions. As fossil fuels are phased out and CO2 regulations become more stringent, the increase in demand to provide ever more lightweight, low-cost, safe, high-power and fast-charging batteries has accelerated advances in battery technology.

Access to the right tools and technologies can help optimise R&D and production cycles, investigate causes of battery failure, improve safety, and speed up time-to-market, to keep technological progress moving in sync with modern global demands. Here we discuss a complementary set of physical, chemical, and structural analysis solutions designed to enable rapid, high-precision analysis of particle size and shape distribution plus elemental composition of battery materials for the entire process from research through to production.

With the Mastersizer 3000, particle size can be rapidly analyzed with ease, while the Morphologi 4 can image and classify thousands of particles automatically with high statistical accuracy. Our Zetasizer can analyze the zeta potential of a dispersion and also the size and agglomerate state of nanosized materials. Phenom XL G2 scanning electron microscope (SEM) is an unrivalled technique that allows users to observe the 3D structure of powders and electrodes and also identify elements and the presence of contaminants.

The importance of particle size measurement

The performance of a battery can be characterised according to the amount of energy that it can store or the amount of power that it can produce. The maximum battery power can be increased by decreasing the particle size of the electrode material and increasing the surface area. Battery power is determined by the rate of reaction between the electrodes and the electrolyte, while storage capacity is a function of the volume of electrolyte within the cell. These properties are intrinsically linked to the intercalation structure and particle size of the electrode particles, which determine how well the mobile ions are taken up and released by the electrode. Particle size distribution and particle shape influence particle packing, hence the volume of electrolyte that can be accommodated within the interstitial voids of the electrode, which affects storage capacity. As a result, a mixture of coarse and fine particles is often used in the electrodes to increase surface area, whilst also controlling the overall packing fraction of the electrode material to allow good contact between the electrode and the electrolyte.

Particle sizing of electrode materials is commonly performed using the Mastersizer 3000 which uses automated laser diffraction technology. With a measurement range that runs from 0.01 to 3500 µm, the Mastersizer is the particle sizing technology of choice for most battery manufacturing applications – starting from precursor to the final milled electrode materials.

Figure 1 Particle size distribution of three batches of NCM cathode materials synthesized with different processing parameters

Figure 2 Particle size distribution of three batches of synthetic graphite synthesized with different heating conditions

The Malvern Insitec online process systems deliver real-time monitoring of particle size for automated process control. These can be used for either the monitoring of particle size evolution in precursor slurry or in the control of electrode material size right after the mill. Smaller particles in electrode slurry production can be prone to agglomeration and/or flocculation, resulting in uneven electrode coatings and ultimately compromising the electrochemical performance. Aggregation and stability can be monitored by measuring zeta potential (particle charge) using the Malvern Zetasizer Ultra. A low zeta potential will indicate particles likely to aggregate whereas a high zeta potential will form a stable dispersion. The Malvern Zetasizer Ultra builds on the legacy of the industry-leading Zetasizer Nano Series adding high-resolution sizing (Multi-Angle Dynamic Light Scattering) and particle concentration capabilities.

The importance of measuring porosity

Porosity is an important parameter both for the separator and for the electrolyte to transport lithium-ions between the anode and cathode. By controlling porosity, higher intra-electrode conductivity can be achieved to ensure adequate electron exchange as well as sufficient void space for electrolyte access/transport of lithium-ions for intercalation of the cathode. Higher porosity means less heat generated in the cell and greater energy density. However, excessive porosity hinders the ability of the pores to close, which is vital to allow the separator to shut down an overheating battery. Therefore understanding the porosity of the electrode materials is important to guarantee the right ion accessibility and charging speed.

Recognised as the most advanced instrument in the field for material surface characterisation the Micromeritics 3Flex has become a crucial tool for the battery industry. The 3Flex is a high-performance adsorption analyser designed for measuring surface area, pore size, and pore volume of powders and particulate materials. Analysis of BET surface area, pore volume and pore size distribution helps to optimise battery components.

The Micromeritics AutoPore V Series utilises Mercury Porosimetry, a technique based on the intrusion of mercury into a porous structure under controlled pressures, to calculate pore size distributions, total pore volume, total pore surface area, median pore diameter and sample densities.

The importance of measuring surface area

Increasing the surface area of the electrode improves the efficiency of the electrochemical reaction and facilitates the ion exchange between electrode and electrolyte. Lower surface area materials are better suited for improved cycling performance of the cell resulting in longer battery life. High surface area presents some limitations due to the degradation interaction of the electrolyte at the surface and resultant capacity loss along with thermal stability. Nanoparticles hold promise to increase surface area without capacity loss by permitting shorter diffusion paths for lithium-ions between the graphite particles which facilitates fast charge and more efficient discharge rates and improves the capacity of the battery.

The Micromeritics TriStar II Plus is an automated, three-station, surface area and porosity analyser. MicroActive software allows the user to overlay a mercury porosimetry pore size distribution with a pore size distribution calculated from gas adsorption isotherms to rapidly view micropore, mesopore, and macropore distributions in one easy-to-use application.

The importance of measuring particle shape

Shape will affect the electrode coating in terms of packing density, porosity and uniformity. Spherical shaped particles will pack more densely than fibrous or flake shaped particles. The average strain energy density stored in a particle increases with the increasing sphericality. Fibrous and flake shaped particles are expected to have a lower tendency for mechanical degradation than spherical-shaped particles. Automated imaging using the Malvern Morphologi 4 is commonly employed for particle shape analysis of electrode materials but can also be coupled with Raman spectroscopy to give particle-specific structural and chemical information.

The importance of analysing chemical composition 

Deviations in chemical composition or impurities in electrode materials can significantly affect final battery performance. For this reason, chemical composition and elemental impurity analysis are an integral part of the battery manufacturing process. Simple to operate and fast to learn, the Phenom XL G2 scanning electron microscope (SEM) is an unrivalled technique that allows users to observe the 3D structure of electrodes after production; the size and granulometry of raw powders; the size of pores and fibres in insulating membranes and the response of materials to electrical or thermal solicitations. Using fully integrated X-Ray analysis (Energy Dispersive Spectrometer, EDS) the distribution and identity of elements including the presence of contaminants in the battery sublayer can quickly be revealed.

The Phenom XL G2 is the only SEM that can be placed within an argon-filled glovebox, allowing users to perform research on air sensitive lithium battery samples.

The future of batteries

Driven by our need to reduce greenhouse gases with renewable energy and portable communication devices, the Li-ion battery market is growing at 14% Compound Annual Growth Rate (CAGR). Deutsche Bank forecasts lithium-ion batteries will account for 97 percent of battery use in energy storage alone by 2025. Most automotive companies are now investing in batteries and are in a race to patent critical next-generation battery technologies and battery management systems. Companies related to fossil energy or mining are also entering the battery value chain.

Australia is well positioned to capitalise on the significant opportunities presented, having the world’s third-largest reserves of lithium and is the largest producer of spodumene (mineral source of Lithium). Australia currently produces nine of the 10 mineral elements required to produce most lithium-ion battery anodes and cathodes and has commercial reserves of graphite – the remaining element. Australia has secure access not only to all the chemicals required for lithium-ion battery production including precursor, anode, cathode, electrolyte materials but also the knowhow through various research groups at our universities and institutions such as the CSIRO, meaning more advanced batteries can be manufactured locally. However, as demand for lithium batteries continues to increase, it will eventually outstrip supply, so we need to think beyond Lithium to other technologies that can deliver safer, more abundant and cheaper materials (such as sodium, zinc or vanadium) to store renewable energy.

At the University of Wollongong , the Smart Sodium Storage Solution (S4) Project aims to develop sodium-ion batteries for renewable energy storage. This ARENA-funded project builds upon previous research undertaken at the University of Wollongong and involves three key battery manufacturing companies in China. Gelion, a spin-off company from the University of Sydney, is developing gel-based zinc-bromine batteries. The technology uses a unique gel electrode that transforms zinc-bromide technology into a high-efficiency non-flow battery.

Energise your battery research with ATA Scientific

Whether you are a battery component manufacturer looking for greater process efficiency and better quality control, or a researcher striving to determine the performance parameters of newly emerging battery materials, our solutions will offer you the new levels of insight and control needed to power the production of superior quality batteries. Contact us via phone (+61 2 9541 3500), or through our website for a demonstration or quote today!