Corelative Studies

Categories

Get in Touch with Us

correlative microscopy nanoparticles

Nanoparticle research increasingly uses correlative approaches because no single technique captures all particle properties. Combining complementary methods supports formulation optimization, quality control, and mechanistic insight. Transmission electron microscopy (TEM) reveals morphology, nanoparticle tracking analysis (NTA) measures size and concentration in native solution, dynamic light scattering (DLS) provides rapid ensemble averages, and atomic force microscopy (AFM) maps individual particle topography. Together, these methods reveal subpopulations, low-level aggregation, and differences between solution and dried states that isolated measurements often miss.

At Hyperion Analytical, we offer the Envision system to enable researchers to specifically integrate into these multi-modal workflows. Envision NTA fits naturally into correlative studies because it works in liquid phase. You can measure the same sample that you’ll later image with TEM or analyze with AFM. The technique tracks individual particles in real time, giving you size distribution and concentration data that electron microscopy alone can’t provide.

Challenges in Accurate Corelative Microscopy Nanoparticles Analyses

Running correlative studies isn’t straightforward as sample preparation differs dramatically between techniques. Here are some challenges involved.

  • Varying prerequisites: Different techniques measure particles under vastly different conditions. TEM requires high vacuum and dried samples, while AFM demands immobilization on solid substrates. These preparation steps can alter particle structure, size, and aggregation state.
  • Data integration issues: Each instrument outputs different file formats and uses different sizing algorithms. Comparing results requires careful consideration of what each technique actually measures. TEM gives you a projected 2D area. Dynamic light scattering provides a hydrodynamic radius. NTA tracks Brownian motion to calculate equivalent spherical diameter. The output from each of these certainly varies, and hence it may be complex to find common ground.
  • Sample heterogeneity: This adds a layer of complexity as nanoparticles aren’t uniform. A batch contains size distributions, shape variations, and potentially aggregates. Taking a representative subsample for each analysis method while maintaining statistical validity takes planning. You also need enough material, which can be limiting for precious samples such as early-stage pharmaceutical formulations.
  • Timing: Nanoparticles can aggregate, dissolve, or undergo surface chemistry changes over hours or days. If your correlative microscopy nanoparticles workflow takes a week to complete, the core of the nanoparticles and subsequently the output may change, especially when it comes to biologics or unstable formulations. 

Benefits of Using NTA for Studying Correlative Microscopy Nanoparticles

Despite these challenges, correlative studies deliver insights extremely useful for researchers, and which no single technique can offer. For instance, NTA enables rapid concentration measurements across a wide size range. When you combine these values with the morphological detail from electron microscopy or the chemical information from Raman spectroscopy, you validate and contextualize each dataset. Arriving at a common ground with all these different values may lead to phenomenal output. Here are some benefits of such corelative studies using our Envision NTA system.

  • Small sample volumes: The Envision system makes this whole process easier because it requires very small sample volumes. A sample size as small as 20-50 microliters is enough for a complete measurement. This means you can split the sample across multiple techniques without worrying about running out.
  • True number weighted distributions: Envision tracks individual particles in liquid suspension, generating true number-weighted size distributions. Each particle contributes equally regardless of size, so minor populations remain visible. This prevents the bias seen in ensemble techniques where larger particles dominate the signal.
  • Simultaneous Size and Concentration Validation: The system provides simultaneous size and concentration measurements. Concentration data validates that sample dilution for other techniques hasn’t altered the particle state. For instance, if a formulation shows 2×10^11 particles/mL by NTA but appears aggregated under TEM, the researcher can investigate whether the dilution step for TEM induced aggregation.
  • Broad Particle Size Coverage: Envision operates across a broad size range (approximately 30-1000 nm for most particles, with lower limits of 10-30 nm for highly scattering materials). This span captures primary particles, small aggregates, and larger contaminants in a single measurement. Researchers can track how populations shift between techniques. Does cryo-TEM reveal structures below NTA’s detection limit? Does AFM capture aggregates that appear as a tail in the NTA distribution?
  • Fluorescence Mode for Selective Population Tracking: Fluorescence mode enables selective tracking of labelled particles in mixed populations. When running correlative studies on targeted nanoparticles, fluorescence NTA can track only the ligand-conjugated fraction. This pairs naturally with fluorescence microscopy, providing size and concentration metrics that complement imaging data.
  • Rapid Screening Optimizes Resource Allocation: The measurement speed (3-5 minutes per sample) permits screening large sample sets before committing to low-throughput techniques.
  • High Sensitivity for Weakly Scattering Particles: The optical design delivers high signal-to-noise imaging even for weakly scattering particles. Lipid nanoparticles, polymer micelles, and protein aggregates often scatter light poorly. Envision’s sensitivity extends correlative analysis to these challenging systems where other optical methods fail.
  • Seamless Data Integration with Analysis Software: Data export formats integrate easily with analysis software used for microscopy and other techniques. Size distributions, particle tracks, and concentration measurements can be imported into plotting tools, statistical packages, or custom scripts. Overlaying NTA histograms with TEM size distributions becomes straightforward, facilitating direct visual comparison.

Key Applications

 Here are some key applications of correlative studies across scientific research segments and industries.

  • mRNA Lipid Nanoparticle Development: Lipid nanoparticles for mRNA vaccines need thorough characterization. Combining NTA with cryo-EM gives you both the population statistics and the detailed structure. Process engineers use NTA to monitor batch-to-batch consistency while formulation scientists use correlative studies to understand how composition affects particle properties. This matters when you’re scaling from lab bench to commercial manufacturing.
  • Extracellular Vesicle Research: Extracellular vesicles are heterogeneous and require correlative characterization. NTA quantifies particle size and concentration, including marker-specific subpopulations via fluorescence. TEM confirms vesicle morphology and excludes contaminants. Correlating NTA with flow cytometry defines reliable size ranges for exosomes and microvesicles and improves experimental interpretation.
  • Virus and Viral Vector Characterization: Viral vector analysis combines NTA concentration measurements with structural and genomic assays. TEM differentiates full and empty capsids, while NTA provides rapid total particle counts. Correlating NTA with qPCR yields full-to-empty ratios. Aggregate detection by NTA is validated using ultracentrifugation or complementary light scattering methods.
  • Nanomedicine Formulation Screening: Correlative workflows balance throughput and detail during formulation screening. NTA identifies candidates with acceptable size, distribution, and concentration. TEM confirms morphology, DLS evaluates reproducibility, and zeta potential assesses surface charge. Correlating stability changes measured by NTA with microscopy reveals failure mechanisms during stress testing.
  • Nanoparticle Targeting and Surface Modification: Surface modification studies use NTA to detect size shifts and aggregation after ligand conjugation. Fluorescence NTA confirms labeled populations and size-dependent binding. Correlating NTA with microscopy and binding assays links physical particle changes to targeting performance and functional outcomes. 

Frequently Asked Questions (FAQs)

How do I maintain sample integrity between different correlative measurements?

Run NTA first since it’s non-destructive and requires liquid samples. Store aliquots at appropriate temperatures (usually 4°C) in low-binding tubes. For time-sensitive samples, complete all liquid-phase measurements within the same day. Document the time elapsed between measurements, especially for aggregation-prone systems. Consider running a second NTA measurement after other techniques to verify the stability of the sample.

Can I use the same dilution for NTA and dynamic light scattering in a correlative study?

Usually, no. NTA works optimally at 10^7 to 10^9 particles per milliliter, while DLS often needs higher concentrations for good signal. You need to prepare separate dilutions from the same stock, and you can optimize each measurement independently. Always run concentration measurements at their ideal dilution, then back-calculate to the original concentration. This gives more accurate results than using the same dilution for both techniques.

What’s the minimum sample volume needed to complete a full correlative panel including NTA, TEM, and DLS?

NTA needs 20-50 µL per measurement, TEM requires about 5 µL for grid preparation, and DLS typically uses 50-100 µL depending on your cuvette. Always prepare 3-5x more than the theoretical minimum to account for pipetting losses, reruns, and validation measurements. For truly limited samples, prioritize which techniques give you the most critical information first.

Should I run correlative studies on every batch or just during initial development?

During development, run full correlative panels to establish relationships between techniques and understand your system thoroughly. Once validated, use NTA as your primary routine QC tool with periodic correlative validation (quarterly or whenever you see unexpected results). This approach balances thoroughness with practicality. If you’re submitting regulatory filings, plan correlative studies on formal stability and validation batches. For research publications, at least one full correlative dataset significantly strengthens your characterization section.