**You only get what you pay for! Are low budget instruments worth what you pay for it?**

“Nano” is one of the most attractive and modern terms in research, industry and marketing.

Due to the excellent definition work of the ISO committee TC229 since August 2008 there is a precise determination what “Nano” really means (ISO/TS27687). Also development in this field, as well as increasing knowledge is growing daily at an accelerating speed.

Only the size measurement technologies and evaluation methods don’t keep pace with the changing trends. Today most sizing instruments are based on a more than 30 year old PCS (photon correlation spectroscopy) technique applying micrometer liquid layers for detection of high concentrated samples in order to minimize the influence of multiple scatter. Even with today incomparably increased computing power they still stick to simplest 2^{nd} Cumulant evaluation methods or in some cases to highly damped evaluation modes of Contin style or NNLS (non negative least square). All these instruments only present result graphs or data without checking the relevance. This means without proving, that the issued result represents the measured data correctly. Issued simplifying values as e.g. “error of fit” cannot replace the direct comparison of the measured correlation function to that one representing the issued result. Also the popular heading for “Olympic targets, faster, wider, higher” has lead to an increasing number of easy to handle but inaccurate nano size measuring instruments.

In most cases the increased speed is gained by a pre selection of measured data to skip signals that disturb the smooth curve progression of correlation curves. All this old fashioned evaluation procedures were due to the fact that bad conditioned sets of equations could not be evaluated properly with old days computing capacity. Nowadays this is possible if one takes a little effort in adjusting the evaluation range correctly, which is assisted by the software itself in a close to perfect way already. Not to mention the incorrectness of using an intensity matrix instead of a volume matrix for evaluation.

In general terms applying PCCS technology together with not damped NNLS evaluation based on volume matrix, today can lead to highest sensitivity and most correct results.

Nevertheless, the market is flushed with less sensitive and incorrect instruments at lowest cost. Many of those pretend additional user value by also measuring Zeta-potential in the same instrument without extra cost. But is this kind of zeta-potential measurement worthwhile?

According to Smoluchowski’s formula applied for Zeta-potential determination (pictures see the attached PDF file please) The speed and thus the Zeta-potential is depending on the particle size. Commonly the particle size used for Zeta-potential calculation is the so called **Z-average diameter** or **cumulative mean diameter.** Already from this point of view the achieved results are highly doubtable as many samples are not mono modal and thus using a mean diameter is incorrect.

Even though the *Smoluchowski theory* is very powerful because it works for dispersed particles of any shape at any concentration. Unfortunately, it has limitations on its validity. As in real systems in addition also other properties take effect like e.g. hydrodynamic hull, ionic atmosphere and degree of dissociation of the electrolyte. It does for instance not include the *Debye length* *1/k*, the ratio between core size and ionic layer diameters as well as the contribution of surface conductivity, expressed as condition of small *Dukhin number*. Approximations try to overcome this:

The *Huckle approximation* for the Henry function takes care of this for

small particles in low dielectric constant media (eg non-aqueous media) in this case

*f(κa) becomes 1.0 .* While the *Smoluchowski approximation* for the Henry function takes care of particles larger than about 200 nm in electrolytes containing more that 10^{3} mol of salt,

in this case *f(κa) is set to 1.5* (pictures see the attached PDF file please).

This means that using a simple PCS instrument (mostly leading to a mono modal 2^{nd} Cumulant result only or calculating for a mean diameter) applying the described rough estimating evaluation will lead to a Zeta-potential value of lowest precision. If this is combined with a simple dilution of the sample as necessary for PCS detection, the potential may even be changed before measurement due to non iso-ionic dilution.

*This is why the use of a separate acoustic Zeta-meter instead of PCS based ones is highly recommended.*

*Again, with respect to size determination, only PCCS instruments with most modern evaluation algorithms can supply highest resolution results and can prove the relevance of those directly. *

For more details you may refer to http://www.sympatec.com/EN/PCCS/PCCS.html

Full text as PDF: Size- & Zeta-pot.-measurement you only get what you pay for!