This is an old revision of the document!
The accuracy of the Conductivity measurement depends on 4 elements:
Another error is due to the temperature compensation.
This will all be explained in the following paragraphs.
This error indicates the possible deviation of the actual conductivity of a non calibrated measurement when using an ideal electrode. The device error is the sum of 2 different errors: repetition error and linearity error.
The error of the device is given in the technical specifications. A Conductivity meter usually has several measurement ranges, therefore is the error usually given as a percentage of the full scale (% f.s.) of the range. This allows to calculate the exact absolute device error value for each range separately.
Error specification: 0.5% f.s. of range
This doesn't mean that when you measure a conductivity of 1000 µS/cm, a few moments later the device might indicate 10 µS more or less in the same solution. The device remains quite stable and has a very small repetition error.
The device error might be minimized after a proper calibration and measurements near the calibration points. The further the measurements deviate of the calibration points, the possible error will increase again towards the maximum device error. This because of the possible linearity error.
A Conductivity electrode is a passive element which remains also rather stable. The error will mostly be coming from impurities or small construction differences. The measurement error is here also the sum of small repetition errors and larger linearity errors. The linearity error is increasing for most electrodes when measuring in higher ranges. Plastic body cells are mostly not suitable for higher conductivity measurements.
Elimination of the error is done by choosing a calibration point(s) near the measurements to be done.
The standards that are used for the calibration are the largest cause of error which cannot be eliminated. The solutions have mostly an accuracy of 1% or 0.5%. This needs to be checked with the supplier or manufacturer of the solutions.
These measurement practices allow to keep the errors low, error due to contaminations of the solutions, temperature differences or not taking enough time.
Many people notice another value than expected immediately after calibration in a KCl solution. Here is explained why.
A Conductivity meter can reflect the absolute conductivity (i.e. not temperature compensated) or the relative conductivity which is the recalculated absolute measurement towards the reference temperature. To obtain the relative value, the meter is using a certain temperature dependent method to convert the absolute measurement. The Consort meters make use of the general accepted international standard EN 27888 to do so. This temperature compensation assumes the measured value is behaving as a natural water. The measured products mostly have another temperature behaviour. This leads to calculation differences, which increase when the temperature difference with the reference temperature is larger.
When the temperature is 22°C in a 0.01 M KCl solution, the meter will show in this same solution 1421 µS/cm while 1413 µS/cm is expected at a reference temperature of 25°C. This is hardly noticed when the calibration is done at or close to the reference temperature.