Improved instrumentation, greater sensitivity and faster analysis

Chromatography is one of the main tools of the environmental analytical chemist in providing detailed information on hundreds of different organic compounds which may be present in the environment. A general introduction to chromatography was provided in this journal last year (Dec 06/Jan 07 issue), giving a description of the basic principles of chromatography and a number of different techniques available for use in the modern laboratory.

However, technology moves ever onwards, and three of the main drivers for improvements in chromatography are:

  • Faster running/processing times
  • Lower limits of detection
  • Increased confidence in target analyte identification

Faster running/processing times

The pressure from end users on environmental laboratories is always to provide data as quickly as possible. For example, most site investigations are conducted because of commercial pressures – the site is to be sold, or a developer is seeking planning permission, and a site investigation is required before these can proceed. Therefore the owner/developer is keen for the information to be available in fairly short time frames. This places pressure both on the environmental consultant and the laboratory, but the laboratory can only operate within the physical constraints of how quickly a sample can be prepped (drying for twenty four hours may be needed), solvent extraction, and then analysis by the appropriate method. Another example of a requirement for a rapid response is when laboratories are monitoring outflow samples to ensure manufacturing or processing plants are compliant with their discharge consent limits – if there is breach of these, the regulatory authority needs to know very quickly, in order to take action.

Laboratories also have to operate according to strict quality guidelines – every batch of samples (fifteen to twenty) must include a blank, an AQC (analytical quality control) standard, and possibly a process sample. Acceptance criteria on these standards must be met, and if there is a failure for any reason, the whole batch must be repeated. This can also add to the overall sample processing time.

Consequently, improvements to chromatography systems which allow the speeding up of run times are welcomed by the industry. An example of these is racer columns, which can be fitted to gas chromatography systems, allowing run times to be cut by more than 50%. A good example is for the analysis of EPH (extractable petroleum hydrocarbons) by GCFID (gas chromatography with flame ionisation detection), where the standard run time is 25 – 30 minutes.

When a racer column is fitted, this allows this run time to be reduced to less than ten minutes for a similar hydrocarbon range.

Generally, with any form of chromatography, the downside of shortening the run times is a loss of resolution in the separation of the compounds. This is true of the racer column analysis, whereby there will be some co-elution of some of the compounds. This may not be a problem, depending upon how this technology is applied – for example, it is extremely useful in providing a screen for TPH (total petroleum hydrocarbons), which can assist the environmental consultant in deciding where the hotspots are on a site, and where remediation is most needed. It can also be useful in monitoring the ongoing progress of bioremediation, when an indication of how successful the treatment is may be required.

However, it would not be suitable in assessing the sixteen PAHs (polyaromatic hydrocarbons) used as priority pollutant markers, or for the detailed CWG (criteria working group) analysis, as co-elution does occur.

Notwithstanding these limitations, the use of racer columns provides faster analysis at cheaper rates, and is now firmly established within the industry.

Other procedures for improving the analytical speed of samples include the use of more automated processes. Autosampler carousels have been in use for many years now, which allow samples to be loaded up in the day and to run unattended overnight. These are very popular with laboratories, as not only do they allow more samples to be processed in a twenty four hour period, but they are also maximising the use of very expensive analytical instrumentation. It would be unlikely now for an instrument to be designed and sold without an autosampler.

One more recent improvement is the use of automated sample preparation stations, which sit on the chromatography instruments, and will automatically add all the internal standards and surrogates used to quantify the compounds of interest and to monitor the recovery of these compounds, respectively. If these additions are performed automatically on line, this will save preparation time, and also provide more consistent and precise analytical data. An example of one such system is shown in Figure 4.

Improved software for data processing has also become available over the last few years, whereas it used to take the analyst longer to assess the chromatography and process the data than it did for the instrument to run the sample. There are several versions of software now available such as Galaxie and Chromeleon, which have significantly improved on these processing times, again reducing these by over 50% in many cases.

Lower Limits of Detection

A major driver in the last few years is the need for laboratories to be able to achieve ever decreasing detection limits for priority pollutants, particularly in water samples. Much of the pressure for this is being driven by legislation from the European Union, a good example of which is the Water Framework Directive. The phrase used in the directive for many of the pollutants of concern is that they should ‘not be discernable’. The regulatory bodies (specifically the Environment Agency in the UK) will stipulate MRVs (minimum reporting values) which they consider should be achievable, and this places significant pressure upon the laboratories to achieve limits of detection at the nanogram/litre level of concentration.

Analyte MRV(ug/l) LoD (ug/l)
1,1,2-trichloroethane 0.1 0.01
2,4-dichloropropanol 0.1 0.01
Aldrin 0.003 0.00003
Off site biological treatment 0.03 0.003
Azin-phos methyl 0.001 0.0001
Benzene 1 0.1
Cadmium 0.1 0.01
Carbon tetrachloride 0.1 0.01
Chlorvenphinphos 0.001 0.0001
Chloroform 0.1 0.01
PCB (individual congener) 0.001 0.0001
diazinon 0.001 0.0001
hexachlorobutadiene 0.005 0.0005
Mecoprop 0.04 0.004
Mercury 0.01 0.001
DDT 0.002 0.0002
permethrin 0.001 0.0001
Simazine 0.03 0.003
Toluene 4 0.4
TBT/TPT 0.001 0.0001
Xylenes 3 0.3

Table 1: List of selected analytes and MRVs from Appendix 7 of the EA policy document on Hydrogeological Risk Assessment on Landfills

To achieve a reasonable level of certainty associated in reporting these values, it is usually ideal to reach an LoD of ten times less than the MRV, so it is obvious that these levels will pose significant problems for environmental laboratories, and that careful sample handling by skilled and experienced analysts is needed to ensure data is accurate and meaningful – this is not bucket chemistry. Even with everything being favourable, it is not currently possible to achieve many of these LoDs with existing equipment. The EA itself has recognised that these may only be achievable on clean water samples, and that LoDs on ‘dirty’ waste waters or effluents will be much higher.

There are a number of methods laboratories can use to attempt to achieve these concentrations – some based on practical procedures, and some on new equipment. However, improved detection limits procedures are often in direct conflict with the need for faster results, so these two requirements are not always compatible.

  • Use larger volumes of water for extraction – the larger the volume used, the lower the detection limit reached, e.g. if the method normally uses 500 ml and reaches an LoD of 10 ug/l, then using 1000 ml will provide an LoD of 5 ug/l. This can be a problem where the sample volume is limited, and the consultant has experienced difficulties in obtaining larger volumes of water, or where multiple tests are needed on the same sample
  • Using concentration techniques such as the Turbovap – water samples are extracted using 20 ml or more / of solvent, and this extract can be reduced in volume using compressed air or nitrogen to evaporate the solvent to 1 ml, thus producing a twenty fold improvement on the LoD if the original solvent extract was analysed Care must be taken to avoid cross contamination of solvent vapour or the loss of the more semi-volatile compounds
  • Using large volume injectors (LVIs) fitted onto the chromatography system – most GC or GCMS systems involve a sample introduction loop which only uses one or two microlitres of solvent extract, whereas these systems enable up to 100 microlitres of sample to be added directly onto the column. This has the twofold benefit of both improving the LoD and/or reducing the volume of water required for extraction. A downside of these systems is the problems which may occur with heavily contaminated or ‘dirty’ samples. The increased volume of solvent injected may cause unacceptable build up of debris in the injector or on the actual column, so some sample pre-treatment may be necessary
  • Improved sensitivity for instruments – newer systems, particularly mass spectrometry systems have improved significantly over the last few years, allowing nanogram concentrations to be achieved more easily. However, there is significant capital expense in replacing older mass specs with newer versions, as the average system now costs in excess of £60,000 (¤84,000)

Increased confidence in target analyte identification

We currently have thirty three listed hazardous priority pollutants, which are considered to be the most toxic of the thousands of compounds potentially present in the environment. Many of these other compounds are very similar in mass or polarity, and are difficult to resolve and isolate. Improvements in mass spectrometer systems, such as using triple quad detection and ion traps, allow the use of MSMS identification of target suites. This technique allows the ionisation of the molecule as per a normal MS operation, then it subjects one of more of the fragment ions to secondary ionisation which provides a secondary mass spectra. With the correct selection of characteristic ions, this can became an extremely valuable tool in target compound identification, as the probability of having two compounds with the same retention time, producing the same ions upon ionisation, is low, and is even lower when you move to optimised secondary ions. This provides even greater confidence in the results, further reducing the number of false positive or false negative results produced on simpler or older MS systems.

For target and non-target species detection, time of flight (TOF) mass analysers can significantly help identify trace levels of materials in difficult matrices, by utilising very sophisticated deconvolution software, as this allows very similar compounds to be separated, detected and quantified.

The implementation of triple quads and TOFs has resulted in a new generation of software platforms to deal with this increasing complexity of information rich data produced by these instruments, and many manufactures are therefore taking this opportunity to provide combined analytical processing platforms that include GC, LC and MS data.

Conclusion

In conclusion, the scientific and technological advances are attempting to keep pace with the legislative requirements, but it will require increasing investment from laboratories to maintain their place at the forefront of this situation. A possible scenario for the future is that laboratories may become more specialised in particular fields of analysis, such as pesticides, rather than attempting to cover the full range of environmental analysis. Currently in the UK, most Chromatography environmental laboratories are fully stretched meeting the requirements of the market, and are under constant pressure to provide high quality data in more rapid timeframes for an increasing range of parameters. It will be interesting to see if any polarisation of the market does take place over the next five to ten years, as the provision of centres of excellence for specific analyses would seem a logical route to follow.

Published: 10th Mar 2008 in AWE International