Gas and liquid chromatography coupled with mass spectrometric detection is the most powerful tool for contaminant analysis in the environment.
Two different ways to work with this method lead us either to constantly lower concentration ranges, and eventually down to natural background, or to a more and more complete overview of what contaminations we are really concerned with at all.
Making atoms and molecules visible
Analysis of chemical compounds which have been released into the environment is based on technical methods by which atoms and molecules can be specifically detected. The challenge is to detect them in small amounts and distinguish them from the general background of the unmeasurable universe of atoms and molecules of which our world consists. The task is to find a way to exclusively see this one sort of atoms or this one sort of molecules. This is generally accomplished by a useful choice from our comprehensive arsenal of spectroscopic methods which are able to recognise the building blocks of matter by their individual responses to radiation – colour – or to magnetic fields according to their mass. A helpful or more often indispensable prerequisite in such an analytical procedure is an isolation, purification, and enrichment of the wanted, the target compound from the background, also called matrix.
In inorganic chemistry, the analysis of chemical elements, i.e. atoms, is most straightforward. The sample is completely destroyed and broken down to its single atoms in a plasma chamber, these are then sorted by their mass by flying through a modulating magnetic field and eventually detected as electronic impulses in a photomultiplier. We can thus count how many atoms of what sort are contained in our sample, and by comparison to standard solutions with known contents identify and quantify the metals in our sample1. In the next more complicated analytical approach, the so-called sum parameters, classes of molecules are extracted and purified from the sample by specifically discriminating procedures. This will yield information, for example, on extractable organically bound halogens (EOX2), eventually detected by a silver electrode, or total organic carbon (TOC3), detected as carbon dioxide after combustion under controlled conditions.

Separation according to molecular properties
As opposed to these often quite brutal approaches, in organic chemistry we need to rely on the extraction of intact molecules of our target analyte from the sample matrix. Much more gentle, and thus sophisticated methods are required for extraction and separation of intact molecules from environmental samples. Traditionally, environmental analysis was centered around relatively inert and very non-polar compounds like petroleum hydrocarbons4, PCBs5, and organochlorine compounds which are easily analysed with one-step hexane extractions and basic gas chromatography. Nowadays our scope has widened towards e.g. phenols6 and organotin compounds7 which need a more complex derivatising extraction, and polar and sensitive pesticides8 and polyfluorinated compounds which are favourably analysed by liquid chromatography. The EU Industry Emission Directive10 now demands analysis of industrial chemicals from the environment. These compounds are often reactive intermediates of chemical processes and sometimes unstable even during extraction procedures, and some need to be detected as their transformation products.
Contributions on Chromatography in this journal display quite a few approaches to explain what at all chromatography is. For the expert, it is beautiful and inspiring to read how colleagues conceive and regard our subject, and the layman can choose among these descriptions what suits him/her best for their understanding. So let us add our perception and concept.
The principle of chromatography was observed when drops of ink fell on blotting paper and started spreading. Prof. Friedlieb Ferdinand Runge in the 19th century named this phenomenon Professorenkleckse11, professor blobs. During migration to the outside, the mixture of pigments is separated into individual rings of the component colours, and the term chromatography was coined from the Greek words for drawing with colours. The effect is based on the paper as stationary phase (don’t tell me that this is where the word stationery for paper products is derived from) and the solvent of the ink as the mobile phase. Separation results from the chemical properties of the pigments which are either better soluble in the solvent or have a higher affinity to the paper. Chromatography is now any separating method where a mobile phase moves along a stationary phase while any substances in the system constantly diffuse between the two phases. As macroscopically visible result, they move with an individual overall velocity which depends on their specific affinity to each of the two phases. Maximum Velocity will be one (completely soluble and moving with the solvent front) and minimum will be zero (so sticky that it is not moving at all).

The time after which a compound appears at the end of the chromatographic system is highly reproducible and a specific property which results from its individual molecular structure. This retention time is thus a first identification characteristic. We will obtain further identifiers by placing a suitable physical detector at the chromatographic outlet. This may be a specific wavelength at which the molecule absorbs or emits radiation – colour – or a fingerprint-like array of mass fragments to which it breaks or associates when hit by charged particles in an ion source.
Method development in analytical chromatography
The development of such a chromatographic method typically starts at the end. We do need the wanted compound – the target – as material, at least a few drops or crystals, in our hands. The first step is to prepare a nice stable and concentrated stock solution to which we can always recourse. To find useful ways of detection, we initially try different types of chromatography and record spectra of the substance we want to detect in suitable dilutions. Then a range of concentrations is checked to find where the detector signal is proportional to the input amount of substance and can thus be calibrated. Such a calibration may yield statistical performance characteristics of the method and give a limit of detection or a limit of quantification. This is the smallest concentration at which we can find the substance at all. Then we look for typical or representative samples from the context in which we need to apply our procedure. These can be garden soil, water from under the bridge at our river, sludge, industrial waste, or just simply tap water or purified sea sand. The test samples are spiked with the substance, and we start looking for appropriate extraction conditions to get most of our spiked stuff back. Eventually the procedure can be communicated to other laboratories, validated by a round robin measurement, and can then be standardised.


During method development, the whole procedure is thus established, starting with the compound in question, from end to beginning. This started easy decades ago with hydrocarbons and organochlorine pesticides, when we just placed a few grams of sample and some toluene in a sealed vial in the ultrasonic bath and put the extract on the machine. A long series of new editions and releases of the standard recipes later, and with a few really nasty compounds lately lining our path, today we are looking at quite a development process. When, say, five brains unite as a working group, distribute tasks, regularly gather their results, and repeat each other’s work, beside their normal everyday duties, we expect three years until a feasible process and another two years for a validation round robin and the final edition of a standard procedure. The scope is typically a class of 5 – 25 compounds becoming detectable by the same principle. This way, the world’s experts on analytical chemistry established an extensive compendium of standard methods (ISO, CEN, and National Standards) for quantitative trace analysis of 4000 – 5000 chemical compounds during the last 40 years. Most of these compounds are pesticides in drinking water.
How many of his sheep can a shepherd know by name?
The number of chemical compounds known to humanity can be estimated from the registry of the Chemical Abstracts Service (CAS), the American Chemical Society’s database on compounds and research founded in 1907. The registry currently contains 204 million entries12. Many of these are everyday compounds or biological components like water (CAS 7732-18-5), sugars (glucose: CAS 50-99-7), or sodium chloride (CAS 7647-14-5). Most others possibly exist in only a few milligrams in one research lab somewhere in the world.
The Stockholm Resilience Centre, a research institute of the Royal Swedish Academy, estimates that there are 350,000 different types of manufactured chemicals on the global market13. If our pace in method development remains stable, and if competent and enthusiastic experts regrow, and no further chemicals are invented, we may hope to possess standard procedures for all relevant chemicals by the mid-60th century.
In 2020, the European Union published its “Chemicals Strategy for Sustainability – towards a toxic-free environment” (CSS14) as part of the Green Deal. The European Environmental Bureau15, an association of environmental citizens’ organisations, has discovered that the REACh process for specific risk assessment of a chemical requires 5 to 10 years, and they conclude that the only way to attain the CSS’s goal within the lifetime of the planet is a change of strategy towards the so-called generic approach to risk assessment GRA. Generic in this context is the opposite of substance-specific and means that a chemical compound can be regulated according to its affiliations within chemical classification instead of individual assessments of toxicity. The Chemical Industry, on the other hand, regards the generalisations implied by this generic approach as a direct way into planned economy16.17

The EChA again, responsible for making the REACh dossiers for all the chemical compounds to be regulated, has noticed that their one-compound-one-dossier concept, the specific risk assessment SRA, is too slow to catch up and eventually keep pace with the development of the market. They developed a likewise generic approach by grouping of substances18. Part of this is again an evaluation of substance classes by lead compounds, e.g. short-chain alkylated phenols or long-chain alkylated phenols.
Some apparently clean water
While in hazard prevention, we seemingly have to choose between either a sort of “inaccurate” protection or, for a vast amount of substances, no protection at all, we obviously are facing a similar dilemma in environmental analysis. The environment will not be toxic-free when it contains the few compounds which we can analyse with trace methods in less than µg/kg amounts.
An especially rude example of misled chemical analysis hit our laboratory last year. Nearly all available trace analytical methods were ordered. We saw a clear water sample, but with a noticeably high refraction, a yellow colour, and a prominent musty stench. Upon extraction with solvents, a slimy suspension formed (Fig 2) and was hard to separate. Even such an extract can be clarified, diluted and subjected to gas chromatography. Preferably the sample can be diluted with tap water, and reported limits of quantification (LOQ) need to be raised accordingly.
BTEX, chlorinated solvents, PFAS, PCB, nonyl phenol, organotin compounds, chlorobenzenes, chlorinated pesticides, brominated diphenyl ethers, phthalic acid esters, chloroalkanes, dioxins, and ethylene oxide were all below LOQ. Among the polyaromatic hydrocarbons, acenaphthene to pyrene were a bit elevated and otherwise the sample was clean according to test report.
The polyfluorinated plethora
The dilemma is classically represented by the current debate in the PFAS field. Polyfluorinated Compounds (PFAS) are a voluminous and very diverse group of industrial chemicals which are derived from normal hydrocarbons by replacement of all or most hydrogen atoms in the backbone by fluorine19. The resulting molecules have the same dimensions and behave likewise in many aspects, but they have extremely convenient properties as water repellants on outdoor garments and as surfactants in extinguishing foams. Biochemically, hydrogen to fluorine substitution in a molecule yields a suicide substrate which fatally inactivates any enzyme acting on this compound. As with many of its gifts to humanity, chemistry had to face the discovery of the PFASs’ toxic properties, ban, and development of analytical procedures for environmental samples. The current standard DIN 36407-42 specifies the analysis of 27 compounds like perfluorooctanoic acid and perfluorooctane sulfonic acid, while 4730 PFAS have so far been developed, registered, and patented20. The debate is now, why should anyone continue to use one of the compounds which are known to be monitored and are there any possibilities to make more representatives of this group analytically visible?

As discussed in the introductory paragraphs, there are so called sum parameters which permit the measurement of whole substance classes by exploitation of some of their common properties. A measurement of total fluorine bound in organic molecules (TOF) is now established. But Regulation does not like it because it is less sensitive than chromatographic trace analysis and considered to be unable to detect PFCs in low but toxicologically relevant concentrations. However, the advantages of both approaches can be combined by a new procedure by which many PFAS can be oxidatively degraded to some of the known and easily analysable compounds. This so-called total oxidisable precursor (TOP) assay20 is currently being standardised in Germany by a DIN working group.
We think that in view of the specific vs. generic dilemma described, the virtue of the sum parameters in chemical analyses should be eagerly reconsidered. More mediating approaches like the TOP assay for PFAS should be devised and investigated if they can provide a viable compromise. One of these ideas involves non-target mass spectrometry screenings in gas, liquid, and somewhere in the future possibly thin-layer chromatography.
Relax, unfocus, and see the whole picture
As already mentioned, molecules can be spectroscopically detected by separating them in magnetic fields according to their mass. To this end they need to be ionised in an ion source. The older technique called Electron Impact (EI) uses an electron bombardment of the molecule in high vacuum and is conveniently placed at the end of a gas chromatograph. The development of ion sources which function under atmospheric pressure was the main challenge in the development of liquid chromatography coupling to mass spectrometry, it is younger and still much more work in progress.
The gas chromatography/mass spectrometry approach, however, with electron impact ionisation (GC/MS-EI) is much simpler and works with exactly the same normalised conditions in all labs. Molecules in the ion source decay to always the same fragments in always the same ratios, this leads to fingerprint like mass spectra which are stored in searchable data bases. From the mass spectra obtained from unknown sample extracts, it is thus possible to completely or partly recognise the compounds contained without ever to have a reference standard in the lab. Moreover, unknown substances can be detected and their spectra documented for future comparison.
This non-target screening has been criticised for being too vague and inaccurate, but the Swiss Federal Office for the Environment22 and the German Waterchemical Society23 issued guidelines, and U.S. EPA methods 8260 and 8270 contain descriptions on reporting screening results. We recently succeeded in fully standardising a procedure for GC/MS from soil samples with surprisingly well interpretable quantitative results. The German Standard 359924 describes extraction, measurement, and a reliable documentation of screening results. It also contains a quantification concept versus plain standard compounds which was found to be very reproducible in a large interlaboratory trial. Results were much less inaccurate than anticipated25.

The method development described in Figure 1 can be dramatically simplified for non-target screenings. A simple and reproducible standard extraction procedure will define which classes of chemical compounds can be captured and thus expected to be seen if present (generic aspect). The database, depending on the quality and significance of the mass spectra measured, will identify signals at least as members of substance classes, but often as specific individual compounds. We will have to put up with somewhat less accurate and less sensitive measurements, but what sacrifice is that for being able to see a much larger portion of the whole truth.
Non-Target GC/MS screening of the sample extracts shown in Figure 2 revealed the sources of the sensual obtrusiveness (Fig 3.). In a chromatogram full of unknown and unidentified peaks, two prominent signals were found for 2,4,7,9-tetramethyldec-5-yne-4,7-diol and dicyclohexylamine26.
Conclusion
The current approach in environmental analysis where a substance is identified, toxicologically evaluated, an analytical procedure developed, and the compound listed in a complex legislation process is too slow to provide a toxic-free environment. Material and staff resources are not available. Users will evade regulation and monitoring by making new compounds, as has been bitterly learned in doping control.
Future analytical strategies will have to include a well-considered combination of generic and specific approaches. Ideally, quick toxicological tests should also be included27. When sum parameters and screenings indicate new pollutions, target and trace analysis can be developed specifically for actual cases. Law will have to devise ways to deal with spontaneous findings appropriately.
References
- DIN EN ISO 17294-2:2017-01 and DIN EN 16171: 2017-01
- DIN 38414-17: 2017-01
- DIN EN 15936:2012-11
- DIN EN 14039: 2005-01
- DIN EN 16167: 2019-06
- DIN 38407-27: 2012-10
- DIN EN ISO 23161: 2019-04
- DIN 38407-36 (F36): 2014-09
- DIN 38414-14 (S14): 2011-08 and DIN 38407-42 (F42): 2011-03
- EU-Directive 2010/75/EU
- L. S. Ettre & K. I. Sakodynskii: M. S. Tswett and the discovery of chromatography I: Early work (1899–1903): Chromatographia volume 35, pages223–231 (1993)
- www.cas.org/cas-data/cas-registry, cited 06-01-2023
- Stockholm Resilience Centre www.stockholmresilience.org/research/research-news/2022-01-18-safe-planetary-boundary-for-pollutants-including-plastics-exceeded-say-researchers.html, cited 11-01-2023
- environment.ec.europa.eu/strategy/chemicals-strategy_en
- eeb.org/wp-content/uploads/2022/05/EEB-Comments_CA44_GARM-1.pdf
- www.vci.de/vci-online/themen/chemikaliensicherheit/eu-chemikalienstrategie/eu-chemicals-strategy-for-sustainability-vci-position.jsp
- Camphanic acid chloride is a valuable tool for the separation of stereoisomers. It was not available in South Africa in 1996 because carbonic acid chlorides could not be shipped by air freight. Stereo isomer separation was instead accomplished on three buckets of silica within five weeks with 50 L of dichloromethane. The author CB owes part of his mental condition to this experiment. Liver damage was prevented by indulgent perseverance
- /echa.europa.eu/en/support/registration/how-to-avoid-unnecessary-testing-on-animals/grouping-of-substances-and-read-across
- H.U. Dahme, Bewertungsschwierigkeiten der PFAS-Belastungssituation und deren Auswirkungen auf den Wirkungspfad Boden-Grundwasser, Altlastenspektrum Jahrgang 31 (2022): 157-204
- OECD Environment, Health and Safety Publications Series on Risk Management No. 39 (2018): Toward a new comprehensive global database of per- and polyfluoroalkyl substances (PFASs): ENV/JM/MONO
- E. F. Houtz and D. L. Sedlak (2012) Oxidative Conversion as a Means of Detecting Precursors to Perfluoroalkyl Acids in Urban Runoff. Environmental Science and Technology 46, no. 17 (2012): 9342-49.
- Screeninganalysen bei Abfall-und Altlastenuntersuchungen. BAFU, Bern 2013
- W. Schulz, T. Lucke et al. Non-Target Screening in Water Analysis – Guideline regarding the application of LC-ESI-HRMS for Screening (2019). Download at www.wasserchemische-gesellschaft.de
- DIN 3599: 2022-02
- The merry notion that organic compounds in solid samples could be measured accurately and sensitively (and that both at the same time) should in any case be very sincerely reviewed (S. Uhlig, B. Colson, P.Gowik (2022): Measurement Uncertainty Interval In Case of a Known Relationship between Precision and Mean. In: Preprints; www.preprints.org/manuscript/202208.0179/v1
- An ink jet surfactant and an anti corrosive for gear oil, not listed in any environmental legislation known to us. An immediate ban of these two will have a similar effect like the Corona measures, and an unlimited continuation of their free emission will render environmental analyses a travesty
- I. Klingelhöfer, N. Hockamp, G.E. Morlock: Non-targeted detection and differentiation of agonists vs antagonists, directly in bioprofiles of everyday products. Analytica Chimica Acta 1125 (2020) 288-298