Skip to main content

Tag: gas analyzer

Practical Considerations for Quantitative Gas Analysis with Quadrupole Mass Spectrometers

Practical Considerations for Quantitative Gas Analysis with Quadrupole Mass Spectrometers

Many factors must be considered when comparing the overall suitability of different quadrupole-based gas analyzers for any given application and the list can sometimes appear daunting and confusing. This can be due to inconsistencies in the way that different manufacturers choose to define specifications or, in some cases, omit them altogether.

These factors can be categorized into two main areas: (i) inlet interface suitability and (ii) quadrupole mass analyzer suitability. This article aims to remove some of this confusion and define and present those practical specifications which are critical for repeatable and reliable quantitative gas analysis.

The suitability of the inlet and interface determines how well the gas analyzer can capture, condition or transfer the gas sample without altering it and for it to be measured on an appropriate timescale, which could be milliseconds or hours. The inlet and interface can include both the upstream transfer elements and the downstream pumping and gas handling elements.

Quadrupole Mass Spectrometer
Assuming the inlet and interface are properly designed and equal between systems, then the quadrupole mass spectrometer is the critical element determining the overall precision, stability, and detection limits of the gas analyzer. The quadrupole mass spectrometer includes the ionization method, the transmission characteristics, and the quality of the driving electronics.

Precision, stability, and detection limit are often mis-represented in commercial literature. This misrepresentation can be addressed and clarified by directly comparing two different classes of quadrupole analyzers: a 6mm rod diameter, RGA type instrument, typical of many currently on the market, and a higher performance 19mm rod diameter instrument, used in more demanding research and industrial applications. These two systems are compared with nominally identical inlet/transfer conditions, so that only the mass spectrometer performance is under consideration. This presents a direct comparison of the practical range of precision, stability and detection limit in each case so potential users of this powerful analytical technique may be better equipped to make meaningful comparisons between different suppliers.

The MAX300-CAT is typical of the high-end RGA based gas analyzers, based upon 6mm quadrupole rod technology, whereas the MAX300-LG is a higher performing analyzer based on 19mm quadrupole rod technology and more sophisticated electronics.

Detection Limit Comparison
The specified figure of detection limit can be very misleading. Often it will be a calculated figure, or it may reflect data that has been averaged and smoothed for long periods of time to give a best possible case which is often not achievable in practical situations. Nonetheless, the ultimate detection limit is a good starting point to begin to define the practical capabilities of the analyzer.

Speed of Analysis
Analysis speed is a key factor in quantitative gas analysis. Applications such as catalysis, reaction monitoring or kinetics, and evolved gas monitoring all require faster capture of process changes than QA/QC applications, while a breath measurement application needs to report quantitative differences on the millisecond scale. Note that this refers to the ability of the analyzer to measure, with the desired level of accuracy, raw signals and then analyze these in a given timeframe, taking into account spectral interferences, in order to output the result of a single analysis. The rate at which an analyzer scans directly influences this data quality. Slower scanning or more averaging yields more repeatable results and lower detection limits.

Analysis Precision
Analysis precision (or short-term repeatability) represents the standard deviation of analysis results over short time periods. Repeatability can be improved by slowing analysis scan speed or averaging more scans.

Analysis Stability
Analysis stability is a representation of drift or fluctuations over long-term data collection. It is a critical factor which influences longer analyses such as process control, slow heating TGA and thermal analysis, and air monitoring, but also impacts general instrument operation. Stability allows for accurate results over time, less calibration frequency, and confidence in the day-to-day repeatability of the analyzer.

Dynamic Range
Large dynamic measurement range is an essential requirement of quantitative gas analysis and becomes especially apparent in applications such as solvent drying, where species must be monitored from high to low concentrations with accuracy and repeatability.

The MAX300-CAT, a high-end RGA based gas analyzer using 6mm quadrupole rod technology, can demonstrate low detection limits of approximately 5 ppb, using slow scan speeds. The scan speed on this instrument can be increased to a typical quantitative analysis rate of 2 seconds per component, resulting in an increase of detection limits to 0.5 ppm. The MAX300-CAT has a maximum speed of approximately 2-seconds per component in quantitative scans. While this changes the instrument precision, the stability remains constant. The dynamic range of the MAX300-CAT allows for an analysis range from 1×10-6 to 5×10-13 Torr (100% to 0.5 ppm), when scanning at a rate of 2 seconds per analysis component.

The MAX300-LG, a higher performing analyzer based on 19mm quadrupole rod technology and more sophisticated electronics, displays extremely low detection limits of <1 ppb, using slow scan speeds. The scan speed on this instrument can be increased to a typical quantitative analysis rate of 400 milliseconds per component, resulting in a moderate increase of detection limits to <10 ppb. The MAX300-LG has a maximum speed of 5 milliseconds per component in quantitative scans. This instrument has incomparable precision and stability, a result of the large quadrupole and high-performance electronics combination. The MAX300-LG demonstrates a very large dynamic range from the dual detector setup, allowing an analysis range of 1×10-6 to <1×10-14 Torr (100% to <10ppb), while scanning at a rate of 400 milliseconds per component.

Continue reading

Using a Mass Spec in Semiconductor Fabrication

Consider Using a Mass Spec in Semiconductor Fabrication

Ultra-pure gases are a necessity for semiconductor device fabrication and the continuous monitoring of bulk gas purity can ensure maximum production. Contamination is costly. Semiconductor manufacturers need the ability to continuously verify the purity of process gases in real-time and detect trace contamination at concentrations in the low parts-per-trillion (ppt).

Our ultra-high purity gas analyzers have the speed, sensitivity, and ease-of-use to continuously monitor Nitrogen, Argon, Helium, Oxygen, and Hydrogen supply streams and rapidly report ppt-level contamination to protect the electronics fabrication process. The Process Insights VeraSpecAPIMS combines Atmospheric Pressure Ionization (API) technology with a high-performance mass spectrometer optimized over five decades in industrial gas analysis. Process Insights is the only mass spectrometer manufacturer in the world that utilizes a 19mm, tri-filter quadrupole mass filter in semiconductor gas analysis for the very best performance, reliability, and uptime.


  • Confident supply of UHP production gases
  • One analyzer for all contaminants
  • Fully automated, real-time contamination alerts
  • Reliable 24-7 process protection
  • Maximized wafer yields

Atmospheric pressure ionization is a technique that gives a mass spectrometer the very highest sensitivity for trace gas analysis in UHP samples. A corona discharge needle is used to ionize the molecules of the bulk gas sample. These ions readily transfer this charge to contaminant molecules with lower ionization potentials. The approach yields ionization efficiencies approaching 100%, ensuring exceptional detection limits.

VeraSpec APIMS for Continuous Semiconductor Bulk Gas Purity Verification

While APIMS allows for high ion currents, resulting in low detection limits, the technique is limited to species whose ionization energy is less than that of the bulk gas, or components with sufficient proton affinity to be ionized. The VeraSpec APIMS system combines both EI and API ionization sources. Having two ionization techniques allows for the complete analysis of all components in the pure gas sample with one system.

The Questor5 process control software that drives the VeraSpec APIMS System is designed for continuous gas monitoring in a process environment. The intuitive web-based interface allows the user to check instrument status, review data, or run an acquisition from anywhere on the network, while maintaining government and industry security standards for login and electronic record keeping.

Continue reading

Validating CRDS for Moisture Analysis in Medical Oxygen

Validating CRDS for Moisture Analysis in Medical Oxygen

Medical oxygen is one of the most commonly used gases in the healthcare industry, from giving O2 to critical care patients, providing the basis for anesthesia, to supplementing O2 to patients with chronic lung diseases, such as COPD.

To ensure that the oxygen meets the necessary quality to prevent harm to patients, strict standards outline limits to a variety of possible impurities in the gas, one of them being water vapor (H2O). One of the most common standards for medical oxygen is the European Pharmacopeia (EP) standard, we will demonstrate analytical equivalency between the Process Insights’ Tiger Optics’ Spark Cavity Ring-Down Spectroscopy (CRDS) analyzers and demonstrate analytical equivalency to traditional electrolytic moisture analyzers, so the Spark can be used as a more modern and powerful alternative.

Proving Equivalency to European Pharmacopeia
The EP standard dictates that the maximum water vapor content in medical and pharmaceutical grade gas must be less than 67 parts per million (ppm), and the recommended method for analysis of moisture content in medical gases is electrolytic based sensors. Since this standard was published in 1999, gas manufacturers have significantly improved their process efficiency, resulting in considerably higher purity product; at
the same time, the state-of-art in analytical technologies for moisture measurement has evolved. The combination of improved analytical capabilities and higher purity product creates an opportunity for gas manufacturers to maximize the return on oxygen by qualifying it for multiple uses in a single validation step.

Based on powerful, proven CRDS, the Process Insights‘ Tiger Optics Spark H2O offers a wide dynamic range, from single-digit parts-per-billion to one thousand ppm for analysis of moisture in oxygen. This low-cost, fast and accurate analyzer features self-zeroing and auto-verification, eliminating the need for field calibration and saving time & money on labor and consumables. In addition to qualifying oxygen, the same analyzer can service nitrogen, argon, helium, hydrogen, clean dry air, and many other gases and mixtures. In support of the proposed use of the Tiger Optics Spark for qualification of medical oxygen, we present the following validation data, demonstrating equivalency in accuracy of the Spark H2O with two EP-approved electrolytic moisture analyzers.

The Tiger Optics Spark analyzer allows accurate measurement of moisture in oxygen to within ±4% or 6 ppb, whichever is greater, as clearly demonstrated in the present validation data. Thereby, it demonstrates equivalency with the European Pharmacopoeia standard, which mandates a relative accuracy of less than ±20%. Plus, the Spark affords a significant performance advantage over the incumbent electrolytic based sensors, including lower detection limits, wider dynamic range, higher accuracy, and faster speed of response. This allows for better throughput and simplified product qualification, ultimately saving end-users time and money. It should be noted that the ability to conduct one-step qualification of pure oxygen for multiple applications provides significant value to users.

Continue reading

A Single-Supplier Solution For Impurity Monitoring In Fuel-Cell Hydrogen

A Single-Supplier Solution For Impurity Monitoring In Fuel-Cell Hydrogen

Global efforts to reduce the impact of harmful emissions on the environment have increasingly focused on lowering carbon emissions.  The key to meeting this challenge lies in replacing fossil fuels with alternative, renewable fuel sources, particularly to power vehicles.

Fuel cells offer a uniquely flexible solution in this market and can be used for a wide range of applications, powering systems from laptop computers to utility power stations.  The move to a hydrogen economy is widely regarded as the next step in the global transition towards a zero-emission energy sector. A hydrogen fuel cell uses chemical energy to cleanly and efficiently produce electricity; the only byproducts are water and heat. It can also be combined with electric motors to power a zero-emission vehicle.

While most hydrogen (H2) today is still produced from fossil sources, an established H2 infrastructure allows a future seamless transition to renewable and truly carbon-free H2 production. Fuel cells are also far more efficient than conventional combustion engines while offering a similar range (typically, about 500-700 km). H2 can also be refilled quickly at a fueling station, avoiding the delays of charging a battery-electric vehicle.

However, the performance of a fuel cell is dependent on the purity of the hydrogen. There are multiple impurities that can affect the fuel cell, and their presence and concentration levels depend largely upon the method used to generate the H2.  For example, most hydrogen is produced through steam methane reforming. This process can generate several contaminants ranging from methane and moisture to carbon monoxide and carbon dioxide (CO2). If H2 is created through electrolysis, splitting water into hydrogen and oxygen, then moisture and oxygen are the most common contaminants. Additionally, many impurities can come from the atmosphere, mostly nitrogen, oxygen, and moisture.

Eliminating impurities from H2 altogether is not practical. Maintaining an efficient fuel cell requires the presence of these contaminants to be limited to specific levels, set by international purity standards such as ISO 14687 or SAE J2719.

It is essential to monitor the wide range of impurities listed in the table above to prevent performance being compromised, or worse, irreparable damage to the fuel cell.  In the past, a monitoring solution for H2 purity involved a complex setup using as many as seven different analyzers from multiple providers, with no integration.  A typical arrangement requires:

  • A gas chromatograph for total sulfur and total halogenates
  • Another gas chromatograph for helium, nitrogen & argon, and methane/THC
  • An electrochemical analyzer for oxygen
  • An FTIR to measure CH2O2, CO and CO2
  • Three separate CRDSs, measuring water, ammonia, and CH2O

Process Insights provides a full portfolio of solutions for monitoring hydrogen purity throughout its supply chain, from production, through transport and storage, to fueling.  The contaminant detection limits of these instruments are ideally suited to qualify H2 for compliance with global purity standards.  The real benefit, however, lies in Process Insights’ total solution.  Only three analyzers are required, and these can be fully integrated into one single-provider system.

Our total solution:

  1. Prismatic 3 multi-species CRDS Analyzer – for CO, CO2, water, and ammonia
  2. MAX300-LG Mass Spectrometer – for measurements of He, N2, Ar, CH4, O2, CH2O, and CH2O2
  3. Sulfur & THC Multi-Species GC Analyzer – for total sulfur and total hydrocarbons

The Prismatic 3 multi-species Cavity Ring-Down Spectroscopy (CRDS) analyzer allows simultaneous detection of up to four impurities with wide dynamic ranges, delivering simple, real-time, direct measurements. Highly reliable and with low maintenance requirements, the Prismatic 3 is ideal for remote operations and has a low cost of ownership.  Tiger Optics’ analyzers are used to support fuel-cell hydrogen quality control by organizations including the Korean Gas Safety Corporation (KGSC) and California’s Division of Measurement Standards and is approved by ISO.

The MAX300-LG is a compact benchtop mass spectrometer that delivers complete, real-time sample analysis from 100% concentration down to parts-per-billion (ppb) levels. It is supported by our Extrel easy-to-use Questor5 process control software, which is designed for continuous gas monitoring with automated calibrations, analysis sequences, and data outputs.

The combination of Process Insights’ Prismatic 3 analyzer and MAX300-LG mass spectrometer with a Sulfur & THC GC Analyzer to provide the remaining measurements provides a powerful solution for monitoring contaminants in fuel-cell hydrogen.  This system dramatically reduces the number of instruments required for measuring hydrogen samples and reduces the cost of manpower and operation through ease of use and low maintenance needs. It also means operators only deal with a single provider, reducing the complexity of ordering, integration, and usage.

Continue reading

Parts-Per-Trillion Moisture Detection in Electronic-Grade Bulk Gases

Parts-Per-Trillion Moisture Detection in Electronic-Grade Bulk Gases

The semiconductor market is seeing a new era of innovation, with new applications fueling demand for advanced semiconductor devices that put more stringent requirements on reliability, power handling capability and power consumption, while packing more functionality into a smaller package and decreasing technology nodes.

Among these demanding applications are increasingly more powerful smartphones and tablets that aim—at the same time—to improve battery life. More recently, automotive sensor systems, the Internet of Things (IoT), the next generation of wireless communication (5G), and smart power grids have emerged as applications with enormous expansion potential over the coming years and decades. Future self-driving vehicles, for instance, require massive amounts of computing power to process the input from cameras and sensors in real-time; and the necessary high-performance processors must be both reliable and power efficient.

The Demand for Higher-Quality Gases and Better Analytics
To meet the challenges of these new applications, the semiconductor industry’s International Roadmap for Devices and Systems (IRDS) outlines manufacturing quality as one key aspect; therefore, semiconductor device manufacturers are implementing more stringent control into all aspects of the manufacturing process, from the cleanroom environment and the wafer processing tools to the raw materials used for production, many of which are gases. Consequently, improved gas quality control is one of the most important measures that are employed by semiconductor fabs to increase yields and reduce failure rates.

With the need to monitor and ensure stricter and more consistent gas quality comes a demand for more sensitive and accurate analytical technologies. At the same time, speed of response has become more important as well, as fab operators rely heavily on real-time process control to

In many state-of-the-art semi fabs, Cavity Ring-Down Spectroscopy (CRDS) analyzers are the gold standard for ensuring quality of the major bulk gases that are used in the manufacturing process, which are typically N2, CDA, O2, H2, Ar, and He.

Continue reading

HCl Continuous Emissions Monitoring

HCl Continuous Emissions Monitoring

Hydrogen Chloride (HCl) is a major atmospheric pollutant associated with the combustion of fossil fuels, such as coal and heavy oils, and also with a number of manufacturing processes, including cement production.

HCl in the atmosphere has an adverse effect on both human health and the wider environment. The inhalation of even low concentrations of HCl can cause irritation of the respiratory tract in healthy individuals and exacerbate symptoms associated with conditions such as asthma and emphysema.

Dissolved HCl is a contributor to acid rain pollution, the results of which include damage to building materials and reduced crop yields.  Atmospheric HCl pollution is also a factor in the production of photochemical smog. The economic impact makes the reduction of HCl pollution a priority for regulators and industry. HCl is generated by multiple industrial processes, with combustion of coal and oil for household and industrial power generation as the primary source. Here, chlorides present in the fuel are converted to HCl in the combustion process and emitted with other by-products. In addition, industrial processes emit HCl as a result of chlorides present in raw materials that are converted to HCl during production. In cement production, for instance, raw materials, including calcium carbonate, silica, clays, and ferrous oxides, all contain chlorides, resulting in generation of HCl.

Regulators worldwide dictate strict emissions limits for many atmospheric pollutants, including HCl. In the United States, the Environmental Protection Agency (EPA) has recently reduced emissions limits to further lessen the impact of the issues.  These emissions limits require HCl emitters to monitor and report the level of the gas present in stack emissions and to ensure that steps are taken to guarantee that emissions fall below the specified limits. This may require the emitter to either refine their process, via the use of cleaner fuels, for example, or to add abatement technology downstream of the process to reduce emissions of HCl.

Current analytical methods for HCl CEM applications include GFC/NDIR, FTIR, and cross-stack TDLAS. These methods have, to date, been adequate to monitor HCl emissions, based on existing emissions limits. The detection limits for some of these techniques will not be sufficiently low, however, to meet the revised limits, and so alternative techniques will be necessary.

CRDS gas analysis technology offers the performance and range to cope with these regulations, delivering accurate measurements at levels far below the new limits in diluted stack gas.

Continue reading

Next-Generation Monitoring of Airborne Molecular Contaminants in Cleanrooms

Monitoring Airborne Molecular Contaminants in Cleanrooms

Airborne Molecular Contaminants in Cleanrooms

The dust-free and controlled environment of a cleanroom is vital to prevent product failure in semiconductor manufacturing but controlling particulates in cleanrooms is not nearly sufficient to protect semiconductor devices that feature structural sizes on a molecular level. Micro-particles are the main source of physical contamination, but modern filtration systems for air supply, cleanroom attire and specialized surface materials have been successful in keeping particle contamination under control.

There are, however, molecular contaminants, which can range from small inorganic molecules like acids and bases, to more complex organic species like VOCs. These molecules are known to cause chemical contamination (i.e. they react with the materials and surfaces of the semiconductor devices to cause oxidation, unintended doping, dislocations, and more).

This class of contaminants is collectively known as Airborne Molecular Contaminants (AMCs). They are harder to control than particles. Firstly, the molecules are much smaller than particles and require more advanced chemical filtration of the cleanroom air. Secondly and more importantly, however, many AMCs are created within the cleanroom itself. For instance, HF is used for wafer cleaning within the process, and NH3 is emitted naturally by any person present in the cleanroom. As feature sizes have decreased over the decades, both devices and equipment became more and more susceptible to significant damage from AMCs. They can have various detrimental effects: Acids, such as HF or HCl, can cause micro-corrosion and accelerate oxidation. Bases, such as NH3, can cause hazing and attack coatings on the optics of UV lithography equipment. Hydrides, such as AsH3, PH3, or B2H6, can cause unintended doping, which can dramatically impact a devices functionality.

Monitoring AMCs is Key to Contamination Prevention
Because many AMCs are generated within the cleanroom, they cannot be completely avoided, even though wafers are shielded as much as possible from exposure nowadays, e.g., by transporting them in closed pods (FOUPs) between process steps. Precise monitoring of the cleanroom environment for these molecules has emerged as a key for semi fabs to further mitigate the threat of AMCs. By immediately detecting the presence of harmful molecules, engineering controls can be implemented to protect devices and processes from exposure.

Due to the low concentration of AMCs and the need for fast detection, measurement instruments have to fulfill stringent requirements. As a result, ultra-sensitive laser-based analyzers have emerged as the primary class of instrument used to detect small AMCs, such as HF, HCl or NH3. Semiconductor fabs require all of these molecules to be measured at levels below 1 part per billion (ppb) with a response of 1-3 minutes to any presence of molecules. More importantly, the instruments readings have to return to baseline as fast as possible after the molecules’ presence is eliminated to minimize delays in the manufacturing process. The International Roadmap for Devices and Systems (IRDS) even calls out detection requirements below 0.1 ppb as part of the future technological challenges.

Introducing our NEW T-I Max Next-Generation AMC Monitors
To address this challenge, Process Insights manufacturers the TIGER OPTICS™ T-I Max™ series of cleanroom analyzers for detecting AMCs like HF, HCl, and NH3 with unprecedented speed and sensitivity. The T-I Max uses powerful Cavity Ring-Down Spectroscopy (CRDS) Technology but uses an all-new electronic and optical platform that enables lower measurement noise and up to ten times faster measurement rate. This platform takes CRDS to the next level and dramatically lowers detection limits compared to previous generation analyzers.

Continue reading

Cost-Effective Purity Analysis in the Cryogenic Air Separation Process

Cost-Effective Purity Analysis in the Cryogenic Air Separation Process

Did you know you can improve safety and your process efficiency by using CRDS gas analyzers?   

CRDS analyzers offer many opportunities to improve the air separation process by saving time and money and alerting plant operators quickly in case of unsafe impurity levels.  Key advantages of using CRDS gas analyzers include:

  • Freedom from calibration
  • No consumables or service gases required
  • All solid-state design, no moving parts
  • Plug-and-play, easy to operate
  • Accurate detection of H2O, CO2, CH4, C2H2 and H2
  • Fast speed of response, ideal for process control

Cryogenic Air Separation
Cryogenic air separation units (ASUs) are the gas industry’s workhorses for the production of gaseous and liquid high purity nitrogen, oxygen and argon. The cryogenic process can be modified to manufacture a range of desired products and mixes.

Controlling Impurities to Ensure Safe ASU Operation
Following compression, the air pre-treatment step consists of cooling and purification to remove process contaminants, such as H2O, CO2 and others. The most common purification methods are Temperature Swing Adsorption (TSA), which exploits the difference in adsorption capacity of adsorbents at different temperatures, and Pressure Swing Adsorption (PSA), which operates similarly via pressure variations.

Continue reading

Monitoring Gasification with a Mass Spec Gas Analyzer

Monitoring Gasification with a Mass Spec Gas Analyzer

Research in the field of biomass gasification is increasingly important as industry continues to find new uses for syngas. At the Energy & Environmental Research Center (EERC) an Extrel MAX300-RTG process mass spectrometer was used to monitor the exit stream of a Fluid Bed Gasifier. The quadrupole mass spectrometer provided fast, quantitative analysis of the syngas composition.

Over the last several years, concern about the economic and environmental impact of traditional fossil fuel combustion and petrochemicals has led to a search for viable alternatives with gasification emerging as a powerful technique for generating fuel and hydrocarbons. The gasification process makes use of materials such as coal, biomass, and waste to produce synthesis gas, or syngas. Syngas is a combustible mixture of hydrogen, carbon monoxide and carbon dioxide that generally contains a small amount of methane and some trace contaminants. Syngas is used as a fuel source to generate power and heat, or converted into products like hydrogen, for use in fuel cells or fertilizer generation, or liquid fuels via a Fischer-Tropsch reaction.

Gasification and chemical processes utilizing syngas rely upon the ability to obtain information about the composition of the gas stream exiting the reactor. The MAX300-RTG is a 7th generation process mass spectrometer capable of performing quantitative analysis on a wide variety of compounds at concentrations ranging from 100% down to 10 ppb. The 19 mm quadrupole mass filter used by the system allows for high analytical repeatability and long-term stability.

The MAX300-RTG demonstrated that it has the flexibility to quickly characterize and quantify syngas mixtures. It has the sensitivity to detect trace components at ppm levels and below, and the speed to perform each measurement in under 0.4 seconds. The ability to analyze the complete array of syngas components exiting the gasifier, from 100% down to ppm levels, makes the MAX300-RTG an instrument capable of replacing complicated analysis systems involving multiple devices and technologies. The speed of the mass spectrometer means that the MAX300-RTG can be automated to monitor gas composition at several sample points, delivering a complete set of concentrations at 20 seconds per point.

At the EERC, additional sampling at the ports downstream of the reactor could yield important insight into the operation and efficiency of the fixed beds, or be used to analyze hydrogen membrane separation, or a Fischer-Tropsch product. The speed and flexibility of the MAX300-IG, combined with the capability to run 24/7 in rugged and hazardous industrial environments, make it ideal for monitoring production scale gasification and any associated chemical processes downstream. At large facilities that utilize syngas, like ammonia plants, the MAX300-RTG and its predecessors have set the standard for analyzer automation and process control over the last several decades.

Continue reading

Thermogravimetric Analysis/Mass Spectrometry (TGA-MS)

Thermogravimetric Analysis/Mass Spectrometry (TGA-MS)

An Extrel MAX300-EGA was coupled with a NETZSCH TG 209 F1 Libra to Perform Evolved Gas Analysis

The heated transfer line of the MAX300-EGA™, a quadrupole mass spectrometer designed for evolved gas analysis, was connected to the off-gas port of a NETZSCH® TG 209 F1 Libra® thermobalance. A variety of samples were analyzed and the combination of the two technologies allowed for simultaneous thermal characterization and quantitative analysis of the compounds in the furnace exhaust.

Thermogravimetric analysis (TGA) is a powerful technique that has been used for many years to characterize solid and liquid samples. The mass of the sample material is monitored while it is heated. By using a high precision balance and carefully controlling the heating process, researchers are able to plot mass loss as a function of temperature. TGA is widely used in the study of polymers, pharmaceuticals and petrochemicals to determine degradation temperatures, characterize thermal decomposition, and monitor solvent and moisture content.

Additional information about sample composition and thermal behavior can be obtained by analyzing the gases that leave the material as it is heated. This allows the researcher to determine not only the temperature at which a mass loss occurs, but also the molecular structures involved. Evolved Gas Analysis (EGA) is commonly carried out via a variety of analytical techniques, but in all cases the integrity of the gas stream must be protected. It must be kept hot and moved quickly to the gas analyzer to prevent condensation and chemical interactions.

The NETZSCH TG 209 F1 Libra is a vacuum tight TGA, making it ideal for connecting to a mass spectrometer. The Libra is equipped with an automatic sample changer and can reach temperatures up to 1100°C. It measures sample mass to a resolution of 0.1 μg. The Libra’s heated adapter was connected to the transfer line of the Extrel MAX300-EGA. The interface is differentially pumped for rapid clearing and heated to 200°C to prevent condensation; it provides a low volume, chemically inert sample path from the TGA all the way into the mass spectrometer’s ionizer.

The MAX300-EGA is a quadrupole mass spectrometer optimized for evolved gas analysis in a laboratory setting. It is capable of scanning from 1-500 amu and features the Extrel 19 mm mass filter for high analytical repeatability and long-term stability. The Questor5 software allows the system to perform qualitative analysis for sample characterization, or quantitative analysis, measuring concentrations from 100% down to 10 ppb. In addition to the transfer line, a MAX300-EGA is equipped to import a start-of-heating signal from the TGA and can be configured to perform calculations and trend data or output the data for viewing and manipulation on a different platform.

Polystyrene Decomposition: Detection of High-Mass Fragments

The furnace of the Libra was loaded with 0.94 mg of polystyrene and heated to over 600°C. The breakdown of the sample was monitored to determine the MAX300’s sensitivity to the small signals generated by high-mass hydrocarbons in the off-gas. Although the TGA records the decomposition of the polystyrene as a single weight loss beginning at 290°C, the MAX300 is able to show that the evolution of several compounds has occurred.

It is generally difficult to keep larger molecules from dropping out of an evolved sample once it has left the furnace, but the mass spectrum at 39.75 minutes clearly shows the presence of styrene in the off gas (Fig. 3. B), as well as the much smaller signal generated by methyl styrene.

The mass of each component in the gas was calculated for comparison to data from the TGA’s balance. Even the relatively small, 60 μg, loss that occurred as moisture left the sample was easily measured and quantified by the mass spectrometer. The MAX300 was also able to individually determine the amount of carbon monoxide and carbon dioxide that, combined, resulted in the second mass loss. While the thermal breakdown of calcium oxalate is well documented, the ability of the MAX300 to perform similar quantitative separations can be used to better understand a complex decomposition featuring the simultaneous evolution of multiple unknown compounds.

Further Applications for the MAX300-EGA

The data gathered from the effluent of the TGA 209 F1 Libra indicates that the MAX300-EGA is a powerful tool for evolved gas analysis. The sensitivity, resolution and quantitation demonstrated during the tests indicate the instrument’s potential for other evolved gas applications. In its standard configuration or equipped with the 300 or 400°C transfer line upgrades, the MAX300-EGA could be used to quantify solvent loss in a pharmaceutical sample, detect trace VOCs, or monitor the gas exiting a microreactor.

Continue reading