Rapid Nondestructive Spectral Imaging Technologies for Online Food Safety Inspection

Moon S. Kim, Kevin Chao, and Alan M. Lefcourt

Agricultural Research Service (ARS), United States Department of Agriculture (USDA), Bldg. 303, Barc-East, 10300 Baltimore Avenue, Beltsville, MD 20705, USA

Foodborne illness presents a public health challenge in USA. There is an urgent need for the federal government and food industries to expand efforts to prevent any food contamination that potentially could be harmful to human health. The Food Safety Laboratory (FSL), ARS, USDA, is one of the leading laboratories for the development of optoelectronic sensing technologies and methodologies, successfully demonstrating several cutting-edge systems for detection and inspection of food quality, safety, sanitation, and security. The sensing technologies and systems include Raman, fluorescence, and visible/near infrared reflectance spectroscopies, as well as hyperspectral and multispectral imaging. A brief overview of the FSL approaches for food safety research and development in addition to applications of rapid hyperspectral and multispectral image-based online safety inspection for apples and chicken carcasses is presented.

Development of a Field-Portable Nucleic Acid-Based Sensor

Carl Batt, Scott Stelick, Clarissa Lui, and Matthew Kennedy

Cornell University, 312 Stocking Hall, Ithaca, NY 14853, USA

We have developed a portable bread-board instrument and the associated protocols creating a system for the detection of pathogens in food products. Existing detection techniques are not adequate to allow for the timely detection of pathogenic microorganisms in the food supply but more importantly require significant operator assistance. The instrument that we have developed has automatic fluid and thermal control as well as integrated fluorescence detection, all incorporated into a desktop platform. Homogenous detection of target is accomplished in “real time” by continuously monitoring the fluorescence during PCR using the DNA binding dye SYBR Green or alternatively the TaqMan 5 nuclease system; the latter of which will allow for multiple target (and internal control) integration. Nucleic acid purification and PCR amplification/detection have been combined and optimized into a single monolithic silicon microchip in addition to magnetic bead capture in a separate but linked region of the chip. Optical detection is being advanced to eventually incorporate the simultaneous detection of multiple emission wavelengths. Detection levels of less than 102 Bacillus anthracis cells are being routinely obtained with a total detection time of less than one hour. The robust nature of this system has been demonstrated by the development of detection systems for Leishmania, Staphylococcus, Listeria and other target pathogens.

Optimization of Peak Capacity Productivity in LC-LC through Design of High-Speed Gradient Elution Chromatography

Peter W. Carr, Dwight R. Stoll, Xiapoing Li, and Xiaoli Wang

Department of Chemistry, University of Minnesota, 207 Pleasant St., Minneapolis, MN 55455, USA

A new approach to high-speed comprehensive 2DLC (LCxLC) based on the use of ultra-fast high-temperature gradient elution-reversed phase chromatography is described. Entirely conventional gradient elution instrumentation and columns are assembled in a system which develops a total peak capacity of about 2000 in 30 minutes; this is equivalent to nearly 1 peak/second. Each first-dimension peak is sampled two or three times as evidenced by the presence of the corresponding peaks in two or three consecutive chromatograms from the second-dimension column. Application to the separation of the low molecular weight components of wild type and mutant maize seedlings indicates the presence of more than 200 peaks in a 2D separation carried out at a time scale of 30 minutes. Compelling illustrations of the analytical prowess of fast high-temperature 2DLC are evident in the clear presence of seven distinct peaks in a single second-dimension chromatogram from a single quite narrow first-dimension peak and the great power of 2DLC to solve the “analytic dynamic range” problem inherent in the measurement of small peaks neighboring a gigantic peak. A number of problems remain including the general question of the optimization of the peak capacity per unit time, choosing the right pairs of columns and achieving desirable baseline characteristics. The major challenge will be how to deal with the huge amount of data generated in a short period of time and turning it into information relevant to the problem at hand. Applications to the separation of a variety of complex mixtures (coffee, wine, urine) will be shown. The principal novel finding is that 2DLC produces higher peak capacity and more actual peaks in a real mixture become evident in 2DLC as compared to fully optimized gradient elution 1DLC in as short a time as 10 minutes.

Making Data Available to Clients in Real Time

Jon S. Kauffman

Lancaster Laboratories, 2425 New Holland Pike, Lancaster, PA 17605, USA

A secure extranet application has been developed to serve our pharmaceutical clients. This browser-based utility will allow clients to view sample information and reports, and test data online as soon as results have been authorized. This includes scanned copies of actual notebook pages associated with authorized tests as well as chromatographic data. The application was designed in conjunction with our clients to maintain security, facilitate flow of information, and to provide unlimited access to data in a timely manner.

On-Chip Liquid Chromatography Using In-Channel Electrochemical Detection

Pei Ling Leow, Bhavik A. Patel, and Danny O'Hare

Imperial College London, 438 Bessemer Building, South Kensington, London Sw7 2Az, UK

Micro total analysis system ( 𝜇 TAS) is an analytical device that miniaturizes laboratory functions within one device. These devices are often used in environmental analysis, genomics, proteomics, and biomedical analysis. By reducing the size of the analytical system, it reduced the analytes’ consumption, improved throughput, and enabled inexpensive mass production of microanalytical instruments [1]. In addition, 𝜇 TAS provides the possibility of performing separations and detection within a single device. To date, detection methods such as spectroscopic, electrochemical, and electrochemiluminescence are widely used as end or post column detectors [2, 3].

We are looking at using in-channel electrode detection for monitoring of the process of the separations within the channel, which will aid in improving our understanding of chromatographic separations. We have been using electrochemical detection for in-channel detection as it has excellent limits of detection ( 1 0 1 5 m o l d m 3 ) compared to spectroscopic methods which scale linearly with path length.

An array of gold wires is embedded on polyethylene terephthalate (PET) wafer (Hanyang University, South Korea), and 8 pairs of the printed gold wires served as the in-channel electrochemical microelectrodes. A 500  𝜇 m width and 3 cm long polyester microchannel is bonded perpendicularly on top of the PET wafer across the gold wires. The channel will be packed with silica-based stationary phase. Studies of the packed and unpacked channel will be carried out, and the process of the separation will be monitored using in-channel electrochemical detection.

Simultaneous Determination of Ascorbic Acid and Hydrogen Peroxide Using Layered Metallopolymer and Protein Films

Amos Mugweru

Rowan University, 201 Mullica Hill Road, Glassboro, NJ 08028, USA

Hydrogen peroxide is one of form reactive oxygen species highly toxic to cells. It causes oxidative stress and is a marker of many kinds of pathological situations. Ascorbic acid is important as an antioxidant. In this work, a film made using a layer-by-layer method was assembled using hemoglobin and poly[4-vinylpyridine Os(bipyridine)2Cl]-co-ethylamine (POs-EA). The hybrid film will be used to simultaneously determine concentration of ascorbic acid and hydrogen peroxide. The film formation will be ascertained by the electrochemistry, the quartz crystal microbalance, and other spectroscopic methods. Determination of hydrogen peroxide using this sensor does not interfere with determination of ascorbic acid. A linear dependence of peak current was obtained for both ascorbic acid and hydrogen peroxide. The results of limit of detection of this sensor for both two analytes will be presented.

Collection, Real-Time Imaging, and Storage of Pancreatic Islet Secretions Using Droplet-Based Microfluidics

Christopher J. Easley, Jonathan V. Rocheleau,  W. Steven Head, and David W. Piston

Vanderbilt University, 747 Light Hall, 21St Avenue South, Nashville, TN 37232, USA

Insulin is stored in the secretory granules of beta cells in pancreatic islets, where it is cocrystallized with zinc ions in a 6:2 ratio (insulin: Z n 2 + ) (see Figure 1). Upon secretion from normal islets, insulin and zinc ions are released into the extracellular space in this 6:2 ratio. In order to image real-time insulin and zinc secretions from islets, droplet microfluidics is utilized to collect secretions with minimal dilution. Through a novel approach, combining lock-in spatial filtering and droplet fluidics, highly sensitive measurements are made via confocal fluorescence microscopy. Using a competitive zinc-binding assay between EDTA and the fluorescent zinc indicator, FluoZin-3, this approach results in sensitive and quantitative real-time imaging of zinc as it is secreted from live pancreatic islets during glucose-stimulated insulin secretion (GSIS). Furthermore, time traces of droplet-confined secretions can be stored in tubes, and can later be used to quantify insulin using radioimmunoassays (RIAs) or enzyme-linked immunosorbent assays (ELISAs). We are utilizing these methods to investigate insulin storage defects that occur in diabetic mice, particularly those with mutations of the SLC30A8 gene which encodes for the zinc transporter ZnT-8 that is present exclusively in secretory vesicles of pancreatic beta cells. Deletion of this gene leads to impaired glucose-tolerance in knockout mouse models.

Integration and Optimization of Hardware and Software for a Differential Flow-Modulated Comprehensive Gas Chromatography System

James D. McCurry, Roger L. Firor, Chun-xiao Wang, and Michael J. Feeney

Agilent Technologies, 2850 Centerville Road, Wilmington, DE 19808, USA

Differential flow-modulated GCxGC, as developed by Seeley, has been shown to provide a simple and robust alternative to thermally modulated systems [4]. This paper will present a differential flow GCxGC system where the modulator hardware has been integrated into a simple monolithic device using a new approach called capillary flow technology. Further, the key pneumatics hardware, electronics hardware, and software have been optimized around this modulator to provide a completely integrated system for easy operation and method development. Test results will be presented along with some key applications that will demonstrate system performance.

Achieving Optimum UHPLC Column Performance by Measuring and Reducing Overall System Dispersion

Richard Henry,  Hillel Brandes, and Russel Gant

Supelco, 595 North Harrison Road, Bellefonte, PA 16823, USA

The rapid introduction of ultra-high performance LC (UHPLC) columns containing particles that are smaller than 3  𝜇 m has created very narrow peak widths that can no longer be faithfully measured by all HPLC instruments. In addition, Fused-Core particles have become available, which can deliver peak widths comparable to sub-2  𝜇 m particles with flow resistance comparable to 3  𝜇 m particles. Their moderate operating pressure allows UHPLC performance to be achieved with older and traditional instruments which demonstrate adequately low instrument dispersion or bandwidth; however, users need a simple way to qualify HPLC instruments, rather than by pressure rating, for use with UHPLC columns.

In this paper, origins of peak dispersion or bandspreading will be examined in detail, including (1) dispersion within the column particle bed, (2) dispersion from column elements such as fittings and frits, and (3) extra-column dispersion from HPLC instrument volume elements such as injector, precolumn tubing, postcolumn tubing, and detector flow cell. Additional system operating elements that can increase sample bandspreading such as excessive sample injection size, nonuniform column temperature, and slow detector response time will also be covered. Rapid techniques for measuring HPLC instrument dispersion will be described with the objective of establishing bandwidth data for every HPLC instrument in the laboratory. Practical suggestions for improving bandwidth of older instruments will be offered, and a new high-performance fitting will be described. Performance of both conventional (400 bar) and UHPLC instruments (600–1000 bar) will be compared, and preferred terms for description of chromatographic dispersion will be recommended. Examples of both high-speed and high-resolution UHPLC applications will be included.

Rapid, Calibrated, High-Resolution, and Spectral Imaging Using Tunable Laser Source

Eli K. Margalith and Lam K. Nguyen

OPOTEK, Inc., 2233 Faraday Avenue, Suite E, Carlsbad, CA 92008, USA

NIR chemical imaging is recognized as an important analytical tool for a wide variety of industries, including agriculture, medicine, chemical, and pharmaceutical development and production. Conventional NIR imaging technologies utilize filtering or dispersion of a broadband light source in order to achieve wavelength selection, which imposes several key performance limitations.

We present a spectral imaging instrument based on tunable laser technology. Specifically, we utilize an optical parametric oscillator (OPO), which can provide high output power and narrow bandwidth over a broad range of wavelengths. By replacing the broadband source and tunable filters of a typical NIR imaging instrument, several advantages are realized, including large field of view, fast scan rates, and the ability to use optical fiber for efficient and flexible delivery of light to the sample. Because of these advantages, the instrument only requires a few seconds to acquire high-resolution, calibrated, and hyperspectral data over the NIR range. Actual wavelengths are recorded and the reflectance signal is calibrated and corrected for linearity at each wavelength in real time, without the need for a premeasurement calibration.

Advances in Automated Sample Preparation

Bruce Richter, Sheldon Henderson, Eric Francis,  Richard Carlson, Brett Murphy, Brian Dorich, and Jennifer Peterson

Dionex, SLCTC, 1515 W. 2200 S., Suite A, Salt Lake City, UT 84119, USA

Accelerated solvent extraction (ASE) is now widely used in the environmental, food, and polymer analysis areas to increase the efficiency of the sample preparation process. Using ASE, sample preparation times can be shortened, and the required amount of extraction solvents is dramatically reduced. While saving of time and solvent is always favorable, the ability to rapidly and quantifiably extract contaminants from a variety of matrices is very important as well. ASE methodology is specified in environmental methods such as USEPA Method 3545A as part of the Contract Laboratory Program (CLP SOW OLM04.2). ASE methodology is also specified in methods in Germany (German Method L00.00-34) and China (Chinese Method GB/T 19649-2005), as well as ASTM standard practice D 7210.

ASE can be used to automatically extract samples without user intervention. Recent advances in the use of adsorbents in the extraction cell have enhanced the capability of this technique. For example, adsorbents have been used to retain a wide variety of interfering species including lipids, ionic compounds, and colored compounds such as chlorophyll and others. Sequential extractions with solvents of varying polarity can also achieve selective extractions and fractionations. For example, samples can be extracted with nonpolar solvents first to remove nonpolar interferences prior to the extraction of polar analytes.

This paper will discuss the use of adsorbents in the extraction cell to automatically produce extracts that can be analyzed without additional sample pretreatment. Automatic sequential extraction of samples to produce unique fractionation and selectivity in ASE will also be presented.

Identification of Anthocyanins in Cactaceae by LC/ESI/MS-MS

James M. Chapman, Charles A. Johnson,  Paul A. Campbell, Mindy Walker, and Chad M. Scholes

Rockhurst University, Science Center 320-C, 1100 Rockhurst Road, Kansas City, MO 64110, USA

Anthocyanins and betalains are water soluble vacuolar pigments. In flowers, anthocyanin and betalain pigments function as pollinator attractants, and in fruits, the colorful skins attract animals which will eat the fruits and disperse the seeds. In photosynthetic tissues (such as plant leaves or the stems of cacti), anthocyanins and betalains have been shown to act as a “sunscreen" protecting cells against photodamage by absorbing UV and blue-green light, thereby protecting the tissues from photoinhibition or high light stress. They are synthesized exclusively by organisms of the plant kingdom, and have been observed to occur in all tissues of higher plants, providing color in leaves, stems, roots, flowers, and fruits. While the majority of land plants contain anthocyanins, there are a few examples of plants producing betacyanins as in the Caryophyllaceae, Cactaceae, and Rubiaceae families. This work began as the characterization of betalain pigments from Beehive cactus (Mammillaria vivapara var. vivapara) flower petals by LC/ESI/MS-MS. In addition to the expected betalains, several anthocyanins were unexpectedly identified in the flower petals. The identification of these anthocyanins in Mammillaria vivapara var. vivapara is a novel discovery in the species. Additional work has since then been carried out on 20 different cacti flower petals encompassing five genera of cacti, all of which have been found to contain anthocyanins. Approximately 40 different anthocyanins have been identified in the extracts of cacti flower petals at this stage of the work. Comparisons to anthocyanin standards obtained, from plants known to contain anthocyanins, have resulted in the identification of 15 of these pigments to this point.

Amino Acids Analysis by HPLC/PITC Precolumn Derivatization

Liao Haiming and Dingzhong Yang

National Institute for the Control of Pharmaceutical and Biological Products, No.2 Tian Tan Xi Li, Chong Wen District, Haidian District, Beijing 100050, China

In this paper, we have developed a new amino acid analysis method based on HPLC and precolumn derivatization. The method is simple, reliable, and cost-effective. It can be done on regular HPLC/UV 254 and there is no need to use expensive amino acid analyzer. The method uses PITC (phenylisothiocyanate) for amino acid derivatization and UV detection. By optimizing the operational conditions and instrument parameters, we are able to get good reproducibility and comparable results to other commercialized methods. It features the following advantages: low cost for instruments and reagents; analyzing all primary and secondary AA simultaneously; quick turnover. Many examples of applications in food and pharmaceuticals will be presented.

Validation of Automated Liquid-Liquid Extraction of B-Carotene

Sikander Gill, Rajwant Gill, Dong Liang, and Richard Zuk

Aurora Biomed Inc., 1001 East Pender Steet, Vancouver, BC, Canada V6A1W2

To provide an automated solution to the liquid-liquid extraction process for downstream applications, Aurora Biomed Inc. has validated its liquid-liquid extraction workstation. In this validation, the water-alcohol (1:1) samples were spiked with 𝛽 -carotene at 850  𝜇 g/mL. The sample (liquid phase) and the hexane solvent (organic phase) were mixed either by autoshaker provided on the deck of the workstation or by auto-pipet-action of the workstation. The extraction profile showed that 90.1 and 9.3% of the active compound were extracted in the first and second extractions, respectively. The third, fourth, and fifth extractions had 0.3, 0.1, and 0.02% efficiency, respectively. The mixing of the sample and the solvent was effectively carried by the autoshaker at 1100 rpm. The performance of the shaker at 700 rpm was also compared with 1100 rpm. The latter speed was observed to be more effective than the former. The extraction profile of the automated operation in the manual performance was found to be better than manual performance.

Identification of Explosives Using a Combination of Ion Mobility Spectrometer and Other Detectors

Wolf Muenchmeyer, Andreas Walte, and Bert Ungethuem

Airsense Analytics GmbH, Hagenower Str. 73, 19061 Schwerin, Germany

The gas detector array GDA2 was developed for the fast identification of toxic industrial chemicals and chemical warfare agents. The detector array consists of a combination of an ion mobility spectrometer (IMS), a photoionization detector, two metal oxide sensors, and an electrochemical cell.

Due to low temperatures in the IMS, some of the military used explosives cannot be detected.

This is because some explosives have almost no vapor pressure and require special sampling methods and higher operating temperatures in the detector. Many explosives are based on nitrogen-oxide compounds, which can be detected in the negative mode of the IMS. New peroxide-based explosives, now often used in terrorist attacks because of their simple manufacture, have a much higher vapor pressure and cannot be detected in the negative mode of the IMS.

Sampling methods and modifications needed for the simultaneous detection of all kinds of explosives will be discussed. The identification and warning capabilities of the system, compared to an ion mobility spectrometer, will be shown.

Continuous Monitoring of Nitrogen Compounds in Wastewater with Wireless Data Transmission

John N. Driscoll, Walter Johnson, Pol Perov,  Patricia Hogan, Nicholas Hennigan, Brian Muccioli,  John Hamm, George Heufelder, and Keith Mroczka

PID Analyzers LLC, 780 Corporate Park Dr., Pembroke, MA 02359, USA

Denitrification of wastewater on Cape Cod is an important process because of the nitrogen problems in this area. As a result, there are a number of denitrification systems in use and are being tested at Otis Air Force Base on Cape Cod.

The continuous measurement of nitrogen compounds in wastewater is important in determining the long-term effectiveness of control techniques. It is difficult to judge the long-term performance of any system with only grab samples (weekly/biweekly). The continuous systems will send a signal wirelessly to a PC in a nearby trailer that is connected to the internet. The data will be available to Suffolk University, the Town of Barnstable, and the vendor of the wastewater system. The advantage of the wireless system is that built-in diagnostics (calibration, pump, etc.) will improve the uptime as well as the quality and quantity of the data. We will be adding MODBUS (bidirectional RS232) communications to the analyzer to further enhance the diagnostics. If MODBUS was added to the control system, remote tuning of the control system would be possible.

The analyzer will be a PID Model 610 that uses ion-selective electrodes for ammonia, nitrite, and nitrate. Each of the sensors will have a separate pump for the addition of ionic strength adjustment buffers prior to the measurement. A single meter will be used to display the results and convert the output to a linear voltage proportional to concentration. Samples of wastewater (24-hour integrated) will be collected daily and run by standard methods at the Barnstable County water labs. We will compare the 24-hour integrated sample results with the results from the continuous analyzers, and determine the effectiveness of these electrochemical techniques for continuous monitoring of wastewater.

Characterization of Wood Used as Biomass Fuel by Organic Elemental Analysis

Liliana Krotz and Guido Giazzi

Thermo Fisher Scientific, Strada Rivoltana, 20090 Rodano, Italy

Biomass is organic material made from plants and animals. Some examples of biomass fuels are crops, manure, garbage, and the most common wood. In the past, wood was burned for heating and cooking and was the main source of energy.

Many manufacturing plants in the wood and paper products' industry use wood waste to produce their own steam and electricity. This saves these companies’ money because they do not have to dispose of their waste products or buy as much electricity.

To calculate the energetic value of the wood, it is necessary to know the elemental composition. Therefore, the use of exact analytical techniques, better still if they are automatic, is required. The FlashEA 1112 CHNS/O Analyzer permits the quantitative determination of carbon, nitrogen, hydrogen, sulphur, and oxygen, and the dedicated software Eager 300 allows automatic gross and net heat value calculation. The system, which is based on the dynamic combustion of the sample, provides simultaneous CHNS determination in a single run and the Oxygen determination by pyrolysis in a second run. To perform sulfur determination at trace levels, the analyzer has been coupled with the flame photometric detector (FPD). The method combines the advantages of the elemental analyzer with the sensitivity, selectivity, and robustness of FPD. The coupling is simple, and it allows for the sulfur determination without matrix interference.

This paper presents CHNS/O data of wood samples to show repeatability obtained with the system and the calorific values automatically calculated by Eager 300 software.

Purge and Trap-GC/MS Analysis of Volatile Organic Compounds in Drinking Water Using Simultaneous Measurements of Scan and Selected Ion Monitoring

Yuki Sakamoto, Koki Tanaka, Haruhiko Miyagawa,  Katsuhiro Nakagawa, Melissa Waller, Richard Whitney, and Mark Taylor

Shimadzu Corporation, 1 Nishinokyo-Kuwabaracho, Nakagyoku, Kyoto 6150005, Japan

Hazardous volatile organic compounds (VOCs) in drinking are of particularly high interest all over the world and are regulated in many countries. Although the P&T-GC/MS has been used to determine VOCs, higher-sensitivity and sample throughput methods are still required. The scan/SIM (selected ion monitoring) measurement can acquire both datasets at the same run by alternate switching. It can also perform precise quantitative analysis for target compounds with lower sensitivity using acquired SIM data and confirmed target compounds using mass spectra from acquired scan data. In this study, the applicability of the scan/SIM using P&T-GC/MS was investigated for the EPA method 8260C analysis to improve the sensitivity.

The Eclipse 4660 purge and trap sample concentrator (O I Analytical) as well as GCMS-QP2010S (Shimadzu Corporation) equipped with a shorter capillary column (Rtx-624, 30 m × 0.25 mm i.d., df = 1.40  𝜇 m) were used. The column temperature program started at 35 for 0.5 minute and was increased to 220 at 20/min and held for 2.75 minutes.

The high-speed column temperature program of GCMS-QP2010S allowed that the total time for analysis is shortened without loss of separation efficiency, and three samples were analyzed in one hour. The S/N values of these compounds in scan/SIM were enhanced by a factor ranging from 3.7 to 10.7 times higher compared with the S/N values in scan. The reproducibilities of all compounds at 0.2  𝜇 g/L ( 𝑛 = 7 ) using scan/SIM were less than 8% in the relative standard deviation, and improved approximately by one half compared with the reproducibilities using scan. The calibration curves of all compounds in scan/SIM showed the linearity that ranged from 0.1  𝜇 g/L to 200  𝜇 g/L, and the dynamic range in scan/SIM was twice as wide as the dynamic range in scan.

The scan/SIM measurement improved the sensitivity for twenty-six VOCs, and mass spectra could be used for the confirmation of target compounds.

Monitoring Perchlorate in Water Using Microchip Capillary Electrophoresis

Brian M. Dressen, Don Cropek, and Charles S. Henry

Colorado State University, 1872 Campus Delivery, Champaign, CO 80523, USA

Perchlorate inhibits uptake of iodide into the thyroid gland leading to irregular production of thyroid hormone and giving rise to developmental problems, neurological disorders, reduced intelligence, and cerebral palsy. Human exposure to perchlorate has not been quantified, and current environmental exposures are unknown. Perchlorate is wide spread as a result of both natural and anthropogenic sources, and has been detected in drinking water, food, and both human and cow milk. Sensitive and selective methods for in-field monitoring would aid in mitigating exposure as well as tracing remediation efforts. Currently, the EPA has set reporting limits at 4 ppb. The most common methods for perchlorate detection are ion chromatography coupled to conductivity detection, and liquid chromatography coupled to mass spectrometry. While these methods are very capable, their size, complexity, and cost limit the use to well-equipped and well-funded laboratories. Inexpensive and compact analyzers are needed that are capable of field measurements. Here we present a simple, rapid, and inexpensive system for monitoring perchlorate levels in water samples using microchip capillary electrophoresis with conductivity detection. Microchip devices enjoy low-cost fabrication and instrumentation as well as portability. Sub-ppb detection limits will be shown as well as separation of perchlorate from other water contaminants. Furthermore, separation occurs in 65 seconds, providing near real-time perchlorate measurements. Samples can be run without pretreatment, aside from filtration for turbid waters. In-field monitoring systems will also be presented that are capable of drawing samples directly from the surface water and analyzing them on site (see Figure 2).

Automatic Dynamic Headspace Sampler for the Determination of VOCs in Water

Manuela Bergna and Roberta Lariccia

DANI Instruments S.p.A., Viale Brianza 87, 20093 Cologno Monzese, Italy

Due to their toxicity and persistence in the environment, volatile organic compounds (VOCs) are particularly important pollutants. Some of them are mutagens, teratogens, or carcinogens. For this reason, government agencies require these contaminants to be monitored at progressively lower levels. The qualitative confirmation, quantitative accuracy, and precision required in current regulations demand high-performing analytical solutions.

As static headspace technique does not allow for all the compounds the attainment of the minimum detectable levels required by law in force (EPA 8260/524 and Italian D.M. 471/1999 and D.L.31/2001), dynamic headspace sampling is the preferable technique as it permits to reach the required sensitivity.

In this paper, the use of a dynamic headspace sampler coupled to a capillary GC for the determination of VOCs in water is presented.

The water sample, put in a standard 20 mL vial, is placed in a dynamic headspace sampler that automatically performs all the following operations. The sample, eventually heated, is first purged with a flow of inert gas for a defined time; the inert gas sweeps the sample and carries out the volatile compounds; the purged gas, enriched in VOCs and water, passes through a cold focusing trap where the compounds are concentrated. Finally, the trap is heated up in backflush, the desorbed gas passes through the “Dew Stop"—a device especially designed to remove humidity before entering into the GC or GC/MS system.

Data will be reported including chromatographic parameters, method detection limits, calibrations, and efficiency of the humidity removal system.

Optimization of RPC Separation of Metabolism of Hydrophilic Solutes

Rudolf Laufer, Georg Petroianu, and Rudolf Laufer

Department of Pharmacology, Semmelweis University, Nagyvarad Ter 4, 1089 Al Ain, Hungary

Therapeutic drug monitoring may be routinely done if the standards of both the parent drug and its metabolites are available for analysis. Selective detection highly facilitates evaluation of metabolism. Either mass-selective mode of RPC-MS or another specific and sensitive monitoring may be used. Scouting for the tentative metabolites is a rather complicated case.

Bis-pyridinium aldoxime types of cholinesterase reactivators are extremely hydrophilic compounds. Microsomal treatment of pyridinium aldoximes is the major method of choice for in vitro metabolism. Both the lipophilicity and the retention characteristics of metabolites are generally decreased during the metabolic processes. A series of experiments were devoted to model separation of their metabolites from the chromatographic peaks of the background (blank) microsome.

Both RP-8 and RP-18 stationary phases and either one of methanol, acetonitrile, and tetrahydrofurane mobil phase can give adequate separations of chromatographic peaks to identify the generated metabolites using their mass spectra. The chromatographic separation is generally improved by the use of ion-pairing agents such as trifluoroacetic acid. Monitoring of separation was done at 286 nm (ultraviolet detection of HPLC separation). RPC-MS analyses are evaluated using parallel detection at 286 nm, total ion current (TIC), and characteristic single ion monitoring (SIM). In vitro metabolic studies suggest metabolism of bis-pyridinium aldoximes with oxidation such as either aliphatic hydroxylation or aliphatic epoxidation depending on the length of the side chain.

This project was financially supported by the grant of OTKA T049492.

2D Online Chromatography Separation for Complex Samples

Yiwei Dong, Jinli Huang, and Wan Wang

The Chinese Academy of Agricultural Science, Zhongguancun Street 12, Haidian District, Beijing 100083, China

Even with MS or MS/MS, chemists still have problems with some very complex sample matrices. As examples, the analysis of toxic compounds from plastic toys or fibers is still hard to avoid the matrix effects, even using MS. GPC has been used for food sample cleanup for many years, but it is not online and is very time-consuming. In this study, we explored 2D HPLC in minimizing the matrix effects. In particular, we developed a scheme using a combination of HILIC columns and RP, to perform continuous separation. As for GPC/LC and GPC/GC combination, we developed a general method to get online cleanup followed by LC/MS or GC/MS analysis.

Automated System for Collecting Atmospheric Gas Emissions from Soil

Spencer L. Arnold, R. Scott Tubbs, James Schepers,  Nicholas S. Arnold, and Alan E. Walker

United States Department of Agriculture (USDA), Agricultural Research Service (ARS), Agroecosystem Management Research Unit (AMRU), University of Nebraska, 188 Plant Science, Lincoln, NE 68583, USA

An automated collector of terrestrial system (ACTS) device was designed as an inexpensive method of sample collection, while reducing human error from adverse sampling conditions and fatigues. Field and laboratory testing with greenhouse gases demonstrated the versatility and reliability of the programable ACTS device. Field testing took place at University of Nebraska-Lincoln East Campus, and treatments included drawing samples manually by a simple automated spring-loaded device and with the ACTS device. Results showed strong correlation ( 𝑟 2 = 0 . 8 1 - 1 . 0 0 ) between sampling methods. Testing continues with various peripherals to further evaluate agroecosystem management applications, such as soil respiration related to residue management and methane emissions from livestock waste.

An Internal Verification Check Approach for Online Moisture Analysis in Hydrocarbon Gas Streams

Mike Fuller, Airat Amerov, and Bob Fiore

AMETEK, 150 Freeport Road, Pittsburgh, PA 15238, USA

Tunable diode laser spectroscopy (TDLAS) is a rapidly growing approach in online process analysis. Combining highly specific absorption laser wavelengths using a wavelength modulation spectroscopy technique provides a high degree of selectivity and sensitivities below 1 ppm for many small gas molecules in natural gas and other hydrocarbon streams. While the noncontact TDLAS approach does not typically require calibration updates, it is important to verify that the online system is performing properly and that the results are valid. The new AMETEK 5100 NCM system for the analysis of moisture in natural gas uses a novel approach for performance validation. The system contains a sealed water reference cell which provides verification that the laser is “locked" on the selected water absorption line. The water reference cell is also used to perform a reliability check on the quantitative measurement of the water measured in the sample cell. This is done by carefully measuring the temperature and pressure of the reference cell block, and then thermodynamic expressions are used to calculate the water vapor concentration in the reference cell. If the calculated concentration for the water in the reference cell based on the laser line absorption matches the theoretical value, the performance of the system is said to be verified. If there is a mismatch between the expected and calculated concentrations, an error is reported. The performance of this system for the analysis of moisture in natural gas between 5–2000 ppm will be described.

Teaching Analytical Chemistry with Personal Response Systems

Grace Zoorob

Vanderbilt University, 5153 Hereford Ct, Brentwood, TN 37027, USA

A personal response system (PRS) is a form of technology that offers the instructor the opportunity to ask in-class questions during lecture, receive responses from every student, and obtain immediate feedback. It is generally used in freshman lectures. In recent years, it has been used in a junior-level analytical chemistry lecture to maximize the classroom learning environment. Advantages and disadvantages of the technology will be presented and compared to learning in a traditional lecture.

Real-Time Imaging in X-Ray Fluorescence and X-Ray Diffraction

Kenji Sakurai  and Mari Mizusawa

National Institute for Material Science, 1-2-1 Sengen, Tsukuba 305-0047, Japan

The present paper will describe novel powerful imaging for X-ray fluorescence (XRF) and X-ray diffraction (XRD). So far, the scanning-type imaging has been widely used in those techniques. Though recent progress in high-spatial-resolution imaging using synchrotrons is wonderful, there has been a clear limit; because of the step scan, the imaging requires a long measuring time. In many scientific applications, X-ray imaging that is much more rapid (e.g., capable of high-speed resolution rather than high-spatial resolution) can be extremely important. As shown in Figure 3, it is possible to do X-ray imaging without performing any scans. Here, the method uses a quite wide beam, which illuminates the whole sample surface in a low-angle-incidence arrangement ( 0 . 5 3 d e g ) . The detector used is a CCD camera working at 30 fr./sec, equipped with a collimator inside, and the distance between the sample surface and the detector is set extremely close in order to enhance both spatial resolution and efficiency. Note that the imaging is done with one shot. In the case of XRF imaging, distinguishing elements are required and, therefore, most of the experiments were performed with monochromatic or quasi-mono-chromatic X-rays. The procedure for XRD imaging uses a combination of exposure and incident X-ray energy scan (or just tuning). Since the present experiment employs a fixed small-angle incidence and also a fixed diffraction angle of around 90 deg, the diffraction plane here is inclined at about 45 deg from the surface of the specimen. By scanning the energy of the incident X-rays, one obtains a diffraction peak which corresponds to the lattice spacing. Further instrumental details and many applications will be presented.

Strategies for Increasing Bioanalytical Throughput for Drug Discovery and Development Support

Patrick J. Rudewicz, Young Shin, and Qin Yue

Genentech, Inc., 1 Dna Way, South San Francisco, CA 94080, USA

In an effort to keep pace with the speed of drug discovery and development, bioanalytical laboratories are continually searching for new sample preparation and LC/MS/MS technologies to increase sample throughput. As part of drug discovery, LC/MS/MS methods are used for early PK screening, lead optimization studies, and lead qualification bridging studies. Once a compound is selected for development, more robust methods are developed and validated in compliance with GLP regulations to support several activities in drug development including formulation optimization, GLP toxicology studies, and clinical studies.

In this presentation, recent LC/MS/MS technologies that have been implemented in pharmaceutical laboratories to increase the speed and efficiency of quantitative LC/MS/MS analysis will be described. One approach that has been used successfully is online sample preparation, such as the Prospekt system in which disposable extraction columns are utilized for sample cleanup. Turbulent flow chromatography (TFC) has also been used for the online extraction of samples. The TFC column may serve as both the sample purification and the analytical column. Serial introduction of multiple LC column effluents into a single mass spectrometer ion source can also provide high-throughput capability, particularly in cases where the useful runtime is a fraction of the total analysis time.

Instrumentation advances that increase the efficiency and throughput of a bioanalytical laboratory also include innovative mass analyzer designs. Although the conventional triple-stage quadrupole mass spectrometer used at unit resolution in the SRM mode provides excellent sensitivity and selectivity for quantitative analysis, there are instances when interference from matrix or metabolites may be reduced or eliminated using high-resolution instrumentation. This may be achieved using several types of mass spectrometers including a Q-TOF or an LTQ-Orbitrap.

High-Throughput Bioanalysis by LC/MS/MS in Pharmaceutical Industry

Perry G. Wang

Teleflex Incorporated, 1001 Hill Avenue, Wyomissing, PA 19610, USA

Bioanalysis is a technique which is used for the quantitative determination of drugs and their metabolites in biological matrices.

Hyphenated instrumentation, such as liquid chromatography or mass spectrometry (LC-MS), is an essential tool in pharmaceutical industries. Due to its high selectivity and sensitivity, it plays a crucial role for drug discovery and development.

One of the most important factors for drug discovery and development, is the availability of high-throughput analytical approaches. The introduction and implementation of automated 96-well and even 384-well extraction techniques have made the approaches more realistic. The automated extraction techniques can be protein precipitation, solid phaseextraction, and liquid-liquid extraction. Additional high-throughput techniques include online extraction, the application of pierceable caps for biological tubes, and the so-called nanostream technique which has recently been introduced into the pharmaceutical industries.

Combination of automated 96-well sample preparation with the application of liquid chromatography (especially, UPLC or fast HPLC) coupled with tandem mass spectrometry (LC/MS/MS) has enabled bioanalysts to face the high-throughput challenges with greater confidence.

High-throughput assay can be also improved by using a parallel mechanism. For example, multiple HPLC systems are connected to one MS system. A real-study comparison will be presented where two HPLC systems are connected to one MS system. Most recent advances in the bioanalytical field will also be reviewed in this presentation.

Nonhazardous Automated Colorimetric Method for Nitrate Analysis

Craig R. Chinchilla

Systea Scientific, LLC, 900 Jorie Blvd., Suite 35, Oak Brook, IL 60523, USA

Several methods exist for the determination of nitrate in aqueous solutions; however, the most commonly performed automated colorimetric methods utilize toxic substances and generate hazardous waste (hydrazine and cadmium). There is no hazardous waste generated when performing the method presented and disposal costs are minimized or eliminated. The method has been specifically developed for discrete analysis which enables it to truly run unattended, thus greatly reducing labor and improving laboratory productivity.

The procedure for the determination of nitrate utilizes the reaction in which nitrate is reduced to nitrite by a proprietary reagent “R1." The reaction is slow and requires more than 12 minutes for 100% reduction of nitrate to nitrite. The reduced nitrate is then treated with sulfanilamide and N-1-naptylethylenediamine dihydrochloride under acidic conditions to form a highly colored soluble dye which is measured colorimetrically between 520–550 nm. The final product measured represents the nitrite ion being originally present, plus that formed from the reduction of nitrate (nitrate + nitrite). In order to determine the true nitrate concentration, the sample must also be analyzed separately for Nitrite to determine the amount originally present in the sample. The value obtained for nitrite is then subtracted from the nitrate + nitrite value to determine the true value for nitrate. Regardless of the sample matrix, recovery of nitrate to nitrite is consistently between 95% and 105%, which is a dramatic improvement over traditional automated colorimetric methods. After extensive testing on various matrices, no matrix interference problems have been observed.

Mathematical Model of Current Polarized Ionophore-Based Ion-Selective Membranes: Large-Current Chronopotentiometry

Erno Lindner, Justin Zook, Róbert E. Gyurcsányi, and Richard P. Buck

Department of Biomedical Engineering, The University of Memphis, 330 Engineering Technology, Memphis, TN 38138, USA

A mathematical model is presented to describe the effects of constant current on ion-selective membranes using theta functions. The model provides exact analytic solutions for calculating the concentration polarization of the ionophore, the ionophore-ion complex, and the charged mobile sites in space and time within the membrane. It also predicts the time course of the membrane potential and the electric field inside the membrane following the application of constant current. This analytic solution is faster to compute than the numerical simulations, and it provides the solution for any given time or position directly. The simulated concentration profiles compared favorably with concentration profiles recorded experimentally using spectroelectrochemical microscopy (SpECM), and allowed the determination of the diffusion coefficients of the ionophore, the ion-ionophore complex, and the charged mobile sites inside an ion-selective membrane. The extension of the model to large-current chronopotentiometry accurately predicts the experimentally recorded breakpoint time in the voltage-time transients. The diffusion coefficients calculated from the breakpoint times and from the initial ohmic resistance of the membranes are compared to those calculated by fitting curves to the SpECM measurements.

Characterization of Catecholamine Secretion in Murine Adrenal Slices Using Fast-Scan Cyclic Voltammetry and Constant Potential Amperometry

Jelena Petrovic and Mark Wightman

Department of Chemistry, The University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA

Fast-scan cyclic voltammetry (FSCV) has been previously used to study physiological preparations such as brain tissue slices. FSCV is a powerful technique since it exhibits chemical selectivity based on an analyte's signature oxidation and reduction potentials. Furthermore, FSCV at carbon-fiber microelectrodes offers spatial and millisecond temporal resolution allowing for close monitoring of catecholamine release dynamics. In this work, electrically stimulated catecholamine release in murine adrenal slices was studied using FSCV. Our interest in adrenal slices stems from the neuronal origin of the adrenal chromaffin cells and their use as neuronal cell models. Upon electrical stimulation, chromaffin cells rapidly secrete the catecholamines’ epinephrine and norepinephrine. Electrically stimulated catecholamine release was found to be pulse-, frequency-, and calcium-dependent. Application of a sodium channel blocker (tetrodotoxin) was found to completely abolish release. In addition, the release profiles suggested an uptake mechanism. To characterize uptake, adrenal slices were incubated with varying concentrations of the uptake inhibitor cocaine. Cocaine slowed the rate of uptake in a concentration-dependent manner. Further studies revealed rapid spontaneous catecholamine release in a number of slice preparations. In order to study spontaneous release, constant-potential amperometry (CPA) was employed due to its sub-milli-second temporal resolution. CPA revealed spontaneous events whose amperometric spike characteristics such as half width and area closely resemble those observed for vesicular release in isolated chromaffin cells. Spontaneous events were reversibly blocked by the nicotinic acetylcholine receptor antagonist (hexamethonium) implying mediation of spontaneous release via nicotinic receptors. CPA data suggests vesicular nature of spontaneous release in slices.

This work is funded by NIH (NS-38879).

Characterization of Macromolecular Protein Assemblies by Surface-Induced Dissociation: Expanding the Role of Mass Spectrometry in Structural Biology

Christopher M. Jones, Richard L. Beardsley,  Asiri S. Galhena, Eman Basha, Elizabeth Vierling, and Vicki H. Wysocki

The University of Arizona, 1306 E University Blvd, P.O. Box 210041, Tucson, AZ 85721, USA

The vast majority of biological processes are carried out by intricate assemblies of proteins, working in unison to carry out functions not afforded by individual polypeptide chains. While the investigation of quaternary protein structure has long been the domain of X-ray crystallography and electron microscopy, electrospray ionization (ESI) mass spectrometry has recently emerged as a powerful method for probing the structure of intact protein complexes. A single stage of mass analysis reveals the molecular weight of the assembly and stoichiometry of the subunits, while tandem mass spectrometry holds the potential to elucidate sub-oligo-meric structural information through the dissociation of subunits from the intact complex. However, slow heating methods such as collision-induced dissociation (CID) invariably result in the asymmetric ejection of a single unfolded monomer from the complex and a complementary (n-1)-mer, limiting the amount of structural insight that can be gained.

We have recently implemented surface-induced dissociation (SID) within a Q-TOF mass spectrometer for the study of protein assemblies. The sudden activation provided by SID results in a more symmetric product ion distribution and fragments other than monomer and (n-1)-mer, indicating that SID may yield additional information about the organization of subunits within a complex. Results are presented for SID of bacteriophage Cro proteins, bovine insulin oligomers, and heat shock proteins from several organisms. These complexes demonstrate how the surface-induced dissociation of protein assemblies is affected by their molecular weight, subunit number, interfacial contact area, and subunit conformation (see Figure 4).

Simultaneous Real-Time Detection of pH and Histamine Release from Gastric Glands in the Stomach

Eleni Bitziou, Bhavik A. Patel, and Danny O'Hare

Biosensor Research Group, Department of Bioengineering, Imperial College London, South Kensington Campus, 438 Bessemer Building, London Sw7 2Az, UK

Acid secretion is utilized by the stomach to process food. This process is complex with a network of cells working simultaneously to promote acid secretion and cause muscular contraction or relaxation. One cell that has been noted to play an influence on the parietal cells is enterochromaffin-like (ECL) cell. They directly activate hydrogen receptors at the parietal cells by releasing histamine [5]. To date, there are no real-time direct measurements that can monitor histamine levels and pH changes, due to acid secretion, simultaneously. The ability to measure both will provide important mechanistic information on the cellular network of the stomach to gain better knowledge of the physiology of the stomach and understand how this mechanism changes during disease states’ conditions.

To obtain spatial and temporal resolutions, we have utilized two sensing devices. For the measurement of histamine, we utilized a boron-doped diamond (BDD) microelectrode with amperometric detection at 1.4 V versus A g | A g C l , which has been shown to be extremely stable for detection of neurotransmitters in vitro [6]. For the measurement of pH, gold microelectrodes have been developed as solid-state pH sensors based on anodic electrodeposited iridium oxide film (AEIROF) [7].

The BDD microelectrode is characterized and calibrated for the detection of histamine, showing good limits of detection. For pH measurements, calibration and stability data showed excellent sensitivity responses with super-Nernstian behavior exceeding 7 0 mV/pH unit. Good reproducibility with prolonged use has illustrated a reliable and robust pH sensor for biological applications. Some preliminary data obtained from tissue samples will be shown.

Advanced Automation of SPE Methods

Naomi Reid, Robert Johnson, and Tom Hall

Horizon Technology, Inc., 45 Northwestern Drive, Salem, NH 03079, USA

Many of the EPA Series 8000 methods for the analysis of organics in solid wastewater samples involve an extraction procedure. Solid phase extraction (SPE) has several clearly demonstrated advantages over the labor intensive liquid-liquid extraction (LLE). Switching from LLE to SPE reduces the solvent consumption and labor required for the sample preparation step. This directly impacts the profitability of the laboratory.

Automating the SPE process adds the additional benefits of further reducing labor, the mostly costly component of the process, and improving the reproducibility of the extraction. Consistency of the extraction process is enhanced, and operator-to-operator variability is eliminated with automation. Automating the SPE process also minimizes the exposure to solvents.

This paper will focus on the benefits of using automated SPE for 8000 Series wastewater samples. Data, showing precision, accuracy, and recovery, from EPA Method 8270 will be presented. A cost analysis comparing manual and automated techniques will be shown.

Certified for Automation a Replacement for EPA 1664A

Naomi Reid, Jay Rowden, and Wilson Braulio

Horizon Technology, Inc., 45 Northwestern Drive, Salem, NH 03079, USA

Oil and grease analysis using the new EPA method 1664A presents new challenges for laboratories moving away from Freon113-based liquid-liquid extractions. The “performance-based" method 1664 utilizes n-hexane rather than Freon as the extraction solvent. Either liquid -liquid n-hexane extractions (LLE) or the less cumbersome solid phase extraction (SPE) can be used.

SPE reduces solvents and eliminates emulsions, and automating SPE dramatically reduces labor costs. Performing manual SPE extractions is labor-intensive. Automation increases productivity, improves consistency, and provides a safe work environment when compared to the labor-intensive manual SPE extractions. Automating the procedure allows the analyst to accomplish multiple tasks while extractions are taking place. Multiple technicians can achieve higher precision due to the extraction consistency. Reducing direct contact with solvents creates a safe environment. With the newly available 90 mm SPE disks, handling samples with high particulate are no longer a problem.

Direct Determination of Metal Ions in Wine and Fruit Juice Samples Using Internal Standardization and Fast Sequential Multielement Flame Atomic Absorption Spectrometry

Sergio L. Costa Ferreira, Anderson S. Souza,  Brandão C. Geovani, Hadla S. Ferreira,  Walter L. dos Santos,  Erik G. da Silva,  Lindomar A. Portugal,  Geraldo D. Matos, and Fernanda A. de Santana

Universidade Federal da Bahia (UFBA), Campus Universitário De Ondina, Salvador 40170-290, Brazil

The internal standardization technique is used to overcome matrix effects that might influence the analytical signal in the quantification step of the methods. It requires simultaneous or fast sequential analytical measures, and it has therefore been widely used in methods employing ICP OES and ICP-MS. The use of the internal standardization technique in FAAS has become feasible due to the introduction of a fast sequential system (FS) that allows sequential multielement determination employing FAAS.

The present paper proposes use of the internal standardization technique for correction of matrix effects in a procedure for the direct determination of manganese and iron in wine and copper in fruit juices employing FS-FAAS. The elements tested as internal standards were cobalt, silver, nickel, and indium. The results demonstrated that cobalt and indium are efficient for determination of manganese and iron in wine samples, respectively. For the quantification of copper in fruit juices, indium can also be used as internal standard. For analysis, the samples of wines and fruit juices were acidified with 1 m o l L 1 nitric acid solution.

The proposed method was applied for the determination of manganese and iron in sixteen wine samples and for quantification of copper in several fruit juices' samples. All analytical results were compared with the results obtained by analysis of these samples after complete mineralization using acid digestion and determination using FAAS.

Novel Techniques for Identification of Mixtures Using FTIR and Raman Spectroscopy

Mike C. Garry and Scot Ellis

Thermo Fisher Scientific, 5225 Verona Road, Building 4, Madison, WI 53711, USA

One of the most common applications of infrared and Raman spectroscopies is to aid in the identification of unknown materials. While the technique works well when applied to pure compounds, results can be ambiguous when samples containing a mixture of compounds are analysed. Spectral subtraction is a useful technique to help characterize mixtures by mathematically removing the spectral features of one or more of the suspected constituents. However, subtraction requires skill on the part of the analyst and is limited due to the distortions it causes in the spectral data. Expanding the use of infrared and Raman spectroscopies for identification of unknown materials demands that unskilled operators can effectively use the technique to obtain actionable results. Key components of the technology that are required for this expansion are improved spectral searching algorithms and user interfaces which allow operators to obtain results with a high degree of confidence.

In this presentation, insight will be provided into novel multicomponent searching techniques and a unique operator interface used to aid in the identification of samples containing mixtures of chemical compounds. Specific examples will be provided for FTIR and Raman instrumentation in polymer, forensic, and pharmaceutical applications.

The Simultaneous Determination of Hundreds of Petroleum Components through the Use of Spectral Accuracy with FI-TOF Mass Spectrometry

Michael T. Cheng, Ming Gu, and Yongdong Wang

Chevron Research and Technology, 100 Chevron Way, Danbury, CA 94802, USA

In petroleum applications, it is often required to perform the analysis of several hundred hydrocarbon components in a single sample, presenting a very unique challenge for analytical problem solving. When these components are a mixture of various hydrocarbons with different carbon numbers and different degrees of saturation, the hope of separating them out in time through gas chromatography prior to detection is largely dampened, leaving the heavy burden of differentiating these components to the detection system alone. Fortunately, with the commercial availability of high-resolution mass spectrometry such as time of flight (TOF) and its combination with soft ionization technique such as field ionization (FI), it is now possible to detect and differentiate hundreds of these hydrocarbons through mass spectrometry (MS).

Even at the 10,000:1 resolving power of TOF MS, however, there exist spectral interferences that are located only a few mDa apart, rendering reliable analysis difficult. Combined with the frequent mass drift on TOF MS, it is unreliable to rely on accurate mass information for component identification. On the other hand, each of these petroleum components has its characteristic isotope distributions uniquely given by its elemental composition, which can be utilized to tackle both the interference and the mass drift problem as long as the TOF mass spectral peak shape function is known. This paper will describe a new approach to self-calibrate TOF MS in terms of both accurate mass and peak shape function, which will make it possible to take full advantage of the characteristic isotope distribution information for the elucidation of complex TOF MS data. In fact, a single-step solution involving more than 240 simultaneously unknown components can be obtained to achieve both qualitative ID and quantitative analysis.

Characterization of Hydrocarbon Fractions in Petrochemical Samples by Automated Online HPLC + HRGC-MS Multidimensional System

Josep M. Gibert, Ariadna Galve, Nieves Sarrión,  José A. Muñoz, and Roger Gibert

KONIK Instruments, 12221 Sw 129 Ct., Sant Cugat Del Vallés, 08190 Barcelona, Spain

Characterizations of petrochemical streams are used to predict properties and/or behavior in processes or during application. Although density can give indicative data about crude oil aromaticity [8], more information can be obtained by a separation based on molecular properties of components in the sample. Nowadays, GC × GC [9] allows to obtain highly structured chromatograms of petrochemical samples based on separations along the distributions of volatility and polarity. However, GC × GC fails to give the systematic distribution of aromatic and naphthenic classes. The addition of a previous LC step has allowed a group-type separation of the sample into compound classes with an equal number of aromatic rings [10].

In this work, a new application of the patented TOTAD Interface for online coupling HPLC + HRGC is presented. The interface coupling an HPLC to an HRGC in the KONIK K2 HPLC + HRGC system allows the direct separation of petroleum fractions (aliphatic hydrocarbon; mono-, di-, and polyaromatic series) without mixup between them before their analysis by GC-MS or GC × GC-MS. The different hydrocarbon series were separated first by HPLC using an NH2 column and pentane, hexane, or heptane as mobile phases. Afterwards, hydrocarbons were eluted by groups from the column and the fraction of interest was transferred to the GC-MS system. With the addition of the KONIK Robokrom HPLC autosampler and the full control through the Konikrom Software, the complete analysis can be easily automated and performed in few minutes, limiting the use of solvents while protecting sample integrity.

Characterization of Phosphonium Ionic Liquids through a Linear Solvation Energy Relationship and Their Use as GLC Stationary Phases

Zachary S. Breitbach, Junmin Huang, and Daniel W. Armstrong

Department of Chemistry and Biochemistry, The University of Texas at Arlington, 700 Planetarium Place, P.O. Box 19065, Arlington, TX 76019, USA

In recent years, room-temperature ionic liquids (RTILs) have proven to be of great interest to analytical chemists. One important development is the use of RTILs as highly thermally stable GLC stationary phases. To date, nearly all of the RTIL stationary phases have been nitrogen-based (ammonium, pyrrolidinium, imidazolium, etc.). In this work, eight new monocationic and three new dicationic phosphonium-based RTILs are used as GLC stationary phases. The solvation properties of the phosphonium RTILs are studied using an inverse GC linear solvation energy model. This model describes the multiple solvation interactions that the phosphonium RTILs can undergo, and is useful in understanding their properties. In addition, the phosphonium-based stationary phases are used to separate complex analyte mixtures by GLC. Results show that the small differences in the solvent properties of the phosphonium ILs compared to those of the ammonium-based ILs will allow for different and unique separation selectivities. Also, the phosphonium based stationary phases tend to be more thermally stable than nitrogen-based ILs, which is an advantage in many GC applications.

On-site Vapor Detection of Chemical Warfare Agents by Monitoring Tape Method

Yasuo Seto, Mieko Kanamori-Kataoka,  Shintaro Yamaguchi, Ryuji Asada, Takeshi Ohmori,  Isaac Ohsawa , Nobuo Nakano, Tetsuya Kawabe, and Satomi Abe

National Research Institute of Police Science, 6-3-1 Kashiwanoha, Kashiwa, Chiba 277-0882, Japan

In the incidents of chemical warfare terrorism, various kinds of toxic substances may be used, and on-site detection performed by first responders is required for countermeasure to minimize terrorism disaster. Our previous research on verification of commercialized on-site chemical warfare detection equipments has disclosed that it is difficult to detect chemical warfare agent vapors perfectly for sensitivity, rapidity, and operation, and we have newly developed field detection method for chemical warfare agents using the monitoring tape method. In this presentation, the monitoring tape method was further improved for detecting cyanogen chloride (CK, blood agent), chlorpicrin (PS, choking agent), and sarin (GB, nerve gas). CK could be detected with limit of detection (LOD) of 0 . 2 m g / m 3 with sampling time of 30 seconds, using 4-benzylpyridine and barbituric acid as coloring reagents. PS could be detected with LOD of 0.01 mg/ m 3 with sampling time of 30 seconds, using transmission-type apparatus with pyrolyzer ( 4 0 0 C ) and 4-p-nitrobenzylpyridine as coloring reagent. GB could be detected with LOD of 0 . 5 m g / m 3 with sampling time of 1 minute, using transmission-type apparatus as well as methyl yellow and methyl cellosolve as coloring reagents.

Online Detection of Contaminations in Drinking Water by a Combined Electronic Micronose and Optospectrometric System

Joachim Goschnick and Martin Sommer

Institut für Mikrostrukturtechnik (IMT), Forschungszentrum Karlsruhe GmbH, Postfach 3640, 76344 Eggenstein - Leopoldshafen, 76021 Karlsruhe, Germany

Drinking water is vulnerable to a variety of threats such as technical imperfections of the water supply, accidents, or criminal attacks. Consequently, a fast and broadband detection system of water contaminations is of primary relevance. Since the threats are not confined to a certain part of the supplying network, highest security is only available if the water quality is checked at the point of use. There, conventional water analytical laboratory equipment is inappropriate because neither the costs or the size nor the labor input is acceptable. New autonomously working low-cost monitoring systems are required to provide long-term stability and low size for system integration at point of use. To fit these requirements, a novel broadband water analytical system is developed. Volatile components of the water are extracted online by a gas-permeable membrane, and transferred to an electronic nose microsystem. The latter is based on a gradient gas sensor microarray sensitive to all volatiles except inert gases. Highly inexpensive fabrication is achieved using a single SnO2 layer only subdivided by parallel electrode strips to form 38 conductive gas sensor segments. A temperature gradient and an inhomogeneous gas-permeable coating differentiate the segment's properties providing gas characteristic conductivity patterns with which the volatile analysis is working. Nonvolatile water components are detected optically with a minispectrometer analyzing light absorption as well as stray light or fluorescence. Not only can simple chemicals be detected by this combined technique but also particulate material and even biological contaminations like bacteria via stray light and metabolism products.

Development and Optimization of Molecular Beacons as Biosensors for Surface Hybridization Using Lock Nucleic Acid

Karen Martinez, Maria Carmen Estevez,  Joseph A. Phillips, and Weihong Tan

University of Florida , P.O. Box 117200, Gainesville, FL 32611, USA

Biosensors based on DNA hybridization have been used in various studies for many years. However, biosensors based on molecular beacons (MBs) have not fully realized their potential. In biosensors utilizing MBs, the probes are immobilized on a surface after which a complementary target is added over the immobilized probe. The complementary sequence can then bind to the MB forming a hybrid on the surface. This allows the fast, sensitive, and selective detection of nucleic acid targets. However, MBs showed a relatively low fluorescence enhancement when immobilized onto a surface compared to that in solution. One of the concerns when using MBs for immobilization is the fact that the hairpin structure may interact with the surface, degrading it partially. These interactions can change the electrostatic properties and local environment of the immobilized MB. Consequently, once the structure of the beacon is compromised, low quenching efficiency and therefore high background are observed. In order to overcome these concerns, MBs using lock nucleic acids' bases (LNA) have been designed for immobilization onto a glass surface. The excellent affinity and stability that LNA offers combined with the detection capabilities of the MBs promise an outstanding tool for hybridization studies onto surface. Also, these properties will allow for better immobilization efficiency, with a relative low-background signal and high stability of the immobilized beacon.

Automated Parallel Chiral Scouting System

Joan M. Stevens, Mark E. Crawford, and Ziqiang Wang

Gilson, Inc., 3000 Parmenter Street, Middleton, WI 53562, USA

A Gilson parallel analytical chiral system optimizes chiral separations in a fraction of the time compared to conventional chiral screening. The system screens chiral compounds on four separate chiral columns simultaneously, increasing throughput by 300%. Four analytical separations with chiral detection in 15 minutes versus more than 60 minutes for a chiral analysis in series. The liquid handler automatically dilutes each sample to a specified concentration. Various solvents can be accessed for the dilution through a syringeless solvent delivery system, and solubilized via an orbital shaker. The liquid handler simultaneously introduces the sample to four parallel chiral columns. Each separation is monitored by UV/Vis and chiral detection. The system is capable of gradient and isocratic mobile phase and mobile phase solvent selection. Only one chiral detector is required for the system, drastically reducing capital investment. The system also monitors flow rate changes for each column through pressure sensors. The systems’ flow rate and sample are split in four ways by a manifold prior to the columns. After the sample has been chromatographed, the flow rate is combined into a single stream by a second manifold prior to the chiral detector, allowing for determination of optical rotation of the separated peaks. Automatic overlays of the four UV traces relative to the optical detector predict the best conditions for the separation based on type of chiral column performance, mobile phase composition, and optical rotation within a single chromatographic run. Both gradient and isocratic profiles were extremely reproducible. The system was designed to maintain the entireness of the separation through the parallel columns and into the chiral detector, which is accomplished by minimizing tubing lengths and IDs to be equal throughout the entire system. The results from the analytical chiral scout chromatographic analysis can then be implemented in preparative purification (see Figure 5).

Determination and Validation of N-Methylpyrrolidine in Cefepime for Injection by Capillary Electrophoresis

Sigamani J. Prasanna, Hemant K. Sharma, and Khagga Mukkanti

APL Research Centre, 313 Bachupally, Qutubullapur Mandal, Hyderabad, Andra Pradesh 500072, India

A simple indirect UV photometric capillary electrophoresis method was developed and has been validated for determining N-methylpyrrolidine in cefepime for injection. The electrophoric system consists of 5 mmol background electrolyte of imidazole adjusted to pH 5.1 with 3 M acetic acid solution. The applied voltage was 20 kV and temperature was 20 C . The runtime for the analysis was 8 minutes. Precondition was employed with 4 bar pressure at the inlet of the capillary for 3 minutes. A bare fused silica extended light path capillary with effective length of 56 cm and internal diameter of 50  𝜇 m was used. The indirect UV detection was performed at wavelength, signal at 240 nm, and reference signal at 210 nm. A very good base line was achieved. The method is validated for linearity, specificity, limit of quantification, limit of detection, repeatability, robustness, and accuracy. A limit of detection and limit of quantification were derived from the residual standard deviation method, the values were found to be 5.8  𝜇 g/mL and 16.3  𝜇 g/mL, and the relative standard deviation of 10.9% and 5.2% was observed, respectively. The recovery was estimated to be between 98.1% and 103.2%. The overall recovery was 100.4%, and the relative standard deviation was 1.8%.

This method requires less time consuming, simple sample preparation, high buffering capacity, no extraction procedure as described in the reported GC method, and very good base line as compared to the reported CE method.

The results of the various validation experiments demonstrate that the method is specific, linear, sensitive, repeatable, and accurate. It is suitable for routine analysis of N-methylpyrrolidine in cefepime for injection.

Acknowledgments

The authors gratefully acknowledge the Aurobindo Pharma Ltd. Research Centre at Hyderabad for providing the analytical support to pursue this work, and they are also grateful to colleagues who helped them in this work.

A Novel Method for Simultaneous Determination of Peptides, Lipids, and Lipid-Impurities in the Synthetic Pulmonary Surfactant Formulation

Darwin J. Asa, Rosa Bonilla, John Rech, Victoria Scott,  Charlotte Baker, Gerald Orehostky, and Michelle DeCrosta

ESA Biosciences, Inc., 22 Alpha Road, Chelmsford, MA 01824, USA

Surfactant deficiency and dysfunction have been associated with numerous pulmonary conditions. In premature newborns, surfactant deficiency is the primary pathophysiologic mechanism of the neonatal respiratory distress syndrome (RDS). Exogenous surfactants comprised of phospholipids and proteins from animal sources are currently used to treat RDS. Lucinactant (Surfaxin; Discovery Labs, Pa, USA), a precision-engineered surfactant that contains a synthetic peptide in place of animal-derived proteins, has been developed.

Various approaches have confirmed the fundamental roles of the peptides and phospholipids in the surfactant system. Currently, industry publications suggest that combined analysis of proteins and lipids is not recommended based upon their inherent structural characteristics. Lucinactant contains four active ingredients: the peptide Sinapultide (also known as KL4) and three lipid ingredients including two phospholipids, Dipalmitoylphosphatidyl Choline (DPPC) and Palmitoyloleylphosphatidy Glycerol (POPG), and the fatty acid Palmitic Acid (PA). In lucinactant, the four active ingredients form liposomal-like aggregates in a trishydroxymethylaminoethane (Tris) buffer medium.

Presented here is a gradient HPLC-CAD (charged aerosol detection) method capable of resolving the peptide KL4 and the phospholipids DPPC, POPG, and PA with their corresponding lipid-related degradants in < 35 minutes on a C18 reversed-phase column. The method is sensitive (LOD ~0.47  𝜇 g/mL), shows good linearity (R2 > 0.99 for actives; R2 > 0.98 for impurities), and can readily measure the low levels of the degradants (23.31  𝜇 g on column).

Multivariate Analysis of Deep Subsurface Raman Spectra

Neil A. Macleod

Central Laser Facility, Rutherford Appleton Laboratory, Oxford, Oxfordshire Ox11 0Qx, UK

The chemical specificity of Raman spectroscopy makes it an ideal technique to noninvasively characterise an extensive variety of sample types. Recent developments utilising the diffusion of photons through turbid media (spatially offset Raman spectroscopy (SORS) [11] and transmission Raman (TR) spectroscopy [12]) have overcome the inherent bias of conventional Raman spectroscopy towards surface layers, and allowed the identification and chemical characterization of subsurface layers. Potential application areas include the detection of bone disease and breast cancer, online monitoring of pharmaceutical production lines, and security screening (see Figure 6).

The complexity of multilayered or multicomponent systems can be unravelled by a battery of techniques based on statistical analysis. Applications of such multivariate techniques (including principal component analysis (PCA), partial least squares (PLS), and spectral deconvolution) will be demonstrated in a number of areas including quantitative analysis of the composition of drug tablets and capsules, detection of explosives and illicit materials concealed in common containers, and the determination of the spectrum of bone measured through an overlayer of organic material.

Determination of Methyl Mercury in Blood and Urine Samples with Automated Dynamic Headspace Sampling and Plasma Emission Detection

Eike Kleine-Benne, Oliver Lerch, and Hans-Wolfgang Hoppe

GERSTEL GmbH & Co. KG, Aktienstrasse 232-234, Muelheim, 45473 North-Rhine-Westphalia, Germany

An instrumental strategy for measuring methyl mercury in blood and urine samples is presented. The instrumentation is based on gaschromatography with a new dynamic headspace sampler (DHS) which allows the use of individual traps for each sample coupled to a selective plasma emission detector (PED) with a robust and simple microwave-induced helium plasma source.

Method performance parameters and results for real samples are presented. Reference measurements were done according to a standardized method published by the German Research Foundation [13]. It is based on derivatizing methyl mercury with sodium tetraethylborate and on determining the mercury species with headspace-GC/MS.

Dynamic headspace sampling methods instead of static headspace methods promise to result in exhaustive enrichment of the analyte. This provides on one hand lower detection limits. On the other hand, exhaustive methods are independent of steady equilibrium partitioning of analytes between sample and gas phases for different samples. This is important for urine samples which can vary concerning their composition. For this study, a recently developed DHS system with replaceable adsorbent traps was used. The clean adsorbent traps are stored in a sealed tray on the x-y-z robotic sampler which transports them to the sample vessel, then directly to the integrated thermal desorber. This design enables automated optimization of trapping conditions including choice of adsorbent, and avoids transferlines.

Detection was done with a recently developed robust and simple-to-use plasma emission detector (PED) for gas chromatography, based on an interference filter to select the emission wavelength of the mercury line at 253.65 nm excited in microwave-induced helium plasma. Spectral background correction is performed by oscillating the bandpass filter between two defined angels.

Characterization and Profiling of N- and O-Linked Oligosaccharides from Glycoproteins with the Corona CAD

Darwin J. Asa

ESA Biosciences, Inc., 22 Alpha Road, Chelmsford, MA 01824, USA

Profiling and anlysis of native and recombinant glycoproteins are crucial for many companies, but methods to perform this important operation are often complex and require a great deal of expertise to perform. Here, we discuss profiling methods for N- and O-glycans that are ideally suited for routine analysis utilizing an LCMS system equipped with a Corona CAD detector.

In these studies, N-linked glycans were released by N-glycanase or endo-H. O-glycans were chemically released either by reductive B-elimination (RBE) or by nonreductive ammonia/ammonium carbonate (NAC). Two HPLC stationary phases for separation of released oligosacchardies were investigated. Both methodologies utilized charged aerosol detection (CAD), a highly sensitive universal detection technique, for the routine analysis of unlabeled glycans.

HPLC profiling of RBE-released O-glycan alditols was accomplished using a PGC column with CAD detection and simultaneous MS analysis with flow splitting. In a typical experiment, approximately 300  𝜇 g of glycoprotein are processed, and a volume equivalent to 25–50  𝜇 g is injected for HPLC using a 9:1 CAD:MS flow splitting ratio.

Released N-glycans were profiled on PGC as native structures with CAD detection using a simplified sample workup only involving ultrafiltration prior to HPLC. All expected oligosaccharide components were well resolved. Further sample workup via borohydride reduction gives a single peak for N-glycan alditols on PGC although a loss of sensitivity is evident. Retention of heavily sialylated oligosaccharides is an advantage of PGC, allowing the profiling of tri- and tetra-antennary glycans.

The concentrated buffer-free fractions obtainable using this approach allow unusual or low-abundance N-glycans and O-glycans to be characterized offline by MS and the CAD with maximum sensitivity. This profiling method is also amenable to further oligosaccharide structural analysis by MS.

Automated Solid Phase Extraction of Lipids in Biological Tissue Extracts

Landon A. Wiest, Katherine N. Biggs, Josiah Moulton,  Steven G. Wood, Craig Thulin, and Matthew R. Linford

Brigham Young University, 341 North 500 East, Provo, UT 84606, USA

The separation and identification of lipids in tissues into three important biological classes (neutral lipids, fatty acids, and phosphatidylcholines) are promising as a tool for detection of certain diseases. To date, separation attempted using manual SPE has had various problems, including inconsistent results between users and poor reproducibility in general; that is, detection of diseased versus control tissues has been accomplished but with large scatter between test groups. A more reproducible method has yet to be developed.

This study undertakes developing such an automated reproducible SPE method for fractionating the lipids in biological tissue based on a modified literature procedure. (Sorbent Extraction Technology Handbook). An automated SPE method using a Gilson GX-271 ASPEC was developed with a test solution containing cholesterol, linoleic acid, and a phosphatidylcholine, which were eluted with chloroform, 2% acetic acid in diethyl ether, and methanol, respectively. Breakthrough concentrations on amino SPE cartridges (Varian, BondElut, NH2, 3 mL, 50 mg) for these analytes were determined. Elution of analytes was confirmed with ESI-TOF-MS. This method was tested on extracts of biological samples. This automated approach yielded more reproducible data than the manual method.

Analysis of Water for Pesticides at Low Parts per Trillion Levels Using Direct Injection into an Online SPE-LC/MS/MS System

Andre Schreiber, Stephen Lock, Nadine Anderson, and David Evans

Applied Biosystems, 71 Four Valley Drive, Concord, ON, Canada L4K4V8

The provision of clean uncontaminated drinking water is of paramount importance to the water industry. In recent times, the requested limits of detection for pesticides have been decreasing as methodologies improve. Typically, water companies need to be able to perform quantification below 1  𝜇 g/L. These low levels often mean that samples have to be extracted to concentrate contaminants to such a level where they can be detected. Sample pretreatment can often be time-consuming and can add cost to the analysis.

Data presented were acquired on the 3200QTRAP LC/MS/MS system where pesticides have been detected at low ppt levels. High injection volumes were used with online solid phase extraction before LC/MS/MS analysis. Reproducibility was less than 15% at low ppt levels, and calibration over the range 20–1000 ppt was observed to be linear. For confirmation, MRM triggered enhanced product ion scans with a collision energy spread to enhance the spectral quality.

Nitrate Reductase: A Green-Chemistry Replacement for Toxic Cadmium in Automated and Manual Colorimetric Methods for Determining Nitrate in Water

Charles J. Patton and Jennifer R. Kryskalla

U.S. Geological Survey, National Water Quality Laboratory, P.O. Box 25046, Ms 407, Denver, CO 80225, USA

Nitrate is one of the most universally determined anions in natural water and drinking water because it can promote eutrophication and it is toxic to fetuses, the young of livestock, and humans at concentrations that exceed about 10 mg-N/L. In water, nitrate is usually determined by reduction to nitrite, which is subsequently determined colorimetrically with Griess reagents—acidic sulfanilamide (SAN) and N-(1-Naphthyl) ethylenediamine (NED). Longstanding reference methods such as EPA 353.2 and U.S. Geological Survey I-2545-90 use granular cadmium in the form of small packed-bed reactors to reduce nitrate to nitrite. Widespread acceptance and application of cadmium-reduction nitrate determination methods notwithstanding, “optimum" reactor geometry, activation procedures, and reagent formulations remain a topic of perennial discussion among environmental analytical chemists.

Ease-of-use, toxicity, and waste-disposal issues associated with cadmium-reduction nitrate determination methods led us to explore soluble nontoxic reducing agents. Bispecific nitrate reductase (NAD(P)H:NaR; Enzyme Commission # 1.7.1.2, from Pichia angusta) and NADH:nitrate reductase (NADH:NaR; EC 1.7.1.1, formerly EC 1.6.6.1 produced by recombinant expression of nitrate reductase from Arabidopsis thaliana in the yeast Pichia pastoris) are nontoxic and more selective than cadmium. This paper summarizes our considerable analytical experience with these enzymes, and describes automatic, semiautomatic, and manual approaches for using them as replacements for cadmium in routine colorimetric nitrate determinations in water.

Selective Mercury Removal from Wastewater by Sulfathiazole-Based Hydrogel

Ece Kok Yetimoglu

İTÜ Fen-Edebiyat Fakültesi Kimya Bölümü, Goztepe Kampusu, Istanbul 34722, Turkey

Mercury is potentially used in chemicals and equipment such as switches, gauges, thermometers, fluorescent and specialty lamps, and batteries. Because of the high toxicity of all mercury compounds, the extraction of mercuric ions from aqueous wastes is of special environmental importance. The sorption of selective metal ions by polymeric adsorbents has recently gained much attention. Hg(II) has an extremely high affinity for thiol-containing compounds and forms linear complexes. The coordination between Hg(II) and sulfathiazole would be through the nthiazolic, with two hydroxyl groups bonding to Hg atom, in a local tetrahedral geometry [14].

In this study, a new mercury-selective monomer was synthesized from the reaction between sulfathiazole and glycidyl methacrylate (GMA). Sulfathiazole-based hydrogel was prepared by irradiating the mixtures of sulfathiazole-modified glycidyl methacrylate monomer, polyethylene glycol diacrylate (cross-linker), and photoinitator by UV irradiation at ambient temperature [15].

The optimization of procedure was performed under competitive and noncompetitive conditions using factorial design including pH, amount of hydrogel, contact time, temperature, ionic strength, and wet/dry hydrogel first and final concentrations of sample and eluent. Equilibrium isotherms were determined to assess the maximum sorption capacity of the sorbents. Elution of metal ions was investigated in acid media. It was concluded that the desorbed hydrogel could be reused to remove mercury without loss of adsorption capacity.

Use of Near Infrared Spectroscopy for Online Quality Control of Peanuts

Mirta Golic

Peanut Company of Australia, P.O. Box 26, Kingaroy, Queensland 4610, Australia

Peanut Company of Australia has been using near infrared spectroscopy (NIRS) for predicting quality of peanuts online since the beginning of the year. We have been using the NIRS for prediction of moisture and oil contents, with predictions of fatty acids’ contents, peroxide value, and free fatty acids’ contents at final testing stages before implementation in the online quality control.

Advantages of NIRS analysis online over conventional chemical methods of analyses are as follows. It is nondestructive to samples; it is fast; it produces multiple results at once, allows analysis of each tonne peanut bag, reduces cost of chemical testing (approximately 0.8 million AU$ per year), and increases production efficiency; real-time decisions can be made.

Optimization of the Thermal Modulation in Comprehensive Two-Dimensional Gas Chromatography

Gianluca Stani  and Armando Miliazza

SRA Instruments Italia, Viale Assunta 101, 20063 Cernusco Sul Naviglio, Italy

Comprehensive two-dimensional gas chromatography employs an interface device (the modulator) between the first and second columns. The thermal modulation uses hot and cold jets of gaseus nitrogen to continuously and efficiently trap and inject portions of eluting peaks from the primary column into the secondary column.

The thermal processes are determined by the nitrogen cold flow and the temperature/time of the hot pulse. In order to obtain an optimal modulation ratio of 3-4, the cold jet flow and the activity time of the hot jet pulse must change during the GC run for such application that requires the simultaneous determination of either very volatile compounds or high-boiling compounds. The optimized combination of these two parameters improves the efficiecy of the modulation in terms of preventing breakthrough of the high-volatility compounds and avoiding the trapping for semivolatile compounds, causing increased contribution on the modulation ratio and peak tailing.

A microprocessor programable device is presented together with some examples to control and optimize the thermal processes of a commercial dual-stage thermal loop modulator. It can be coupled with any GC's brand, and time-base programed to follow a GC run program. The cold nitrogen flow optimization and a specific programable function allow for an important reduction of liquid nitrogen consumption in such type of modulators.

Automated Dynamic Headspace Sampling of Aqueous Samples Using Replaceable Adsorbent Traps

John Stuff, Jackie Whitecavage, and Andreas Hoffmann

GERSTEL, 1510 Caton Center Drive, Suite H, Baltimore, MD 21227, USA

Static (equilibrium) headspace injection is commonly used for GC determination of volatiles in solid and liquid samples. Since this technique relies on the analyte partitioning between the sample and headspace and uses a fixed injection volume, it may not provide adequate detection limits, particularly for higher molecular weight, and higher-boiling analytes.

In this study, we describe the use of a new automated dynamic headspace sampler for determination of volatiles in high water content solids and aqueous samples. This sampler uses a two-needle design to flush the headspace of standard headspace vials onto replaceable adsorbent traps that can be thermostatted to control interference from water vapor. After sample collection, the adsorbent traps can be automatically dry-purged to further eliminate trace water before introduction into the integrated thermal desorber. This design enables automated optimization of trapping conditions including choice of adsorbent, and has the potential for automated internal standard addition and automated calibration.

Performance of the new system was compared to traditional static headspace analysis using high water content solid samples like fruits and vegetables, and also beverages. To illustrate the versatility of the new design, several sample types with high water content were tested with a series of adsorbent traps to choose optimal trapping conditions. Better detection limits were obtained with dynamic headspace for all sample types.

Automation of Genomic DNA Isolation with Nucleic Acid Isolation Workstation Using Promega's Blood DNA Isolation Kit

Sikander Gill, Rajwant Gill, Alicia Davis, and Dong Liang

Aurora Biomed Inc., 1001 East Pender Street, Vancouver, BC, Canada V6A1W2

To provide an automated solution to the bottleneck problem of having high-quality genomic DNA for downstream applications, Aurora Biomed Inc. has validated its nucleic acid isolation workstation. Using Promega's MagneSil Blood Genomic kit, genomic DNA in 96-well format was isolated from human embryonic kidney 293-cell line, Chinese hamster ovary cell line, human cheek buccal cells, and human saliva. The high molecular weight isolated DNA was detected running close to 48.5 kb hyper-DNA ladder in ethedium detection system. The fine DNA bands without any streaks indicated no shearing of the DNA molecules in the automated process. The isolated DNA presented all panel genes for which amplification was carried with specific primers. The automated process was observed to be significantly efficient as no DNA was detected in the wash and extra elution steps except the actual elution step. The isolated DNA yield was 4.9  𝜇 g/500  𝜇 L of human saliva with an OD260/280 of 1.74–1.89.

Automation of High-Throughput Recombinant Protein Purification Using Versa Liquid Handling Workstation

Sikander Gill, Rajwant Gill, Alicia Davis, Anirudh Mally, and Dong Liang

Aurora Biomed Inc., 1001 East Pender Street, Vancouver, BC, Canada V6A1W2

The goal of much of the biotechnology industry is to prepare large volumes of purified, native, or recombinant proteins. The purity of the proteins is vital for such applications. Since proteins vary from each other in size, shape, charge, hydrophobicity, solubility, and biological activity, these differential characteristics make purification a very cumbersome process. However, the development of novel cloning vectors and affinity resins has contributed significantly to the overexpression and purification of recombinant proteins in high quality and with ease. Aurora Biomed Inc. has developed a workstation that employs Ni-NTA magnetic beads-based technology for the purification of His-tagged recombinant proteins in high-throughput format. The data from the automated process would be presented in terms of purity, efficiency, throughput, and ease of purification of a recombinant protein.

Identification and Quantification of Oxidative Degradation Products of Aldicarb by Various Oxidation Systems Using HPLC-Ion Trap Mass Spectrometry

Tongwen Wang, Chuan Wang, Evelyn Chamberlain,  Honglan Shi, Craig Adams, and Yinfa Ma

University of Missouri - Rolla, 345 Schrenk Hall, 1870 Miner Circle, Rolla, MO 65409, USA

Carbamate pesticides are derived from carbamic acid, and kill insects in a similar fashion as organophosphate insecticides. They are widely used in homes, gardens, and agriculture. Their mode of action is inhibition of cholinesterase enzymes, affecting nerve impulse transmission. Carbamate pesticides have caused more attentions because of their environmental and human health impacts. It was reported that a large part of the carbamate pesticides went into the natural water system, leading to the intake of these harmful substances by humans. Furthermore, these pesticides can be further degraded into smaller and more hydrophilic molecules which can be even more toxic to humans during the disinfection process in water treatment with different oxidants in water treatment plants. Identification and monitoring of the formed degradation products, therefore, become crucial in providing critical information for establishment of disinfection strategy in water treatment plants. A comprehensive study was performed to identify the degradation products of aldicarb in water at different treatment conditions, such as free chlorine, monochloroamine (MCA), chlorine dioxide, permanganate, ozone, and hydrogen peroxide. The main technique used was high-performance liquid chromatography coupled with ion trap mass spectrometry. It was found that aldicarb sulfoxide was detected as a degradation product of aldicarb after its treatment with free chlorine, MCA, ozone and hydrogen peroxide, and aldoxycarb was detected as a degradation product of aldicarb after its treatment with permanganate. Aldicarb sulfoxide and aldoxycarb were further treated with these oxidants, and it was found that aldoxycarb was detected as a degradate of aldicarb sulfoxide after being treated with permanganate; no other degradates were detected when treated with other oxidants. N-chloro-aldoxycarb was detected as a degradate of aldoxycarb when it was treated with free chlorine.

Mobile Instruments: Fast, Cheap, and under Wireless Control

Vassili Karanassios

Department of Chemistry, University of Waterloo, 200 University Avenue West, Waterloo, ON, Canada N2L 3G1

Fueled by demands for inexpensive instruments that provide rapid real-time analytical results on-site (i.e., in-the-field, where needed the most) and driven by developments in the micro-, nano-, and wireless-electronics industries, a powerful trend toward development of portable analytical systems has emerged. Such mobile instruments will cause a paradigm shift in classical chemical analysis and metrology by allowing users to carry (part of) the lab to the sample.

In this breadth rather than depth presentation and following a brief overview of the field, two classes of examples from the authors' laboratory will be used as the means with which to illustrate the power of mobile micro- and nanoinstruments.

One class involves a “patient" as the sample and an ingestible wireless capsule-size spectrograph for fluorescence-based cancer diagnosis of the gastrointestinal (GI) tack as (part of) “the lab." The other involves the “environment" as the sample and a couple of mobile battery-operated instruments that are often fabricated on inexpensive plastic substrates using soft-lithography or microfluidics channels [16] that can be used either for optical spectrometry or mass spectrometry [17] and that utilize a palm-size personal organizer with a wireless interface for data acquisition and signal processing as (part of) “the lab."

To illustrate how to electrically power such instruments (and thus further enhance their portability), mobile energy issues will be briefly addressed [18]. Particular emphasis will be paid to current and anticipated future applications, to the effect of scaling laws, to the scientific and technological roadblocks that remain to be overcome, and to the integration of technologies that may prove essential in developing the next generation of disruptive-technologies mobile micro- and nanoinstruments that will be fast, cheap, and under wireless control.

Totally Self-Contained Gas Chromatograph- Toriodal Mass Spectrometer for Field Application

Stephen Lammert, Jesse A. Contreras, Jacolin A. Murray, H. Dennis Tolley, Samuel E. Tolley, Edgar D. Lee,  Milton L. Lee, and Douglas W. Later

Torion Technology, 2400 North 180 West, Provo, UT 84062, USA

The development of portable mass spectrometers for field analysis applications depends less on progress in the miniaturization of the mass spectrometer analyzer components than on the reduction of the overall system utilities and vacuum requirements. In addition, approaches must be taken to address the reduced ion capacity that accompanies miniaturization in order to maintain system performance specifications that are comparable to laboratory-based systems. Finally, the same requirements that drive miniaturization also drive an implicit requirement of reduced operating complexity. In other words, use of mass spectrometers in the field will likely be done by nonmass spectrometrists, and the operation and control of the system must accommodate this.

We have been developing a field-portable self-contained gas chromatograph-toroidal ion trap mass spectrometer (GC-TMS) system over the last few years. The current system, called GUARDION-7, is a ~12.5 kg package that contains everything needed for 4 hours of operation (battery power and gas) in a Pelican case that measures 47 cm × 35.7 cm × 17.6 cm. Analysis of gas or liquid samples is accomplished using a custom SPME sample injector, a fast (ca. 3 minutes) chromatographic separation using a 5 m × 0.1 mm i.d. capillary column coated with 5%-phenyl-95%-methyl polysiloxane stationary phase. The TMS operates in the electron-impact mode over a mass range of ~50–500 da. Analysis turnaround time is under 5 minutes. Custom system operation software guides the user through all of the data acquisition steps after power-up. Acquired spectra can be compared to on-board specialized libraries (i.e., chemical threat agents, toxic industrial chemicals) or to the NIST library.

Analytical performance has been tested on many classes of chemical compounds, including environmental chemicals (both volatile and semivolatile), explosives, drugs, chemical threat agents, and chemical agent simulants. Performance measures (i.e., spectral integrity, detection limits, spectral mass resolution, and dynamic range) will be shown for many of these application areas. Furthermore, the GUARDION-7 has undergone preliminary environmental tests including temperature, electrostatic discharge (ESD), shake and vibration, and drop testing with the goal of conforming to the MIL-STD 810E specification. In each of these areas, the instrument sustained only minor failures. Examples of these results will also be presented.

Life on a Chip: Integrating the Animate and Inanimate Worlds

James Castracane

CNSE, University at Albany, 255 Fuller Road, Albany, NY 12203, USA

Recent years have been marked by the emergence of new disruptive physical, chemical, and biological innovations that are driven by the vast scientific and technical capabilities provided by nanotechnology. The essence of nanotechnology is the ability to engineer the individual building blocks of matter at the molecular level, atom by atom, to form a link between the nanoscale and the micro- and even macroscales with precisely controlled functionality and customized properties/performance. Through the exploitation of intrinsic and engineered behaviors of such materials, unique biosensors can be created. Integrating cells/tissues/biomolecules with computer chip platforms leverages both nature and state-of-the-art IC fabrication methods. The linkage between animate and inanimate components provides the key to expanding the range of possible experiments and resulting mobile sensors. Examples range from the use of innovative micro-/nanoscale chip developments for point-of-care disease screening (TB, Botulism, etc.) to the creation of implantable multifunctional probes which can be used for documenting cancer cell metastatic dynamics. Recent results from a selection of ongoing bio-/nanotechnology research projects at the University at Albany's College of Nanoscale Science and Engineering (CNSE) will be presented against a backdrop of the infrastructure development at CNSE.

The Development of Automated DNA Sequencing

Lloyd M. Smith

Department of Chemistry, University of Wisconsin-Madison, 1101 University Avenue, Madison, WI 53706-1396, USA

In April 1982, I arrived in Pasadena to begin postdoctoral work with Leroy Hood at Caltech. I acquired a solid background in synthetic chemistry, fluorescence detection, and instrumentation development during my graduate work in the Chemistry Department at Stanford. However, I knew nothing about nucleic acids and the emerging field of molecular biology, and hoped to remedy that shortcoming as a postdoctor. In short order, I found myself working long hours at the tedious business of DNA sequencing. This gave me a great appreciation of its laborious nature, and spurred my interest in attempting automation of the process. Over the course of some months, colleagues and I developed the basic idea of 4-dye fluorescence-based DNA sequencing, and I undertook the successive challenges of developing a chemical means for tagging the DNA molecules, selecting a set of four appropriate spectrally resolved fluorophores, designing and building a prototype instrument, and obtaining the first proof-of-principle data demonstrating the concept. The work was published in Nature in 1986, and the first commercial instrument was sold by Applied Biosystems in 1987. Over the ensuing 20 years, very substantial investments by both the public and private sectors led to the evolution of these instruments into the powerful workhorses that we see today. These machines were the engine behind the success of the Human Genome Project and the concomitant transformation of biology into “information science" that we are undergoing today.

Online SPE-LC-MS for Ultratrace Analysis of Antibiotics in River and Ground Water

Frank Steiner, Frank Arnold, Verena Fraas,  Markus Martin, and Christian Huber

Dionex Corporation, Dornierstrasse 4, 82110 Germering, Germany

Endocrine disrupting compounds (EDCs) are suspected of interfering with the endocrine systems of humans and wildlife. They cause adverse health effects like cancer, behavioral changes, or reproductive abnormalities in mammals, fish, and other species. Typical EDC compounds are common pharmaceuticals such as antibiotics. They are of special importance because their presence in the aquatic environment is also a potential cause for antibiotic resistance. The concentrations of these compounds in environmental samples are usually very low, that is, in the microgram/L scale (ppb to ppt range) which makes significant enrichment prior to a subsequent analysis mandatory.

An instrumental online SPE-LC-MS solution for the analysis of selected antibiotics in aqueous matrices like river water or ground water is presented. It comprises a sample cleanup and preconcentration procedure combined with high-performance liquid chromatography (HPLC) for the separation, and mass spectrometry (MS) for detection with highest sensitivity and selectivity. The sample preparation step is performed using online solid phase extraction (online SPE). The intended system is fully automated and operated under single-point software control. The high performance of the system is demonstrated by analyzing a selection of representative antibiotics, mostly of the tetracycline and macrocyclic types.

Preconcentration and Separation of Cr(III) and Cr(VI) Using Sawdust as a Sorbent

Saima Q. Memon

University of Sindh, Hitech Central Resources Lab, Jamshoro, Sindh 92, Pakistan

A simple inexpensive method based on solid phase extraction (SPE) on sawdust from Cedrus deodara has been developed for the speciation of Cr(III) and Cr(VI) in environmental water samples. Because different exchange capacities were observed for the two forms of chromium at different pH—Cr(III) was selectively retained at pH 3 to 4 whereas Cr(VI) was retained at pH 1—complete separation of the two forms of chromium is possible. Retained species were eluted with 2.5 mL 0 . 1 m o l L 1 HCl and 0 . 1 m o l L 1 NaOH. Detection limits of 0.05 and 0.04  𝜇 g m L 1 were achieved for Cr(III) and Cr(VI), respectively, with enrichment factors of 100 and 80. Recovery was quantitative using 250 mL sample volume for Cr(III) and 200 mL for Cr(VI). Different kinetic and thermodynamic properties that affect sorption of the chromium species on the sawdust were also determined. Metal ion concentration was measured as the Cr(VI)-diphenylcarbazide complex by UV-visible spectroscopy. The method was successfully applied for speciation of chromium in environmental and industrial water samples.

Optimization of Antibody-Conjugated Magnetic Nanoparticles for Immunoassays

Joshua E. Smith, Kim E. Sapsford, Frances S. Ligler, and Weihong Tan

Department of Chemistry and Physics, Armstrong Atlantic State University, 2nd Floor Science Center, 11935 Abercorn Street, Gainesville, GA 31419, USA

Biosensors based on antibody recognition have been used for monitoring biological targets for clinical, environmental, and homeland security and food analysis. However, these devices suffer from poor limits of detection for some certain pathogens. We have developed antibody-modified magnetic nanoparticles (MNPs) to improve their performance. The MNPs are iron oxides and are coated with silica. These particles were then conjugated with fluorescently labeled antibodies that were fluorescently labeled with Alex647. The antibody-conjugated MNPs (Alex647-chick-MNP) were used in a direct immunoassay binding format, and the assay was evaluated by measuring the Alex647-chick-MNPs bound to antichick IgG-modified slides. The Alex647-chick-MNP binding to the slides was optimized by altering the nanoparticle composition, surface modification, and concentration conditions. These prepared samples were evaluated using the NRL array biosensor to monitor the immunoassay under flow conditions. This study demonstrates the proof of concept and successful optimization of MNPs labeled with fluorescent proteins used simultaneously for target concentration and detection.

Lab-on-Chip Biosensor for Glucose Based on a Packed Immobilized Enzyme Reactor

Carlos D. Garcia, Lucas Blanes, Maria F. Mora,  Claudimir L. do Lago, and Arturo Ayon

The University of Texas at San Antonio, One UTSA Circle, San Antonio, TX 78249, USA

In this work, the development of a packed immobilized enzyme reactor (IMER) and its integration to a capillary electrophoresis microchip are described. The applied procedure involves the separation of the target analyte by capillary electrophoresis (CE), which is then coupled to a postcolumn IMER that produces H 2 O 2 . The H 2 O 2 produced is finally detected downstream at the surface of a working electrode. Glucose was detected above 100 M by packing particles modified with glucose oxidase at the end of the separation channel. The analytical performance of the microchip-CE was demonstrated by performing the separation and detection of glucose and noradrenaline. Additions of fructose showed no effect on either the peak position or the peak magnitude of glucose. The microchip-CE-IMER was also used to quantify glucose in carbonated beverages with good agreement with other reports. The present microchip design differs from others in many aspects. First, the presented design allows performing analysis with or without the enzyme-coated particles. Second, the presented microchip can detect different analytes just by changing the material used to pack the IMER. By packing modified particles at the end of the separation channel, the challenge of controlling the position and the size of the patch of immobilized enzymes is also avoided. Additionally, because of the larger surface area of particles, larger amounts of enzyme can be immobilized. Furthermore, the use of packed IMERs allows the use of any substrate material to fabricate the chip, and the enzyme is easily replaceable if activity is lost.

This project was financially supported by The University of Texas at San Antonio. L. Blanes and C. L. do Lago also thank FAPESP and CNPq for the scholarship granted.

Optimization of Laser-Based Sampling and Immunolabeling for Single-Cell Analysis by Capillary Electrophoresis

Rob Brown, Rano Matta, and Julie Audet

University of Toronto, 11th Floor 160 College St., Toronto, ON, Canada M5S 3E1

Capillary electrophoresis (CE) is a powerful tool capable of high-resolution separations of analytes from small volume samples. Due to the small diameters of capillary which are available and the high detection sensitivity of laser-induced fluorescence detection systems, this technique is well suited for single-cell analysis. However, in order to apply this tool towards cellular analysis, several hurdles must be overcome. First, cells must be sampled in a manner which will not affect the signaling state of analytes in the cell; second, analytes must be fluorescently labeled for detection.

We have assembled a CE system with laser-induced fluorescence detection. This system has been coupled with a pulsed YVO4 nanosecond laser for rapid cell lysis (sub-milli-second). We have further characterized the cell sampling efficiency of this technique with varying laser parameters using GFP-transfected cells. Optimal GFP sampling was obtained at low laser pulse energies ( 𝑃 = . 0 5 7 , 2 𝜇 J versus 9  𝜇 J) focused directly under the cell ( 𝑃 = . 0 5 5 versus 10  𝜇 m offset in the xy plane), resulting in a sampling efficiency of 0 . 6 7 ± 0 . 0 8 . Fluorescent immunolabeling of target proteins is currently being investigated as a labeling strategy. We are currently in the process of optimizing conditions for separations of antibody-analyte complexes using several different capillary coatings including acrylamide, polyvinyl pyrrolidone, and polybrene/poly(vinyl sulfonate) coatings as well as various electrophoretic buffers. With high-resolution separation of antibody and antibody-analyte complex, it will be possible to perform sensitive analysis of proteins in single cells.

Parallel Monitoring of Cellular Secretions on a Microfluidic Chip

John F. Dishinger and Robert Kennedy

University of Michigan, 930 N. University Avenue, Ann Arbor, MI 48109, USA

The use of microfluidic systems for cellular studies has been greatly advanced by the creation of devices for high throughput and parallel analyses. When compared to single-sample microchips for biological analysis, parallel microfluidic systems greatly increase the speed at which experiments can be performed and datasets completed. A 15-sample microfluidic chip has been developed for continuous monitoring of hormone secretion from live cells using electrophoretic immunoassays. On the chip, perfusate from pancreatic islets is mixed online with immunoassay reagents. After ~2 minutes of reaction time, the reaction mixture is analyzed by capillary electrophoresis with fluorescence detection. Parallel fluorescence detection (typically achieved with complicated laser scanning devices) was performed using a standard fluorescence microscope and CCD camera. The system has a 1 nM LOD for insulin. It can record 15 immunoassays every 10 seconds, and has an RSD of concentration determination of less than 1% for a single channel. The sensitivity is sufficient to monitor insulin release with 10-second temporal resolution from 15 islets simultaneously. The chip produced reproducible insulin secretion plots, showing a basal secretion rate RSD of 9% between 15 islets.

This microdevice has been tested on real-world samples by analyzing insulin secretion from genetically modified islets. In these experiments, the 15-sample chip was used to examine the insulin secretion properties of islets from mice lacking leptin receptors in the pancreas. It was found that the leptin receptor knockout mice had higher overall rates of insulin secretion than control islets, as well as no response from exposure to leptin (which was found to inhibit insulin secretion in control islets). While only two experiments were required with the 15-sample chip in this study, 22 would have been needed using a single-sample device. Additional experiments examined islet response to BSA and GLP-1.

Development of an Automated High- Throughput Microfluidic Device for the Study of Single-Cell Kinase Activity

Amy D. Hargis, Christopher E. Sims, Nancy L. Allbritton, and J. Michael Ramsey

University of North Carolina at Chapel Hill, 28 Holland Drive, Chapel Hill, NC 27514, USA

A microfluidic device is being developed to analyze kinase enzyme activity at the single-cell level to elucidate the role these enzymes play in various intracellular signaling transduction pathways. Microfluidic devices are well-suited to address single-cell assays because of their ability to precisely manipulate the subpicoliter volume contained within a cell. The research to date involves the development of a new microfluidic network capable of rapidly trapping and lysing individual cells. The cells are pulled from a flow stream and trapped at the top of a separation channel using pressure in a modified patch-clamp technique. The trapped cell is then rapidly lysed using a voltage pulse which, when sufficiently high, causes permanent disruption of the cellular membrane. The cellular contents are electrophoretically injected into the separation channel where separation and detection of the desired intracellular compounds occur. To study kinase activity, membrane-permeable fluorescently labeled reporter peptides that are substrates for specific kinase enzymes of interest have been developed. The resulting intracellular fluorescently tagged substrate and product formed through kinase enzyme modification can be separated and detected on-chip in less than six seconds, providing a qualitative assessment of the kinase activity within each cell. The rapid cell lysis and separation capabilities of the device will allow for the analysis of at least 10 cells per minute per separation channel. This provides at least a 1000-fold increase in throughput as compared to the current technique for studying single-cell kinase activity.

Automation of Spectrophotometric Titrations

John A. Lynch, William M. McGee, and Ivan P. Zubkov

Department of Chemistry, The University of Tennessee at Chattanooga, 615 Mccallie Avenue, Chattanooga, TN 37403-2598, USA

Today, spectrophotometric titrations are widely used by inorganic chemists to determine metal-ligand stoichiometries and formation constants. Unfortunately, a laborious mix and measure approach are used almost universally with no attempt at continuous titrant addition. Experimental control, data acquisition, and data analysis are the hallmarks of method automation. For spectrophotometric titrations, the use of a modern instrument with a fiber optic probe and CCD array detector, together with addition of a syringe pump, solves most control and acquisition issues. Data analysis remains, but, as will be shown, there are a rich titration theory and software technology to draw upon. Developing algorithms and using multivariable curve-fitting to extract information, therefore, become quite feasible.

Example studies used to illustrate the versatility of this automated approach were titrations of eriochrome black T with Mg2+, 1, 10-phenanthroline with Fe2+, and 1 and 10-phenanthroline with Cu2+. Algorithms appropriate for each reaction were developed using the mathematical software Maple. Commercially available software was then used to fit algorithms to titration data and to assess the quality of their fit by means of a point-by-point comparison of data(measured)—data(fit). Formation constants obtained were in good agreement with literature values. Interesting details of stepwise reaction profiles were apparent in contours of these automated titration curves. For Cu2+, changing the roles of titrant and analyte also reverses the order of the reaction steps. All in all, a lot was learned from reactions occurring in small vessels taking only minutes to perform.

A Continuous Flow Microfluidic Reactor for Gene Expression Analysis and Quantification at the Single-Molecule Level

Zhiyong Peng and Steven A. Soper

Department of Chemistry, Louisiana State University, Baton Rouge, LA 70803, USA

Messenger RNA (mRNA) is an important biomarker for gene expression analysis. MMP-7 and MMP-9 mRNAs are two extensively investigated targets, which have significant clinical value for early detection of colorectal cancers. In conventional approaches, low abundant mRNA molecules are reverse-transcribed followed by amplification through PCR (RT-PCR) to generate enough copy numbers for further interrogation. In this research, the time-consuming PCR step is eliminated, and the mRNA molecules are directly quantified using an allele-specific ligase detection reaction (LDR) via single-pair FRET assay. The mRNA isolated from whole cells was captured in solid phase and reverse-transcribed into its complementary DNA (cDNA). A pair of primers were designed that were based on the sequence of the target cDNA strands, and were end-labeled with Cy5 (donor) and Cy5.5 (acceptor). In the presence of target cDNA, these two primers will join together in an LDR to form a molecular beacon. The two dyes in the molecular beacon were brought in close proximity to undergo fluorescence resonance energy transfer (FRET). A continuous flow microfluidic reactor was designed and fabricated using a PMMA substrate to carry out the LDR. This microfluidic device possessed serpentine channels to allow denaturing and thermal ligation to take place in a sequential process in different isothermal zones. The resulting molecule beacons were examined using a laser-induced fluorescence (LIF) system, and the captured mRNAs could be directly counted at the single-molecule level to achieve quantitative information required to read out the expression level of these targets.

Identification and Examination of Thermal Degradation Products of Pharmaceuticals by HPLC-TOF-MS

Wanlong Zhou and Roger K. Gilpin

Brehm Research Laboratory, Wright State University, 3821 Colonel Glenn Hwy, University Park, Fairborn, OH 45324-2031, USA

During the development of new drug substance and drug formulations, there are a number of important chemical and physical questions that must be answered in order to satisfy FDA regulations. Many of these are related to the stability of a new drug, and require a detailed evaluation of potential degradation products as well as elucidation of decomposition mechanisms. HPLC coupled to high-resolution mass spectrometer (TOF-MS) has emerged as a practical approach for detecting and identifying unknown degradation products. High mass accuracy, around 5 ppm, allows for molecular formula determinations with the help of special system tools, and potential structures may be revealed through available formula database searching. High full mass range sensitivity permits the detection of degradation products to very low levels. In addition, TOF-MS is capable of producing accurate isotopic ratios, and therefore reducing the number of possible compounds significantly by eliminating those having nonmatching isotopic ratios.

In the current work, an HPLC-UV-MS method has been developed for studying the high-temperature degradation of indomethacin and mefenamic acids. Indomethacin mainly undergoes thermal degradation by cleavage of the amide bond and loss of the carboxyl group, whereas mefenamic acid primarily degrades via loss of water and the carboxyl group. The current talk considers optimization of both the HPLC separation and API ionization conditions, and the identification of major degradation products is based on exact molecular weight, isotopic ratios, and fragment ion information. The degradation samples were analyzed under different chromatographic conditions (e.g., eluent additives, binary composition, and pH), ionization methods (ESI and APCI), and detection modes (positive and negative ion spectra).

Comprehensive Selenium Speciation Analysis of Selenium-Enriched Samples through Liquid Chromatography-Particle Beam/Mass Spectrometry

Joaudimir Castro, R. Kenneth Marcus, Geovannie Ojeda, and M. V. Balarama Krishna

Chemistry Department, Clemson University, 219 Hunter Laboratories, Clemson, SC 29634, USA

Garlic has long been utilized in the Asiatic and western cultures as a prophylactic and therapeutic medical agent. Nowadays, it is known for its potential anticarcinogenic and antioxidative properties due to the presence of various selenium species. On the other hand, onion is also considered to provide health benefits in the prevention of some diseases. Garlic and onions are known to be bioaccumulators of selenium. The most abundant selenospecies determined in garlic and onions are selenomethionine, γ-glutamyl-Se-methyl-seleno-cysteine, and seleno-methyl-Se-cysteine. Presented here is the comprehensive speciation of organic and inorganic selenium species, via liquid chromatography-particle beam/mass spectrometry (LC-PB/MS) using electron impact (EI) and glow discharge (GD) as ionization sources, towards the analysis and characterization of selenium-enriched green onion.

The separations of the selenium species from standard solutions (sodium selenate, sodium selenite, selenomethionine, selenocystine, and Se-methyl-selenocysteine) and selenium-enriched green onion extract were carried out by reversed-phase chromatography using a gradient elution system of water-methanol-trifluoroacetic acid with an analysis time of less than 20 minutes. Initially, the course of the separation was followed by measuring the absorbance at 210 and 254 nm. Evaluation of the EI source (electron energy and source block temperature) and the GD source (discharge pressure and current) parameters was performed to determine the optimal operating conditions by monitoring the analytes' intensities and the fragmentation patterns. The analytes eluting from the chromatographic column undergo nebulization, followed by aerosol desolvation and ionization of dry analyte particles reaching the ionization source. The GD and EI processes yield mass spectra which reflect the chemical species eluting from the column. The GD ion source provides EI-like molecular fragmentation of the eluting compounds, allowing spectral library comparison (when available). Analytical response characteristics were obtained for each of the selenium species, and detection limits on the single-nanogram level were achieved. LC-PB/MS approach with the versatile interchangeable EI/GD sources provides comprehensive speciation analysis of the selenium species; therefore, it is believed to be applicable for the study of fundamental metabolic studies.

Assessment of a New Analytical Approach for Real-Time Measurement of Glutamate and Nitric Oxide Interactions In Vivo

Ian N. Acworth, John Waraska, Michael Weber,  Siripan Pharranarudee, and Timothy J. Maher

ESA Biosciences, Inc., 22 Alpha Road, Chelsmford, MA 01824, USA

Although microdialysis perfusion is a routine technique for studying many analytes in the ECF, it suffers from two important drawbacks. It has poor temporal resolution (typically several minutes) usually dictated by the analysis time, and is limited to those compounds that are not actively metabolized in the extracellular space and remain fairly stable during the collection period, thus making it difficult to measure molecules such as nitric oxide.

Recently, a novel instrument (BioStat) using digital signal processing technology was coupled with the newest generation of analyte-specific implantable electrodes to provide an approach with high temporal resolution and specificity.

Presented here is an assessment of such a system. Electrodes specific for either NO (100 and 600 micron tip diameters, 2 mm active length, hydrophobic membrane type, held at +865 mV relative to Ag/AgCl) or glutamate (180 micron tip diameter, 1 mm active length, biosensor type, held at +600 mV relative to Ag/AgCl) were implanted in the hippocampus (from Bregma (mm): AP −5.0; LR 5.0: DV −7.0) of chloral hydrate anesthetized rats. Under in vitro conditions, sensors were shown to remain stable (2% drift over 3 hours and 12% over 4.5 hours) for more than 12 hours. Once implanted, sensor response remained stable for the duration of experiments (2-3 hours). Sensitivity in vitro was approximately 262 pAmps/nM for NO and 245 pAmps/microM for glutamate.

The response at the nitric oxide electrode was shown to decrease following IP injection of the nNOS-specific inhibitor 7-nitroindazole (200 mg/kg in corn oil) by 21% after 42 minutes and 77% after a second administration (100 mg/kg). No effect on the NO electrode response was seen from IP and IV administration of L-Arg or L-NAME. The glutamate sensor showed a low and less than expected change in response following IP administration of kainic acid (13 mg/kg). Interaction between hippocampal NO and glutamate was further examined pharmacologically.

Biomolecular Interaction Analysis by Using QCM with Separate Monitoring System of Mass Load and Viscous Load

Tomomitsu Ozeki, Yukiko Suzuki, and Atsushi Itoh

ULVAC, Inc., 2500 Hagisono Chigasaki, Kanagawa 253-8543, Japan

The piezoelectric quartz crystal microbalance (QCM) is a highly sensitive mass-measuring device on a surface, and has been used as a biosensor in aqueous solutions with its properties of nonlabel and real-time monitoring. Applications of QCM instruments cover wide range of research fields such as protein-protein interaction, protein-DNA interaction, material-biomolecule interaction, and material evaluations. In some cases (e.g., protein samples dissolved in highly viscous solution such as glycerol), viscous loading significantly affected signal of mass load of protein adsorption on a QCM. Recently, we succeeded to develop the separate monitoring system of mass load and viscous load by using frequency 𝐹 2 measurement of QCM. In this study, we demonstrated some applications by using this system, such as viscous solution-dissolved protein adsorption, ethanol-dissolved molecule adsorption, cation-dependent DNA conformation change, and temperature-dependent conformation change of polymer materials. We could analyze these reactions that were complicated by using conventional QCM system; these results can expand the possibility of QCM instruments as biosensors (see Figure 7).

Flow Injection Chemiluminescence Determination of Hydroquinone

Mohammad Reza Baezzat

Fars Research Center, Boolvar Mirzaee Shirazi, Aryan Street, Shiraz, Fars 71877-54531, Iran

A new sensitive and selective flow injection chemiluminescence method for the determination of hydroquinone over the range of 1 . 1 × 1 0 7 - 5 × 1 0 6  M will be described. The method is based on chemiluminescence emission during the oxidation of hydroquinone by potassium persulfate in alkaline medium. Interference was considered, and some cations in reaction increase or decrease chemiluminescence intensity. Method development includes optimization of reagent concentrations and flow conditions. Detection limit is 4 . 5 × 1 0 8 . The method also has good selectivity. The method is simple, fast, selective, and precise.

Mark E. Benvenuti, Aisling O'Connor, Alice Di Gioia, and Peter Lee

Waters Corporation, 34 Maple St., Milford, MA 01757, USA

The recent pet food contamination incident in North America highlights the need for conclusive and rapid analyses of melamine and its metabolites (ammeline, ammelide, and cyanuric acids) in pet food, animal feed, and tissue samples. As documented in the Washington Post of May 7 (2007), an unknown number of cats and dogs in USA became ill or died from eating certain brands of pet food. This resulted in the recall of millions of pounds' pet food. The formation of sharp melamine-cyanuric acid crystals in the kidneys of animals which consumed the tainted pet food was found to be the probable cause of illness, in some cases leading to death. This outbreak has fueled the latest public outcry for increased, accurate, and rapid analytical food safety testing among manufacturers and government regulatory agencies. Confirming the widespread nature of this contamination, the U.S. Food and Drug Administration (FDA) reported that melamine was found in wheat gluten and rice protein concentrated in USA, all imported from China and intended for use in pet food. Melamine contamination has also been found in animal feed causing concern about migration of these products into the human food supply.

Here, we show the use of Ultra Performance LC with PDA and MS detection to quantitate melamine and the associated compounds of ammeline, ammelide, and cyanuric acids with a runtime of less than 2 minutes. A simple sample preparation procedure applicable to many pet food matrices is illustrated.

Optimization of a Headspace-SPME-GC-ECD for the Determination of Chloroanisoles in Wine and Corks

Alfredo Lo Balbo, Mariano Gotelli, Luciano Signorini, and Carlos Gotelli

Centro de Investigaciones Toxicológicas, Av. Juan B. Alberdi 2986, Ciudad De Buenos Aires 1406, Argentina

“Cork taint”—a musty-mouldy off-odor—represents one of the most serious problems in the wine industry. 2,4,6-Trichloroanisole (TCA), along with other compounds, is known to be responsible for this effect. The wine industry is losing about $100 million annually in USA alone, due to TCA problems. The problem could be in the billions of dollars worldwide. For this purpose, the present work reports an optimized method based on headspace solid phase microextraction (HS-SPME) followed by gas chromatographic (GC) separation and electron capture detection (ECD) for rapid TCA determination. The method was validated after a lawsuit has been filed against a stopper manufacturing firm, claiming that the agglomerate corks have ruined 80 000 liters of red wine with a retail value of more than $2500000.

The best analytical conditions were obtained using 15 minutes of incubation time (60°C) with a polydimethylsiloxane 100  𝜇 m (PDMS) fiber, and 10 minutes of extraction time (250°C) following a GC separation in a DB-Wax column. The quantification limits were 4 ng/g of cork and 1 ng/L of wine. The optimized method showed good sample throughput, and TCA values ranging within 23–294 ng/L were obtained in the samples (the aroma detection threshold of TCA in wine has been determined as ranging from 1.4 to 4 ng/L, depending on wine type).

This method is a real solution for screening of oak chips/shavings prior to use, and may reduce the risk of wines becoming contaminated with chloroanisoles.

Vitamin C Analysis by Automated Discrete Technology

Melanie Geaslin, Dave Glutz, and Larry Anderson

EST Analytical, 503 Commercial Drive, Fairfield, OH 45014, USA

As a nutrient, vitamin C (L-ascorbate or ascorbic acid) daily requirements are a matter of ongoing debate. However, it is an important nutrient for higher primates playing an essential role as a cofactor for some enzymatic metabolic reactions and also as an important antioxidant protecting against oxidative stress. Ascorbic acid is manufactured internally by almost all organisms except humans; so it is an essential additive to the human diet and has become one of the biggest food additives in the food and beverage industries. Nutritional labeling requirements state that vitamin C content must be stated in terms of % daily requirment contained in the product per serving which requires manufacturers to quantitatively analyze the level contained or added to their products. Historically, this determination was performed by high-performanc liquid chromatography or a less accurate titration.

Ascorbic acid can now be quickly and accurately determined in a variety of juice products and food matrices utilizing discrete technology and an enzymatic process which selectively isolates L-ascorbic acid. The first step involves ascorbic acid and other reducing agents (x-H2) assayed quantitatively through a rapid five-minute reduction of the tetrazolium salt MTT to a formazan in the presence of an electron carrier. The formazan itself is what can be photometrically recorded in the visible range at 578 nm. The second step isolates L-ascorbic acid by an enzymatic reaction which selectively removes the ascorbic acid from the sample after which another photometric reading is taken and subtracted from the total reducing substances.

The completely automated analysis occurs in less than 17 minutes with little to no sample preparation. Six to twelve reactions occur simultaneously as the instrument utilizes micro 12-cell cuvettes and is a continuous feed system.

Enhanced Flavor Characterization of Food Samples through Highly Sensitive Automated Head Space Analysis

Daniela Cavagnino, Fausto Munari, and Andrea Cadoppi

Thermo Fisher Scientific, Strada Rivoltana, 20090 Rodano, Italy

Different analytical methods and instrumentation for the determination of volatile components in foods and beverages are available, and many of them are based on the reconcentration of the head space before the injection into a capillary column. The aim is to increase the sensitivity for the minor components, usually the most interesting ones, for the characterization of the volatile fraction of food samples. The volatile compounds responsible for the typical aroma of foods often characterize the quality profile of the product, and their identification and quantitation allow to determine possible adulteration or contamination.

To detect and recognize the volatile components present in the matrix, a suitable instrumentation is required for a reproducible head space transfer to the analytical column and for an appropriate detection. A multidetection system (MS and GC detectors) is advisable for a selective identification of the targets.

A deeper insight into a complex flavor composition is also possible by coupling the headspace sampling technique with a comprehensive two-dimensional gas chromatographic system ( G C × G C ), taking advantage of its enhanced separation power.

The presentation will demonstrate the benefits of using a robotic autosampler for both the injection of large amount of head space vapors and for handling the solid phase microextraction (SPME) device for an automated extraction and reconcentration of the volatile components.

To maximize the sensitivity and the chromatographic performance, both sampling techniques have been coupled with a Cold Trap placed inside the GC oven in order to reconcentrate and reinject the volatile components in a narrow band, permitting the use of highly efficient narrow bore capillary columns.

The presentation will describe the chromatographic systems and methods used for the quality characterization of edible oils and beverages, showing examples of head space analysis of drinking water and white wine.

In Vivo Testing of the Ionophore-Based Sliver Sensor for Real-Time Monitoring of Metabolic Status

Miklos Gratzl and Sumitha P. Nair

Department of Biomedical Engineering, Case Western Reserve University, Wickenden 425, 10900 Euclid Avenue, Cleveland, OH 44106, USA

The sliver sensor consists of a miniature array of sensing capsules that change color in response to concentrations of different analytes such as H+, Na+, K+, glucose, and potentially other metabolites. The sliver resides in the interstitial fluid in the dermis, where it remains visible from outside the skin. Extensive in vitro testing has shown that the device responds to sudden concentration changes within 5 minutes and that diffuse reflectance measured with a CCD camera can be used to determine the respective concentrations. Initial in vivo studies proved that the sliver sensor is biocompatible for at least a month. In this work, we have tested if color of the sensing capsules in the dermis can be reconstructed from CCD recordings made outside the skin. Results of in vivo tests on color reconstruction and concentration determination will be reported in this presentation.

Intelligent and Reliable Identification of Chromatographic Peaks

Ludovic Debusschere and Graham Shelver

Varian Data Systems, 1 Rue Hector Berlioz, Parc D'activités Des Plans, Fontaine, Isere 38600, France

Column aging as well as other instrumental and chemical effects can cause chromatographic retention times to vary from their original values. Various approaches have been used to compensate for these effects including changing instrument parameters and using reference peaks to adjust the target retention times. None of these conventional approaches can adjust both the retention time targets and timed peak processing events on the chromatogram where the deviations occur, and some involve modifications of experimental conditions that can cause unexpected deviations from optimum separation conditions.

Galaxie's SmartTimeUpdate is a novel approach applicable to both gas chromatography (GC) and liquid chromatography (HPLC), which uses traditional reference peaks to automatically correct both retention times and peak processing timed events without any need to alter instruments' parameters. When reference peak times have shifted, both the timed events' table and the peak detection table are automatically modified resulting in peak processing parameters that can accurately process the new chromatograms. In Figure 8, when the chromatogram retention times have shifted, the original times for events are incorrect and the corrected times produce proper peak processing. Systematic changes in retention times can be tracked as systematic alterations in peak processing events. If normal retention time variations are of the same order of magnitude as the systematic changes, SmartTimeUpdates can be scaled to shift peak times and timed events a fraction of the measured change, therefore preventing overcompensation of the time shift.

With OnPeak events which start and stop peak processing events when peaks are detected and therefore compensate for peak shape changes, Galaxie with SmartTimeUpdates can compensate for any changes in chromatographic peak characteristics.

Advantages of Conducting a Laboratory Automation Needs' Assessment

Christine Paszko

Accelerated Technology Laboratories, Inc., 496 Holly Grove School Road, West End, NC 27376, USA

Many laboratories could greatly benefit from laboratory automation and a laboratory information management system (LIMS) needs' assessment. Subject matter experts trained in laboratory automation and LIMS can provide an unbiased observation of the laboratories' current operations and make suggestions for process and automation improvements. In many laboratories, the in-house team members may be too close to the process or may not be familiar with the various tools or technologies that are available or the best approach to deployment. Subject matter experts will have first-hand experience with the automation tools, techniques, and an understanding of the return on investment (ROI) for many of the new technologies, and can help laboratories decide on the best approach to remain competitive and to be leaders in their market segment.

Technologies include hardware and software advancements from virtual servers to wireless data transmission or scanning, automated label printing, instrument integration, accessing data securely in real time via the Internet, automated reporting, automated quality control alerts, automated scheduling, automatic e-mail alerts, SPC chart reports, pdf reports, electronic data deliverables (EDDs), and many additional automation tools and technologies.

This talk will focus on the key factors of performing a laboratory automation evaluation (which utilizes checklists, interviews, and a review of the current infrastructure) and the final output from such an evaluation that will consist of a final laboratory assessment report. This final report will serve as the basis for the laboratory evaluating the recommendations of the subject mater expert. This report can also serve as the automation blueprint for increasing laboratory productivity, efficiency, data quality, and in some industries regulatory compliance.

How Secure Is Your Data?

Robert Jackson

CSols, Inc., 131 Continental Drive, Suite 303, Newark, DE 19713, USA

How secure is the data on your PC, on your corporate network, on a laptop, on a USB drive, burned to CD/DVD? When you receive data in electronic form in the post, shipping service, e-mail, and so forth, who did really send it? Has it been read or modified in transit? How do you protect the data you send out so that it cannot be read or modified except by the intended recipient?

In this presentation, the author will discuss how encryption technology can be used to answer these questions. The presentation will discuss the fundamentals of selected encryption techniques, and will cover commercially available and open source software tools to implement data security in the laboratory.

Advanced Automated Reporting Techniques

Caroline Bright

National Instruments, 11500 Mopac, Austin, TX 78759, USA

In many laboratory systems, the reporting element of a system can be an afterthought of the initial design. This can lead to poor reporting of the data or large amounts of time dedicated to the manual creation of reports. We make a large investment on the tools needed to collect data and results, and therefore we should also invest on clear reporting of these results.

This session will cover techniques and tools for reporting data collected in the lab. We will look at different approaches and tools that can be used to automate the reporting of results as well as covering how to make templates in commonly used formats for sharing results with other interested parties. By taking the time to implement an automated system for professional reporting, you can make the most of the investment in your data.

Identification Method for Potential Peptide Biomarkers from Sub-Micro-Liter Biological Samples Using MALDI-TOF-MS

Kyaw T. Myasein, Jose S. Pulido, and Scott A. Shippy

Department of Chemistry, University of Illinois, 845 W. Taylor St., Rochester, IL 60607, USA

The identification of disease biomarkers may be helpful in understanding pathological mechanisms or for diagnostics. One of the most powerful methods for peptides and proteins' biomarker detection is mass spectrometry. Recently, we have presented a technique for enhancing peptide detection using MALDI-TOF-MS from biological fluids with a sub-micro-liter dialysis device and microspotting technique. However, the discovery of possible biomarkers is a time-consuming task due to the complexity of the obtained mass spectra. To address the complexity, a MATLAB algorithm is developed for finding more or less abundant peptide peaks in mass spectra of vitreous collected from control or experimental patients. The mass spectra of 500 nL human vitreous samples from diabetic-related eye disorder patients (proliferative diabetic retinopathy, PDR) and epiretinal membrane (control) patients were collected following protein removal with dialysis and microspot blotting. The spectra were aligned using SpectAlign (version 2.3). The MATLAB algorithm subtracts control spectra from disease spectra giving 𝑚 × 𝑛 difference spectra, where m is the number of disease spectra and n is the number of control spectra. The more abundant peaks are represented as peaks, and the less abundant peaks are represented as dips in the difference MS. The sample input spectra were divided into varying mass windows depending on the MS resolution to give approximately one peak per window. The individual windows were counted for a peak or dip from all possible difference spectra and displayed as a frequency histogram. Criteria are applied to the frequency histogram to determine potential up- and downregulated peptides. After analyzing mass spectra from 9 PDR samples and 9 controls, 10 peaks were found to be more abundant relative to controls, and 4 peptide peaks were found to be less abundant in more than 50% of the difference spectra. The MATLAB result was confirmed by individual analysis of spectra. Overall, this algorithm demonstrates the ability to easily interpret possible peptide biomarkers from complex MALDI mass spectra.

Deconvolving GC-MS Data Using Functional Data Methods

H. Dennis Tolley, James R. Oliphant, Chad B. Grant,  Edgar D. Lee, and Milton L. Lee

Department of Statistics, Brigham Young University, 206 Tmcb, Provo, UT 84602-5700, USA

The value of using tabulated GC retention data and mass spectral libraries for determining the presence of target compounds in a sample (i.e., detection) is well established. Isolation of spectra not present in the target mass spectral library for nontarget compounds present in the sample is much less well developed. This paper starts by viewing the problem of spectral deconvolution and isolation as a statistical hypothesis testing problem in a general function space with a defined inner-product. Placed in this context, isolation of unspecified spectra resembles singular value decomposition in the function space similar to principal component analysis. In this framework, the role of machine-to-machine variability, sample-to-sample variability, and scan-to-scan variability enters into the methods of detection and identification in a consistent probabilistic manner. This paper presents the basics of this approach and illustrates its application to chemical agent simulants and toxic industrial chemicals. The resulting methodology is compared with commonly used deconvolution methods. We emphasize the ability of the method to isolate unspecified mass spectra in the presence of known library mass spectra for further examination.

Efficient Use of Peptide Mapping for Characterizing Native Protein Structure and Structural Variants

Beth Gillece-Castro, Jo-Ann M. Jablonski,  Thomas E. Wheat, and Diane M. Diehl

Waters Corporation, 34 Maple Street, Milford, MA 01757, USA

High-resolution chromatography, accurate mass LC-MS, and software tools have been combined to more efficiently correlate peptide maps with protein structure. HPLC is the established first step in deducing structure from these fragments because differences in sequence yield chromatographically separate peaks. The resolution can be enhanced by applying UPLC separation principles to the peptide mixtures. HPLC and UPLC peptide maps will be compared. The optimized chromatography still requires confirmation of peak identity and purity; so it is useful to couple the separation to the exact mass measurements, which is possible with an oa-ToF mass spectrometer. Peptides can be identified based on molecular weight, and coelutions can be detected. This additional information links the chromatographic pattern to the structure of the protein. At the same time, the comparison of the HPLC and UPLC separations shows an improvement in the utility of the MS because the peptide spectra are simpler and easier to interpret when more completely resolved samples are introduced into the MS. Complete interpretation of complex LC/MS chromatograms with accurate mass measurement is time-consuming and labor-intensive. New specialized software has been developed for these large datasets. The peaks are detected by the Apex3D algorithm to deconvolute multiply-charged ions and combine isotopes. These processed data are matched to the structural features of the proteins with rigorous comparison and search algorithms. The combination of UPLC, orthogonal acceleration (oa-TOF MS), and advanced software acts synergistically to improve the interpretation of peptide maps.

Machine Learning Techniques in Mass Spectrometry: Automated Interpretation of Mass Spectra in Pharmaceutical Analysis

Csaba Peltz

Egis Pharmaceuticals Ltd., Spectroscopic Department, Kereszturi Ut 30-38, Budapest H-1106, Hungary

Automation of the mass spectrum interpretation is essential to achieve a high-throughput mass spectrometry-coupled chromatographic qualitative analysis. The depth of the spectrum interpretation is highly correlated to the accuracy of the classification. Nowadays data-dependent acquisition techniques allow the users to obtain automatic tandem mass spectra besides the molecular ion information in HPLC-MS. Unfortunately, the MS/MS spectral libraries have some severe problems and are certainly unusable when analyzing new compounds.

HPLC-MS/MS spectra—especially those obtained in ion trap instruments—show a rather simplified pattern when compared to classical electron impact spectra, mainly due to much lower excitation energies and the dominance of the closed-shell ions in the fragmentation processes. Characteristic fragment peaks within a certain compound family of drug-like small molecules make the human mass spectrum interpretation easy and straightforward. The aim of the present work is to provide an algorithm and a software tool that performs a human-like interpretation of mass spectral peaks of small molecules; therefore it can be embedded in an automatic qualitative analysis process with the aid of the pharmaceutical preparative chemistry.

The input of the algorithm consists of pairs of structures and mass spectra, while the output is a series of numbers representing the likelihood of the matches. The human interpretation is imitated with a combined usage of an automated probabilistic rule induction technique and small artificial neural networks. An abstract definition of the fragmentation rules is provided. The rule induction part extracts fragmentation rules from positive training samples within the studied compound families, while the neural network is trained to achieve a reasonable classification of the positive and negative “hits.” An example of the above algorithm is presented, along with the application in routine pharmaceutical qualitative analysis.

Laboratory Informatics System Deployment Decisions: What Is Best for You?

Mark Parrish

CSols, Inc., 131 Continental Drive, Suite 303, Newark, DE 19713, USA

Many laboratory information systems (e.g., LIMS and chromatography data systems) are classically deployed using a client/server paradigm. Recently, vendors of many laboratory informatics software products have offered deployment choices ranging from classical to novel ones. In addition to the classical ones, these include terminal services (Citrix) and Web-based (Internet Explorer) services. These technologies provide different levels of connectivity and advantages, one versus the other.

In this presentation, the author will discuss the benefits and drawbacks of each deployment choice and will explore the feasibility of single deployment choices versus multiple choice deployments.

New Functionality in Software for Lab Managers

Richard Hall

TimeKeeper America, P.O. Box 6991, Hudson, FL 34674-6991, USA

Laboratories have huge investments in their assets: facility, staff, instruments, and equipment. These assets must be maintained in a fashion that makes them always ready, that is, ready to go online and support the process or to analyze it. In this vein, established proactive maintenance programs are imperative. Proactive maintenance programs also fill compliance requirements of licensing agencies.

TimeKeeper America (TKA) software is an asset management tool. TKA provides demonstrations of compliance to an auditor. Proactive maintenance monitoring programs are NELAP-requirement proactive maintenance programs that help insure that when samples are logged into a laboratory the facility, the staff, and the instruments and equipment are fully prepared for analysis. This in turn helps insure that data generated will be highly reliable. TKA also contains a demonstration of capability section. This subprogram can also be used to demonstrate calibration of equipment such as balances, viscometers, and autopipetters. Another feature of TKA is a chemical inventory section.

Unique features of the program include a “look ahead scheduler.” Color-coded pop-up messages give advanced notice when maintenance actions are coming due. When a maintenance act is to be performed, click on the pop-up message and link to the SOP or the manufacturer's maintenance manual. Completed maintenance actions are sent to a secure report generator. Reports are acceptable as legal exhibits.

LIMS-type software track samples from when they are acquired through to the final data stream and report generation. TKA completes the other half of the circle. TKA tracks the facility, staff, and instruments to insure that samples will be given the highest-quality analysis available.

Impurity Identification: More Information and Fast Decision-Making with Increased Sampling and Control

Mike S. Lee, Kenneth C. Lewis, and Steven M. Fischer

Milestone Development Services, P.O. Box 178, 7 Snowdrop Place, Newtown, PA 18940-0178, USA

Industrial endeavors that involve the identification of impurities are increasingly challenged to shorten development timelines. New analytical tools are constantly investigated and evaluated to provide improvements in efficiency and productivity. The combination of high-performance liquid chromatography and mass spectrometry (LC/MS) continues to be a valuable tool for the identification of trace-level impurities. Recent improvements in resolution in chromatography and mass analysis formats have generated new specifications for performance and considerations for application. Sensitivity and selectivity remain key analytical figures of merit for LC/MS platforms dedicated to trace-mixture analysis. Faster cycle times (chromatographic and mass) provide the ability to generate higher-quality data and more information in less time. Shorter analysis times provide opportunities to improve existing method development strategies, and lead to significant cost savings. Furthermore, increased levels of automated software control provide capabilities for high-throughput analysis and facile report generation. Results that demonstrate advances in LC/MS instrumentation for impurity identification will be presented. Faster analysis times, improved method development strategies, and increased flexibility will be discussed.

Evaluation of the Importance of Accurate Mass, Mass Resolution, and Dynamic Range for Impurity Profiling Applications Using Multistage Mass Spectrometry

David A. Weil, Patrick Perkins, and Michael Zumwalt

Agilent Technologies, 10 North Martingale Road, Suite 550, Schaumburg, IL 60173, USA

The importance of mass accuracy, mass resolution, and wide in-scan dynamic range for MS and multistage MS/MS data is demonstrated by reviewing results obtained using a new quadrupole time of flight mass spectrometer (QTOF) with unique 4 GHz analog-to-digital acquisition electronics. Coelution of components and ion suppression can inhibit the detection of trace impurities present in the complex mixtures. Maximizing chromatographic resolution using a rapid resolution high-throughput eclipse plus C18 column (particle size of 1.8 micron) with a rapid resolution LC produces narrow peak widths (seconds). In very complex mixtures, however, compounds with the same nominal mass may still be present at the same retention time. The new 4 GHz analog-to-digital electronics along with new peak detection software enables greater resolving of these isobaric species by increasing resolving power to 10000 ( > m/z 100). The identification of low-level impurities which co-elute with high-level impurities can now be detected over a wide concentration range of > 5 orders of magnitude using a new preamplifier design. Compounds present in the complex mixtures were automatically identified using an algorithm known as molecular feature extraction. Sample comparison was completed using another algorithm known as mass profiler that makes statistical comparison of the two samples. Results showing mass accuracy over a wide dynamic range in scan and for both MS and MSMS mass spectra will be shown for several samples.

Reaction Monitoring and Impurity Analysis for Drug Substance Synthesis

Heewon Lee

Boehringer Ingelheim, 900 Ridgebury Road, Ridgefield, CT 06877, USA

Impurity profiling during drug substance development is critical for its safety, efficacy, purity, stability, and quality. When unknown process-related impurities arise, impurity identification plays an important role in deeper understanding of the process and consequently improving the quality of the drug substance. Genotoxic impurities pose significant analytical challenges. In this presentation, case studies of impurity profiling, identification, and control are reported using chromatographic separation (high-performance liquid chromatography and gas chromatography) interfaced with mass spectrometry for active pharmaceutical ingredient (API) development.

How to Justify the Purchase of Very Expensive Instruments

Phil Edwards

NOVA Chemicals, 3620-32Nd Street N.E., Calgary, Alberta, Canada T1Y 6G7

In these days of tight budgets and the need to provide more laboratory support with fewer resources, it has become a challenge for the lab manager to justify expensive laboratory instruments. This presentation will include a review of various types of expensive laboratory instruments, the advantages and disadvantages of purchasing them, alternatives to purchasing them, and the approach that must be taken to justify the purchase of expensive instruments.

The Identification, Confirmation, and Quantification of Allergens in Foods

John H. Callahan, Kevin J. Shefcheck, and Steven M. Musser

Center for Food Safety and Applied Nutrition (CFSAN), US Food and Drug Administration (US FDA), Hfs-707 Spectroscopy and Mass Spectrometry Branch, 5100 Paint Branch Parkway, College Park, MD 20740, USA

The confirmation, detection, and quantification of protein allergens in food are important issues for food safety. Measurements in complex food matrices necessitate approaches in which the target proteins are measured in the presence of complex matrices that also contain proteins. ELISA-based methods are typically used to measure allergens in food, and these can be sensitive and specific. However, the possibility of cross-reactivity with related proteins and resultant false positive identification requires that other confirmatory techniques be available for measuring allergens. Analytical methods that yield a direct molecular signature of the target allergen protein, such as mass spectrometry, are necessary for confirmation. Several mass spectrometric approaches have been developed to address the measurement of protein allergens in food, including methods that fractionate and then enzymatically digest the target protein into characteristic peptides, followed by liquid chromatography/tandem mass spectrometry to fragment and confirm the peptide sequence. This approach can be used to definitively identify the allergen protein. MALDI-MS/MS can also be used for confirmation. This approach is not universally applicable, however, due to the nature of some food matrices and their interference in the fractionation/digestion process. Alternative approaches under development include the use of immunoaffinity methods to extract the target analyte from the food matrix, as would be done in ELISA, followed by elution of the protein from the antibody and analysis of the protein by digestion and LC/MS/MS. The extension of all of these methods to true quantification is a difficult step, due to the need for internal standards. We have been investigating approaches for standardization, which include development of 1 8 O stable labelled standards from matrix spikes as well as the use of homologous proteins to standardize the entire method.

Development of an Inductively Coupled Plasma/Electrospray Ionization Dual-Source Time-of-Flight Mass Spectrometer for Rapid Speciation and Metallomic Analysis

Duane A. Rogers, Steven J. Ray, and Gary M. Hieftje

Indiana University, 800 E. Kirkwood, Bloomington, IN 47405, USA

In recent years, the importance of chemical speciation, that is, the determination of an element amongst its various oxidation states and molecular or complex forms, has been recognized as more relevant than total elemental concentration. Although the total elemental concentration is often easier to evaluate, the bioavailability and toxicological information it yields are inherently limited. It has been demonstrated that chromatographic separation alone cannot provide sufficient qualitative information for native biological samples, due to the unknown nature of the sample. For this reason, researchers have begun employing two separate instruments to obtain the elemental and molecular information from such samples. However, employing multiple instruments for the analysis of a given sample has several disadvantages. In the work presented here, a single time-of-flight mass spectrometer (TOFMS) will be described that utilizes two sources to obtain comprehensive atomic and molecular information simultaneously.

The dual-source TOFMS has been designed and constructed in our laboratory. The current arrangement for the instrument utilizes inductively coupled plasma to obtain elemental, isotopic, and quantitative information. Meanwhile, an electrospray source is operated in parallel to provide molecular information. Due to the wide mass range and high spectral generation rate of TOFMS, ions from both sources can be simultaneously sampled by a single mass analyzer to provide excellent temporal resolution of transient signals, while simultaneously simplifying peak assignment from a chromatographic separation. In addition, since only a single chromatographic separation is necessary for the atomic and molecular channels, sample requirements, preparation time, and analysis time can be significantly reduced. Results will be presented demonstrating the capability to operate both sources simultaneously.

Simultaneous Multiple Element Detection by LC-PB/HC-OES

Charles D. Quarles and R. Kenneth Marcus

Clemson University, Biosystems Research Complex, 51 New Cherry St., Clemson, SC 29634, USA

Very often, the quantification of proteins in biological systems is determined by the analysis of metals. However, by analyzing nonmetal elements present in proteins, a more qualitative picture can be formed. Phosphorus, in the form of phosphorylated sugars, is one of the most abundant nonmetal elements in proteins, and its determination can give insightful information about the regulation of cellular activities by looking at extent of phosphorylation and dephosphorylation [19].

Previous work in our laboratory has focused on the detection of phosphorus (219.9 nm) and carbon (193.0 nm) in proteins using liquid chromatography-particle beam/hollow cathode-optical emission spectroscopy (LC-PB/HC-OES). This technique has proven to be a viable source for detection of various proteins giving qualitative and quantitative data. The LC-PB/HC-OES system used previously consisted of a monochromator with one photomultiplier tube (PMT), requiring repeated sample analysis to gain information for multiple elements.

Our laboratory has recently developed new instrumentation for detecting multiple elements in a single-sample analysis, with LC-PB/HC-OES through the use of a high-resolution JY-5000 polychromator that has 27 PMT channels (Figure 9). This technique allows simultaneous detection of nonmetal and metal elements in proteins, therefore, acquiring quantitative and qualitative data present to determine specific proteins in biological systems.

Performance Optimization in Electric Field Gradient Focusing

Xuefei Sun and Milton L. Lee

Department of Chemistry and Biochemistry, Brigham Young University, Provo, UT 84602, USA

Electric field gradient focusing (EFGF) is a unique separation and preconcentration technique that depends on an electric field gradient and a dynamic counterflow to focus charged analytes, such as peptides and proteins. Recently, we have successfully developed an EFGF device completely fabricated with PEG-containing materials, including the substrate, hydrogel, and incorporated monolith, which have good resistance to protein adsorption. In EFGF, the most important issue is the establishment of a stable electric field gradient. The shape of the PEG hydrogel can be designed to produce a linear or nonlinear gradient. The shape and stability of the electric field gradient can be indirectly determined by the focused band positions of standard proteins at different counterflow rates because when the focused band reaches its focusing position, the electric field strength is proportional to the counterforce (i.e., dynamic counterflow). The effects of gradient shape on bandwidth and resolution were studied. In a nonlinear electric field gradient, the bandwidth was narrower in the steep part of the gradient; however, the resolution was improved in a shallow gradient. Influences of hydrogel characteristics, such as conductivity, ion transport and dimensions, and buffer solution composition, on EFGF performance were investigated. EFGF devices have been demonstrated to preconcentrate protein samples more than 10000-fold (see Figure 10).

The authors thank the National Institutes of Health (Contract no. RO1 GM064547-01A1) for financial support.

The Development of Analytical Methods to Facilitate the Identification and Structure Elucidation of Heparin-Derived Oligosaccharide Substructures

Stacie L. Eldridge, Albert K. Korir, John F. Limtiaco,  Sarah M. Gutierrez, and Cynthia K. Larive

University of California, Riverside, 501 Big Springs Road, Riverside, CA 92521, USA

Heparin oligosaccharides are highly sulfated linear polysaccharides that display a wide range of biological activities through interaction with proteins, including growth factors and chemokines. Elucidation of the relationship between biological function and the microstructure of these biopolymers remains a challenge due to the complex heterogeneity of their structure. In this study, the electrophoretic mobilities of 11 heparin disaccharide standards were measured by CE and compared with group pKas determined through 1H NMR pH titration experiments. This information was used to optimize the construction of a spectral database utilizing capillary isotachophoresis coupled to NMR detection (cITP-NMR). In addition, conformational transformations induced by changes in disaccharide protonation state were also investigated. Oligosaccharides obtained by depolymerization of porcine heparin using heparin lyases were separated and analyzed by CE-UV, LC-MS, and cITP-NMR spectroscopies. CE-UV was used to profile the enzymatic digests, determining the complexity and electrophoretic identification of individual components. Structural information was obtained by conducting cITP-NMR experiments of selected digests and comparing the online spectra to the cITP-NMR database. Complementary structural information was also acquired through the development of rapid LC-MS techniques. This work will enable future studies to examine the binding interactions of heparin and heparan sulfate with proteins.

Optimization of Steady State Recycle Parameters Utilizing Polarimetry in Chiral Separations

Mark E. Crawford, Joan M. Stevens, and Michael Halvorson

Gilson Inc., 3000 Parmenter St., Middleton, WI 53562, USA

Processing gram to kilogram quantities of target analytes has led to the exploration of several high-throughput separation techniques. Among those investigated is steady state recycle (SSR). Similar to simulated moving bed (SMB), fractions are collected from the leading and trailing edges of a chromatographic contor, while sample material is injected into the interior. Purifying large amounts of analyte in the semipreparatory stage is ideal for SSR and SMB. SSR facilitates the development of methods capable of separating 50 g to kilograms of product efficiently. Using polarimetry to optimize the SSR method further improves the efficiency of method development, providing comprehensive data and leading to incisive development decisions allowing the purifications of racemic compounds. Accurate sample injection allowed continual 99% chiral separation after polarimetry optimization. We have developed an efficient SSR optimization methodology that offers rapid development of chiral separation by SSR.

Lab on Chip for Proteomics

Daojing Wang

Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Mail Stop 84171, Berkeley, CA 94720, USA

Proteomics is a major component of systems biology. Mass spectrometry is the enabling technology for proteomics. To fully realize the enormous potential of lab-on-a-chip in proteomics, a major advance in interfacing microfluidics with mass spectrometry is needed. We have recently demonstrated the first monolithic integration of a microfluidic channel with multinozzle electrospray emitters via a novel silicon microfabrication process. These microfabricated monolithic multinozzle emitters (M3 emitters) can be readily mass-produced from silicon wafers. Each emitter consists of a parallel silica nozzle array protruding out from a hollow silicon sliver stem with the conduit size of 1 0 0 × 1 0 micron. The dimension and number of free-standing nozzles can be systematically and precisely controlled during the fabrication process. The M3 emitters showed comparable performance to that of the commercial tips in terms of stability and sensitivity for standard peptides and high molecular weight proteins. Multinozzle emitters are expected to ease the back-pressure and clogging problem that single-nozzle emitters have to cope with, especially as the channel downsizes to submicron scale. Our current work aims to further integrate the M3 emitters with front-end sample processing on a chip. New developments in the monolithic integration of sample loading and on-chip separation will be discussed.

This work is supported by National Institutes of Health, Grant no. R21GM0778701080-2.

Developing Microfluidic Chips for Coupling Capillary Electrophoresis with Matrix-Assisted Laser Desorption Ionization-Mass Spectrometry

Yiqi Luo, Songyun Xu, and Richard N. Zare

Stanford University, 450 Serra Mall, Stanford, CA 94305, USA

As a widely used powerful tool for analyzing biochemical molecules, matrix-assisted laser desorption ionization mass spectrometry (MALDI-MS) shows excellent accuracy, high sensitivity, and good stability. However, for analyzing a multicomponent complex sample, MALDI-MS signals of peptides and proteins may be suppressed by salts, detergents, and competitive analytes existing in the same sample. To overcome this drawback, the use of chromatographic techniques to separate components in complex samples is essential. Capillary electrophoresis (CE), especially free-solution CE, is rarely reported to be coupled with MALDI because the motion of analytes is driven by an electric field applied in the capillary, which gives difficulties to interface CE separation with MALDI sample plates. Therefore, to couple CE and MALDI, analytes should be fractionated after CE separation and introduced to a MALDI sample plate. To achieve these tasks, we developed a microfluidic chip prototype with active actuators. In the microfluidic chip, a row of fractionation valves is positioned above the main channel which is a horizontal channel used as the capillary for CE separation. After sample loading in the double T-junction located to the left, CE separation of analytes is carried out by applying voltage across the main channel. The fractionation valves close immediately after the CE separation. Then, the microfluidic pump located on top of the main channel is actuated to pump the fractions of separated analytes down into reservoirs through the side channels connecting the main channel to the reservoirs, and the fractions of the separated analytes are then taken out of the microfluidic chip for MALDI analysis (see Figure 11).

Productivity Enhancements for Sample Preparation for Environmental Labs Using Accelerated Solvent Extraction

Bruce Richter, Sheldon Henderson, Eric Francis,  Richard Carlson, Jennifer Peterson, Brett Murphy, and Brian Dorich

Dionex, SLCTC, 1515 W. 2200 S., Suite A, Salt Lake City, UT 84119, USA

Accelerated solvent extraction (ASE) is an innovative approach to liquid-solid extraction. It is accepted under EPA Method 3545A for the extraction of conventional environmental toxins such as PCBs, dioxins, PAHs, diesel range organics, and chlorinated pesticides. ASE is a well-established technique currently used in environmental labs across the world. Using ASE, samples can be automatically extracted over several hours without user intervention. Each individual sample extraction takes place in less than 15 minutes and with small volumes of solvent. The automation of the ASE apparatus allows laboratories to improve their productivity because samples can be extracted and prepared for analysis without any input or operation from the analyst.

New techniques have been developed that enhance the productivity of ASE even more. For example, extracts can be collected in vials or bottles that are directly compatible with an evaporator apparatus. This can eliminate or minimize sample handling and the transfer of samples from one container to another. Another example is the use of selective adsorbents in the extraction cell with the sample. This procedure can produce interference-free extracts, thus eliminating the need for postextraction cleanup procedures such as gel permeation chromatography (GPC) or multiple-column cleanup. By eliminating these additional sample preparation procedures, the overall productivity of an environmental lab can be greatly enhanced. Comparisons of time and labor involvement for traditional techniques and ASE techniques will be presented. We will also present a discussion of additional means to enhance the productivity of labs beyond what is currently possible.

Automated Solid Phase Extractions of Organochlorine Pesticides from Water

Naomi Reid, Greg Jeter, and Jay Rowden

Horizon Technology, Inc., 45 Northwestern Drive, Salem, NH 03079, USA

Pesticide pollution is a subject of global concern and although many countries have now banned the use of organochlorine pesticides, which can linger in the environment and can contaminate water sources. Organochlorine pesticides can be toxic to animals and humans; so methods which accurately and easily quantify them are essential. A recent study has been done, with the organochlorine pesticides analyzed, as well as a part of a standard mix consisting of alpha-, beta-, gamma- and delta-benzene hexachloride (BHC), 4, 4 -DDD, 4, 4 -DDE, 4, 4 -DDT, aldrin, dieldrin, endrin, endrin aldehyde, endrin ketone, endosulfan (I, II and sulfate), heptachlor, heptachlor epoxide, and methoxychlor.

The disk uses divinyl benzene (DVB) adsorbent, and is designed and certified for automation. In the study, the extractors pass a 1 L sample of water that has been spiked with standard, through the disk to collect the analytes, and are then eluted in a small volume of solvent into a VOA vial—all in one automated procedure. The eluents were then dried and concentrated. Automation allows increased levels of productivity and precision, making the SPE disk an efficient alternative to liquid-liquid extraction, as well as to other SPE methods.

Determination of Veterinary Drug Residues in Fish by an Automated SPE-HPLC System

Joan M. Stevens, Mark E. Crawford, and Michael Halvorson

Gilson, Inc., 3000 Parmenter Street, Middleton, WI 53562, USA

The FDA is responsible for ensuring the safety of seafood; however, efficient monitoring of aquacultured fish for residues is very limited. Optimizing the preparation of the sample for analysis not only lowers the LOD but also enhances the quality of the results by lowering RSDs. The system presented is a totally automated SPE with HPLC analysis capabilities. It allows unattended sample preparations and online analysis via HPLC without manual intervention. Method performance was evaluated over several days of replicated samples of controlled salmon, salmon fortified with a drug mixture, and salmon dosed with a representative from several drug classes (quinolones, floroquinolones, macrolides, malachite green, imidazoles, tetracyclines, penicillin, and betalactams). The complete automated system allows for ease of use and transfer of the method between sites minimizing down time and time required for validation and calibration.

SIFT-MS: A New Tool for Real-Time, High-Sensitivity, Quantitative Analysis of Flavor Compounds in Chocolate and Cheese

Daniel B. Milligan, Vaughan S. Langford, Barry J. Prince, Christine J. Reed, and Murray J. McEwan

Syft Technologies, 3 Craft Place, Christchurch 8024, New Zealand

Recent developments in selected ion flow tube mass spectrometry (SIFT-MS) have enabled the routine detection and quantification of compounds at part-per-trillion (ppt) concentrations (by volume). SIFT-MS detects trace VOCs in real time from whole-air samples by utilizing well-characterized gas phase reactions of multiple soft chemical ionization agents. The reaction conditions are precisely controlled in the flow tube enabling VOCs to be detected in real time but also to be quantified absolutely based on their known ion products and reaction rate coefficients. The use of different chemical ionization agents provides multiple independent quantitative measurements of each target compound, greatly enhancing the specificity of the technique compared to other real-time MS technologies.

In this paper, several applications of SIFT-MS to the food industry will be demonstrated. VOCs are often the compounds responsible for imparting flavor and aroma to food and beverage. Hence, there is a widening body of research in areas such as aroma characterization, flavor release, and detection of off-flavors or off-aromas.

Chocolate and cheese are both challenging food for analysis using existing real-time mass spectrometry techniques due to their very complex headspaces. In this paper, we will present results that demonstrate the capabilities of SIFT-MS for detection of important aroma compounds in both chocolate and cheese. SIFT-MS proves to be very successful in detecting and quantifying, for example, the characteristic pyrazines of chocolate and organosulfur compounds of cheese.

The technology's potential as a rapid sensitive quality assurance tool will also be emphasized.

Identification and Quantification of Ethoxyquin Degradation Product

Balaji Viswanathan, Rachadaporn Seemamahannop,  Ryan Schwiderski, Paul Nam, and Shubhen Kapila

University of Missouri - Rolla, Cest, Bom no. 01, 1300 N. Bishop, Rolla, MO 65401, USA

Ethoxyquin (6-ethoxy-1,2-dihydro-2,2,4-trimethylquinoline-EQ) is a commonly used antioxidant in food products and animal feed for preserving vitamins and lipids. EQ undergoes oxidation, reduction, and hydrolysis at ambient conditions resulting in degradation products. A number of such degradation products have been reported in the literature. Recently, a new EQ degradation product was discovered during reversed-phase liquid chromatography (RPLC) analysis of antioxidant formulations and animal feed samples. The antioxidant formulation analysis involved acetonitrile extraction of the samples followed by separation of extracted components on a C18 RPLC column. The separated components were then detected with a UV/Vis diode array detector (DAD). Water and acetonitrile were used as the mobile phase with a linear gradient elution program. The mobile phase flow rate was set at 1 mL  m i n 1 . The degradation product was formed during passage of EQ through certain C-18 columns. The extent of the degradation product formation was dependent on source and age of the column. The degradation product was isolated and characterized with UV and LC-ESI MS analysis. Analysis showed that the degradation product results from catalytic hydrolysis of EQ on silica surfaces.

A Rapid Method for Determination of Alpha Hydroxy Acids in Seawater and Biological Fluids

Ryan Schwiderski, Balaji Viswanathan,  Rachadaporn Seemamahannop, Paul Nam, and Shubhen Kapila

University of Missouri - Rolla, Cest, Bom no. 01, 1300 N. Bishop, Rolla, MO 65401, USA

Alpha hydroxy acids are important precursors of amino acids in biological systems, and are naturally present in biological species. These molecules are also used as feed supplements. Efficacy of a-hydroxy acids uptake is important for nutritional studies, and method for determination of the chemical has been reported. However, the methodologies are tedious, requiring a number of sample preparation steps. A simple approach has been developed for quantitative determination of 2-hydroxy-4-butanoic acid (HMB) and pantothenic acid in blood serum and seawater samples. In the approach sample matrices seawater or biological fluids were freeze-dried, and the dried residues were extracted with solvents, such as ethanol or acetone. Solvent was then removed from the extracts under a gentle nitrogen stream; the sample was reconstituted in suitable media (water; methanol or water; acetonitrile mixtures). The extracts were then introduced into a liquid chromatography column for separation. The separated components were determined quantitatively with a UV absorption detector or ES-MS operating the negative ion mode. Validation experiments with fortified simulated seawater and bovine blood serum samples showed that analytes' recoveries were consistently within the 90–100% range. Analytes were readily monitored over 0.1–50 mg  L 1 concentrations' range. Analytes at concentrations down to 𝜇 g  L 1 could be determined through esterification of the extracted analytes followed by GC-EIMS analysis.

Unwanted Species Interconversions as a Challenge for Speciation Analysis of Mercury in Fish Tissue

Stefan Truempler, Wolfgang Frech, and Wolfgang Buscher

Institute of Inorganic and Analytical Chemistry, University of Muenster, Corrensstr. 30, Umea 48149, Germany

In speciation analysis, it is important that the original state of the sample at the time of sampling be preserved until measurement. However, in complex biological matrices, many factors can cause an alteration of the natural species distribution, of which several can be found in the analytical method.

In the presented work, a sample preparation with alkaline digestion and aqueous phase derivatization has been applied. NaBEt4 was used to convert inorganic (H g 2 + ) and methylmercury (MeH g + ) into alkylated derivatives.

The volatile mercury compounds were separated on an HP-1 GC-column, and element-selective detection was carried out with inductively coupled plasma mass spectrometry (ICP-MS) and microwave-induced plasma atomic emission spectroscopy (MIP-AES).

The presented data reveal unwanted species transformation in aqueous standards and fish tissue samples. The reduction of MeH g + and H g 2 + to elemental mercury (H g 0 ) during the derivatization was observed, as well as transalkylation reactions during storage periods.

The interconversions could be detected and traced by means of species-specific isotope dilution (IDMS) and single-standard spiking experiments. However, of these two methods, only IDMS is able to correct observed transformation reactions.

Strategies for Automated Chromatographic Method Development

Teresa Lints and Michael McBrien

Advanced Chemistry Development, Inc., 33235 Regal Dr., Toronto, MI 48026, USA

Automated chromatographic method development has recently become a topic of considerable interest to the analytical chemistry world. The design of effective chromatographic methods is a complex task involving the often conflicting goals of separation robustness, resolution of all components of interest, and fast runtime. The chromatographer has an unprecedented array of tools available for this difficult task, including ultrahigh performance liquid chromatography, mass spectrometry, unique new mobile phase selectivities, chemometric component detection and tracking, automated decision-making, and workflow and project management software.

While the method development toolset has advanced considerably over the last few years, method development strategies have remained fairly stagnant. This paper will present new strategies for chromatographic method development, which leverages these modern method development tools, resulting in increases in method development efficiency. The strategies shown are designed to increase the effectiveness and rigor of method development, while decreasing the amount of time during which users are required to manually review method development data. A combination of screening and optimization method development "waves" maximizes the selectivity space investigated without sacrificing effectiveness of chemometric peak tracking. Instrument control is leveraged to minimize the time required to configure injection sequences. The approaches will be illustrated with several real-world examples.

Analytical Method Validation of HPLC-Charged Aerosol Detection

Brian J. Forsatz and Nicholas H. Snow

Seton Hall University, 24 Michael Drive, South Orange, NJ 07470, USA

CoronaTM charged aerosol detection in high-performance liquid chromatography (HPLC) has recently become commercially available although the detection mechanisms are not well understood. The goals of this research were to gain (better understand this new detection method for HPLC method development by determining the basic chromatographic parameters that have an effect on this detector). These include the composition of the mobile phase and its relationship with the baseline signal, the determination of the amount of mobile phase that is associated with detector charging, effects on analyte response and how it is affected by changes in mobile phase composition, the effects of volatile mobile phase additives, and the determination of the actual amount of analyte that is being detected by the electrometer. In addition to developing a better understanding of charged aerosol detection, applications were examined, compared, and contrasted to current detection methods for HPLC, such as ultraviolet (UV) detection. The application that was chosen was pharmaceutical cleaning validation, of which HPLC with charged aerosol detection proved to be an acceptable technique for trace-level analysis of drug substance on typical pharmaceutical manufacturing equipment surfaces. HPLC-charged aerosol detection was also examined for general pharmaceutical analysis and the ability of this technique to be fully validated to the current International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) and United States Pharmacopeia (USP) guidelines. Again, HPLC-charged aerosol detection proved itself as an acceptable technique, and several pharmaceutical assays for in-process control, impurities, and drug substance purity were validated.

Automated Mass Spectral Identification Using the Template Approach

Daniel L. Sweeney

MathSpec, Inc., 1314 North Highland Avenue, Arlington Heights, IL 60004, USA

Unknown compounds are often closely related to a lead compound: degradation products, impurities, or metabolites. Traditionally, the mass spectral dataset of that lead compound is used to work out the fragmentation pathways, and unknown compounds are then identified based on the changes in the masses of various fragments. This approach works well, but it can be very time-consuming.

Systematic bond disconnection has been used to assign accurate mass fragments to known compounds. A similar approach can also be used to assign subfragments of modular structures to specific molecular substructures of a lead compound. The heavy atom distribution of modular structures, derived from the mass spectral data, is compared to the heavy atom distribution of the molecular structures to find matches. (Heavy atoms are atoms of elements other than hydrogen.) Only the modular structures that correlate with the molecular structures are saved, and a monochrome molecular structure can then be color-coded with the same color scheme as the modular structures. This makes the fragmentation easy to visualize.

By using the modular structures that match the lead compound as templates, related unknown compounds can now be identified by comparing modular structures to modular structures. The modular structures of the unknown compound that best match the templates are saved and linked to the template modular structure that they most closely match. For correlating related compounds to a lead compound of known structure (the template approach), this approach makes it very easy to visualize where changes have occurred.

Optimization of Polyethylene Glycol Mass Spectra Produced by AP-MALDI TOF-MS due to Sample Preparation Modifications

Sara M. Kallop and Stephanie J. Wetzel

Department of Chemistry and Biochemistry, Duquesne University, Mellon Hall 308, Pittsburgh, PA 15282, USA

Atmospheric pressure matrix-assisted laser desorption ionization time-of-flight mass spectrometry (AP-MALDI TOF-MS) is notorious for sensitivity to sample preparation. Specific ratios of salt analyte and matrix combinations are needed to obtain an optimized spectrum. In synthetic polymer AP-MALDI analysis, both the salt and matrix are needed in order to obtain analyte signal. In synthetic polymer analysis, optimized signal is necessary due to the distribution of molecular masses present. This investigation uses the synthetic polymer polyethylene glycol (PEG) to discover the optimal ratio for analysis by AP-MALDI. PEG samples of varying molecular mass were used to determine if the differences in the weight would impact the optimization. It has been determined that spectral intensity will vary depending on the ratios of matrix, analyte, and salt which are utilized.

Online Bioaerosol Ion Mobility Mass Spectrometry of Biological Agent Detection

Juaneka M. Hayes, Kermit K. Murray, Michael V. Ugarov, and J. A. Schultz

Department of Chemistry, Louisiana State University, 232 Choppin Hall, Baton Rouge, LA 70803, USA

We are developing an online interface for MALDI of collected bioaerosols that is coupled to an ion mobility time of flight mass spectrometer. The detection of biological agents presents significant challenges for analytical instrument technology due to the need for rapid analysis coupled with the complex nature of potential sample contaminants. Biological warfare (BW) agents are typically dispersed as aerosol particles, and therefore detection instrumentation must be fast, sensitive, and selective for positive detection with minimal false alarms. Furthermore, portable instruments that can function in harsh environments with minimal operator intervention are required.

We are developing bioaerosol ion mobility mass spectrometry (BIMMS) for real-time detection and analysis of collected bioaerosols. Particles are collected on a target that mechanically transports the particles to a MALDI ion source. Ions are formed by 337 nm UV or 3  𝜇 m IR MALDI. Separation occurs in a 20 cm ion mobility cell at 10 torr helium buffer gas. Ions pass through a 0.5 mm orifice into a differentially pumped region before being extracted into a 20 cm orthogonal time-of-flight mass spectrometer for analysis. Initial studies are focused on the detection and identification of the biological warfare simulants Bacillus subtilis (BG), Bacillus thuringiensis (BT), and Erwinia herbicola (EH) against a background of dust, salt, and pollen. Mass spectra of collected bacteria show distinct trend lines associated with different components of bacteria and complex mixtures with little interference. These particular fingerprints can be used for the detection of BW agents in real time.

Automation of Nonmechanical Valves for Fluid Steering in Microfluidic Chips

Robert Grammer, Theron J. Pappas, and Lisa A. Holland

West Virginia University, Clark Research Lab Rm 355, Morgantown, WV 26506, USA

Recent exploration into nonmechanical valves has demonstrated that bicelle liquid crystal solutions can be used as a means for directing fluids in microfluidic channels. However, a systematic examination of the optimum phospholipid solution required an advanced field delivery platform. Using Peltier modules, rather than circulating water, and motorized magnet control, rather than manual placement, has proven successful in automating these nonmechanical valves. This paper describes a simple, yet more effective and portable, platform.

Lab-on-Chip-Based Miniaturized Capillary Electrophoresis System

Werner Hoffmann, Holger Muehlberger,  Thomas Clemens, Horst Demattio, Claudia Gaertner,  Matthias Klotz, Rainer Koerber, and Gunther Krieg

Research Center Karlsruhe, P.O. Box 3640, 76021 Karlsruhe, Germany

The concept of capillary electrophoresis (CE) in chip format can be seen as one of the outstanding milestones in modern lab-on-chip development. Commercial devices are now available at the market. But, further efforts are essential to get broad acceptance of these “high-tech” products in the analytical lab. Cost reduction seems to be a central challenge, extending the topic of “Electrochemical detection: small is beautiful” to the economic dimension. Here, we present a new device to meet this goal by introducing a chip-based CE system with electrical detection.

The device features the combination of two advantages: (i) substitution of conventional CE glass chips by tailor-made polymer material chips which can be mass-fabricated at low production costs, and (ii) electrical analyte detection by contactless conductivity measurement of high-frequency signals, which needs only a simple detector arrangement and low-cost measurement equipment. This avoids disadvantages of direct-contact electrochemical measurements as well as sophisticated optical detection.

Technical parameters of the portable compact cost-efficient device and results of test measurements will be given. Applications will be demonstrated for the determination of organic acids in wine.

Automated Intelligent Dilution

Stefanie Czyborra and Nadine Seifert

Metrohm AG, Oberdorfstrasse 68, 9101 Herisau, Switzerland

The combination of the 850 professional IC, the 858 professional sample processor, and the MagIC Net software offers a variety of sophisticated sample preparation techniques. One of these is automated intelligent dilution.

In order to perform automated dilutions, the above-mentioned instrument setup is additionally equipped with a magnetic stirrer and an 800 Dosino. The latter aspirates a defined volume of a concentrated sample, and transfers it to the mixing vessel. Subsequently, the diluent generally ultrapure water is dosed into the mixing vessel. After intensive stirring, the peristaltic pump of the sample changer transfers the sample to the injection valve of the ion chromatograph, where the sample constituents are separated and detected. After the quantitation of the sample's constituents, the software verifies if the concentrations are within the calibration range. If this is the case, the next sample will be analyzed. If not, the software, reanalyzes the sample after calculating the appropriate dilution factor.

This paper will give a description of the straightforward analytical setup, and will demonstrate the usefulness of the automated intelligent dilution in a number of typical analytical chemistry applications.

Automation of Sample Preparation Techniques Using a Robotic 𝑋 - 𝑌 - 𝑍 Coordinate Autosampler with Software Control

Fredrick D. Foster, Edward Pfannkoch, and John Stuff

Gerstel, Inc., 1510 Caton Center Drive, Suite H, Baltimore, MD 21227, USA

Today's analytical chemist faces an increasing number of samples to be analyzed and a need to maximize throughput while still ensuring that the highest quality in the resulting data is achieved. There is therefore an increasing need for automation during sample preparation. Many instruments are available that have been created to automate specific sample preparation techniques. As a result, a number of different instruments are needed, and each of these systems has its own controlling software.

A single robotic X-Y-Z coordinate autosampler commonly used for sample introduction in GC or HPLC can be used to perform a wide variety of sample preparation techniques using a single instrument and controlling software. The new Maestro software allows the user to control an expanded list of sample preparation techniques such as derivatization, saponification, esterification, analytical weighing, filtration, and solid phase extraction. In addition to ease of use and intuitive windows-based programing, the software includes tools to automate and optimize parameters assuring efficient sequence creation and maximum sample throughput. The sampler can be configured as part of a GC or LC system, or can be configured as a benchtop workstation. In this paper, we discuss various sample preparation techniques available when using the robotic autosampler in conjunction with the new software. Examples of automating techniques such as solid phase extraction and Twister solvent back-extraction for HPLC and sample weighing are shown.

Fully Automated Solid Phase Extraction for Food and Environmental Samples Coupled Online to LC- and GC-MS

Oliver Lerch, Carlos Gil, and Norbert Helle

GERSTEL GmbH & Co. KG, Aktienstrasse 232-234, Muelheim, 45473 North-Rhine-Westphalia, Germany

A novel automated solid phase extraction (SPE) system with online injection into LC- or GC-MS was evaluated for different food and environmental samples. All components of the system were attached to a standard laboratory autosampler. Standard 1, 3 or 6 mL SPE cartridges which were shortened and equipped with a transport adapter could be used with the system. Solvent transfer was done via an autosampler syringe providing an accurately controlled flow resulting in very good repeatability and recovery. Concentration and derivatization steps were successfully automated. Each step of the process could be programed by software.

Some examples reveal the performance of the described automated SPE system. Malachite green and a metabolite could be analyzed in fish by LC/Ion trap MS. Fish samples were extracted with acetonitrile/water, and the extract was cleaned up automatically by SPE followed by online injection into LC-MS.

Aflatoxines were determined from food. The extracts were cleaned up via automated SPE with immunoaffinity cartridges. This was followed by automated derivatization of the analytes with bromine and injection of the resulting solution into LC/Ion trap MS.

A third application was the determination of 16 EPA PAH in environmental samples. Sample extracts (hexane) were automatically cleaned up over silica gel SPE cartridges followed by online large volume injection into GC/MS.

All described applications show good repeatability and recovery. Automation of the SPE provides time savings, while limiting exposure of lab personnel to solvents and reagents. This makes the automated SPE a valuable tool for sample preparation in LC and GC analyses.

Automated Sample Enrichment and High-Sensitivity Analysis of DMPK Samples Using HPLC-Chip QQQ MS

Georges L. Gauthier, Stephan Buckenmaier, and Martin Vollmer

Agilent Technologies, Hewlett-Packard Str-8, 76337 Waldbromm, Germany

The ability to detect and quantitate drug metabolites in the early drug discovery phase can be limited because of the small available sample size and the complex biological matrix background. To overcome this limitation, nanoflow LC-MS is one approach that can be used to increase sensitivity.

HPLC-Chip is a new nanoflow separation system where components such as the spray emitter and enrichment and separation columns are directly integrated into the chip. Workflows such as sample enrichment are fully automated and performed directly on the chip. The use of HPLC-Chip for routine nanoLC has been demonstrated for biomarker discovery [20] and glycan analysis [21]. It has also been applied to the routine high-sensitivity detection of metabolites in various biofluids [22].

We will review a new chip design with a high-capacity enrichment column specifically developed for the analysis of a wide polarity range of pharmaceutical small molecules. Retention time precision was better than 0.2% RSD on an Agilent 6410 QQQ HPLC chip system (see Figure 12). Typical pharmaceutical compounds were detected with sensitivities in the lower fg range absolute on column. Linearity for quantitation was achieved over a wide concentration range. Both hydrophilic and hydrophobic analytes in a log P range from 0 to 4 were reproducibly enriched on the new small molecule chip. To demonstrate chip performance for DMPK samples, imipramine was analyzed out of human serum. Despite the complex biological matrices, low fg levels were detected.

Online Method for Determining In Vitro Amino Acid Flux in Subminute Time Scales Using Fast Micellar Electrokinetic Chromatography

Jason Greene and John P. Wilkswo

Vanderbilt University, 6301 Stevenson Center, Vu Station B 351807, Nashville, TN 37235, USA

The study of cellular metabolism in rapid time scales is at the forefront of metabolomics. The study of amino acid flux in subminute time scales would yield insight into rapid changes in cellular metabolism as a whole. To this end, we have developed a rapid, online, amino acid flux analyzer (AAFA). The AAFA consists of a microbioreactor, derivatization chamber, and micellar electrokinetic chromatography (MEKC) instrument with laser-induced fluorescence (LIF) detection. The microbioreactor is continuously perifused with a volume of 3  𝜇 L containing a monolayer of 1 0 5 - 1 0 6 cells. The amino acids in the effluent enter the chamber where they are derivatized with o-phthalaldehyde/CN-resulting in being fluorescently tagged. The derivatized effluent is separated by an online, computer-controlled MEKC instrument with LIF detection in order to quantify the concentration of each amino acid. Immediately after each electropherogram is collected, another injection is automatically performed. By integrating each peak in successive electropherograms, a graph of amino acid concentration versus time can be constructed for each amino acid. We have demonstrated the separation of 18 amino acids in less than 30 seconds in cell media with separation efficiencies of up to 3 × 1 0 6 plates per meter and a signal-to-noise ratio of > 1 0 4 . Furthermore, we have shown that the AAFA can be run continuously for more than two hours with relative standard deviations as low as 5.4%. This method would allow the observation of dynamic changes in amino acid flux due to various agents which are known to alter cellular metabolism. This research was supported by Grant U01 AI061223 from the National Institute of Allergy and Infectious Diseases.

Monitoring of Propofol Concentrations in Breath and Blood by SPME-GC-MS and PTR-MS: Chance and Limitations of a New Clinical Tool

Wolfram Miekisch, Patricia Fuchs, Svend Kamysek,  Maren Mieth, and Jochen Schubert

Department of Anaesthesia and Intensive Care, University Hospital of Rostock, University of Rostock, Schillingallee 35, 18057 Rostock, Germany

Monitoring of drug effects onto consciousness is an important issue in intensive care medicine and in anesthesia when sedation is to be tailored to patients' real needs. In contrast to volatile anesthetics, real-time monitoring of commonly used intravenous anesthetics like propofol is not yet available. Conventional determination of blood concentrations still requires sophisticated and time-consuming techniques.

Methods. We applied SPME GC-MS and real-time PTR-MS to assess blood and breath concentrations of propofol in lab animals (pigs) and in 14 mechanically ventilated patients. Propofol concentrations were assessed after bolus application and in steady state. Results. SPME GC/MS headspace methods for determination of propofol from human blood and breath showed good linearity over the physiological range (0.1–20  𝜇 g/mL in blood and 0.3–30 ppb/V in breath) yielding an R2 > 0.99 and RSD < 10%. Correlation coefficient between arterial blood concentrations and breath concentrations was 0.88 for the 14 mechanically ventilated patients. In ventilated pigs, PTR-MS real-time measurements demonstrated pronounced changes of propofol breath concentrations during the first minutes after injection of the drug. Conclusion. Drug concentrations in human blood and breath can be measured fast and precisely by means of SPME GC-MS. PTR-MS real-time measurement of exhaled propofol concentrations provides information on distribution effects in the body.

Good correlations of propofol blood and breath concentrations indicate that monitoring of exhaled propofol in steady state conditions may act as a promising noninvasive tool for assessing blood concentration and depth of anesthesia.

Validation of Diffuse Reflectance Infrared Spectroscopy as a Means to Discriminate Blood from Various Forensically Relevant Substrates

Anthony R. Trimboli, Jessica M. McCutcheon, and Stephen L. Morgan

Department of Chemistry and Biochemistry, University of South Carolina, 631 Sumter St., Columbia, SC 29208, USA

The importance of blood stains has risen as forensic researchers have increased information that can be extracted from even small blood. Landsteiner's initial pioneering work on the A-B-O blood typing system was in 1901; later, other specific antigens (e.g., Rh factors) were discovered to discriminate among blood types. DNA analysis has proven to be the most conclusive piece of evidence used for forensic blood analysis. However, methods for locating and confirming the presence of blood at crime scenes have remained relatively unchanged.

Presently, catalytic methods of detection are the standard for crime scene examination. Chemical detection methods involve toxic chemicals, such as luminol, which also may compromise DNA integrity. Real-time detection of blood stains by infrared spectroscopy (IR) addresses many of the issues with chemical detection and reduces analysis time. Hemoglobin, the major protein in hematocrit, and albumin, the chief component in plasma, produce distinctive IR absorbance in the 1650−1540  c m 1 spectral range of the amide I and amide II bands.

Multiple substrates (i.e., olefin, nylon, and polyester polymers coated with Scotchguard and other stain release treatments) were tested and doped with various concentrations of blood. The feasibility of using spectroscopic detection for rapid visualization of blood on fabric surfaces was evaluated using diffuse reflectance infrared spectroscopy (DRIFTS). Discrimination of blood-stained substrates from clean surfaces was achievable in the overwhelming majority of cases. Multivariate statistical methods were utilized to determine both the optimal spectral region for discrimination between sample and substrate as well as the limit of detection for the method.

Development and Optimization of Sorbent Polymer Coatings for Detection of Chemicals and Explosives

Bernadette A. Higgins, Duane L. Simonson,  R. Andrew McGill, Jennifer L. Stepnowski,  Kimberly P. Williams, Heather M. Summers, Rekha Pai,  Stanley V. Stepnowski, Michael Papantonakis, and Matthew T. Rake

Naval Research Laboratory, 4555 Overlook Avenue SW, Code 6365 B3/151, Washington, DC 20375, USA

Sorbent polymers have the ability to trap and concentrate a variety of hazardous analytes including chemical agents, toxic industrial chemicals, and explosives. These sorbent polymers are used for a variety of applications including sensor coatings, microfabricated preconcentrators, SPME fibers, and membranes. In this paper, we have developed and optimized a hyperbranched hydrogen bond acid sorbent polymer (poly(methyldi(1,1,1-trifluoro-2-trifluoromethyl-2-hydroxypent-4-enyl)silane; HCSA2). HCSA2 can form reversible but strong hydrogen bonds with hydrogen bond basic analytes. Physical properties of this polymer were tailored according to the application where HCSA2 was to be coated. Membranes and SPME fibers require the viscosity of the polymer to be higher to avoid flow of the polymer. The thermal stability of HCSA2 was also improved for all applications to prevent degradation of the polymer as the analyte was desorbed. The absolute and comparative sorptive properties of the HCSA2 polymer were determined by coating and testing on several different sensor types including: inverse gas chromatography (IGC) columns, SPME fibers, and a microfabricated preconcentrator device (cascde avalance sorbent polymer array (CASPAR)). The results presented will encompass HCSA2 physical characterization and the polymer coating response to hydrogen bond basic and other analytes.

Automated Method Scouting to Speed up the Development of Online SPE-LC-MS Analysis of Antibiotics

Frank Steiner, Frank Arnold, Verena Fraas,  Markus Martin, and Christian Huber

Dionex Corporation, Dornierstrasse 4, 82110 Germering, Germany

Method development is still considered the crucial bottleneck that impedes productivity in analytical laboratories. One major challenge with LC methods is the selection of the appropriate stationary phase material. The use of integrated sample preparation by online SPE multiplies the effort as it becomes necessary to discover the best combination of SPE column and analytical LC column. Optimization of eluting conditions from the SPE column, transfer to the analytical column, and the subsequent analytical separation can be excessively time-consuming in such a coupled analytical procedure.

We present a new integrated system that allows automatic and intuitive scouting of columns, eluents, and other important method parameters like column temperature. Two six-position column selection valves are integrated in the column compartment to provide highest flexibility for automation. Intelligent software makes parameter permutation easy, without requiring method changes. The application of this system to automatically screen SPE phases for loading capacity and optimal eluting conditions is depicted. It is used to optimize the enrichment of a wide range of antibiotics in aqueous matrices. In a second step, the antibiotics mixture is dissolved in the optimal SPE eluting solvent, and injected into a selection of different analytical columns. A set of 6 separation columns is combined with the use of different organic eluent components, buffers, and temperatures for automatic screening. Peaks can be tracked via UV/Vis spectra or mass comparison. A spreadsheet-based reporting tool provides a peak resolution chart that provides an instant overview of the results and makes it easy to find the best method conditions.

An Automated LC/MS/MS Protocol to Enhance Throughput of Physicochemical Property Profiling in Drug Discovery

Peter Alden, Paul Lefebvre, and Darcy Shave

Waters Corporation, 34 Maple St., Mailstop Gc, Milford, MA 01757, USA

The synthesis of large and focused chemical libraries allows pharmaceutical companies to rapidly screen large numbers of compounds against disease targets. Active compounds resulting from these screens are traditionally ranked based on their activity, binding, and/or specificity. Turning these hits into leads requires further analysis and optimization of the compounds based upon their physicochemical and ADME characteristics. The critical factor to consider in physicochemical profiling is throughput. The bottlenecks to throughput include MS method optimization for a large variety of compounds and data management for the large volume of data generated.

Currently, experiments including solubility, chemical and biological stability, water/octanol partitioning, PAMPA, Caco-2, and protein binding are used to generate physicochemical profiles of compounds in drug discovery. The measurement of physicochemical properties from these studies is easily enabled using chromatographic separation and quantitation using LC/MS/MS/UV. While the sample analyses may be efficient, the processing of the data and the interpretation of the results often require tedious and time-consuming manual manipulation and calculation.

This paper describes an approach to solving these problems by the use of a novel software package that allows for the design of experiments, data acquisition, and the processing as well as report generation in a fully automated manner.

To demonstrate the usage of this software package, we have developed an automated UPLC/MS/MS protocol for data generation. The data acquired from multiple assays were processed by a single processing method, all in an automated fashion. As a result, the physicochemical profiling process was significantly simplified and throughput was increased.

Monitoring the Interactions of Quercetin with Nucleic Acids

Samuel N. Kikandi, Musah Samira, Joy Romulus,  Ailing Zhou, and Austin Aluoch

SUNY-Binghamton, 4400 Vestal Pkwe, Binghamton, NY 13905, USA

Quercetin, a well-known phytoestrogen, has been reported to exhibit anticancer, antitumor, and other therapeutic activities of significant potency. Despite these useful therapeutic activities, quercetin has been reported to have pro-oxidative and mutagenic activities on biomolecules as well. To date, few studies have reported quercetin interactions with the double-stranded DNA since such interactions induce DNA strand scission that can cause DNA damage, and, consequently, mutation. In this work, we report the real-time monitoring of quercetin oxidation behavior and interactions with the calf-thymus DNA in physiological medium as bulk electrolysis progressed. Electrochemical, UV-Vis, and GC-MS/LC-MS results revealed the quercetin degradation products.The UV-Vis and fluorescence spectroscopies showed the modes of interaction between quercetin and ctDNA in physiological medium.

High-Throughput Characterization of Contamination in Manufacturing Environments Using an Industrialized Electron Beam Analyzer

Timothy J. Drake

Aspex Corporation, 175 Sheffield Drive, Delmont, PA 15626, USA

This paper will focus on the investigatory and developmental studies pertaining to the chemical make-up, source identification, and process control of particulate contamination in a manufacturing environment. The method developed and utilized for this analysis is an automated electron beam scanner which determines the size, shape, and elemental composition of individual particulates. Unique to this technique are both a fully integrated electron beam and energy dispersive X-ray spectrometer and also a database storage and data analysis software suite designed specifically for quality-control and quality-assurance scientists. A series of samples were analyzed from a subassembly which was found to be susceptible to failure due to particulate contamination. In particular, it was determined that both the size of the particles but also the identification and elemental composition of the particles. The key assembly failures occurr due to the tight tolerance of the assembly and also the hardness of particles that were observed. In addition to an overview of the technological breakthrough developed for this instrumentation, this presentation will focus on the data and trend analysis software, as well as the impact of the particulates on the manufacturing process.

Automated Fractionation System That Increases Recovery and Optimizes Precision of Solid Phase Extraction Methodology

Joan M. Stevens, Michael Halvorson, and Mark E. Crawford

Gilson, Inc., 3000 Parmenter Street, Middleton, WI 53562, USA

Solid phase extraction (SPE) is often chosen as a sample preparation method to remove matrix effects from the sample prior to analysis. Removal of these matrix effects drastically increases the signal-to-noise ratio of the compounds of interest. SPE is commonly employed with biological samples where the matrix interferes with detection of the compound or drug of interest, and in the environmental field where trace analysis is required and therefore large amounts of sample (water) are processed. Optimization of the SPE procedure can greatly affect the recovery of the analyte of interest. There are four basic steps in every SPE procedure, that is, conditioning, loading, washing, and elution, and evaluating each of these steps can offer increased recovery and therefore enhanced performance. Manually performing the method development/optimization of an SPE method can be quite tedious and time-consuming, and therefore may never come to fruition. The automated system presented has the capabilities of performing multiple solvent deliveries, “collection fractionation,” analysis of each and every step in a solid phase extraction process that affects recovery and precision without manual intervention; the end result is an SPE method that is more robust with enhanced recovery.

The Identification of Forest Bioproduct Process Components through Near Infrared Spectroscopy

Abigail R. Hamilton, Amy L. St. Peter, and Darrell W. Donahue

University of Maine, 5737 Jenness Hall, Room 226B - Nir Lab, Orono, ME 04469-5737, USA

Near infrared spectroscopy (NIRS) has the potential to advance the productivity of the forest biorefinery process by rapid identification of process bioproducts, including the liquid extract and woody biomass. The potential exists so that composition identification via NIRS can be performed as an inline process control operation. Before this technology is applied to the forest biorefinery process, an NIR spectral database of solid wood chips and liquid extract solutions must be collected, developed, and analyzed. Along with spectral data, physical differences of the woody biomass, including surface color, grain size, and thickness, were considered. The effects of viscosity, the amount of liquid scanned, and the type of solvent used in solution were also investigated.

After developing the database of spectra, partial least-squares (PLS) techniques were used in combination with selected pretreatments to develop calibration models. The calibration models were tested by scanning actual wood chips and liquid extracts removed during a laboratory-scale biorefining process.

Wood chip spectra prior to pretreatment showed most variation within the 1000–1350 nm range. Variation in the liquid extract spectra (high water content) was evident in the same range. Significant differences were seen when a water spectrum was subtracted from the liquid extract spectra. Initial calibrations based on known woody biomass components indicate positive validation results. These outcomes confirm the possibility for identification of extract components via NIRS, leading to the development of techniques that can be further tested in industry.

Optimization of a Compartmentless Microchip Biofuel Cell for Sensing Applications

Michael Moehlenbrock and Shelley D. Minteer

Department of Chemistry, Saint Louis University, 3501 Laclede Ave., St. Louis, MO 63103, USA

Many limitations on current biofuel cell technologies lie in the geometry of the test cell used to ascertain their performance. Most work has utilized cell designs which have large distances between cathode and anode, which limits the performance of the cell due to its inherent resistivity and transport distances. Another concern is the practicality of the device and its use for sensing applications. Traditional test cells are bulky and incapable of being combined in such a way to produce adequate power while maintaining practical size. Microfluidic biofuel cells have demonstrated an immense potential for developing a solution to many of the problems involved in the production of commercially viable biofuel cells. In this work, we describe a novel design for the incorporation of enzymatic biocathodes and bioanodes into a system that is capable of serial power generation in an extremely small area. This has been accomplished through the immobilization of bilirubin oxidase and lactate dehydrogenase on the cathode and anode, respectively. This is accomplished through the controlled modification of nafion polymer with alkyl ammonium bromide salts, which creates an ideal hydrophobic environment of buffered pH and tunable pore size for the encapsulation of each enzyme. Through this immobilization on a microelectrode substrate, we are able to utilize photolithographic techniques to generate an electrode design and polydimethylsiloxane channel structures to develop a hydrodynamic flow cell design that is capable of running ten cells in series while allowing for simple introduction of fresh fuel/analyte through full automation.

A Complete Solution for Automated Oil Sample Prep

Lee Brady, David Hilligoss, Karl Heinze, Michael Kealy, and Gary Reznik

PerkinElmer, 2200 Warren ville Road, Oak Brook, IL 60515, USA

Oil testing presents significant challenges for automated liquid handling and sample preparation. Pipetting performance, cross-contamination, tip washing, and production of waste material vary widely with the range of viscosities and contaminations observed in used oil. Liquid level detection (a factor which greatly impacts pipetting performance) is further complicated by the nonionic nature of used oil, adding to the challenges of automating used oil liquid handling.

As a solution to these challenges, PerkinElmer has introduced a new high-throughput JANUS automated workstation with ultrasonic liquid level detection. The new workstation is based on the JANUS extended deck platform using an eight-tip Varispan arm. It can be integrated with many FTIRs, ICP-MS, MALDI-TOF, or viscometers for “walk-away” operation, by adding a second gripper arm. The innovative application of ultrasonic technology for liquid level detection provides significant advances when working with nonionic samples, and it is not impacted by individual sample viscosities, color, or external lighting (as observed with other liquid level sensing technologies).

Disposable tips used with the JANUS' VersaTip technology eliminate a potential source for cross-contamination between samples, as well as the need for repeated tip washing (as required when using fixed-tip probes). Sample diluents (such as kerosene) are maintained in a closed-loop system, recycling all but what is being used for sample dilutions. System fluids are fully recovered, resulting in a minimized waste stream and potential cost savings.

Data represented in this paper will include fully automated sample preparation of used oil samples. Accuracy and precision data (kerosene and several known oil types), throughput examples, as well as analytical comparisons of samples prepared using the JANUS automated workstation will be discussed.

Real-Time Monitoring of Wear Particles in Lubricating Oil

Lars Schomann, Sven Krause, and Gerhard Matz

Institute of Measurement Techniques, Hamburg University of Technology, Harburger Schlossstr. 20, 21079 Hamburg, Germany

The amount and size distribution of wear particles in lubricating oil of machines are indicators of the current machine condition. Monitoring the wear particles during normal operation can help to identify the need for maintenance and more important to prevent sudden failure of the machine.

The real-time measurement system under development at the Institute of Measurement Techniques of Hamburg University of Technology is aimed to monitor the lubricating oil of gears in windmills or the oil of combustion engines on engine test stands. The principle of operation is to image a thin layer of oil and count and classify contained wear particles. Therefore, a small continuous flow of oil is pumped through a bypass connected to the oil circuit, and directed through a thin transparent flow cell. Images are taken using a bright light source, a magnification lens, and small digital video camera. Algorithms have been developed to process and analyze the images. They are capable of compensating for variations in background brightness, differentiating solid particles from nonsolid particles (e.g., air bubbles), classifying particles by size, and generating a histogram of the particle size distribution.

Using the real-time measurement system, the detection of significant changes in wear particle size distribution during normal machine operation enables the operator to prevent sudden and fatal failure. In combination with a machine component analysis, the measurement results can yield important information about wear mechanisms.

Automatic Preparation of Standard Solutions for the Determination of Octane Numbers

Eric van der Heijden and Alfred Steinbach

Deutsche Metrohm GmbH & Co. KG, In Den Birken 3, 70794 Herisau, Germany

The octane rating of gasoline is a value that indicates how much a fuel can be compressed before it spontaneously ignites. This spontaneous ignition is also known as engine knocking, and adversely affects engine performance. The higher the octane rating of gasoline is, the higher the resistance to knock will be. As per definition, the octane rating of isooctane, a highly branched alkane that burns smoothly, is set to 100. The rate of n-heptane, an unbranched alkane that strongly tends to premature ignition, is set to zero. An octane rating of 85 means that the fuel has the same knocking properties as a mixture of 85% isooctane and 15% n-heptane.

There are different octane ratings like the research octane number (RON) or the motor octane number (MON). Both of them are determined by running the fuel through a test engine (like the CFR ASTM test engine) and comparing the obtained results with those for mixtures of isooctane and n-heptane. The different numbers result in different working conditions of the test engine. As the preparation of the solvent mixtures significantly affects the accuracy of the determination of the octane rating, precise preparations of the solvent mixtures are of paramount importance.

This communication describes how PC-controlled automatic dosing devices allow a straightforward, reproducible, and accurate preparation of solvent mixtures. Additionally, the manual handling with hazardous solvents is minimized, and operator safety is increased.

Multiresponse Optimization of a Method for Copper, Cadmium, and Lead Determination in Natural and Tap Waters Using Fast Sequential Flame Atomic Absorption Spectrometry after Cloud Point Extraction

Marcos A. Bezerra, Hadla S. Ferreira, and Lindomar A. Portugal

UESB, Rua José Moreira Sobrinho S/N, Salvador, Bahia 45206-191, Brazil

In general, trace metals' determination demands to employ a preconcentration methodology to allow using a simpler and less expensive analytical technique such as flame atomic absorption spectrometry. Cloud point extraction is a preconcentration technique based on the phases' separation phenomena which occur when a solution of a nonionic or zwitterionic surfactant is heating to a characteristic temperature. Above this temperature, a surfactant-rich phase of small volume and trapping hydrophobic substances, such as metallic chelates, separates from the bulk solution.

This work presents a method for the determination of copper, cadmium, and lead from natural waters using fast sequential flame atomic absorption spectrometry (FS FAAS) after the cloud point extraction of these metals. The proposed method is based on metallic ions' extraction from micellar media of octylphenoxypolythoxy-ethanol (Triton X-114) after complexing the metals with 2-(5-bromo-2-pyridylazo)-5-(diethylamino)-phenol [5-Br-PADAP]. The simultaneous extraction procedure was optimized by response surface methodology using a Box-Behnken design and desirability function. Enrichment factors of 30.7, 22.2, and 26.6, along with detection limits ( 3 𝜎 B) of 4.3, 0.7, and 8.3  𝜇 g  L 1 for Cu, Cd, and Pb, respectively, were found. The precision expressed as the relative standard deviation was 2.1, 1.7, and 4.4% (5  𝜇 g  L 1 , 𝑛 = 8 ) and 1.8, 1.5, and 1.5 (10  𝜇 g  L 1 , 𝑛 = 8 ) for Cu, Cd, and Pb, respectively. The accuracy was evaluated through the analysis of the certified reference material NIST 1643d (trace metals in water). The developed method in this study was applied to the determination of Cu, Cd, and Pb concentrations in natural water.

Recent Advances in Mercury Speciation

Davies Colin, Paul Danilchik, and Annie Carter

Brooks Rand Labs, 3958 6Th Ave NW, Seattle, WA 98107, USA

Various methods for speciation of mercury currently in existence are complex and time-consuming. EPA Method 1630 for the analysis of methyl mercury in water (with modifications for sediments and tissues) is a manual method that works well for very knowledgeable and experienced chemists, but it is very susceptible to a number of issues for the less experienced. This manual method is also very labor-intensive. We have developed a new method and automated instrumentation, based on the same chemistry as that of EPA Method 1630, but that vastly improves data quality and efficiency.

Our new method and instrumentation incorporate a number of novel approaches, which will be discussed in detail along with performance measures of quality and throughput. This system utilizes a gas-pressurized delivery system for fluid handling, thereby eliminating peristaltic pumps and the “sticky” tubing used in typical automated systems. This unique approach minimizes carryover contamination, and removes a common maintenance issue. The system also incorporates a multiple trap system—cycling traps between the collection, drying, and desorption stages to maximize efficiency. The trap desorption mechanism uses infrared radiation to ballistically heat the trap material and release the mercury species extremely rapidly. This rapid release allows for much easier GC separation with improved peak separation.

The system has been extensively tested on a variety of sample matrices, including a wide range of water, sediment, and tissue samples. Utilizing CVAFS, this system has significant improvements in detection limits, precision, linear range, and throughput.

Performing EPA Methods 245.7 and 1631 with PSA Millennium Merlin System

Bin Chen, Warren T. Corns, and Peter B. Stockwell

P S Analytical, Arthur House, Crayfields Industrial Park, Main Road, St. Pauls Cray, Orpington, Kent Br5 3Hp, UK

Mercury is naturally present in aquatic systems in very low concentrations. Due to the long range atmospheric transport and deposition of anthropogenic mercury, elevated concentrations of mercury are found even in remote freshwater system although no direct local contamination sources are present. It has been long recognized that mercury is one of the most hazardous toxicants to human and the environment. To protect people and the environment from the mercury, governments and regulatory agencies are introducing ever more stringent guidelines. As a result, analysts are challenged to achieve the ever greater sensitivity. The USEPA approved methods 245.7 and 1631 for the determination of low-level total mercury in water. Both methods are based on vapor generation atomic fluorescence spectrometry (CV-AFS). In this work, a comparison between performing these two EPA methods with PSA Millennium Merlin System is presented. The methodology, operation procedure, analytical sensitivity, and the quality control criteria are compared in detail.

Achieving Greater Productivity Using EPA Method 245.7 for the Determination of Mercury in Wastewaters at Ultratrace Levels

David L. Pfeil and Bruce MacAllister

Teledyne Leeman Labs, 6 Wentworth Drive, Hudson, NH 03051, USA

As part of the methods' update rule which was published on 12th March in the Federal Register, EPA method 245.7 was approved for the determination of mercury in wastewater. Method 245.7 (mercury in water by cold vapor atomic fluorescence spectrometry) provides better sensitivity and precision while requiring less sample preparation than the older cold vapor atomic absorption methods (245.1, 7470, etc.). Laboratories that measure mercury in wastewater will find that method 245.7 enables them to measure mercury to lower levels with less time and effort than ever before.

Method 245.7 was developed to satisfy the significantly lower detection limits required by recent amendments to the Clean Water Act. The method has a reporting limit of 5 parts per trillion (ppt), satisfying analytical requirements for most mercury water quality criteria. To put this in perspective, 12 ppt are the lowest water quality criterion for mercury outside the Great Lakes states.

Cold vapor atomic fluorescence and earlier cold vapor atomic absorption methodology will be compared in terms of sample collection and preparation, analysis time, detection limits, and quality control.

A New Solid Phase Extraction Instrument for the Fully Automated Analysis of EPA 1664A Oil and Grease Water Samples

Kurtis J. Montegna, Ken Kerwin, and Joseph Stefkovich

Aqueousblue Instruments, 3400 Shasta Gateway Dr., Shasta Lake, CA 96019, USA

EPA 1664A utilizes a liquid-liquid extraction (LLE) method for the analysis of oil and grease water samples. The EPA method also allows solid phase extraction (SPE) if the laboratory satisfies the method requirements. New instruments have been produced to accomplish SPE by 1664A. However, those instruments are not fully automated, require intensive analyst time, have hidden costs, and are expensive.

Laboratory demand for better SPE instrumentation is growing. In response, we have created a virtually labor-free instrument. The chemist needs to screw a patented filter cartridge onto the sample, and attach it to the instrument, after which the chemist starts the analysis via a touch screen. Upon initialization, the instrument's balance will calibrate itself and store the data. After calibration is complete, the instrument will tare out the solute recovery container while methanol is applied to the filter. Once the computer-controlled filter activation cycle is complete, the methanol is purged and the sample is extracted. Hexane will then rinse the bottle and release the solute from the filter. The collected solute (extract) is robotically moved to an intrinsically safe heating area to evaporate the hexane. After evaporation, the extract moves to a desiccator and equilibrates. Again, the computer controls the equilibration cycle, after which the extract is weighed as per the method. Results are then calculated, stored internally, and can be exported to a printer, network, or LIM. Analysis of 40 water samples a day is possible.

Characterization and Application of Novel Gemini Surfactants as Pseudostationary Phases in Capillary Electrophoresis

Cevdet Akbay, David M. Ahlstrom, Ernest E. Hooper,  Anam K. Lodhi, Asad A. Rizvi, and Lei Shi

Department of Natural Sciences, Fayetteville State University, 1200 Murchison Road, Fayetteville, NC 28301, USA

Understanding how drugs interact with cells is ongoing research that attracts a significant attention. However, the use of cells for this purpose is not always practical; thus, gemini surfactants can be used to help us understand the drug-cell interaction at a deeper level because these surfactants can form aggregates (i.e., vesicles) that mimic the cell membranes above a certain concentration called critical aggregation concentration (CAC). Gemini surfactants are a new class of surfactants that are made up of two monomeric surfactants connected to each other by a spacer group. In the present research, two novel gemini surfactants (1, 1 -Didodecyl-1, 1 -but-2-yne-1,4-diyl-bis-pyrrolidinium dibromide (I-12) and 1, 1 -Ditetradecyl-1, 1 -but-2-yne-1,4-diyl-bis-pyrrolidinium dibromide (I-14)) as well as their counterpart conventional monomeric surfactants were characterized and utilized as pseudostationary phases in capillary electrophoresis (CE) to test how compounds with different chemical properties such as hydrophobicity and hydrogen bond forming capacity interact with the vesicles. More than 30 benzene derivatives with varying differences in hydrogen bond formation properties were used as standard analytes to test the separation power of aforementioned novel surfactants. To optimize the CE separation conditions, several parameters such as surfactant and buffer concentrations, temperature, and separation were systematically studied. The surfactant type and temperature were found to have a significant effect on the separation. In addition, solute pseudostationary phase interactions were also examined by determining Gibbs free energy, enthalpy, and entropy values.

Optimization of Lignin Degradation Components by Capillary Electrophoresis Mass Spectrometry

Roderquita K. Moore

Clark Atlanta University, 223 James P. Brawley Drive, Atlanta, GA 30317, USA

Lignin is the second most abundant organic substance on earth, after cellulose, and plays a central role in the natural ecological cycle of plants. The lignin macromolecule is comprised of a number of ether- and carbon-linked methoxyphenols. Due to the complexity of the lignin sample, capillary electrophoresis mass spectrometry (CE-MS) seems to be an ideal technique for the analysis of charged and uncharged components of lignin. Thirty-five lignin standards were used to investigate the optimization of CE-MS conditions. The optimized CZE-ESI-MS conditions were mobile phase containing 20 mM ammonium carbonate, pH 9.5; sheath liquid containing 80/20 MeOH/ H 2 O delivered at 5  𝜇 L/min; spray chamber set to drying gas flow of 5 mL/min, nebulizer pressure of 5 psi, and drying gas temperature of 150°C. Further studies are underway to analyze lignin via micellar electrokinetic chromatography (MEKC) coupled to MS, using molecular micelles. After optimizing the peak capacity of lignin standards, the method will be applied for the identification of lignin in sour orange tree.

Automated Microfluidic Sanger Sequencing Sample Preparation

Stevan Jovanovich, Richard Belcinski, Luliu Blaga,  Allen Boronkay, Helen Franklin, Corey Garrigues,  Roger McIntosh, Bill Nielsen, Mike Nguyen,  Phil Sison, and Jaclyn Taal

Microchip Biotechnologies, Inc., 6693 Sierra Lane, Suite F, Dublin, CA 94568, USA

Microfluidics and nanofluidics offer potential advantages in performance, reagent consumption, and size for many applications. However, their potential has seldom been realized, and the impact of miniaturization technologies remains largely a future vision. We will describe a new microfluidic and nanofluidic platform developed at MBI, Apollo 100, which automates fluorescent DNA sequencing sample preparation by integrating nanoliter scale cycle sequencing and bead-based cleanup on-chip.

The Apollo 100 is based upon our patented NanoBioProcessor technology that creates on-chip MOV minivalves, pumps, and routers with simple control of their operation using full-scale pneumatics. The Apollo 100 uses full-scale commercial robotics to load and unload NanoBioProcessor microchips. The user will input standard microtiter plates with DNA templates and receive back ready-to-inject microtiter plates with samples ready for capillary array electrophoresis analysis, saving reagent and personnel costs.

At the heart of the system, a microchip device performs multichannel nanoscale cycle sequencing reactions and cleanup. The microchip uses the on-chip MOV valves, pumps, and routers to mix samples and reagents, and to move beads, reagents, and wash solutions. The NanoBioProcessor microchip and the robot are controlled by MBI's DevLink software. Results of integrated nanoscale cycle sequencing and cleanup with read lengths over 1000 bases will be presented.

The future applications of the Apollo system in the development of genomic, biodefense, forensic, and other systems will be discussed as well as how on-chip valves, pumps, and routers can enable many next-generation, modular, microfluidic devices for analysis of liquids. The overall strategies, designs, and results will be presented to illustrate how to harness the power of microfluidics to create widely applicable modular sample preparation and platforms for both portable and laboratory analyses.

Automated Powder Pipetting with Random Access Sample Management and Weight Checking

James A. Lowe

Gilson, 1103 Lering Drive, Ballwin, MO 63011, USA

Automated sample management has long been a realization for the liquid handling sector. Only until recent improvements in the technology of powder aspiration and deposition has the feasibility of powder sample management been a realization.

The system under development will push the boundaries of dry powder automated sample handling by incorporating a multiparameter sampling automation process. A robotic XYZ platform with dual Z drives will select from 8 precalibrated powder pipettes, with the secondary Z drive moving samples about the internal robotic plane. The application requires the system to select a sample amid a library of 96-well sample boxes. The sorting will be accomplished via a carousel and bar code reading system. As some vials have press on caps and others have screw caps, the XYZ robot will move the selected samples to the correct zone within the robot plane to automatically decap and cap the vials. A balance incorporated into the same robotic plane will tare and weight-check each vial for the accuracy of the deposited dry powder.

The project presentation will prove the feasibility of dry powder automated sample handing for this and other prospective projects. A detailed accounting of the precision and accuracy of the sample handling data will be presented along with project highlights and an operational overview.

Development of Online Rapid Sampling Microdialysis Monitoring for the Study of Human Intestinal Ischaemia

Emma P. Corcoles, Samer Deeba, George B. Hanna,  Martyn Boutelle, and Ara Darzi

Department of Bioengineering, Bagrit Centre, Imperial College London, London SW7 2AZ, UK

Intestinal ischaemia is a serious complication of many surgical procedures. We have used online rapid sampling microdialysis (rsMD) to detect ischemia in the human bowel. Our approach has been to place the probe in the bowel wall and analyze the dialysate online.

We have developed an online rsMD technique for monitoring metabolites in the human bowel during gastrointestinal surgery. This analyzes the dialysate glucose and lactate electrochemically at high time resolution (typically 30 seconds). The system was adapted from the previous biosensor system used to monitor neurochemicals in the human brain during surgery [23]. The method consists of a flow injection analysis (FIA) system coupled to an enzyme-based amperometric detector.

Dialysate levels in the human bowel stabilized within 12 minutes of probe implantation, for glucose ( 2 . 7 7 ± 0 . 5 7  mM, 𝑛 = 7 ) and lactate ( 0 . 7 7 ± 0 . 1 7  mM, 𝑛 = 7 ). After arterial resection glucose showed a small decrease at 10 minutes ( 0 . 2 6 1 ± 0 . 0 8 5  mM, 𝑃 = . 0 1 7 5 , 𝑛 = 7 ), recover to preresection values by 17 minutes to then decrease dramatically to 1 . 8 9 ± 0 . 3 9  mM ( 𝑃 = . 0 1 3 5 , 𝑛 = 6 ). Lactate levels arose in a constant manner to reach the value of 1 . 4 7 ± 0 . 5 3  mM ( 𝑃 = . 0 3 3 1 , 𝑛 = 7 ). Results from anastomosis microdialysis monitoring in animal models will be compared with the human data.

High-Throughput and Robust SEC Measurement Using Newly Developed All-in-One SEC Instrument for Polymers

Hiroshi Tomizawa, Toru Satoh, Hideo Suzuki,  Kuniyuki Tokunaga, Teruhiko Tsuda, Hiroyki Moriyama,  Yoichi Yasua, and Cara Tomasek

Tosoh Corporation, Shiba-Koen First Bldg. 3-8-2 Shiba, Montgomeryville, Tokyo 105-8623, Japan

EcoSEC, a new sophisticated GPC/SEC instrument, was developed for semimicro and high-throughput SEC measurements. The equipment consists of an online degasser, autosampler, temperature-controlled pumping system, column oven, refractive-index (RI) detector, and optional UV detector. All units mount in a chassis and are computer-controlled.

We evaluated the fundamental characteristics of this new GPC/SEC equipment, and the results will be reported. We investigated the precision and the reproducibility of flow rate, the precision of temperature control in the pumping unit, column oven and RI detector, the baseline stability against external temperature change, the sensitivity of the RI detector, and other system characteristics. Furthermore, the precision and the reproducibility of molecular mass and molecular mass distribution were evaluated using semimicro TSK-GEL SuperMultipore columns and PStQuick polystyrene calibration markers.

The results demonstrate that the new GPC/SEC instrument was suitable for high-throughput SEC as well as for semimicro SEC applications. Moreover, the instrument, in combination with TSK-GEL SuperMultiporeHZ series columns and PStQuick calibration standards, showed a high level of robustness in terms of site-to-site reproducibility of molecular mass measurement. The EcoSEC instrument, equipped with a TSKgel SuperMultiporeHZ-H column, was applied to the high-throughput analysis of several important industrial polymer samples.

Optimization of Capillary-Channelled Polymeric Fibers as Stationary Phases for the Separation of Water Soluble Polymers by RP-HPLC

Katie J. Hilbert and R. Kenneth Marcus

Clemson University, 219 Hunter Laboratories, Clemson, SC 29634, USA

Size-based separation methods are universally accepted for polymer mixtures, particularly when analytes differ greatly in size. However, this technique lacks selectivity when analytes have similar molecular weights. Therefore, it is advantageous to develop a chemically based separation method to allow improved selectivity for the analyte of interest by exploiting the chemical differences of the analytes.

Previous work in this laboratory has demonstrated the use of capillary-channeled polymer (C-CP) fiber columns for the reversed-phase (RP) separation of proteins. Previously, C-CP fiber columns have been used to separate a suite of 4 proteins in less than 4 minutes at a flow rate of 7 mL/min [24]. The C-CP fiber format facilitates separations based on chemical interactions without contributions from size exclusion phenomena; therefore it was hypothesized that this technique could be adapted to separations involving synthetic macromolecules (i.e., polymers). Due to the structural and synthetic differences between proteins and polymers, it is necessary to evaluate the chromatographic performance of the C-CP fiber columns in conjunction with polymers.

Our laboratory has separated two water-soluble polymers (poly(4-vinylpyridine hydrochloride) and glycolic acid ethoxylate 4-nonylphenyl ether) which could not be separated by SEC. The initial separation was done under gradient conditions at a flow rate of 0.5 mL/min. After an evaluation of varying separation conditions, an optimum separation was achieved using water and acetonitrile (ACN) both with 0.1% trifluoroacetic acid (TFA) at a 20% gradient and a flow rate of 1 mL/min.

Going Paperless: Converting from Paper Forms to an Electronic Laboratory Notebook

Dale Seabrooke

Labtronics Inc., 546 Governors Road, Guelph, ON, Canada N1K 1E3

As laboratories start to consider replacing their paper-based processes with an electronic laboratory notebook, they are often surprised at how many paper forms their organization actually uses. In many cases, they discover that there are dozens of forms with multiple pages and multiple variations that are widely distributed throughout the enterprise.

Converting every single variation of every paper form to an electronic document can appear to be a daunting task.

Our presentation will help organizations manage their process of moving away from paper by providing guidelines and insight into (1)determining which processes should be a priority for conversion,(2)identifying the levels of automation that can be incorporated into ELN forms,(3)identifying points of integration between the ELN and existing systems,(4)different processes for converting from paper.

Organizations will learn how to turn the conversion process into an opportunity to assess the forms they are using, consolidate forms that share common characteristics and formats, and prioritize forms within the conversion process.

With this information, they will be better prepared to launch an ELN project that will introduce maximum benefits with minimal impact on the workplace both during the transition process and once the new system is in place.

Calculating the Benefits for the Introduction of Laboratory Informatics Solutions

George Gallatig and John Petrakis

Patni Life Science, Inc., 1170 Us Highway 22 East, Bridgewater, NJ 08807, USA

Introducing a standardized enterprise-wide solution can lower the total cost of ownership (TCO) for laboratory information management systems (LIMSs). Electronic notebooks (ELNs) can improve the personal productivity of analysts working in research and quality-control laboratories. Data management and data warehouse systems provide ready access to information, and can help reduce search time and duplication of effort by knowledge workers. Providing integrated analytical tool kits and training for scientists can reduce some of the chaos, support costs for research, and provide new opportunities for sharing and building on innovative ideas.

We intuitively know that there is value in implementing these laboratory informatics solutions. These solutions help companies in all industries to produce better products with shorter development cycle times and at lower cost. However, business decisions to acquire them have to be based on the review of a business case including a cost justification.

This presentation will present ideas and a model for calculating the benefits for laboratory informatics solutions. The model will include rules of thumb based on industry experience for projecting productivity improvements resulting from the introduction of technology to replace functions typically performed manually like data entry, data access, or compilation of reports. However, projections of productivity improvement are only part of the story. These need to be translated into actual benefits that are measurable by the corporation in terms of speed, productivity, or cost. The author will describe strategies for accomplishing this. These will be supported by examples.

This will be a valuable presentation for anyone involved in looking at the impact of the introduction of laboratory informatics technology, whether developing a business case or looking at benefits being captured by systems that have been implemented.

Building an Electronic Laboratory Notebook for Routine Analyses

Dale Seabrooke

Labtronics Inc., 546 Governors Road, Guelph, ON, Canada N1K 1E3

Most electronic laboratory notebooks (ELNs) have been designed to accommodate R&D requirements, where variety and flexibility are the name of the game.

For laboratories that are performing routine analyses on a regular and ongoing basis, there is a need for an ELN that does much more than simply recording results and observations. In this environment, there is an opportunity for the ELN to become a proactive component controlling the analytical process and ensuring that every aspect of an SOP is being fulfilled.

Creating an ELN for this environment requires looking beyond the basics and considering all of the factors that are part of ensuring conformance to an SOP. These factors include confirming analysts' qualifications, completing instrument calibrations, confirming proper use of reagents, solutions and chemicals, automating calculations and limit checks, as well as ensuring step-by-step adherence to the SOP process.

Of course all of these factors have to be considered right at the bench level, as the analysis is taking place.

Our presentation will show how starting out with these requirements in mind results in an electronic laboratory notebook that addresses each of these factors, in real time, ensuring that every routine procedure fulfills all aspects of SOP requirements.

Achieving Great Science Bringing Together Physical Measurements and Chemical Analysis

Michael Boruta, Michel R. Hachey, and Keith Kociba

Advanced Chemistry Development, Inc., 102 Stalh Road, Toronto, PA 18966, USA

In today's laboratory, it is common to be involved in many aspects of the physical/chemical analysis of materials. Having the ability to keep track of all of these disparate data can be a major challenge whether we are generating all of the data ourselves or they are coming from several departments. In addition, with all of these different data, the possibility of misplacing something is much higher than we would like to admit. This talk will look at a tool that can be used to manage, analyze, annotate, and create reports from data sources as diverse as TGA, DSC, XRPD, optical spectroscopy, chromatography, Mass Spec, NMR, chemical structures, and metadata. Having all of the data available to review and share across an organization improves the decision-making process and allows great science to be achieved.

Segmented Flow Analysis Using Microfluidic Devices and High-Efficiency Capillary Electrophoresis

Gregory T. Roman, Meng Wang, and Robert Kennedy

University of Michigan, 930 N. University Dr., Ann Arbor, MI 48103, USA

A microfluidic device has been developed that combines continuous plug generation, reagent mixing, online fluorescent derivatization, and plug fusion to an aqueous stream for high-efficiency capillary electrophoresis (CE) and laser-induced fluorescence (LIF) detection of amino acids (AAs). Hydrophobic and hydrophilic patterned channels were employed to assist in aqueous phase breakup at a T-junction, interplug chemical isolation, and plug fusion at a virtual wall. Differential glass etching was also employed to generate high fluidic resistance electrophoresis channels with 8.5  𝜇 m depth and low-resistance droplet breakup and mixing channels with 80  𝜇 m depth. The former enabled high-efficiency electrophoresis, while the latter was useful for generating plugs with volumes and temporal intervals ranging between 7.8–28.2 nL and 0.6–10 seconds, respectively. Online fluorescent conjugation of amino acids was performed in plugs using multiple derivatization chemistries including fluorescein isothiocyanate (FITC) and naphthalene 2,3,dicarboxylaldehyde (NDA). Reagents were combined laminarly immediately prior to plug breakup at a T-junction. Turbulent mixing of NDA, CN-, and AAs was performed in plugs for 1-2 minutes prior to droplet fusion. Kinetics of the fluorogenic NDA reaction were recorded over the length of the reaction channel, and were found to reach completion at ~65 seconds after initial mixing. Plug fusion into a continuous flow stream was demonstrated using a parallel current fusion technique. Following fusion, capillary electrophoresis of fluorescein, rhodamine 110, FITC-AA, and NDA-AA conjugates was performed with high efficiencies.

A Digital Microfluidic Approach to Homogeneous Enzyme Assays

Elizabeth M. Miller and Aaron R. Wheeler

University of Toronto, 160 College St., Rm. 440, Toronto, ON, Canada M5S 3E1

A digital microfluidic device was applied to a variety of enzymatic analyses. The digital approach to microfluidics manipulates samples and reagents in the form of discrete droplets, as opposed to the streams of fluid used in channel microfluidics. This approach is more easily reconfigured than a channel device, and the flexibility of these devices makes them suitable for a wide variety of applications. Alkaline phosphatase was chosen as a model enzyme, and used to convert fluorescein diphosphate into fluorescein. Droplets of alkaline phosphatase and fluorescein diphosphate were merged and mixed on the device, resulting in a 140 nL stopped-flow reaction chamber in which the fluorescent product was detected by a fluorescence plate reader. Substrate quantitation was achieved with a linear range of 2 orders of magnitude and a detection limit of approximately 7 . 0 × 1 0 2 0  mol. Addition of a small amount of a nonionic surfactant to the reaction buffer was shown to reduce the adsorption of enzyme to the device surface and extend the lifetime of the device without affecting the enzyme activity. Analyses of the enzyme kinetics and the effects of inhibition with inorganic phosphate were performed, and 𝐾 𝑚 and 𝑘 c a t values of 1.35  𝜇 M and 120  s 1 agreed with those obtained in a conventional 384-well plate under the same conditions (1.85  𝜇 M and 155  s 1 ). A multiplexed device was also developed to perform 6 simultaneous determinations in an effort to reduce analysis times and variation among replicates. It was concluded that the digital microfluidic format is able to perform detailed and reproducible assays of substrate concentrations and enzyme activity in much smaller reaction volumes and with higher sensitivity than conventional methods.

Improved Temporal Resolution with Segmented Flow

Meng Wang

University of Michigan, 930 North University Street, Ann Arbor, MI 48105, USA

Temporal resolution is a key figure of merit for in vivo experiments. With microdialysis sampling, temporal resolution is primarily limited by the band broadening of sample plugs as they flow from the sampling probe to the analysis system due to parabolic flow and diffusion. This problem is especially evident when sampling flow rate is low (10–300 nL/min). To solve this problem, a segmented flow system was developed, in which the aqueous sample stream emerging from a microdialysis probe was converted to separate droplets, and transported in an immiscible oil phase, for example, perfluorodecalin. A T-junction on a PDMS-based chip was used for droplet generation. Relations between flow rate, droplet volume, and time interval were studied, which were compared with an existing model. To test the ability of this system to improve temporal resolution, a homemade microdialysis probe was coupled to the chip, and fluorescein was sampled at a flow rate of 200 nL/min. After generation, droplets were transferred to a capillary rendered hydrophobic by derivatization with octadecyltrichlorosilane. Droplets were detected by laser-induced fluorescence (LIF) both immediately after droplets were formed and at approximately 40 cm downstream. As fluorescein concentration was changed, the response time for both on-chip and on-capillary detection processes was recorded and the resulting plots showed that they were exactly the same. In comparison, for continuous flow, deterioration in temporal resolution was observed at downstream. Several optimizations in experimental condition were conducted to further shorten response time. Finally, this system was applied to detect glucose concentration change in rat brain. A triple-branch inlet channel was used this time to mix both enzyme and dye with glucose sample stream before droplets were formed. Reactions between glucose and derivatizing reagents took place within droplets as they flowed from the T-junction to the detection window on the collection capillary.

Simultaneous Chemical and Morphological Analysis of Pharmaceutical Granules

Janie Dubois, Kenneth Haber, and E. Neil Lewis

Malvern Instruments, Inc., 7221 Lee Deforest Dr., Suite 300, Columbia, MD 21046, USA

Pharmaceutical granules pose new QA/QC challenges to the analytical laboratory because their formulation is often not as well understood as the average pressed tablet. Size and weight, in combination with dosage, often suffice to monitor quality, but are not very helpful for troubleshooting performance issues. In this presentation, we discuss two approaches to measure both chemistry and morphology in pharmaceutical granule formulations or pregranulation samples. First, we describe a method for the simultaneous measurement of size, shape, and chemical composition of hundreds of granules using near infrared chemical imaging (NIR-CI). Tens to hundreds of granules are spread in the field of view (FOV) for data acquisition, and the chemistry of each granule is acquired, with a spatial resolution depending on the selected FOV. Chemometrics analysis is performed on the dataset to generate contrast relevant to the chemistry of the samples, and the morphology is investigated with respect to the chemistry. The second approach described is primarily a morphological measurement performed with an automated microscope equipped with a single-point Raman spectrometer. With this instrument, the morphological features of hundreds of thousands of granules are measured and the chemical information (Raman spectrum) is automatically acquired only from discrete granules selected on the basis of their morphology. This presentation focuses on results showing how these techniques may be used for root-cause analysis of dissolution failure and blend heterogeneity. They indicate that the performance of the unit dose shows dependency on the different chemistries and morphologies of individual granules rather than on the average of all granules in a batch.

Real-Time Particle Classification Using Pattern Recognition

Lew Brown

Fluid Imaging Technologies, Inc., 65 Forest Falls Drive, Yarmouth, ME 04096, USA

The use of pattern recognition algorithms in digital image processing is not a new phenomenon. Research and development of algorithms for pattern recognition have been going on for many years, driven initially by applications in the intelligence community for analysis of reconnaissance imagery. Later on, the techniques were successfully applied to other fields, the most prominent being medical imaging. However, in most cases the algorithms were developed for looking at static images, where an entire image is being searched to try to locate one particular object within it.

This paper will discuss applying the principles of pattern recognition in a new way, to characterize particles in a moving fluid in “real time,” with the objective being the determination of the concentration of multiple different particle types in a heterogeneous solution. This technique has the potential to greatly advance the sensitivity of monitoring systems in a process environment, by providing much more information on the state of the process in real time. The availability of this information will permit a much tighter “feedback loop” for quality monitoring, and therefore can yield significant cost savings by being able to proactively adjust process variables.

A brief overview of pattern recognition theory will be followed by a discussion of how these well-developed techniques can be applied to monitor particle concentrations for multiple particle types within a heterogeneous fluid. A specific application will be detailed showing results obtained, and suggestions will be made as to other areas where this technique may be applied.

Automated Cross-Fractionation Instrument

Alberto Ortin, Benjamin Monrabal, Raquel Ubeda,  Juan Sancho-Tello, and Pilar Del Hierro

Polymer Characterization SA, C/ Nicolás Copérnico 10, Nicolas Copernico 10, Paterna, 46980 Valencia, Spain

Polymer ChAR has developed a fully automated cross-fractionation apparatus in a benchtop instrument, called CFC. It is intended to measure the interdependence between molecular weight distribution and chemical composition distribution in polyolefins in a single experiment, under full computer control. The infrared detector is used to obtain the MWD of each fraction and the 3D surface plot in temperature, and Log Mw axis is generated with these results. Also other 2D plots are offered like the MWD profile with composition dependence as well as the composition profile with molar mass dependence for the whole sample.

CFC is capable of analyzing up to 5 consecutive samples automatically, and incorporates a subambient option down to 2 0 C . Two samples a day can be analyzed at standard conditions (see Figure 13).

The virtual instrumentation software provides instrument control and monitoring of the whole process and signals generated. The system includes remote control access for easy diagnosis and software upgrades.

Development of Online UV/Pyrolysis- GC/MS System for the Analysis of Photo/Thermal/Oxidative-Degradation Process of Polymeric Materials

Robert Freeman, Akihiko Hosaka, Chu Watanabe,  Tetsuro Yuzawa, and Shin Tsuge

Quantum Analytics, 363 Vintage Park Drive, Koriyama 94404-1185, Japan

The physical and chemical performances of most polymeric materials gradually degrade due to external effects such as heating, photoradiation, oxidative atmosphere, and mechanical stress. During the degradation process, not only the decomposed compounds formed from the sample but also the structural alternation of the samples has been important target to analyze. By getting this information, it would be possible to prepare the advanced materials by modifying their molecular structures and/or selecting appropriate additives. For these analyses, a new analytical method has to be developed.

A new analytical instrument using an online ultraviolet (UV) radiator combined with the multifunctional microfurnace pyrolyzer (double-shot pyrolyzer, Frontier Lab) with capillary column GC/MS will be described. A UV beam was spotted on a small amount of polymer sample set in the pyrolyzer through an optical fiber cable. The evolved gas from the irradiated polymer sample was analyzed by GC online, and then the residual polymer sample was pyrolyzed in the pyrolyzer to give a specific pyrogram. Based on both information sets obtained, the deterioration mechanism of the polymeric material during irradiation and the effect of additives such as photostabilizer and UV-absorber can be evaluated using sub-milli-gram order of polymer sample with relatively short test period compared with that by conventional technique such as a weather meter. Here the basic performance of this system was examined using typical polymer materials such as polystyrene, polypropylene and polycarbonate. These examples clearly demonstrate the effectiveness of the online UV/pyrolysis-GC/MS system when characterizing the photo/thermal/oxidative-degradation of polymers.

New Automated Overhead Stirrer

Werner Zinsser

Zinsser Analytic, 19145 Parthenia Street, Suite C, Northridge, CA 91324, USA

Formulation studies of new products involved in the fine chemicals or specialty chemicals fields have motivated our company to design an overhead stirrer. This unique stirring module has 8 reactors, typically 20 or 30 mL glass vials, in which 8 rotating blades allow perfect mixing of the different products. It has several advantages compared to traditional vortexers or magnetic stirrers, especially when high or increasing viscosities are involved in a chemical reaction process as the overhead stirrer increases the turbulence and keeps the contents moving. Each reactor can be filled through cross-slitted septum by our special probes or disposable tips, while the WinLISSY software digitally controls the mixing. The pipetting channel is integral to the mixer ensuring quick and efficient mixing of reagents which are delivered to the heart of the reaction. A temperature-controlled carrier is available for heating of the reactions and a reflux option fits neatly on top.

In Situ Simultaneous N O x and O 2 Monitoring

Peter DeBarber

HORIBA Instruments, Inc., 17671 Armstrong Avenue, Irvine, CA 92614, USA

Horiba has developed and is selling an in situ monitor for combustion measurement applications. The INM-700 simultaneously measures N O 𝑥 and O2 using a novel zirconium oxide detector. The in situ probe geometry requires little of the setup time for, sample conditioning, and maintenance that is required for systems employing extractive sampling, thus offering reduced initial cost and cost of ownership. This paper outlines the successful testing of a new approach for in situ, simultaneous N O 𝑥 and O2 analysis. The sensor technology described was demonstrated for a coal-fired power plant application.

Direct Mercury Analysis of Liquid Hydrocarbons

Jason Gray, Alvin Chua, and Koji Tanida

Nippon Instruments North America, 1511 Texas Avenue, College Station, TX 77840, USA

Accurately measuring the mercury content of liquid hydrocarbons, such as naphtha, gasoline, kerosene, light oils, crude oils, and petroleum condensates, can be a very daunting task. Often the mercury concentrations of hydrocarbon samples can be in the low to subpart per billion range, while the diversity and complexity of these samples have few rivals. Conventional analysis techniques involve a hazardous and tedious acidic digestion that consumes several hours of the analysts' laboratory time, and have been known to produce over half of the errors generated during any sample analysis. One known technique for eliminating the digestion process for mercury analysis is the use of high-temperature combustion to decompose the sample into a gaseous state. This thermal decomposition of the sample, accompanied by gold amalgamation and atomic spectroscopy, is commonly referred to as “direct mercury analysis.” In recent years, this technique has commonly been used for environmental samples; however Nippon Instruments Corporation has expertly adapted it to the analysis of liquid hydrocarbons in the Model PE-1000 Mercury Analyzer.

In this presentation, data will be provided to illustrate the high-quality results obtained through the use of Model PE-1000. The capabilities of this analyzer will be discussed in detail.

Analysis of Volatile and Semivolatile Compounds in Environmental Monitoring by Thermal Desorption

Ilaria Ferrante and Manuela Bergna

DANI Instruments S.p.A., Viale Brianza 87, 20093 Cologno Monzese, Italy

The thermal desorption technique is commonly known to be applied to the monitoring of volatile organic pollutants in air; the numerous advantages make it a successful alternative to the solvent extraction technique. The recent improvements in the TD instrumentation extend the use of this technique to the analysis of compounds with a wider boiling point range.

This paper shows the application of a thermal desorber to the environmental monitoring of both VOCs (volatile organic compounds) and SVOCs (semivolatile organic compounds). An automatic leak check ensures the complete integrity of each sample.

The typical target compounds are BTEX, chlorinated hydrocarbons, pesticides, PAH, phenols, and phtalates. Thermal desorption is a good alternative to conventional extraction procedures, and allows the reduction of analysis times without loss of the precision.

Optimization of the Technological Advancements in a Purge and Trap System for the Determination of VOCs by US EPA Methodologies

Jeff Sheriff and Jim Monk

EST Analytical, 503 Commercial Dr., Fairfield, OH 45014, USA

There are several demands and requirements imposed on chemists performing volatile organic analysis in today's environmental laboratory. The first and most important fact is that the analysis must be performed in compliance with USEPA methodologies. Next, there is a continued trend to achieve lower levels of detection. As the result of required low detection levels, water management and carryover reduction have become more of a concern.

This paper will present the optimum purge and trap system parameters used to generate method 8260B data. The conditions utilized will provide the necessary sensitivity, linearity, and accuracy compliant with the method. In particular, the revolutionary advancement used to virtually eliminate carryover below the maximum contaminate level will be highlighted. Other technological advancements in a purge and trap system will also be demonstrated. Analytical results including calibration factors, method detection limits, and reproducibility data in both water and soil matrices will be shown.

Automated SPME-GC-FID Determination of Parabens in Wastewater

Rosa Vatinno, Janusz Pawliszyn, Sanja Risticevic, and Stefano Pelagatti

Chemistry Department, University of Waterloo, 200 University Avenue West, Waterloo, ON, Canada N2L 3G1

Parabens are a group of chemicals used as preservatives in the cosmetic, pharmaceutical, and food industries for their bactericidal and fungicidal properties. Their efficacy as preservatives, in combination with their low cost and their long history of safe use, probably explains their widespread use.

However, a considerable interest in toxic effects of a continuous exposition to low levels of these compounds has arisen after the recent discovery of parabens in tissue samples from human breast tumors. A weak estrogenic activity for some parabens has been demonstrated by in vivo and in vitro tests.

Urban wastewater constitutes one of the main environmental sources of these species in the biosphere; therefore, it must be the first compartment to be investigated in order to know the levels of parabens and understand their fate in the aquatic media.

Available methods for the determination of parabens in wastewater consist of a solid phase extraction (SPE) as the concentration technique, followed by a liquid (LC) or gas chromatographic (GC) analysis with mass spectrometry detection.

A new method based on solid phase microextraction-gas chromatography-flame ionization detection (SPME-GC-FID) can be proposed as a screening tool to monitor paraben presence in the environment and/or food. The new Thermo TriPlus Autosampler enabled a completely automated development of the SPME method. A CW/DVB fiber was chosen as the best extracting coating, and a good chromatographic separation was achieved in only 23 minutes.

The proposed method is fast, economic, and suitable for the large number of samples in routine analysis.

Continuous Flow Analysis of ppb-Level Total Phosphorus in Natural Waters following Manual Persulfate Digests as an Alternative to Kjeldahl Methodologies

Colin Evett and Craig Ranger

O.I. Analytical, P.O. Box 9010, 151 Graham Road, College Station, TX 77842-9010, USA

Eutrophication, a detrimental increase in an ecosystem's microbial population due to an influx of nutrients, is a serious concern in natural water environments. The homeostatic balance of these ecological systems depends upon the presence of organic phosphates in lakes, streams, and ponds. Controlling these compounds requires increasingly accurate and consistent detection over smaller and smaller concentrations.

Widely used methodologies for phosphorus analysis, such as USEPA Method 365.4, require a digestion step using a mercury catalyst to convert inorganic phosphates and polyphosphates into measurable orthophosphate. These Kjeldahl-style methodologies of sample preparation demand strict postdigestion waste management to prevent exposure to dangerous mercury compounds. Moreover, these methods often limit the analyst to sample measurements of 10 ppb and higher. Increased environmental awareness of the eutrophic effects of phosphorus and the toxic effects of mercury exposure has driven state and local agencies to seek analytical methods that can detect phosphorus at low levels without incurring the risk and expense associated with hazardous mercury digestions.

An automated continuous flow analysis method for phosphorus detection at low ppb levels will be presented. This alternative methodology will utilize manual persulfate digestions for sample preparation and injected segmented flow analysis (iSFA). Calibration data, linear range measurements, method detection limits (MDLs), and representative sample analyses will be reported.

A Novel Turn-Key LC/MS/MS Replacement Strategy for Traditional LC/UV Drug Screening

Andre Schreiber, Houssain El Aribi, Tania Sasaki,  Sebastian Dresen, and Wolfgang Weinmann

Applied Biosystems, 71 Four Valley Drive, Concord, ON, Canada L4K4V8

With about 5% of the population between the ages of 15 and 64 (~200 million people) using illicit drugs and thousands of drug intoxications per year in the western world alone, fast screening methods for drugs and pharmaceuticals are necessary for the detection of xenobiotics in forensic intoxication cases. Identi fication of these drugs in biological fluids is currently performed by a variety of analytical techniques including immunoassay tests, available only for a small number of substance classes, or chromatographic techniques such as GC/MS and LC/UV. Although these techniques are well established and widely used, they suffer from many limitations, including laborious and time-consuming derivatization steps for the analysis of nonvolatile and polar drugs by GC/MS. LC is ideally suited for polar compounds but UV detection lacks the necessary specificity and methods require long runtimes to minimize the potential for coelution. In addition, sensitivity of LC/UV is limited but newer drugs are used at lower therapeutic concentrations.

In recent years, LC/MS/MS has been increasingly used for toxicological screening applications with mass spectral library searching to confirm detected drugs.

A highly specific and sensitive screening and confirmatory LC/MS/MS method was developed for the simultaneous analysis of 300 compounds relevant in forensic toxicology, such as drugs of abuse, pharmaceuticals, and their metabolites using a QTRAP LC/MS/MS system. The method is built into Cliquid Drug Screen and Quant software for routine forensic toxicology. The new software allows screening for unknown drugs in urine and blood samples with preconfigured methods and automatic processing and reporting. Identification is provided using a comprehensive library of MS/MS spectra.

Simultaneous Determination of Methamphetamine and Dimethyl Sulfone in Crystalline Methamphetamine Seizures by Fast Gas Chromatography

Hiroyuki Inoue, Kenji Kuwayama, Yuko T. Iwata,  Tatsuyuki Kanamori, Kenji Tsujikawa, and Hajime Miyaguchi

National Research Institute of Police Science, 6-3-1 Kashiwanoha, Kashiwa, Chiba 277-0882, Japan

Methamphetamine is a powerfully addictive psychostimulant drug that stimulates the central nervous system. The drug illegally circulated in Japan is generally a form of white crystals, so-called “ice,” and the purity is mostly very high. However, in recent years, we encounter methamphetamine seizures mixed with dimethyl sulfone, a common cutting agent. In this study, we present a method for simultaneous determination of methamphetamine and dimethyl sulfone in methamphetamine seizures by fast gas chromatography. A hundred milligrams of ground samples (seized crystals or powders) were weighed and dissolved in 50 mL of distilled water. A 0.5 mL portion of the solution was added to 0.2 mL of 40% potassium carbonate solution and extracted with dichloromethane containing diphenylmethane as an internal standard. After centrifugation, a portion of the extracts was subjected to gas chromatography with a flame ionization detector on a DB-17 capillary column (0.1 mm i.d. × 10 m, film thickness of 0.1  𝜇 m). The use of a narrow-bore column offered fast and complete separation of 3 substances within 2 minutes. Methamphetamine was completely recovered from sample solutions although the recovery of dimethyl sulfone was 46%. The method was linear over the range investigated (0.02–2.4 mg/mL, which is equivalent to 1–120% in powdered samples). Limits of detection were estimated to be at 0.003 mg/mL for methamphetamine and 0.01 mg/mL for dimethyl sulfone. The method was subsequently applied to the analysis of methamphetamine seizures, and it was found that the contents of dimethyl sulfone ranged from 0% to 90%.

A Validated and Complete Automated SPE/LC/MS Method for the Analysis of Cocaine and Metabolites in Biological Fluids

Eshwar Jagerdeo, Martin Sibum, Madeline Montgomery, Marc A. LeBeau, Tania Sasaki, and John Crutchfield

Federal Bureau of Investigation (FBI), 2501 Investigation Parkway, 22135 Emmen, Germany

Worldwide, cocaine is the most widely abused recreational drug, and thus it is commonly encountered in many forensic and clinical toxicology cases. Cocaine (C) is extensively metabolized forming the metabolites benzoylecgonine (BZE), ecgonine methyl ester (EME), and ecgonine (E). Additionally, when cocaine is coingested with ethanol, cocaethylene (CE) is produced.

This automated method is unique in that it interfaces the Spark Holland Symbiosis to the Applied BioSystems 4000 Qtrap so that these two instruments function as a single unified system. A positive ion electrospray solid phase extraction-liquid chromatography-tandem mass spectrometry (SPE-LC-MS-MS) method was developed for the analysis of C, BZE, CE, EME, and E in urine and whole blood. The deuterated analogues of these analytes were used as internal standards.

In developing this method, ten different solid phase cartridges were evaluated to determine the optimal SPE material for the extraction of all analytes from urine and whole blood. The results demonstrate that Hysphere MM anion cartridge extracted all five analytes and was used for the validation of the method in both urine and whole blood. The Gemini C6-Phenyl was found to be optimal for all compounds studied. Likewise, several mobile phase combinations were evaluated, but the one that was optimal was a gradient from 100% of 0.1% formic acid in water to 90% of 0.1% formic acid in acetonitrile with a run time of 10 minutes. Two independent MS/MS transitions were monitored for each analyte. The precursor's m/z product m/z values selected and monitored for all the analytes. The validation demonstrates that C, BZE, and CE were linear within 4–1000 ng/mL, while EME and E were linear within 10–1000 ng/mL in urine. Likewise, C, BZE, CE, and EME were linear within 4–500 ng/mL, while E was linear within 10–500 ng/mL in blood. In this study, no known interferences were observed.

Identification and Determination of N,N- Diethyl-Meta-Toluamide and Its Metabolites in Human Samples by HPLC-MS/MS

Roberto Bravo, Jessica E. Norrgran, Paula A. Restrepo, Robert Walker, Larry L. Needham, and Dana B. Barr

CDC, 4770 Buford Hwy, Mailstop-F-17, Atlanta, GA 30341, USA

DEET is the active substance used in many insect repellents. It was designed for direct application to human skin to repel insects by the US Army in 1946. It repels pests such as mosquitoes, sand flies, gnats, chiggers, ticks, deer flies, and fleas. In 1957, DEET was registered for use by the general public. Annually, one third of USA population is estimated to use products containing DEET.

We identified several major metabolites of DEET in human urine samples. Additionally, we have developed a sensitive, robust, and accurate analytical method for quantifying DEET metabolites in human urine. Our method uses an enzyme digestion and a simple solid phase extraction followed by a highly selective analysis with high-performance liquid chromatography high-resolution tandem mass spectrometry using stable isotope-labeled standards. The mass spectrometer was operated using atmospheric pressure chemical ionization in the multiple reaction monitoring mode.

The limits of detection (LOD) of our method were between 11 pg/mL and 228 pg/mL, and coefficients of variation were lower than 18%. The extraction efficiency of the SPE method ranged from 78 to 112% for the different analytes. Calibration plots exhibited linearity over the range from 0.25 to 250 ng/mL. The use of the BETASYL Phenyl-Hexyl and BETASIL C18 column allowed us to have a good chromatographic resolution which increased the selectivity and sensitivity of the method. The use of stable isotope analogues as internal standards for most of these metabolites allows for the highest degree of accuracy and precision.

Optimizing Mobile Phase Solvent Purity for LCMS

Dannie Mak, Bryan Krastins, and Stephen Roemer

ThermoFisher Global Chemicals, One Reagent Lane, Fair Lawn, NJ 07410, USA

Increasing sensitivity in the field of liquid chromatography mass spectrometry (LCMS) is an ongoing process. Indeed, special applications such as biomarker discovery or metabolomics studies involve complex biological matrices that require excellent sensitivity (femtomole range), enhanced mass accuracy, and superior resolution. Optimizing the quality of mobile phase solvents can contribute to improving the chromatographic or mass spectrometric properties of the analyte. ThermoFisher Global Chemicals has developed a new solvent grade that meets the stringent purity requirements of LCMS by addressing the need for minimal organic contamination, minimal metal mass adduct formation, and high ionization efficiency. Comparative chromatographic data for pure solvents and neat chemicals (TFA and FA) were collected using a single-quadrupole mass spectrometer and a ThermoFisher LTQ -FT hybrid linear ion trap MS system. Our results will demonstrate that Optima LC/MS acetonitrile (A955), water (W6), and methanol (A456) have a low mass-to-noise level for both positive and negative modes in TIC, exceptionally low metal ion content which can form MS adducts that interfere with MS interpretation, and very low LC-UV background using diode array detector.

Fast and Robust Microflow HPLC Method for Open-Access Product Analysis in Discovery Medicinal Chemistry

Hung-Yuan Cheng

Eksigent Technologies, 5875 Arnold Road, Dublin, CA 94568, USA

The advent of high-throughput synthesis and purification has created a need for more efficient ways to do reaction scouting, route development, and product profiling. Microflow HPLC, defined as having chromatographic columns in the 0.3–0.5 mm range and flow rates in the 2–60  𝜇 L/min range, can significantly shorten the HPLC analysis time attributable to its characteristic rapid gradient mixing and fast column re-equilibration. Additionally, the substantial reduction in solvent usage and waste generation can contribute to the increasingly important “green chemistry” effort. Using a novel microflow HPLC system, we have developed a robust short-cycle time ( < 2 minutes) rapid gradient method for the analysis of crude synthesis products in discovery chemistry labs. Chromatographic columns of different phases, particle sizes, and dimensions are evaluated on a set of compounds covering a wide range of lipophilicity. The capability of the microflow HPLC method to deal with real-world “dirty” samples is critically investigated. The results compare favorably to those obtained using a conventional 4.6 mm column rapid gradient HPLC system. The benefits and issues of using a microflow HPLC in an open-access medicinal chemistry environment will be discussed.

Advanced Optimization of a Microflow HPLC System

Phillip DeLand, Douglas Cyr, David Neyer, Phillip Paul, and Jason Rehm

Eksigent Technologies, 5875 Arnold Road, Dublin, CA 94568, USA

Current trends in HPLC system development have embraced higher pressures, higher temperatures, and lower flow rates to increase analytical throughput and separation efficiency. By combining recent advances in microscale fluid delivery, small particle ( < 3  𝜇 m) stationary phases, high-temperature separations, and chip-based UV absorbance detection with a fully integrated microflow gradient HPLC system, the overall performance has been optimized for short-cycle times, high resolution, and good detection sensitivity.

It is widely accepted that microflow HPLC offers a number of advantages over conventional analytical HPLC. At low flow rates ( < 50  𝜇 L/min), the amount of time required to thoroughly mix mobile phases and re-equilibrate the column is dramatically reduced. This in turn results in a reduction in injection-to-injection cycle times.

While sub 2  𝜇 m particle stationary phases offer high separation efficiency ( > 150,000 N/m), at ambient temperatures, column pressure becomes a limiting factor. In order to achieve high efficiency without having to design system components to withstand ultrahigh pressures, we use sub 3  𝜇 m particle columns at moderately high ( > 6 0 C ) temperatures. In this way separation efficiency is improved with only modest increases in system pressures.

One critical aspect of system performance is the optical detection system. Clearly, reducing the flow cell's dimensions preserves chromatographic resolution. However, this change in the flow cell design is unavoidably accompanied by a reduction in light transmission which increases short-term noise. This reduction in signal-to-noise ratio can compromise the system's performance. To overcome this limitation, optical fibers and chip-based technology have been employed.

Results of our tests using a diverse set of pharmaceutical applications relevant to both drug discovery and drug development indicate that the system design optimization enables separation efficiencies approaching those of UHPLC systems without its requisite ultrahigh pressures. A selection of these results will be presented to illustrate overall system performance.

High-Throughput Concurrent Determination of Compound Solubility and Purity Employing a Low-Volume Parallel Liquid Chromatography System with MS Capability

Sergio Guazzotti and Shiela Bilbao

Nanostream, 580 Sierra Madre Villa, Pasadena, CA 91107, USA

Large numbers of compounds are generated by synthesis strategies based on combinatorial chemistry, with the consequent increase in the size of compound libraries. This has resulted in a critical demand for high-throughput analytical techniques, both qualitative and quantitative, that can aid in the management and characterization of these libraries. For example, compound purity and aqueous solubility constitute critical properties that need to be determined during drug lead optimization. Conventional HPLC systems can be employed for these determinations, but they suffer from their intrinsic low-throughput nature when used in a serial mode. Micro parallel liquid chromatography systems allow for high-throughput chemical analysis with increased sample capacity and data analysis capabilities while reducing sample consumption, solvent usage, and waste generation. In this presentation, we will describe the use of a micro parallel liquid chromatography system that allows for the concurrent determination of compound purity and solubility for up to 23 samples simultaneously. The system employed in these determinations is equipped with 24 parallel columns for liquid chromatography, each with its own sample introduction port and exit port for connection to detector/s of choice. For the determinations presented here, UV detectors were employed for all columns with the 24th column being also connected to a mass spectrometer for mass confirmation. The micro parallel liquid chromatography system employed in these determinations significantly improved the overall time required for the individual determinations with the added advantage of reduced sample consumption, solvent usage, and waste generation. The results discussed in this presentation demonstrate the advantages of employing a micro parallel liquid chromatography system with MS capability for high-throughput chemical analysis.

Some Consideration to Achieve High-Quality Ultrahigh Resolution Analysis Using Ultrafast Liquid Chromatograph

Masami Tomita, Yoshihiro Hayakawa, Yoshiaki Aso,  Yoshiaki Maeda, Kenichi Yasunaga, and Yusuke Osaka

Shimadzu Corporation, 1 Kuwabaracho Nishinokyo, Nakagyo, Kyoto 6048511, Japan

It is well known that high-resolution higher-speed analysis can be achieved using longer column with smaller particle size media, maintaining higher flow rate. However, even with the optimized particle size media, such analysis will give higher back-pressure to the system and the higher pressure sometimes compromises the quality of the analysis.

We studied those side effects of higher pressure, and optimized the system to improve the high quality of the data. Some considerations to the system are reported to demonstrate the quality of the system.

The Evaluation of Multipesticide Screening Methods by GC/MS

Sky Countryman and Kory Kelly

Phenomenex, 411 Madrid Avenue, Torrance, CA 90501, USA

Pesticides are widely used by farmers to control pests, weeds, and molds that would otherwise decrease crop production. While this has significantly increased worldwide food productions, the pesticides pose significant health and environmental risks. The restrictions for specific pesticides differ from one country to the next.

Since many different types of pesticides can be used on the same food product, multiresidue screening approaches are used to look simultaneously for multiple classes of pesticide compounds. Considering that there are more than 500 registered pesticides, no single analysis technique is capable of screening for all possible contaminates. However, gas chromatography (GC) is still the most commonly used method for the majority of the pesticide classes. While analyte-specific detectors such as ECD or NPD may be used for screening, mass spectrometer (MS) detection must be employed to provide positive confirmation.

The current work demonstrates the use of two new and unique phases which have been optimized for the analysis of all classes of pesticides. The phase chemistry improves separation and peak shape for the more polar pesticide compounds when compared to standard 5% phenyl columns. Selectivity data is compared between a 5-milli-second-type phase and the two new columns.

Multipesticide residue screening is evaluated over 250 different pesticides commonly analyzed from fruits and vegetables. The unique selectivity offered by the two phases improves resolution for multicomponent analytes providing a more unique elution pattern which can be used to identify closely eluting analytes.

Heart-Cutting 2D GC-MS with Microfluidic Deans Switch and Low Thermal Mass GC: Efficient Second-Dimensional Separation Using Independent Temperature Control

Nobuo Ochiai, Kikuo Sasamoto, and Hirooki Kanda

GERSTEL K.K., 2-13-18 Nakane, Meguro-Ku 152-0031, Japan

Real-world samples in various fields (e.g., petrochemical, food and flavor, and environmental) are often very complex. Resolution of all individual compounds by means of GC separation can be challenging. Heart-cutting two-dimensional GC (GC-GC) on dissimilar phases can significantly improve the resolution of complex samples. Modern GC-GC systems usually employ deactivated miniature switching device based on Deans principle, electronic pneumatic control (EPC), cryogenic focusing device, and double GC ovens. However, cryogenic focusing device and double ovens require high cost (instrument, cryogen, space, and electronic power consumption).

Recently, low thermal mass (LTM) GC based on the direct resistive heating principle has been developed. The small mass of the LTM-GC provides fast temperature programing rates combined with rapid cool-down for the shortest possible analytical cycle times. A power consumption is also only about 1% of conventional GC. An LTM-GC can be directly integrated with conventional GC instruments to allow full use of conventional injectors, detectors, sampling systems, and software.

In this study, we used an LTM-GC for second-dimensional separation of GC-GC with independent temperature control. This configuration allows efficient second-dimensional separation without cryogenic focusing and an additional GC oven. The potential of the system was illustrated by examples of GC-GC-MS and sequential GC-GC-MS of complex samples, for example, diesel oil, essential oil, and food extracts.

Online and On-Site Analysis of VOC with a Thermal Desorber/Fast Gas Chromatograph/Mass Spectrometer Detector Coupling

Ronan Cozic, Ludovic Fine, Jean-Louis Gass,  Karim Medimagh, and Robert Merciari

SRA Instruments, 150 Rue Des Sources, 69280 Villeurbanne, France

Environmental air samples typically comprise a complex mixture of low-concentration compounds requiring enrichment procedures and high-resolution chromatography for accurate analysis. The wide volatility and polarity range of organic compounds present in the atmosphere requires adequate chromatographic resolution or selective detection.

A thermal desorber/fast gas chromatograph/mass spectrometer coupling has been developed as a transportable analyzer perform sensitive analysis in field online.

The objective with Fast-GC is to reduce analysis time without loss of separation efficiency. Provided specific fragment ions are available for each co-eluting compound a mass spectrometric detector can cope with poor Fast-GC column separation. The thermal desorber is a system for online speciated measurement of multiple trace-level volatile organic compounds in air. It combines automated and controlled-flow sampling with cryogen-free concentration technology. The thermal desorber connects to a Fast-GC/MSD and is designed for unattended operation in remote field locations.

Screening for Environmental Contaminants in Complex Matrices: Tobacco

Donald C. Hilton and Mark Libardoni

Leco Corporation, 14950 Technology Ct., Saint Joseph, FL 33908, USA

Contamination of materials intended for consumer consumption may be difficult to determine in complex matricies. Tobacco in the finished product is a particularly challenging example, as it contains plant metabolites from the tobacco leaf as well as degradation products of the metabolites resulting from aging of tobacco and processing. While target analytical techniques can reliably identify many compounds, the reliability depends on sufficient separation of components in the mixture for the particular analytical technique. Analyses are limited, to the target list of compounds. With adequate separation and capability to rapidly evaluate the separated compounds, screening is possible, but automation is required in complex systems such as tobacco. The application of GC × GC-TOFMS and automated spectral searching for spectral features associated with contamination combined with target analysis takes advantage of the high peak capacity afforded by GC × GC techniques. In addition it makes use of the full-scan spectra from components at low concentration, and automated searching of the peak table for suspect contaminants. This work shows the application of this technique to tobacco.

High-Throughput Analysis Using GC-TOFMS with Interchangeable EI and CI Sources

Megan McGuigan

LECO Corporation, 3000 Lakeview Avenue, Saint Joseph, MI 49085, USA

Gas chromatography (GC) is a commonly used method for the study of volatile and semivolatile compounds in complex mixtures. GC-TOFMS using an electron ionization (EI) source produces spectra with highly reproducible fragmentation patterns that can be automatically compared against standard libraries for the identification of sample components. Through the use of chemical ionization (CI), a relatively soft ionization technique, the molecular ion is preserved with minor fragmentation making it complimentary to EI and allowing the simple determination of the analyte molecular weight. The LECO GC-TruTOF-HT MS system features easily interchangeable ion sources allowing analysis of a sample in either EI or CI mode. The ion source is isolated from the TOF mass analyzer, which allows for a fast turnaround when the ion source is changed.

In this presentation, we will give a brief introduction to the TruTOF HT system including performance characteristics. The power of this system will be demonstrated by showing a complete characterization of a series of gasoline-range hydrocarbons using GC-TOFMS in both EI and CI modes. Structural characteristics revealed by the EI spectra combined with molecular ion measurements seen in the CI spectra aid in the overall sample component identification. The automated peak-find and deconvolution features of ChromaTOF software in both EI and CI modes use of high-speed GC enables techniques to further increase sample throughput. Sample analysis in both EI and CI modes is shown to be complete in a short time.

Kinetic Calibration Using Dominant Pre-Equilibrium Desorption for On-Site and In Vivo Sampling by Solid Phase Microextraction

Simon Ningsun Zhou, Wennan Zhao, and Janusz Pawliszyn

University of Waterloo, 200 University Avenue West, Waterloo, ON, Canada N2L 3G1

A new kinetic calibration was developed using dominant pre-equilibrium desorption by solid phase microextracton (SPME). The calibration was based on isotropism between absorption and desorption, which was proved theoretically and experimentally in an aqueous solution and semisolid matrix. It therefore allows for the calibration of absorption using desorption to compensate matrix effects. Moreover, concentration profiles are initially proposed to verify isotropism between absorption and desorption, and also provide a linear approach to obtain time constants for the purpose of quantitative analysis. This linear approach is more convenient, robust, and accurate than the nonlinear one with the previously used time profiles. Furthermore, this method employs the target analytes as the internal standards, so radioactive or deuterated internal standards are not necessary compared to the previously reported technique. In addition, dominant pre-equilibrium desorption utilizes the pre-equilibrium approach and offers a shorter sample preparation time, which is typically suitable for in vivo sampling. This kinetic calibration method was successfully applied to polycyclic aromatic hydrocarbons’ (PAHs) sample preparation in a flow-through system and in vivo pesticide sampling in a jade plant (Crassula ovata).

Automatic Sampling System for Online Analysis of Chemical Pollutants in Water

Sam Li

Department of Chemistry, National University of Singapore, 3 Science Drive 3, Singapore 117543

The potential health hazards that arise from chemical pollutants in water have been an increasing concern to the water industry and environmental research. Online analysis of pollutants, most frequently organic herbicides and trace metals, is a pathway to real-time monitoring of water quality. Due to the low concentration of these pollutants in water, sample preparation demands the development of effective sampling tools. Currently we have designed and fabricated an automatic water sampling system for concentration of chemical pollutants in water using solid phase extraction (SPE) technique. The setup and operation procedures are described and applications are discussed. The described water sampling system has various applications in online preconcentration of chemical pollutants prior to separation and detection with chromatographic and spectroscopic analyzers, for example, CE, HPLC, and LC-MS as well as biosensors and imaging techniques. By selecting a correct SPE cartridge, this system applies to organic herbicides and pesticides, trace heavy metal ions, and biochemical toxins in tap water, industry water, and natural waters.

Automated Deconvolution of Composite Mass Spectra Obtained with an Open-Air Ionization Source Based on Exact Masses and Relative Isotopic Abundances

Andrew H. Grange and Wayne Sovocool

U.S. Environmental Protection Agency, P.O. Box 93478, Las Vegas, NV 89193, USA

This paper is being submitted for the symposium, achievements, and challenges in mass spectrometry. Chemicals dispersed by accidental, deliberate, or weather-related events must be rapidly identified to assess health risks. Mass spectra from high levels of analytes obtained using rapid, open-air ionization by a direct analysis in real-time (DART) ion source often contain precursor, oxide, adduct, and/or dimeric ions. Ion compositions must be determined and ion correlations must be made to understand composite mass spectra and enhance confidence in tentative identifications. These tasks are performed rapidly by an in-house ion correlation program (ICP) that considers exact masses and relative isotopic abundances measured by a JEOL AccuTOF time-of-flight mass spectrometer. ASCII files provided by the data system acquired at three CID voltages are imported into the ICP. Possible precursor ions and related oxidized ions, ammonium adducts, protonated dimers, and ammoniated dimers are found in the lowest CID voltage mass spectrum. At the intermediate CID voltage, dimeric ions are fragmented, while possible precursor ions remain and are confirmed as such. Product ions are gleaned from the highest CID voltage spectrum. Starting with the highest-mass precursor ion, all lower-mass ions that are not precursor ions are checked to see if they are precursor ion subunits. When multiple compounds are present with different collections of heteroatoms, some product ions will and some will not correlate with each precursor ion. Examples of mass spectra deconvolution will be demonstrated for data acquired with the DART/TOFMS.

Notice
Although this work was reviewed by EPA and approved for presentation, it may not necessarily reflect official agency policy. Mention of trade names or commercial products does not constitute endorsement or recommendation for use.

Automated Extraction Procedure for Improved Recovery of Phenols and Phenoxy Herbicides

Bruce Richter, Richard Carlson, Eric Francis,  Sheldon Henderson, Brett Murphy, Brett Dorich, and Jennifer Peterson

Dionex, SLCTC, 1515 W. 2200 S., Suite A, Salt Lake City, UT 84119, USA

Accelerated solvent extraction (ASE) is a rapid sample preparation technique that uses common organic solvents to extract solid or semisolid samples. Using ASE, pressurized solvents are heated at or above their boiling points. The use of hot pressurized solvents has many favorable extraction properties compared to traditional extraction techniques such as Soxhlet or sonication. For example, as temperature increases the solution viscosity is reduced, resulting in less resistance to mass transfer as analytes diffuse between solid and liquid phases. It is well known that diffusion coefficients and analyte solubility increase with temperature. The effect of reduced solution viscosity, higher analyte solubility, and increased diffusion accelerates the extraction process resulting in rapid and efficient sample preparation. ASE has been applied to many different analytes and numerous matrices. In general, ASE methods are complete in 15–25 minutes and consume 20–40 mL of solvent per extraction. ASE is fully automated and can facilitate inline cleanup of some samples using resins and sorbents to retain some coextractables. ASE can be used for environmental applications such as the extraction of pesticides, PAH's PCB's, TPH, dioxins, phenols, and phenolic herbicides in environmental matrices.

Due to the polar nature of phenols and phenoxy herbicides, the extraction and the commensurate recovery for analytical determination of these compounds can be challenging. Often acidic pretreatment of samples is required for efficient extraction of these compounds. A discussion of pretreatment techniques prior to ASE, to improve the recoveries of phenols and phenoxy herbicides will be presented.

Optimization of Headspace SPME Conditions for Quantitative Analysis of Volatile Fatty Acids in Water Matrixes

Aitor Aizpuru

Universidad del Mar, Campus Puerto Angel, Puerto Angel Pochutla, Oaxaca 70902, Mexico

Volatile fatty acids (VFAs) represent one of the major components associated with unpleasant odors coming from several liquid sources such as wastewater treatment plants. As odor can be detected even at very low concentrations in the source, a preconcentration step of VFAs is required in order to achieve trace-level quantitative analysis. The present work focuses on solid phase microextraction (SPME), a recent preconcentration technique, and on the effect of several parameters on extraction efficiency (fiber type, exposure time, analyte concentration, salt content, pH, and temperature). Six short chain VFAs were selected as model compounds (acetic, butyric, propionic, isopropionic, valeric, and isovaleric acids). After headspace extraction of different water matrixes, analysis of VFAs was carried out by gas chromatography equipped with a flame ionization detector. Optimization of the preconcentration conditions was realized for every single compound. Moreover, different mixtures of the 6 VFAs were analyzed, in order to establish the concentration conditions where a satisfactory linearity can be guaranteed without the competition phenomena known to occur during the SPME step, particularly for adsorptive fibers.

Use of Automated Solid Phase Extraction to Achieve EPA Compliance

Naomi Reid, Robert Johnson, and Tom Hall

Horizon Technology, Inc., 45 Northwestern Drive, Salem, NH 03079, USA

The performance-based requirements of method make it a demanding analytical procedure and not a simple replacement for the Freon procedure. There are many quality control requirements that must be achieved in order to meet the method parameters. A fully automated alternative to expensive labor-intensive LYLE methods for extracting oil and grease from aqueous samples is available.

One simple way to achieve compliance quickly is to use SPA and completely automate the process. Solutions are available that can handle up to three samples simultaneously; the water sample is kept in the original sample container and is loaded directly onto either extractor unit. The operator initiates the run, and approximately 20 minutes later (based on the size of the SPA disk used and the cleanliness of the sample) the sample is automatically processed; the extract is then ready for the next step.

Johan Nortje

Milestone, Inc., 25 Control Dr., Shelton, CT 06484, USA

The US Environmental Protection Agency (EPA) is placing increasing pressure on coal-fired power plants to reduce mercury emissions by 70% by the year 2018. EPA 30B describes various techniques for sample prep and analysis of mercury as a reference method for Hg CEM's and sorbent trap monitoring systems used in coal-fired boilers. Traditional techniques like wet chemistry digestion/reduction followed by spectroscopic analysis are time-consuming, costly, and operator-dependent. The alternative technique of thermal decomposition, amalgamation, and atomic absorption spectroscopy requires no acid digestion and delivers results in approximately 5 minutes. Direct analysis of mercury using this technique has also been specified in ASTM Method D6722-01 for the analysis of mercury (Hg) in coal and coal combustion residues. Current and proposed methodologies will be discussed with sorbent trap and other coal-related sample data presented from analyses using the Milestone DMA-80 direct mercury analyzer.

Comprehensive Analysis of Drugs of Abuse in Urine with Automated Disposable Pipette Extraction

Sparkle T. Ellison, William E. Brewer, and Stephen L. Morgan

University of South Carolina, 631 Sumter St., Columbia, SC 29208, USA

The analysis of basic, acidic, and neutral drugs of abuse in urine is accomplished with a single small volume sample solution using a mixed-mode disposable pipette extraction (DPX) product. The extraction of acidic and neutral drugs is accomplished using reversed-phase and hydrophobic mechanisms, and the extraction of basic drugs is performed utilizing cation exchange. The recoveries of several basic drugs such as tricyclic antidepressants, PCP, opiates, and meperidine are between 80–100%. The recoveries of the acidic and neutral drugs, such as barbiturates, glutethimide, and COOH-THC, are greater than 80%. By combining the two modes, comprehensive screening is readily accomplished from single small volume samples which have relevance in toxicological analyses of saliva and sweat.

The main advantages of DPX technology are that the extractions are very rapid; negligible solvent waste is generated, and the extractions can be fully automated and coupled to chromatographic injections. Using a Gerstel MPS-2 instrument, the DPX extractions can be performed offline or “inline” with a GC or HPLC instrument. This study includes both applications of automated DPX extractions. It is found that the extractions can be performed during the time required for chromatographic analysis, and therefore throughput can be optimized by performing the analyses “inline” with chromatographic instrumentation. In this study, the comprehensive analysis of just 0.2 mL of urine samples is performed “inline” with GC/MS (Agilent Technologies 6890 with 5973 MSD) with runtimes of less than 10 minutes.

Laboratory Information Management Systems: The Secret Weapon to Regulatory Compliance

Ken Ochi and Christine Paszko

Accelerated Technology Laboratories, Inc., 496 Holly Grove School Road, West End, NC 27376, USA

Laboratories in a variety of industries are under increasing pressure to ensure compliance with an ever growing number of national and international regulations. Some of the better known compliance standards include NELAC, HACCP, EPA, USDA, FDA, CFR 21 Part 11, ISO, Sarbanes-Oxley, and GLP. To help manage the challenges of maintaining compliance, organizations are turning to laboratory information management system (LIMS) as their “secret weapon.”

Understanding the regulations and putting in place a proven process to maintain compliance are critical because the implications of becoming noncompliant can be very costly. This is especially true for laboratories in highly regulated industries like food and beverage: pharmaceutical and environmental (water quality). Many organizations have found that a well-designed LIMS can actually provide the foundation for a successful regulatory compliance process.

This paper will focus on the many benefits that a laboratory information management system (LIMS) can provide to specifically help an organization maintain regulatory compliance. This includes automating many of the laboratory activities to eliminate potential data transcription errors. The LIMS can also provide audit tracking capabilities that will identify the source of noncompliance, while providing real-time alerts to warn the organization when results exceed prespecified limits. In this way, the LIMS is providing a valuable monitoring service and helps to ensure regulatory compliance.

Enhancing Your Laboratory Information Management Systems with a Mobile Device

Kim Chew

Accelerated Technology Laboratories, Inc., 496 Holly Grove School Road, West End, NC 27376, USA

As a laboratory grows, being more efficient and productive becomes an integral part in success and continued growth. Most laboratories have already taken the first steps in automating their laboratory processes by implementing a laboratory information management system (LIMS). Some laboratories have even taken further steps to enhance their LIMS by including functions such as instrument integration, bar code solutions, access to data via the Internet, and autoemailing and faxing of reports.

The introduction of a mobile device is ideal for industries requiring field data collection. The ability to prelog samples and access sample and client information is no longer confined to the laboratory alone. A mobile device provides laboratory personnel access to this information at their fingertips. This contributes to laboratory productivity by providing paperless processes and eliminating transcription errors.

This presentation will review how using a mobile device can further increase productivity and enhance laboratory processes.

Automated Analysis of Carbonyl for the Oleochemical Industry

Lindsay Pratt and Robert Menegotto

Man-Tech Associates, Inc., 2 Admiral Place, Guelph, ON, Canada N1G 4N4

In organic chemistry, carbonyl groups are functional groups consisting of a carbon atom double-bonded to an oxygen atom. Carbonyl groups found within aldehyde and ketone compounds are known by the oleochemical industry to pose many problems during the processing of alcohols, most notably because of their potential to contribute unwanted color. Alcohols can create aldehydes in the presence of heat, oxygen, and a catalyst, which can all be present during the alcohol distillation process. Numerous companies in the industry have methods to try to control the amount of aldehydes being produced, but the carbonyl content of the alcohol is still tested to ensure low levels.

The goal of this automated method was to meet the strict timing criteria for reagent addition, heating, and cooling, while at the same time maintaining good accuracy and precision. This method also helps to free up operator time as the automated system controls all sample preparation and analysis without user intervention.

The method is performed by adding an excess amount of 2,4-dinitrophenylhydrazine (DNPH) to hot alcohol samples in an acidic medium. The carbonyl present in the sample will react with the DNPH to form hydrazone and water. An excess of base is then added to decompose any excess DNPH and form a quinoidal ion. After cooling the solution, the sample is then measured in a spectrophotometer at 530 nm, and compared to a standard curve.

This method conforms to ASTM E 411—standard test method for trace quantities of carbonyl compounds with 2,4-dinitrophenylhydrazine. Further detail on the advantages of the automated system and the method procedure, along with statistical data for various alcohol samples, will be discussed.

Off- and Online Measurement of Material Density Variations with Time-Domain Terahertz Instrumentation

Jeffrey White and David Zimdars

Picometrix LLC, 2925 Boardwalk, Ann Arbor, MI 48104, USA

The density variations within a material can affect the material's performance. Mapping the density variations with a sample can be difficult and time-consuming.

This presentation will demonstrate the ability of time-domain Terahertz (TD-THz) to make such measurements on a range of materials (e.g., foam, paper, and wood products). The method employed uses essentially single-cycle pulses (1 ps width) of EM radiation to probe the sample. The EM pulse will reflect off interfaces with differing refraction indices. Measuring of the time of flight from the first surface of a sample and from the substrate on which the sample is sitting allows the thickness of the material to be precisely measured. The time-of-flight delay of the pulse through the material has been demonstrated to be highly correlated to a material's mass (for many various materials). These two measurements allow the precise determination of density.

The THz pulse can be focused to relatively small spots (1 mm or less), and thus mapping the density variations within a material is straightforward.

TD-THz instrumentation can make noncontact measurements at large standoff distances (reflection sensor has working distances adjustable from less than 1 cm up to 100 cm). Measurements are made in real time, with typical waveform acquisition rates of 100 Hz. Higher-speed operation has been demonstrated. The sensor heads are connected to the instrument with fiber optic and electrical umbilicals. Thus, the heads can be moved to map an object or can be placed in remote location(s) for online measurements. Multiplexing of 8 sensor heads from the instrument has been demonstrated.

Monitoring Organic Reactions in Real Time by HPLC

Steve Hobbs and Justin Kittell

Eksigent Technologies, 5875 Arnold Road, Suite 300, Dublin, CA 94568, USA

Real-time monitoring of organic reactions and other pharmaceutical processes can provide very valuable information. Example information includes reaction kinetics progress, mechanisms, impurity formation, reaction optimization, and dissolution rates. Despite the need, there are presently few attractive options for monitoring pharmaceutical processes. FTIR probes have been available for some time, but the technique's limited specificity severely limits its application to a small fraction of processes. HPLC is a much more universal technique than FTIR; however this requires very time-consuming human intervention for sample withdrawal and dilution. This manual process is also prone to human error.

We have developed a system that automates the sampling and dilution process and allows unattended monitoring of pharmaceutical processes by HPLC. The system includes a sampling and dilution module, an HPLC and system software. The system is contained on a compact cart for portability. Real-world processes can be monitored with sample volumes of tens of microliters. The system is capable of sampling from reactions with a moderate amount of particulates and up to 100 psi of pressure.

The authors will describe real-time monitoring of reaction progress from organic reactors using an inline autosampler and dilution device coupled to an HPLC. Analytical figures of merit for the system will be discussed.

Automated Analysis of Saponification Value for the Oleochemical Industry

Lindsay Pratt and Robert Menegotto

Man-Tech Associates, Inc., 2 Admiral Place, Guelph, ON, Canada N1G 4N4

Saponification is a common test procedure in the oleochemical industry. Oils, usually in the form of triglycerides, are broken down into methyl esters, which are then hydrogenated into alcohols. These alcohols are then typically sulfated in order to be converted to surfactants. The presence of esters in the alcohol affects the sulfation process, and is therefore undesirable. Saponification value is a measure of the number of milligrams of potassium hydroxide required to saponify one gram of fat, which is a good indication of the amount of esters present in the sample.

This automated method was developed to help free up operator time and to improve the accuracy and precision of the manual method. In order to improve the reproducibility of results, a new covering/uncovering system was developed for use in this method, including the automatic pickup and drop-off of lids by using an electromagnet. The new automated method also helps to eliminate operator error, particularly during the delivery of the extremely critical sodium hydroxide solution. In addition, accuracy is improved by using a potentiometric endpoint rather than the subjective color change endpoint in the manual method.

The method involves the addition of 50 mL of ethanolic sodium hydroxide to blanks and fat samples, and heating for one hour then the remaining base is back-titrated with hydrochloric acid. The difference between the blank and the samples is directly proportional to the amount of ester in the samples.

This method was originally developed for the oleochemical industry, but it may also be used by the petrochemical industry to test the quality of engine oils following ASTM D94: standard test methods for saponification number of petroleum products. Further detail on the advantages of the automated system and the method procedure, along with statistical data for various fat samples, will be discussed.

Continuous Analysis of Aerosols Using a 3D Lab-on-Chip Device

Scott D. Noblitt, Susanne V. Hering, Jeffrey L. Collett, and Charles S. Henry

Colorado State University, 1872 Campus Delivery, Fort Collins, CO 80523, USA

Atmospheric aerosols can negatively affect both the environment and health, and therefore aerosol composition needs to be routinely monitored. Water-soluble inorganic ions are currently monitored using the particle-into-liquid sampler coupled to ion chromatography (PILS-IC). PILS-IC gives a seven-minute temporal resolution with detection limits near 0.1 micrograms per cubic meter of air for nitrate and sulfate. However, a less expensive, faster, and more portable system is desirable. Here, we describe the coupling of microchip capillary electrophoresis (MCE) to a water-based condensation particle counter (WCPC) for rapid and continuous monitoring of anions in aerosols. To achieve a working system, several obstacles were overcome. A working interface between MCE and the WCPC was developed. A membrane containing sample reservoir was included to filter insoluble aerosols and to inhibit sampling-induced hydrodynamic flow. A flushing system was designed to clean and replenish the sample reservoir, and the electrophoresis separation chemistry was optimized to operate continuously for extended times. In-field performance of the integrated system was tested with atmospheric aerosols. Inorganic anions can be analyzed in less than a minute with detection limits similar to the PILS-IC, but with improved portability and cost. Continuous monitoring of organic acids was not previously possible, but it is feasible with WCPC-MCE. Coupling microfluidic devices to aerosol sampling technology proves successful for analyzing water-soluble anions, and can be extended to other portions of aerosols such as cations and carbohydrates. The reduced cost and size relative to current technology will alllow a greater deployment of monitoring stations or the development of portable analyzers may be feasible (see Figure 14).

Automated Derivatization and Analysis of Malondialdehyde from Small Volume Tissue Samples Using Column-Switching Technology and High-Performance Liquid Chromatography

Heather L. Lord and Jack Rosenfeld

Department of Pathology and Moleclar Medicine, McMaster University, Hsc 3N26C, 1200 Main St. W., Hamilton, Canada L8N 3Z5

Derivatization of an analyte prior to analysis can stabilize the molecule, improve extraction efficiency and separation characteristics, and enhance detection sensitivity. All of these advantages are important for the analysis of malondialdehyde (MDA) as a biomarker of oxidative stress in biological samples. Conventionally, however, the derivatization process is cumbersome, resulting in a significant lengthening of total sample preparation time, reduced analytical precision, and limited options for automation.

We have addressed these challenges for the solid phase analytical derivatization of MDA from small volume tissue homogenate samples, employing the fluorescent derivatization reagent dansyl hydrazine and chromatographic separation. This was achieved by an automated column-switching technique where the online derivatization was conducted in a sample preparation cartridge packed with Amberlite XAD-2 resin, followed by elution of the product to an analytical LC column and fluorescence detection. The limit of detection from tissue homogenate samples was 0.02  𝜇 g/mL. The method was linear ( 𝑟 2 > 0 . 9 9 9 ) with precision < 5% relative standard deviation from the limit of quantification (0.06  𝜇 g/mL) to at least 35  𝜇 g/mL. The sample preparation cartridge was stable for several hundred analyses with a pressure limit of 700 psi.

The method was applied for the analysis of a small volume (30  𝜇 L) of mouse liver tissue homogenate samples. Endogenous levels of MDA in the tissues ranged between 20–40 nmol/g tissues (ca. 0.1–0.2  𝜇 g/mL homogenate). As there was no suitable method for MDA analysis to validate the developed method, the data were compared to those for isoprostanes, with favourable results. Isoprostanes analysis provides an additional marker of oxidative stress. Compared to conventional MDA analyses, the current method has advantages in automation, selectivity, precision, and sensitivity for analysis from small sample volumes.

Analysis of 2,6,Di-Tertiary Butyl Cresol Dielectric Fluid Inhibitor with a Portable IR Spectrophotometer

Ronald Hontert

S.D. Myers, Inc., 180 South Avenue, Tallmadge, OH 44278, USA

The electrical transformer is a necessary link in the consumers’ power distribution use. The performance of the transformers depends on the condition of the dielectric fluid (usually mineral oil). The dielectric fluid is used for the purpose of insulation and as a cooling medium for the transformer. In an attempt to minimize the detrimental actions of oxygen during operation, the oxidation inhibitor 2,6 Di-tertiary Butyl p Cresol (DBPC) is added to the mineral oil at about 0.3 weight percent (wt%). Over time, the inhibitor is used up and it becomes necessary to analyze the oil for the remaining DBPC adjust the level to the optimum value of 0.3 wt%. This can be accomplished quantitatively with infrared spectroscopy. The absorbance of a solution of the DBPC inhibitor in the mineral oil adheres to Beer-Lambert law which states that the absorbance is a function of the concentration. Deviations from linearity can be determined by obtaining absorbance values from known amounts of inhibitor in mineral oil and by creating a calibration table. Dielectric fluid is typically sampled in the field by a service technician, and delivered to a remote laboratory for analysis.

A study was undertaken to determine whether a portable infrared instrument could be used to analyze the dielectric fluid at the field location. Several transformer dielectric fluid customer samples (used mineral oil) with DBPC inhibitor content, previously analyzed with a laboratory bench top FT-IR by ASTM Method D-2668, were selected for this study. These samples were run on portable infrared spectrophotometers with a mineral oil blank and a one-point DBPC standard. A comparison of the results of the portable IR analyzers to the bench top FT-IR spectrophotometer revealed very good agreement of 0.02–0.05 wt% between the two instruments. All the reagents used in the procedure are common and readily available to test personnel.

Maintaining High Precision with Rapid Analysis and High-Throughput Precious Metals Analysis on ICP

Martin J. Nash, Andrew Clavering, and Karen Harper

ThermoFisher Scientific, Inc., Solaar House, Cambridge Cb5 8 Bz, UK

Precious metals analysis is of prime importance to manufacturers for the control of quality at all stages from raw material through work in progress to life-expired articles or scrap. Its many applications include chemical catalysts, residues, bullions, and plating solutions in addition to analysis of the purity of the high precious metal or alloy.

Traditionally, ICP has been used to maintain the QA but the QC routines are often cumbersome and use time-consuming digestion and analysis routines. The use of microwave digestion greatly reduces the sample preparation time, while the use of cutting-edge technology and method development tools now allows rapid analysis of precious metals with high precision.

This paper will look at the effectiveness of the new models for this application, looking at stability, sensitivity, and precision using ICP.

Determination of Total Gaseous Mercury in Ambient Air Using Amalgamation Coupled to Atomic Fluorescence Spectrometry

Warren T. Corns, Peter B. Stockwell, Richard Brown, and Andrew Brown

P S Analytical, Arthur House, Crayfields Industrial Park, Main Road, St. Pauls Cray, Orpington, Kent Br5 3Hp, UK

Measurements of mercury in ambient air are assuming greater importance, because of increasing health concerns and legislative requirements. The general public and the environment can be exposed to mercury originating from natural, domestic, or industrial processes. Coal-burning power plants are the largest anthropogenic source of mercury emissions to the air. Burning hazardous wastes, the chlor-alkali industry, crematoria, breaking mercury products, and spilling mercury, as well as the improper treatment and disposal of products or wastes containing mercury, can also release it into the environment.

In this paper, we will describe automated online instrumentation based on amalgamation with atomic fluorescence spectrometry. A known volume of ambient air at a controlled flow rate is passed over a gold impregnated silica trap. Total gaseous is preconcentrated on the gold substrate by amalgamation. After the collection period, the Hg is thermally desorbed and subsequently delivered to the atomic fluorescence spectrometer specifically designed to detect Hg. The analytical performance of the system will be presented along with data from rural, coastal, and urban industrial sites from several European countries.

Field Experience of a Mercury Continuous Emission Monitoring System

Matthew A. Dexter, Warren T. Corns, and Peter B. Stockwell

P S Analytical, Arthur House, Crayfields Industrial Park, Main Road, St. Pauls Cray, Orpington, Kent Br5 3Hp, UK

The clean air mercury rule provides a regulatory regime for the control of mercury emissions from coal-fired utilities in USA. The rule requires the installation of continuous emission monitoring systems at the majority of such utilities, and provides a detailed regime of tests to validate data from the emissions’ monitoring system. Elemental and oxidized mercury calibration gases traceable to national standards are required for the validation tests.

The P S Analytical continuous emission monitoring system has been used to monitor gas phase mercury concentrations in coal-fired utility stack gas. The instrument consists of a sampling probe, heated sample transfer line, sample conditioning system, analyzer, and calibration modules.

The sampling probe extracts sample from the stack, separates the gaseous sample from fly ash, and delivers the diluted sample to the sample conditioner via the heated sample line. The sample is conditioned to convert all mercury in the sample to elemental mercury, and is delivered to the analyzer without the need for water injection. Mercury in the sample is determined by amalgamation coupled with atomic fluorescence in the Sir Galahad analyser, providing a method detection limit of less than 0.01  𝜇 g  m 3 .

A calibration gas generator and a delivery manifold system are incorporated to deliver known-concentration calibration gases to the sampling system for system integrity validation tests.

The various components of the PSA Hg CEM will be described, and results of recent field experience of the system and compliance with the regulatory tests will be presented.

The Use of Thermal Decomposition, Gold Amalgamation, and Atomic Absorption Spectroscopy for the Determination of Mercury in a Variety of Environmental Matrices

David L. Pfeil and Bruce MacAllister

Teledyne Leeman Labs, 6 Wentworth Drive, Hudson, NH 03051, USA

Mercury has been recognized as an environmental risk for many years. Environmental organizations, governments, researchers, and concerned citizens throughout the world continue striving to understand the role of mercury and its impact upon ecosystems. Mercury is mobile, persistent, and bioaccumulative, rendered as a global concern that will not go away. As such, there is a need to measure mercury concentration in many types of samples including air, water, soil, fish, fowl, mammals, foods, and fuels to name just a few. Until recently, the sample preparation and instrument calibration for mercury determinations varied with sample matrix resulting in dedicated methodology for each type of sample analyzed. Using the thermal decomposition technique, the analyst can look at a variety of samples without modifying methodology.

Recent changes to the thermal decomposition technique and environmental monitoring regulations will be described. Analytical performance using thermal decomposition will be presented and compared with earlier cold vapor atomic absorption methods.

New Generation Automated Raman Lab Tool for Measurement of Stress and Strain in Semiconductor Substrates and Devices

Emmanuel Leroy, Fran Adar, and Nobuyuki Naka

HORIBA Jobin Yvon, Inc., 3880 Park Avenue, Edison, NJ 08820, USA

Raman spectroscopy has long been used in semiconductor research, mostly for silicon studies and then for alloyed materials. It has been proved useful to measure strain on patterned Si wafers, or look at the stoichiometry of alloys. Raman spectroscopy has evolved from a complex analysis technique, reserved to research and development only, to a user-friendly tool available to the industry. HORIBA Jo bin Yvon presents how Raman spectroscopy has become a very powerful tool now available for inline monitoring of the strain in strained silicon blanket wafers as well as devices in patterned wafers. It can also help monitor the influence of FEOL processes on the stress and ensure the quality of the realized devices. It can also be used to verify the crystalline quality and to detect and review defects and dislocations. The most recent papers published on strained silicon research are reviewed to identify and stress the measurement challenges relative to this new generation of engineered wafers and to assess the technical and physical limitations. Finally, requirements for a tool for manufacturing are reviewed, and the latest technological developments show how they bring manufacturers’ expectations to a reality and to measure submicron features.

An Investigation of Mercury in Human Hair by Cold Vapor Atomic Absorption Spectrometry and Correlations between Mercury Concentrations and Use of Hair Dyes

Mark T. Stauffer, Jennifer M. Uhler, Dean E. Nelson, and Barbara J. Barnhart

University of Pittsburgh, Greensburg, 150 Finoli Drive, Smith Hall B-3, Greensburg, PA 15601, USA

This presentation focuses on undergraduate research involving quantitation of mercury in hair from female adults of child-bearing age (20–40 years) and the correlation of the results with mercury contamination from hair dyes, and possible effects of this source of mercury contamination on reproductive capabilities of females in the aforementioned age group. Mercury is currently of intense interest because of its high toxicity at sub-micro-gram per-gram concentrations and its abundance in Earth's crust, oceans, and atmosphere due to natural and human sources. Human hair yields a more permanent record of heavy metal contamination than such samples as blood and urine. In this study, the authors focused on possible mercury contamination from hair dyes. Hair samples were collected from at least 20 anonymous subjects. Samples were digested using microwave techniques, and analyzed for mercury by cold vapor atomic absorption spectrometry. Statistical analysis of the data using a SPSS statistical software package. Experimental aspects, results obtained, and future directions for this project will be discussed.

Online Raman Monitoring of the Composition of Etchant Solution through Existing Teflon Tubing

Hoeil Chung, Jaejin Kim, Yongdan Kim, and Changyong Oh

Department of Chemistry, Hanyang University, Haengdang-Dong, Seongdong-Gu, Seoul 133-791, South Korea

A near infrared (NIR) spectral collection scheme, employing direct transmission through Teflon tubing, has been recently studied for the measurement of etchant solution. Alternatively Raman spectroscopy can be used for the same purpose since it could provide more selective and diverse spectral information compared to NIR spectroscopy, especially for the components in etchant solutions, such as H 2 O 2 and inorganic acids. For Raman measurement, we have developed a novel Raman collection scheme, using Teflon (PFA: perfluoroalkoxy fluorocarbon) tubing as an effective and synchronous external standard. Therefore, the resulting spectrum is the sum of spectral features from the sample and the Teflon tubing, which allows the nonoverlapping Teflon bands to be used to correct variations in the Raman intensity. We also built a pilot-scale chemical wet station that can simulate a real cleaning bench and generate continuous flow, and Raman spectra were then directly collected from online samples. We used a Metal Aluminum Etchant (MAE) solution (composed of H 3 P O 4 , H N O 3 , C H 3 C O O H , and water)—one of the most frequently utilized etchants—for this study. PLS calibration models were built for each component, and their online prediction selectivities were evaluated by spiking individual components into a sample mixture. Additionally, NIR spectra were simultaneously collected with Raman measurement for comparison. NIR discrimination between H 3 P O 4 and H N O 3 features was somewhat difficult, while Raman spectral features were distinct for these compounds. Overall, Raman spectroscopy also would be a simple and selective tool for the monitoring of diverse etchant solutions.

Handheld Analytical Instrumentation with Mobility Spectrometry for On-Site Rapid Measurements of Chemical Vapors of Industrial and Medical Importance

Gary A. Eiceman

Department of Chemistry and Biochemistry, New Mexico State University, 1175 North Horseshoe Drive, Las Cruces, NM 88003, USA

Modern mobility spectrometers have arisen as handheld rugged instruments through interests in military preparedness and commercial aviation security. The principles of ion formation and characterization in air at ambient pressure were compatible with some of the attractive features of mobility-based instruments portability, comparatively cost of ownership or operation, low detection limits for certain important chemicals, and fast response. Twenty-five years after the introduction of the first generation of in-field ion mobility analyzers, this technology increasingly accepted in industrial or medical applications is occurring. A survey of these uses illustrates the strengths of this measurement method and the fundamental or technical barriers that limit expanded civilian uses. Often these are found in the same facet such as the ionization step where gas phase ion chemistry at ambient pressure establishes advantages of sensitivity and disadvantages, on occasions, of matrix interferences. The growth of small and fast gas chromatographs improves analytical integrity. Other prefractionation methods can aid the analytical reliability in the first step of analytical response, ion formation. Ion characterization with relatively low resolution devices is being refined through tandem analyzers based on mobility or differential mobility with little increase in size, weight, or power consumption. Current trends and future directions in both technology and applications will be described.

Sampling Solutions for Field-Portable GC-MS: Issues and Technology in Past, Present, and Future

Charles S. Sadowski, Phil Smith, and Jon Onstot

Smiths Detection, 21 Commerce Drive, Danbury, CT 06810, USA

A driving force in moving GC-MS from the laboratory to the field was the requirement for faster analysis and turnaround times in environmental analysis. Initial instruments represented ruggidized versions of lab GC/MS. Because trained environmental chemists were typically the instrument operators, sample collection/preparation and analysis methods were transferred from the lab to the field. A lesson learned from this experience is that “what works in the lab does not always work in the field.”

Researchers and manufacturers have succeeded in designing a new generation of truly portable GC-MS systems. Miniature ion traps under development provide smaller/lighter systems, with faster analysis times. After 9/11, GC-MS found widespread use in homeland security applications. Operators are now often “hazmat technicians,” emergency responders, and/or 18-year-old marines. These operators are being tasked to understand the capabilities and limitations associated with using a GC/MS for a wide range of samples. Simplified sampling procedures and flexibility in sampling methods are required.

Getting the “right sample” into the instruments is as important as the technology in the instrument. Solid phase micro extraction (SPME) offers a solution for sampling a wide range of chemicals without sample preparation. Micro concentrators can be used for volatiles in air, with headspace sampling for extraction of VOCs from soil or water. Modular designed sample inlet systems that support SPME and air sampling are part of the solution. Innovation is still required, which lessens the skill requirements of operators for sampling.

Development of a New Automated Selectivity Testing Equipment for Monolithic Columns

Hans D. Mueller

MERCK KGaA, Frankfurter Str. 250, Darmstadt, 64293 Hessen, Germany

Monolithic silica-based columns (Chromolith) are becoming a well-accepted alternative to particulate-based traditional HPLC columns because of their high sample throughput with low back-pressure and their ruggedness to sample contaminants. The columns are manufactured in a proprietary and well-controlled environment.

As with all chromatography columns, the sorbent has to be characterized in order to obtain reproducible results at the end-user's site. With traditional sorbents the characterization is made batchwise.

With monolithic columns, another approach is necessary because of the manufacturing process. All columns have to be tested individually.

As no commercial system was available to fulfill all needs, test equipment was developed to evaluate up to 8 columns with up to 6 different eluent systems for selectivity testing (see Figure 15). The paper discusses the basic manufacturing process for monolithic columns, the construction limitations of test equipment and software development including calculation of chromatographic parameters for a fully independent testing. Results will be presented for reproducibility studies as well as statistical data.

Improve HPLC Throughput with Automation and Real-Time Diagnostics

Steven Kannengieszer

Brooks Instrument, 407 West Vine Street, Hatfield, PA 19440, USA

Are you achieving high throughput? How many samples can your lab analyze per hour or per day? Today's drug discovery labs are charged with moving thousands of samples through HPLC and MS screening processes. If your lab is operating 2 4 × 7 to keep up with the throughput demand, every second counts. Not only must the runtime of a single analysis be shortened, but the total cycle time of the injection sequence and runtime need to be optimized, to achieve high throughput. Solvent management automation and real-time diagnostics allow you to increase sample throughput and maximize system uptime while meeting the need to produce reliable results and identify leads. This automation and diagnostics can be achieved by installing an inline real-time flow measurement device into the solvent management system.

By incorporating an inline flow metering device in the solvent management system, it is possible to improve the accuracy of your system and at the same time be able to diagnose the heath of the HPLC pump. This will help you manage the maintenance schedule for the system, and over time should allow you to increase the time interval between maintenance cycles. You will also see real time if the pump is causing any flow pulsations. If pulsations are a problem you can modify the system to minimize the pulsations with changes to the pump and/or the installation of a pulse dampener. In the next-generation systems, it is possible to use the output from the flow measurement device in a flow control feedback loop with the pump to completely eliminate the pulsations. Some flow metering devices will also provide additional benefits such as real-time fluid density, concentration, or microbubble/two-phase flow indication.

If your goal is reducing maintenance or increasing throughput, uptime, or quality, the installation of an inline flow measurement device into your system will help you achieve that goal.

Automated Wavelength Selection for Near Infrared Spectroscopy Based on Particle Swarm Optimization

Yuping Wu and Gary Small

Department of Chemistry, University of Iowa, Iowa City, IA 52242, USA

Wavelength selection has always been an important part of quantitative near infrared (NIR) analysis because of the overlapping nature of NIR spectra and the resulting need for multivariate calibration models. Even when full-spectrum factor-based techniques such as principal component analysis (PCA) or partial least-squares (PLS) are used, improved results are typically obtained when wavelength selection is performed prior to submitting the data to the PCA or PLS algorithms. Effective wavelength selection can result in models with fewer factors, which are easier to deploy and maintain and are less likely to be adversely affected by unmodeled interferences. In this work, a recently proposed global optimization method, particle swarm optimization (PSO), is used to implement an automated wavelength selection procedure for use in building multivariate calibration models based on PLS regression. Compared to some traditional optimization techniques, such as genetic algorithms (GA), PSO is easy to implement and there are few parameters to adjust. The measurement of glucose in a six-component biological matrix. NIR spectroscopy over an extended time period is employed to develop this protocol. In this work, PSO is used to optimize the NIR wavelengths supplied to the PLS calculation, the position and width of a preprocessing bandpass digital filter, and the number of latent variables employed in building the calibration model. The results obtained with the PSO method are compared to analogous results obtained with a GA-based procedure. The stability of glucose predictions over time will be of particular emphasis in evaluating the success of the methodology.

Choosing Chemometrics Tools

Oswin Galtier, Scott Ramos, Jacques Artaud,  Yveline Le Dréau, Jacky Kister, and Nathalie Dupuy

Paul Cezanne University, St. Jerome, Case 451, Bothell, BdR 13397, Cedex 20, France

Multivariate algorithmic approaches seen in the chemometric literature are applied to solve various problems, including pattern recognition, classification, and quantification. In most cases, a single algorithm (or, at most, 2) is used to demonstrate success (or, sometimes, failure) for a particular application. Invariably, a single implementation of the algorithm is used, whether it is from a commercial software package or from a custom developed platform.

This study was developed to compare both algorithms and implementations commonly used in chemometric work. To serve this purpose, French virgin olive oils samples were collected. Fatty acids and triacylglycerols were determinated by chromatography. To support the evaluation of pattern recognition methods, the registration denomination of origins (RDOs) was recorded; then, discrimination of different oils was attempted by several methods: PCA, KNN, SIMCA, and PLS-DA.

Performance of the various chemometric algorithms was evaluated using three common software packages: PLS Toolbox (Eigenvector Research), Pirouette (Infometrix), and Unscrambler (CAMO).

Monitoring Nitrogen Compounds in Surface Waters by Ion-Selective Electrode

John N. Driscoll, John Hamm, Gongmin Lei, Lacy Prior, and Patricia Hogan

PID analyzers, LLC, 780 Corporate Park Dr., Pembroke, MA 02359, USA

Nitrogen compounds are a serious issue for Cape Cod (Mass, USA) in part because of the large number of septic systems and runoffs of fertilizers from lawns. As a result, denitrification is of great concern on the Cape. The water testing laboratories in Barnstable County run samples for all 15 towns on Cape Cod. Typically, the nitrogen samples are run by ion chromatography (IC).

It would be useful to have a rapid and portable method for measuring N H 3 , N O 2 , a n d N O 3 in the field. The colorimetric methods for these nitrogen compounds are cumbersome and are not easily adaptable to field work.

Ion-selective electrodes (ISEs) for environmental analysis of ammonia in discharges have been approved by EPA for > 30 years. Recently, the EPA approved the nitrate and nitrite electrodes for the measurement of N O 3 and N O 2 in water. We will be using ISEs from PID Analyzers for this study. The potential will be determined with the PID Model 104 water quality analyzer following the addition of the appropriate buffer solution.

Water samples will be collected from selected sites on the Cape, analyzed by a standard method (IC) by the Barnstable County Water Labs, and then analyzed using PID Analyzers ISEs for N H 3 , N O 2 , and N O 3 . If high-nitrogen areas are located during the initial sampling, additional testing will occur in these areas. For high-nitrogen areas, the sites will be tested in the field using PID Analyzers ISEs, and then samples will be collected and returned to the lab for testing by IC.

We will compare the methods by IC and ISE, determine whether any interferences are found, and evaluate the ease of use of the ISE methods in the field.

Analysis of Emerging-Haloacid Disinfection By-Products Using Automated Postcolumn Reaction Ion Chromatography with Nicotinamide Fluorescence

Patricia L. Ranaivo, Paul S. Simone, and Gary L. Emmert

Chemistry Department, The University of Memphis, Room 213, Smith Chemistry Building, Memphis, TN 38152, USA

Chlorination is commonly used to disinfect drinking water in USA. However, it reacts with natural organic matter (NOM) in the water, forming a range of disinfection by-products (DBPs) of public health concern. The halogenated by-products predominantly formed are the trihalomethanes (THMs) and haloacetic acids (HAAs). The US Environmental Protection Agency (USEPA) currently regulates five HAAs in drinking water: monochloroacetic acid (MCAA), dichloroacetic acid (DCAA), trichloroacetic acid (TCAA), monobromoacetic acid (MBAA), and dibromoacetic acid (DBAA). The current maximum contaminant level (MCL) for the total concentration of these five HAAs (HAA5) is 0.060 mg/L. Four unregulated HAAs can be present in drinking water: bromochloroacetic acid (BCAA), bromodichloroacetic acid (BDCAA), dibromochloroacetic acid (DBCAA), and tribromoacetic acid (TBAA). The HAA5 species and the four unregulated HAAs are called HAA9. Recently, iodinated HAAs have been also identified in drinking water.

An online automated method has been previously developed to analyze HAA9 species using postcolumn reaction ion chromatography with nicotinamide fluorescence (PCR-IC). This gives method detection limits (MDL) of single 𝜇 g/L, with good mean % recoveries and % relative standard deviations.

The goal of this work is to expand the PCR-IC method to include other haloacid analytes, such as iodoacetic acid and 2,2-dichloropropionic acid (Dalapon). Additionally, internal standardization will be explored, initially using 2-bromobutanoic acid as a possible internal standard for PCR-IC.

Optimization of a GC/MSD Method for Analyzing Polychlorinated Biphenyls in Human Serum

Buu Tran, Li Zhang, and Robert L. Jansing

Wadsworth Center, New York State Department of Health, P.O. Box 509, Albany, NY 12201, USA

Optimization of analytical instrumentation is essential when performing ultra-trace-level analyses of persistent organic pollutants in human serum. This research describes the analysis of polychlorinated biphenyls (PCBs) in human serum using gas chromatography with mass-selective detection (GC/MSD) in the selected ion monitoring (SIM) mode. In this study, human serum specimens were initially spiked with PCB congeners and extracted by an automated solid phase extraction (SPE) system using silica-based C18 cartridges. The extracts were analyzed at two ion source temperatures, with 2 3 0 C being the default temperature and 3 0 0 C being the maximum allowable one. The use of high ion source temperature increased the abundance of high-mass ions, but also increased response factors with an average of 10–20 folds. Ten replicate fortifications of serum at three different levels of 0.5, 1, and 10 ng/g gave mean recoveries of 110, 108, and 85% with relative standard deviations (RSD) of 10.0, 5.4, and 7.0%, respectively. We will also demonstrate excellent linearity between 0.5–100 ng/mL at an ion source temperature of 3 0 0 C , as well as a calculated method of detection limit of 0.2 ng/g serum. We will present data obtained using analytical standards, serum spiked with 21 PCB congeners, and human serum specimens extracted by an automated solid phase extraction system.

An Automatic Viscosity Control System for the Preparation of Pharmaceutical Tablet Coating Solution

Woo Sok Chang, Keith Kahmann, and Christos Monovoukas

Levitronix LLC, 45 First Ave., Waltham, MA 02451, USA

Pharmaceutical coatings on tablets play a significant role in product's function. They increase the tablets' longevity and keep them intact until they are completely swallowed. To accomplish these vital functions, pharmaceutical coatings must be accurately prepared and precisely applied. Tablet coatings have been traditionally sprayed onto the product, while the tablets are constantly moving in a tumbler. Maintaining consistent coating thickness during this process is an important parameter for quality control. Applying a coating that is thinner than specifications shortens the tablet shelf life. On the other hand, applying a coating that is thicker than specifications alters tablets’ release characteristics. To strike the balance between these two undesirable extremes, tablet manufacturers have implemented routine measurements of coating's viscosity. The viscosity of the coating material directly affects the size of the droplets coming out of the spray nozzle, and determines the pattern of the sprays. Traditionally, viscosity measurements have been carried out offline. Periodic samples are taken from the production line to an analytical laboratory where viscosity is measured. However, offline measurements interrupt production, engage valuable human resources, and fail to provide adequate process feedback. In this paper, a viscosity feedback control system is introduced to improve the quality and productivity of tablet coating solution. The system is modeled, analyzed, and designed. Experimental results describe the performance of the viscosity feedback control system such as response speed, accuracy, and steady state error. The reference tracking accuracy of the viscosity feedback control system is better than +/− 0.3% of a target viscosity using an inline viscometer which has resolution of 0.2% of reading and repeatability of 1% of reading. Viscometer's unique construction does not use any rotating seals or bearings that can generate impurities and contaminate the fluid. All wetted surfaces are made of Teflon. A version of the inline viscometer made out of titanium is also available for processes that use steam as the sterilizing agent. Pharmaceutical companies now have an effective and automated viscosity feedback control system that they can deploy for spray coating of tablets to improve productivity, repeatability, and overall product quality.

Simultaneous Measurement of Transformation Energetics, Mass Changes, and Evolved Gas Analysis Using Advanced Thermal Analysis Instrumentation

David Shepard

Netzsch Instruments, Inc., 37 North Avenue, Burlington, MA 01803, USA

Simultaneous thermal analysis (STA) refers to the simultaneous application of two or more thermoanalytical methods on one sample at the same time (usually thermogravimetry and differential scanning calorimetry). The benefits of such a system are obvious. Frequently, the material available for testing is costly or difficult to produce. Using STA, it is possible to get information on the transformation energetics and the mass change on one sample in one run under identical conditions. Glass transitions, melting behavior, evaporation of plasticizers, and decomposition can, for example, be analyzed within one test run. Furthermore, such systems can easily be coupled to a quadrupole mass spectrometer (QMS) or a Fourier transform infrared spectrometer (FT-IR) to analyze the gases evolving during evaporation or decomposition reactions.

The technical details of the state-of-the-art simultaneous thermal analyzers and the interfaces to evolved gas analysis systems are presented. The application of STA-QMS systems on different organic materials (tobacco, cleaning aids) is shown. Furthermore, an FT-IR system connected to an STA was used to characterize the gas release during decomposition of cigarette filters. The examples clearly show the possibilities and benefits of combined systems in the fields of materials research and quality control.

Quality Control Using Standardized Thermal Analysis Techniques

David Shepard

Netzsch Instruments, Inc., 37 North Avenue, Burlington, MA 01803, USA

Over the past few decades, industry has increased its efforts to introduce quality-control systems in all areas of production to improve quality and efficiency of the entire process. Raw materials must be analyzed and characterized to make sure that the composition and structure of process materials remain unchanged. Thermal analysis methods such as thermogravimetry, differential scanning calorimetry, or thermomechanical analysis are well-established fast analytical techniques for materials characterization, especially in the polymer, pharmaceutical, metal, and ceramic industries.

In this case, however, one has to make sure that the instrument offers reliable results which do not depend on the specific instrumentation used. Therefore, international accepted standards are introduced into the analytical techniques to ensure that the instrument fulfills specific requirements and that the tests are carried out under comparable conditions.

Some examples for standards in materials characterization on the basis of thermal analysis techniques, frequently used in industrial research and quality-control labs are presented. The key contents of ASTM, ISO, or DIN EN standards are mentioned, measurement examples are presented, and the cross-link to the production process is given. Furthermore, thermal analysis techniques generally used in the quality-control labs of modern industrial production plants are shown.

Automatic Band-Target Entropy Minimization Curve Resolution

Wee Chew, Suat-Teng Tan, and Haohao Zhu

Institute of Chemical and Engineering Sciences, 1 Pesek Road, Jurong Island, Singapore 627833

A significant modification to the well-proven band-target entropy minimization (BTEM) curve resolution technique coined as automatic BTEM (AutoBTEM) was recently developed. It provided several enhancements to the original BTEM algorithm, notably, (i) automatic band-targeting of prominent/localized spectral features (band-targets) found in the abstract right orthonormal singular 𝐕 T vectors using a statistical function, (ii) automatic specification of band-target range (upper and lower spectral limits) using a novel user-specified bandwidth and band-targets clustering algorithm, and (iii) a near-blind source separation (near-BSS) of multicomponent spectroscopic mixtures into constituent pure component spectra via large-scale BTEM optimization runs and subsequent application of an unsupervised leader-follower cluster analysis.

This AutoBTEM algorithm has been successfully tested on 30-mixture Raman spectral data comprising 10 common laboratory solvents. This 10-component mixture Raman dataset is a very challenging analytical problem as firstly the 10 organic solvents possess highly overlapping bands in the region (ca. 800–1050  c m 1 ), and secondly the number of underlying spectral components/factors is large. Furthermore, two serendipitous corollaries were found in this AutoBTEM approach; namely, (i) the statistical function employed for automatic band-targeting possesses predictive capability to estimate the number of pure components that exists in a spectral dataset, and (ii) spectral band (peak center position) shifts of each potential band-target can be quantitatively estimated from the band-targets clustering algorithm.

Monitoring of Heavy Metal Levels in Water from the River Cauca in the Urban Zone of the City of Cali in Colombia

Ramiro Sanchez, Alejandro Soto, Gloria A. Jimenez, and Fernando E. Larmat

Universidad del Valle, Calle 13 100-00, Cali, Valle AA25360, Colombia

The river Cauca is an important hydric resource for the Republic of Colombia and especially for the city of Cali, located in the southwestern region of the country. Unfortunately, the city takes the water for the population consumption from the river, and it has been site of heavy metals’ contamination from several sources. This has resulted in widespread heavy metals’ contamination such as mercury, lead, cadmium, and chromium of the water and surrounding sediments. It is well known that these metals can be introduced into the food chain by several species of native fish. Consumption of contaminated fish by people living around the river has resulted in moderate contamination and health problems.

This work will present the determination of the heavy metals Hg, Pb, Cd, and Cr in samples of water from the river Cauca during a period of 6 months. For the study, five sampling stations were used and sampling was done during the rainy and dry seasons. Pb, Cd, and Cr were adsorbed from water samples using the resin XAD16, and the extracts were later analyzed by atomic absorption with graphite furnace. Hg was analyzed using the technique of cold vapor and atomic absorption.

Results indicate a moderate-to-high contamination, especially in mercury and lead with concentrations up to hundreds of ppm.

Hyphenated Techniques as Modern Detection Systems in Ion Chromatography

Andrea Wille, Stefanie Czyborra, and Jörg Kleimann

Metrohm AG, Oberdorfstrasse 68, 9101 Herisau, Switzerland

By coupling ion chromatography with mass spectrometry, new fields of applications can be developed. In this work, IC/MS and IC-ICP/MS are used for the qualitative as well as quantitative analysis of environmentally and industrially relevant compounds.

A great benefit of the hyphenated techniques is that they can reach the high sensitivities that are required for the analysis of widespread water contaminants. The US Environmental Protection Agency and the European Union currently prescribe a maximum bromate concentration of 10 ppb in drinking water. For mineral waters, the pertinent regulations stipulate a limit of 3 ppb. For perchlorate, a 6 ppb public health goal was established in USA. By means of IC/MS coupling, detection limits in the lower ng/L range are achieved. Additionally, the higher selectivity helps to avoid coelutions and to suppress matrix interferences to the absolute minimum. Using IC/MS it is possible to quantify carboxylic acids reliably in the presence of a high salt matrix containing approximately 100 g/L chloride.

The IC-ICP/MS technique is used for an improved speciation analysis of hazardous substances. Organic and inorganic compounds can be determined in a single run with this hyphenated technique. Typical applications of IC-ICP/MS are the analyses of chromium(III) and (VI) as well as the speciation of arsenic or selenium compounds.

Another application of the hyphenated techniques is unambiguous peak identification. Using IC/MS, parent compounds and corresponding metabolites can be identified according to their mass/charge ratio. IC-ICP/MS is also used for element-specific analysis.

Optimization of TOC Analysis Methods for Industrial and Environmental Testing Applications

Jeffrey R. Lane and William Lipps

OI Analytical, 151 Graham Road, College Station, TX 77845, USA

Total organic carbon (TOC) analysis is being used in an increasing number of industrial and environmental testing applications. The unique chemical and physical composition of samples such as metal plating solutions, wastewater, and drinking water precludes the use of a single, universal sample oxidation method for TOC analysis. Optimization of wet oxidation methods ensures that consistent and reliable data is obtained from the analysis of complex matrices.

This paper will describe the key variables and an efficient approach to establishing and validating effective TOC methods. Comparison data for other types of oxidative analyzers will also be provided.

Identification of the G-Protein Coupling Mechanism of GPCRs Using a Label-Free Live-Cell Assay

Qin Chen, Julia Michelotti, Roger Tang, and Ed Verdonk

MDS Analytical Technologies, 1311 New Orleans Drive, Sunnyvale, CA 94089, USA

G-protein-coupled receptors (GPCRs) represent the largest target class in drug discovery, and they are important in therapeutic areas such as heart disease, metabolism, and immune disorders. The function is commonly assessed in cells artificially overexpressing the receptor of interest using fluorescent- or bioluminescent-based probes. With the CellKeyÔ system, functional activation of differently coupled endogenous GPCRs can be measured in the same assay format and in their natural setting. In this study, we first demonstrate the sensitivity and ease of use of the CellKeyÔ system by measuring activation of multiple differently coupled endogenous receptors on a single plate of U-2 OS cells. The CellKeyÔ software is then used to group the unique response profiles generated by activation of these receptors by a predicted G-protein coupling mechanism. We also demonstrate the use of the CellKeyÔ system to confirm the subtype of the histamine receptor indicated by the unique CDS kinetic response profile with the use of selective agonists and antagonists. Lastly, it is becoming increasingly apparent that crosstalk between signaling pathways occurs as a result of the normal activation of GPCRs. We show data to illustrate that the CellKeyÔ system can reveal a complex interplay among some GPCR-mediated signaling pathways, and that the data provides information to allow deconvolution of a complex signaling pathway. These studies detail powerful techniques that can be applied to selectivity screening of compounds and endogenous target validation.

Improved High-Throughput Soils’ Analysis for Nitrate/Nitrite, Ammonia, and Phosphate Using Flow Injection Analysis

Lynn R. Egan

Lachat Instruments, 5600 Lindbergh Drive, Loveland, CO 80538, USA

Lachat Instruments has improved the throughput of the flow injection methods for nitrate/nitrite (2 M KCl), ammonia (2 M KCl), and orthophosphate (Bray and Mehlich) in soils so that 120 samples per hour can be analyzed for each analyte. This is a significant improvement (33–100%) over prior methods using flow injection analysis. Data will be presented showing the throughput improvements for each analyte. In addition, configurations of the instrument will be shown, which will allow users to optimize the throughput for samples with different matrices.

Automated Analysis of Amylase Using Microvolumes

Melanie Geaslin, Dave Glutz, and Larry Anderson

EST Analytical, 503 Commercial Drive, Fairfield, OH 45014, USA

A 4-minute single reagent method utilizing kinetic readings over a 2-minute period has been developed for a completely automated discrete analyzer. A 20  𝜇 L sample is added to a 230  𝜇 L heat-stabilized reagent, and 5 readings are taken every 28 seconds at 405 nm. Results are quantified against a multipoint calibration. No sample preparation or filtration is necessary, and disposable cuvettes are used to eliminate any possibility of carryover. Cuvettes consist of 12 different reaction cells allowing 6–12 reactions to occur simultaneously. Fixed timing controlled in the method setup fulfills the necessity of readings occurring at relatively the same time at the same rate. Side wavelength readings can be incorporated for additional potential interference correction. Models are available with test rates from 2–600 tests per-hour.

Amylase enzymes are used extensively in bread making, high-fructose corn syrup production, and sometimes in the manufacture of biofuels. It has also become an important ingredient in many cleaning products and detergents. Amylase breaks down complex sugars such as starch (found in flour and corn) into simple sugars. In the bread making industry yeast then feed on these simple sugars and generates waste products of alcohol and CO2. This process imparts flavor and causes the bread to rise. Amylase is an important enzyme for manufacturing corn syrup and grain alcohol production. Amylase from Bacillus bacteria is used in detergents to dissolve starches from fabrics. Amylase has also been used to control viscosity in the chocolate industry.

Rapid HPLC-UV Method for the Simultaneous Quantification of the Zidovudine, Abacavir, and Nevirapine in Plasma by Using Monolithic Column

Koteshwara Mudigonda, Devender Ajjala,  Vishwottam Kandikere, and Ramakrishna Nirogi

Suven Life Sciences Limited, Serene Chambers, Road no. 5, Avenue 7, Banjara Hills, Hyderabad, Andhra Pradesh 500034, India

Purpose. The purpose of the present investigation is to develop and validate a rapid, sensitive, and simple high-performance liquid chromatography method for the quantification of zidovudine, abacavir, and nevirapine in human plasma with a short runtime and quantification limit sufficiently low to support pharmacokinetic and bioequivalence studies. Methods. The method employed a liquid-liquid extraction from 0.5 mL plasma with a mixture of diethyl ether and dichloromethane followed by extract evaporation and reconstitution. All the three analytes and IS were separated using an isocratic mobile phase on a monolithic column (100 mm × 4.6 mm) at a flow rate of 1.5 mL/min. Zonisamide was used as the internal standard to account for differences due to adsorption and extraction. Detection was set at a wavelength of 220 nm. Results. Zidovudine, abacavir, nevirapine, and IS were eluted at retention times of 2.0, 2.4, 2.9, and 6.0 minutes, respectively. The short retention times were achieved due to monolithic column which withstands high flowrates with low back-pressure. Method allows quantification of all the three analytes in 40–5000 ng/mL range, and is fully validated as per FDA guidelines. The validation data demonstrates good precision ( < 4% CV) and accuracy ( < 8% RE). Conclusions. A rapid and specific assay has been successfully developed and applied for the quantifiacation of zidovudine, abacavir, and nevirapine in human plasma samples.

Automating Method Development with an HPLC System Optimized for Scouting of Columns, Eluents, and Other Method Parameters

Wulff Niedner, Marco Karsten, Frank Steiner, and Remco Swart

Dionex Corporation, Dornierstrasse 4, 82110 Germering, Germany

HPLC method development is still considered as one of the crucial bottlenecks that impede productivity in analytical laboratories. Due to the variety of available columns, the proper selection of the stationary phase is usually the greatest challenge. Despite all efforts in the fields of phase characterization and phase property indexing, an unambiguous selection approach is not available. With the introduction of alkaline stable silica-based stationary phases, a large palette of mobile phase additives and pH values is applicable in reversed-phase chromatography. This results in a vast number of parameters which have to be characterised in method development.

We present a new integrated system that allows automatic and intuitive scouting of columns, eluents, and other important method parameters, for example, column temperature. The system includes quaternary gradient capabilities, an autosampler compatible with well plate and standard sample formats, and a diode array detector. Up to two column selection valves in the column compartment provide high application flexibility. Intelligent software makes parameter permutation easy, without requiring method changes. Data mining tools help evaluating results and identifying optimal conditions.

The application of this system to automatically screen possible combinations of 6 different columns, 3 different organic solvents, 3 different pH values, and 2 different temperatures for evaluating the separation problem is presented. Peaks are tracked either via UV/Vis spectra comparison, or, when the system is equipped with a single-quadrupole mass spectrometer, via mass comparison. A spreadsheet-based reporting tool provides a peak resolution chart. The most promising combinations of eluents, stationary phase, and temperature are automatically predicted.

Rapid Resolution Liquid Chromatography with Charged Aerosol Detection

Ian N. Acworth, Darwin J. Asa, Eddie Goodall,  John Christensen, and Ryan McCarthy

ESA Biosciences, Inc., 22 Alpha Road, Chelmsford, MA 01824, USA

Rapid resolution liquid chromatography (RRLC) has become an increasingly useful approach to achieve higher throughput, improve sensitivity, and reduce costs. The Agilent 1200 “rapid resolution” LC system enables faster analysis (up to 20×) than with conventional HPLC, whilst maintaining resolution. This is achieved by using sub-2 micron column particle chemistry and high flow rates. Higher temperatures are used to minimize system back-pressure.

With the widespread adoption of RRLC it is necessary to question the compatibility of the HPLC detector. Presented here is the use of “charged aerosol detection” (Corona CAD) with conventional HPLC and RRLC. CAD is an evaporative technique and is based on the charging of aerosol-borne analyte particles by nitrogen gas and on corona discharge, with subsequent measurement of the charge, derived from the charged analyte particles by a high-sensitivity electrometer. The compatibility of the CAD with RRLC is evaluated using the analysis of five phenolic acids as an example.

Using conventional HPLC, the runtime was 7 minutes and the peak widths were 10.08, 12.18, 12.66, 17.82, and 16.86 seconds. With RRLC, the runtime was 1.5 minutes and the peak widths were 3, 3.18, 3.60, 4.02, and 4.20 seconds. Using RRLC reduced the runtime five-fold (~6.5 minutes). The CAD showed that it is fully capable of determining rapid peaks of just 3 seconds at base. Furthermore, the sensitivity was improved three-fold. The LOD for gallic acid and 4HPAC using HPLC was 13.1 ng and 23 ng. The LOD for gallic acid and 4HPAC using RRLC was 4.3 ng and 8.1 ng. The data showed that the CAD is fully compatible with the RRLC, as long as the width of analyte peaks is > 3 seconds (at base). Improved runtimes and increased sensitivity were also demonstrated.

Simultaneous Determination of Cations, Zwitterions, and Neutral Compounds Using Mixed-Mode Reversed-Phase and Strong Cation Exchange High-Performance Liquid Chromatography

Jingyi Li, Shan Shao, Markian S. Jaworsky, and Paul T. Kurtulik

Celgene Corporation, 86 Morris Avenue, Summit, NJ 07901, USA

A novel mixed-mode reversed-phase and cation exchange HPLC method is described to determine four related impurities of cations, zwitterions, and neutral compounds in developmental drug A simultaneously. The commercial column is Primesep 200 containing hydrophobic alkyl chains with embedded acidic groups in H+ form on a silica support. The mobile phase variables of acid additives, contents of acetonitrile, and concentrations of potassium chloride have been thoroughly investigated to optimize the separation. The retention factors as a function of the concentrations of potassium chloride and the percentages of acetonitrile in the mobile phases are investigated to an insight into the retention and separation mechanisms of each related impurity and drug A. The study found that the positively charged degradant 1, degradant 2, and drug A were retained by both ion-exchange and reversed-phase partitioning mechanisms. RI2, a small ionic compound, was primarily retained by ion exchange. RI4, a neutral compound, was retained through reversed-phase partitioning without ion exchange. Furthermore, the initial attempt to simultaneously retain and separate these related impurities and drug A on a conventional RP-HPLC by adjusting the pH of the mobile phases has also been discussed. Finally, the method performance characteristics of selectivity, sensitivity, and accuracy have been demonstrated to be suitable to determine the related impurities in the capsules of drug A.

Rajendra B. Kakde and M. A. Khan

RTM Nagpur University, Campus Amravati Road Nagpur, Navi Mumbai, Maharashtra 440 033, India

An isocratic reversed-phase high-performance liquid chromatographic (RP-HPLC) method for identification and quantification of clarithromycin and related substances in bulk drugs and pharmaceutical preparations has been developed using Prodigy C18, 2 5 0 × 4 . 6  mm, 5  𝜇 column with acetonitrile and 0.067 M monobasic potassium phosphate (44:56, v/v) mobile phase adjusted to pH 4 with orthophosphoric acid. Detection was carried out using photodiode array detector set at 210 nm, with the flow rate of 1 mL/min and column evaporation temperature controlled at 5 0 C . This method is simple, rapid, selective, and capable of detecting related impurities at trace level in the bulk drug, and pharmaceutical preparations with the detection and quantification limit for the related substances (impurity A) were found to be 2 and 5  𝜇 g, respectively, on the basis of signal-to-noise ratio. The linearity range was found to be within 2–100  𝜇 g/mL with the coefficient of correlation of 0.999. The percent recoveries from the pharmaceutical preparations ranged between 98–102%. The method has been validated with respect to accuracy, precision, linearity, ruggedness, and limit of detection and quantification.

Inline Sample Preparation: An Effective Tool for Ion Analysis in Pharmaceutical Products

N. Harihara Subramanian, S. Thyagarajan,  V. R. Sankar Babu, and M. R. Hariganesh

Micro Devices Metrohm Ltd., 1st Avenue, Indira Nagar, Adyar, Chennai 600020, India

The determination of traces of inorganic contaminants in concentrated ionic pharmaceuticals by ion chromatography (IC) requires inline matrix elimination from the bulk of the pharmaceutical to avoid overloading of the analytical column. Several methods in a new monograph of the US Pharmacopeia suggest the use of a rinsing solution consisting of the IC eluent and suitable organic solvents to remove the pharmaceutical from the analytical column. However, the procedure is tedious and time-consuming, and cannot be automated.

Inline sample preparation is an excellent method for the determination of anions and cations in complex pharmaceutical samples. In this paper, the inline sample preparation technique is applied to the determination of sodium azide in irbesartan samples. To prevent contamination of the analytical column, the filtered sample solution is passed through an anion preconcentration column that retains the azide and the pharmaceutical. After this, the preconcentration column is rinsed with a solution of 70% methanol and 30% water to remove the retained pharmaceutical. The rinse solution is passed through an anion trap column to remove any anionic impurities.

The method complies with the guidelines of the US Food and Drug Administration (FDA). Reproducibility and recovery are excellent and represent detection limits better than those achieved by direct injection IC with inline sample preparation. Additionally, the new inline sample preparation technique can easily be automated.

Determination of the Water Content in Tablets by Automated Karl Fischer Titration

Regina Schlink and Heike Risse

Metrohm AG, Oberdorfstrasse 68, 9101 Herisau, Switzerland

The water content of medicaments influences the release of their active substances as well as their shelf life. Accordingly, the water content is an essential parameter that needs to be determined.

Normally, tablets prove hard to dissolve in a Karl Fischer working media containing alcohol. This means that complicated sample preparation steps are necessary prior to a water content determination. These steps are time-consuming and can also lead to incorrect results. The surface area of the tablet after comminution is much larger so moisture from the atmosphere can be absorbed more easily.

Metrohm has developed a method for determining the water content of tablets accurately and without any complicated sample preparation steps.

When using a high-frequency homogenizer, complicated sample preparation steps are no longer necessary. The high-frequency homogenizer reduces the tablets to small particles directly in the titration vessels containing the KF solution. As the titration vessels are completely sealed until the determination starts, atmospheric humidity has no effect on the sample water content.

The use of an automated system considerably reduces the workload of the user, and is therefore ideal for high sample throughputs. The sample changers can be adapted to meet all requirements and permit the fully automatic determination of the water content of tablets in a way that saves both time and money. The high-frequency homogenizer can be easily integrated into the system.

Fully Automated Sample Preparation of Tablets

Heike Risse

Metrohm AG, Oberdorfstrasse 68, 9101 Herisau, Switzerland

Producing tablets for the pharmaceutical industry includes proving that the active substance content printed on the package is valid for every single tablet of a batch. Different modern analytical techniques such as titration or ion chromatography are used for the highly accurate quantitation of the components. Nevertheless, these analytical techniques are only as good as the sample preparation preceding them. Substances are dictated by the shape, coating, filling components used, and concentration of the pharmaceutical, different sample preparation steps have to be carried out before the analysis proper.

The first step is always the thorough homogenization of the tablet in a suitable solvent mixture. Depending on the determination technique that follows filtration, additional dilution or pipetting into new sample vials is required. In most laboratories these steps are carried out manually, which can result in carryover or erratic results as the manual operation is influenced by many different circumstances. These steps are very time-consuming and also the organic solvents used quite often present a health hazard.

An automated system that performs every sample preparation in exactly the same way can avoid these problems. Merely, the weighing of the tablets has to be done manually. The content of the active substance in a single tablet as well as the product conformity of the whole batch can be determined with just one automation system. Automation not only improves reproducibility, accuracy and throughput, but also increases laboratory safety.

Online Monitoring of the Drying of a Drug Product

Jeremy A. Linoski

ABB, Inc., 433 Northpark Central Dr., Suite 100, Houston, TX 77073, USA

Numerous pharmaceutical companies are moving towards more real-time online process control monitoring with the advent of the FDA PAT initiative. Powder analysis has always been a challenge for analytical chemists. Traditional techniques of drying and solvent operations through loss-on-drying (LOD) or Karl Fisher (KF) analyses involve removing sample from the processes and measurements in a laboratory. LOD and KF are a time-intensive techniques, and therefore dryers must be powered down to avoid the possibility of overshooting the dryness mark.

An online, real-time, and nondestructive technique is required for real-time feedback and to enhance process knowledge and control, continuous quality monitoring, measuring all “critical-to-quality” attributes, and also reduced cycle time. Inline FT-NIR is a perfect technology for this type of application.

A case study of an ABB dryer monitor installed into a dryer to evaluate the feasibility of monitoring moisture during drying runs is presented.

Bringing the Laboratory at Line

Jennifer Camarda and Frank Portala

Brinkmann Instruments, Inc., 1 Cantiague Road, 11590 Herisau, Switzerland

This paper describes a fully customized at-line automation system suitable for analyses within a process plant, where analytical determinations to monitor a production process are extremely crucial in achieving constant product quality. Its wide scope of applications includes techniques such as direct measurement (pH, redox potential, conductivity, ISE), potentiometric titration (aqueous and nonaqueous), and Karl Fischer titration, voltammetry, cyclic voltammetric stripping (CVS), and sample preparation. These can be applied to various industries including chemical, petrochemical, electroplating, semiconductor, automotive, steel, pharmaceutical, food/beverage, pulp, and paper industries. ProcessLab's compact and modular design, configuration flexibility, and robustness make it ideal for severe conditions prevailing in process plants. The hermetical sealed housings of wet-part modules and electronic components protect them against dust and splashes. ProcessLab offers close positioning to a process allowing for rapid and reproducible results, easy installation and operation, networking and process integration for easy incorporation into the process surrounding, and effective data monitoring on-site as well as off-site. It is also expandable for more analyses at a later time. It is compatible with third-party equipment and is more economical than online systems. ProcessLab also features analog and digital input/output signals that can incorporate other process information or send out information, thus allowing it to react to different input signals, trigger an alarm, or transfer measured values as analog signals. In addition, ProcessLab can be easily integrated into a local network, export data to LIMS, and be fully operated by remote control.

Method Optimization and Validation of Promethazine Enantiomers Using Micellar Electrokinetic Chromatography-Mass Spectrometry

Francisco Fedullo, William Bragg, Jun He, and Shahab Shamsi

Georgia State University, 1541 Park Creek Ln., Atlanta, GA 30319, USA

Recently, a special interest has grown within the pharmaceutical industry to develop single enantiomer formulations, and consequently there is a need for analytical methods to determine the enantiomeric purity of drugs. Micellar electrokinetic chromatography (MEKC) coupled to mass spectrometry (MS) using polymeric surfactant has great potential to serve as a valuable tool for the impurity profiling of chiral drugs where impurities and degradation products have to be separated from the main compound. In this work, a chiral MEKC-MS of a hydrophobic chiral drug (promethazine log, 𝑃 = 4 . 7 8 2 ) was developed using a short eight-carbon-chain polymeric surfactant (polysodium N-octenoxycarbonyl leucinate, poly-L-SocCL) as chiral selector. The method was first optimized sequentially in terms of poly-L-SOcCL, acetonitrile, ammonium acetate (NH4OAc) concentrations, buffer pH, temperature, and voltage, as well as nebulizer pressure. Next multivariate optimization of the most critical parameters was also performed. Optimal separation of the promethazine enantiomers was obtained using an electrolyte with 7.5 mM NH4OAc (pH 5.50), 25 mM poly-L-SOcCL, and 55% ACN. Linearity was achieved using the internal standard method. The limits of detection (LOD) and quantitation (LOQ) were 25  𝜇 g/mL and 77  𝜇 g/mL, respectively. The validated method was applied for the quality control of promethazine enantiomers in commercially available pharmaceutical formulations. The assay of promethazine tablets showed excellent recovery of 1 0 0 ± 0 . 4 5 %. To assure the method suitability for the control of enantiomeric impurities in pharmaceutical quality control, its precision, repeatability, and robustness will be validated according to the requirements of the International Conference on Harmonization.

Daan Zhou

Abbott Laboratories, 1401 Sheridan Road, North Chicago, IL 60064-6289, USA

An ion chromatography method was developed to analyze choline hydroxide aqueous solution for choline potency. The method is also able to detect and quantify related impurities and degradation products which include choline dimer hydroxide, choline trimer hydroxide, and trimethylamine. Optimization of the ion chromatographic method involved the process of selecting the best combination of analytical column and mobile phase to ensure the resolution of choline hydroxide from the related impurities and degradation products with selectivity and sensitivity being the key parameters. The developed IC method was validated; this included linearity, accuracy, precision, intermediate precision, practical detection limit, practical quantiation limit, lot analysis, standard solution stability, sample solution stability, robustness for flow rate, and robustness for mobile phase composition.

Validation results of the IC method using Dionex CS17 column, Dionex CG17 guard column, and Dionex CSRS-Ultra II 4 mm cation self-regenerating suppressor with conductivity detection are discussed in this paper. The function of the suppressor in the ion chromatography method is to reduce the background conductivity while maximizing the response for the analyte cations. The reduction of background conductivity from the eluent translates into low noise and hence improved detectability for the analyte cations. This method has practical detection limits of 0.3 (%w/w), 0.01 (%w/w), and 0.01 (%w/w) for choline dimer hydroxide, choline trimer hydroxide, and trimethylamine, respectively. The validated IC method for choline hydroxide aqueous method showed adequate linearity, sensitivity, repeatability, and accuracy for its intended use.

The Use of an Automated Compliance Engine with an Automated Business Process Manager to Ensure Instrument Compliance

Robert Giuffre

Agilent Technologies, 550 Clark Drive, Budd Lake, NJ 07828, USA

The qualification of instruments in a GMP/GLP laboratory is one of the most critical aspects in a compliant environment. The use of an instrument that is out of compliance invalidates all samples run on that instrument. In addition, many laboratories are confronted with an array of instruments from different manufacturers, which makes uniform procedures and specification extremely difficult.

This paper will discuss two products which, when used in combination, can (i)schedule the qualification of an instrument,(ii)put the instrument into a “maintenance lock” so that it cannot be used while it is out of qualification,(iii)perform the qualification,(iv)assess if the procedure passed or failed,(v)automatically route the appropriate documents to laboratory personnel.

The automated compliance engine (ACE) is a vendor neutral procedure that can take any qualification specifications and adopt them for the instrument being tested. The business process manager (BPM) allows metadata from the ACE documents to be analyzed. The BPM can then assess whether the ACE procedure passed or failed, and can then route documents to the appropriate personnel. BPM can also take action in the case of a failure to reschedule an ACE procedure or, for example, to alert a metrology group to take further remedial action. Examples of the ACE procedure and various workflows will be demonstrated.

Evaluation of Measurement Uncertainty in Analytical Chemistry: A Software Solution

Wolf-Dieter Beinert, Matthias Roesslein, and Bruno Wampfler

VWR International GmbH, Hilpertstr. 20A, 64295 Darmstadt, Germany

Analysis results are often used to check materials against specifications or statutory limits. To justify such decisions, it is important to have a reliability measure of the results, like measurement uncertainty. Therefore, the new ISO/IEC/EN 17025 requires the laboratories to set up and operate a system for estimating measurement uncertainty. This demands a great effort from the laboratories. The software UncertaintyManager is the ideal solution for this challenge.

UncertaintyManager implements the full evaluation process according to the Eurachem/CITAC guide Quantifying Uncertainty in Analytical Measurement (2nd edition), and fulfills all requirements of the ISO Guide to the Expression of Uncertainty in Measurement (GUM) and its supplements (Monte Carlo method). The new version 3.0 allows calculating the measurement uncertainty including any type of sample preparation. For this purpose, a database with various sample preparation steps was elaborated considering the analyte characteristics and concentration range. After setting up the operating sequence of the analytical procedure, UncertaintyManager analyses, of the uncertainty sources in sample preparation, are covered by a recovery correction.

UncertaintyManager is the first software to support the analyst throughout the entire evaluation of measurement uncertainty, and helps to save tremendous amount of time. The calculation of measurement uncertainty of chromatographic procedures with extensive sample preparation will be presented.

Online Gas-Free Electrodialytic Eluent Generator for Capillary Ion Chromatography

Purnendu K. Dasgupta

Department of Chemistry, The University of Texas at Arlington, 700 Planetarium Place, Arlington, TX 76019-0065, USA

Both low- and high-pressure gas-free capillary scale electrodialytic generators (EDGs) for eluents in ion chromatography are described. While the low-pressure devices rely on planar or tubular membranes, the high-pressure devices rely on ion exchange beads used as both one-way ionic gates and bail-on-seat valves to provide sealing. The high-pressure device is easily implemented in the form of a commercial cross-fitting, and can withstand at least 1400 psi. By design, these devices do not produce gas in the eluent channel; hence it is not necessary to remove gas afterwards. With appropriate electrolytes and electrode polarities, such devices can produce either acid or base. The behavior of these devices fully corresponds to that of a semiconductor diode in regard to ionic transport. Reverse bias can be applied to prevent Donnan-forbidden leakage or ion exchange. Even with 4 M KOH in the electrode compartments and 4  𝜇 L/min water flowing through the eluent channel, with sufficient reverse bias applied, the product KOH concentration can be maintained as low as 30 M, whereas the zero voltage applied concentration, called open circuit penetration (OCP), is 1600 M. It is suggested that this OCP occurs not just through Donnan-forbidden leakage but perhaps mainly via ion exchange. Chromatograms and reproducibility data are presented for both isocratic and gradient chromatographies, using ion exchange latex-modified open tubular and packed monolithic columns.

Miniaturized Sample Preparation Approaches for Microseparation Techniques

Hian Kee Lee

Department of Chemistry, National University of Singapore, 3 Science Drive 3, Singapore 117543

We believe that it is ironic to continue to use conventional sample preparation procedures, such as liquid-liquid extraction and ultrasonication, for microseparation techniques. A lot of solvents are used each time for such procedures, which remain in widespread use. Over the past few years, our laboratory has focused on the development of miniaturized and solvent-minimized extraction methods combined with microseparation analysis. Some of these approaches that have been developed for applications in the environmental and biological fields will be described. These include single-drop microextraction (or solvent microextraction or microdrop liquid-phase microextraction (LPME)), hollow-fiber (HF)-protected LPME, polymer-coated HF microextraction, three-phase (liquid-liquid-liquid) HF LPME, solvent-bar microextraction, continuous flow microextraction, polymer-fiber microextraction, micro-solid phaseextraction, and variants of solid phasemicroextraction. All these methods are easy to implement, work well, generate good analytical data. They are also environmentally friendly and chemically sustainable. They are described in the context of their combination with microseparation techniques such as gas chromatography, liquid chromatography, and capillary electrophoresis for the determination of a wide variety of important analytes.

Interfaces between Real Samples and Microseparation Techniques

Janusz Pawliszyn and In-Yong Eom

Department of Chemistry, University of Waterloo, 200 University Ave. W., Waterloo, ON, Canada N2K 1W7

Microseparation techniques are prone to contamination from the sample matrix, when used with real samples. This occurs because small interfacial areas are available in these techniques as compared to larger format separations. Therefore, it is important to design efficient sample preparation approaches which eliminate interferences and other unwanted matrix components. In the talk, techniques developed in our laboratory based on membranes, coated fibres, and tubes will be discussed. Spot- and time-weighted average sampling and continuous monitoring will be discussed.

Integrating Automated Data Analysis with Optimized LC-DAD-ELSD-CLND-MS

Kenneth C. Lewis and Joseph D. Simpkins

OpAns, 4134 S. Alston Avenue, Suite 104, Durham, NC 27713, USA

In pharmaceutical R&D, three questions are consistently asked. (1) Is the expected compound present in the sample? (2) How pure is the sample? (3) How much is in the sample? These questions are particularly relevant early in discovery when hundreds of thousands of samples are being assayed and little is known about each sample. This presentation will describe our approach of developing optimized instrumentation consisting of sub 3  𝜇 m particle HPLC, diode array UV detection, evaporative light scattering detection, chemiluminesent nitrogen detection, and mass spectrometric detection. To translate the data from this system into results efficiently, we developed a software platform called “analytical studio." Combining the optimized instrumentation with automated data analysis has enabled our lab to efficiently analyze hundreds of thousands of samples.

Developing Robust Linear and Nonlinear Chemometric Tools for Online Content Uniformity Determination Using Near Infrared Spectrometry

Yusuf Sulub, Joseph Berry, and Busolo Wabuyele

Novartis Pharmaceutical Corporation, One Health Plaza, 401/A221C, East Hanover, NJ 07936, USA

In the past few years, the Food and Drug Administration and the chemical industry have supported an effort to apply process analytical technology (PAT) tools to improve both the control of the manufacturing process and the product quality. Near infrared (NIR) is one of the most used techniques to implement this initiative specifically for the content uniformity (CU) determination of solid dosage forms.

In this study, the performance of NIR multivariate calibration models generated using partial least-squares (PLS), artificial neural networks (ANNs), and support vector machines’ (SVMs) regression is compared for online content uniformity (CU) determination of pharmaceutical solid dosage forms. PLS is a well-known quantitative approach that utilizes Beer's law to linearly correlate measured NIR absorbance spectra with reference sample concentrations. However, when dealing with solid samples, where scattering is prominent, the optical path length may vary within a single measurement. This could ultimately cause the relationship between concentration and spectra to be nonlinear, hence leading to deviations from Beer's law. In these instances, linear multivariate calibration techniques such as PLS are not optimal for modeling nonlinear relationships.

A Novel Screening Method for Enzyme Activity and Enantioselectivity Using SERRS

Karen Faulds, Duncan Graham, Andrew M. Ingram, and Barry D. Moore

Department of Pure and Applied Chemistry, University of Strathclyde, 295 Cathedral Street, Glasgow, Strathclyde G1 1Xl, UK

Measurement of enzyme activity and selectivity at in vivo concentrations is highly desirable in a range of fields including diagnostics, functional proteomics, and directed evolution. Surface-enhanced resonance Raman scattering (SERRS) is one of the most sensitive spectroscopic techniques for molecular detection. The approach taken here was to design “masked” enzyme substrates that did not give an SERRS signal when added to silver nanoparticles. The masked substrates were designed to contain an SERRS active chromophore that was unable to complex to the enhancing metal surface due to linkage of a substrate through the surface complexing part of the dye. A range of substrates and masked dyes were synthesized for use with different enzyme classes. This allowed comparison of the masking ability of different substrates and the comparison of different dyes and their abilities to act as SERRS enzyme probes. The enhancing surface used was that of silver nanoparticles which provide excellent surface enhancement of Raman scattering and are easy to use.

The next stage in the analytical development of this approach was to establish the use of the unmasking of the dyes in an assay format. To this end, ELISA assays using an enzyme-conjugated antibody were established for human C-reactive protein. Calibration curves were obtained for standards to establish the analytical robustness of the approach, and clinical samples from patients were also successfully analyzed. In a further advancement of this approach, a multiplexed ELISA using two different SERRS dyes and antigens was successfully developed to demonstrate the versatility and advantages of this technique over conventional ELISAs. This is a significant breakthrough in the use of SERRS for bioanalysis, and demonstrate the advantages of the technique compared to those routinely used such as fluorescence.

Automation of Solid Phase Microextraction in High-Throughput Format Using Multiwell Plates and Applications to Therapeutic Drug Monitoring and Drug-Protein Binding Studies

Dajana Vuckovic and Janusz Pawliszyn

University of Waterloo, 200 University Avenue West, Waterloo, ON, Canada N2L 3G1

The automation of solid phase microextraction (SPME) coupled to liquid chromatography-tandem mass spectrometry (LC-MS/MS) has been demonstrated using a 96-multi-well plate format, SPME multifiber device, and a three-arm robotic system. This automated configuration is capable of performing all steps of the entire SPME procedure in parallel for all samples, thus drastically increasing sample throughput compared to other SPME-LC approaches. Extensive optimization of the proposed setup was performed including (1) selection of the best fiber coating, (2) optimization of the fiber coating procedure, (3) selection of the best stainless steel support, (4) examination of the need for fiber preconditioning and/or rinsing, (5) selection of the optimal calibration method, and (6) confirmation of uniform agitation in all wells. The performance of polydimethylsiloxane and bonded silica (C18 and C16-amide) coatings was compared. The reusability and robustness of both types of coatings without loss of extraction efficiency were demonstrated for > 50 uses. However, due to slow kinetics of mass transfer for polydimethylsiloxane coating, bonded silica fibers were selected for further use. The optimized automated SPME-LC-MS/MS platform was subsequently fully validated for the high-throughput analysis of diazepam, nordiazepam, lorazepam, and oxazepam in human whole blood. The proposed method allowed the automated analysis of > 200 samples per day, while achieving excellent accuracy and precision. This represents the highest throughput of any SPME technique proposed to date. Finally, an automated ligand-receptor binding study investigating the binding between diazepam and human serum albumin is reported.

Validation of a High-Performance Liquid Chromatography-Particle Beam Mass Spectrometry with Electron Impact and Glow Discharge Ionization Sources for Botanical Extracts

Joaudimir Castro, R. Kenneth Marcus, and M. V. Balarama Krishna

Chemistry Department, Clemson University, 219 Hunter Laboratories, Clemson, SC 29634, USA

Consumer interest in botanical products as dietary supplements has grown intensely because of their medicinal properties. For example, Ephedrae herba (Ma-Huang) is used in traditional Chinese medicine to reduce fever and treat cough and asthma. The validation of a high-performance liquid chromatography-particle beam mass spectrometry method (HPLC-PB/MS) with interchangeable electron impact (EI) and glow discharge (GD) ionization sources is described for the analysis of botanical extracts. More specifically, the proposed method was validated for the analysis of the ephedrine alkaloids using ephedra-containing dietary supplement standard reference materials (SRMs) 3241 Ephedra Sinica Stapf Native Extract and 3242 Ephedra Sinica Stapf Commercial Extract from NIST.

The ephedrine alkaloids were separated by reversed-phase chromatography using a phenyl column at room temperature. A linear gradient method with a mobile phase composition varying from 5:95 methanol: 0.1% trifluoroacetic acid (TFA) in water to 20:80 MeOH: 0.1% TFA in water at a flow rate of 1 mL/min with analysis times of less than 20 minutes, separation was monitored by absorbance at 254 nm. Detailed evaluation of the primary controlling parameters for the EI source (electron energy and source block temperature) and the GD source (discharge pressure and current) was performed to determine the optimal operating conditions by monitoring the intensities and the fragmentation patterns of the ephedrine alkaloids. Ephedrine and N-methylephedrine were taken as a representative of the test alkaloids. The eluting analytes are introduced into the particle beam interface to undergo nebulization and desolvation. Subsequently, dry analyte particles reach the ion source where they undergo vaporization and ionization. Previous studies have shown that the glow discharge ion source provides similar molecular fragmentation electron impact of each eluting compound; therefore a comparison between the EI and GD mass spectra acquired was feasible. The ephedrine alkaloids present in the SRMs were quantified by a standard addition method. Limits of detection in the single-nanogram level were achieved. It is believed that the HPLC-PB/MS method with EI and GD ion sources is a viable technique for the analysis and characterization of commercial extracts as well as regulatory compliance and quality control.

The Development of an Automated Forced Degradation and Method Validation Instrument

Roy Helmy, Timothy Rhodes, Lina Liu, Wes Schafer,  Elise Miller, Jia Zang, Margaret Figus, Bing Mao,  Scott Donenfeld, Carlos Perez, and Jim Caverly

Merck & Co., Inc., 126 East Lincoln Avenue, Rahway, NJ 07065-0907, USA

Forced degradation is one of the critical activities during drug development within the pharmaceutical industry. According to FDA ICH guidelines Q3B(R2), forced degradation studies are used for several purposes: (1) development and validation of stability-indicating methods to demonstrate specificity of separation methods as well as to gain some insight into the degradation pathways; (2) discernment of degradation products in formulations that are related to drug substances versus those that are related to nondrug substances; (3) helping facilitate formulation development, manufacturing processes, and packaging.

The FDA guidance on forced degradation is vague with respect to experimental conditions, and much of the detail for the investigations is left up to the pharmaceutical researcher. Even within Merck, forced degradation studies are performed using experimental conditions that vary from scientist to scientist and from site to site. In order to harmonize the procedures of forced degradation, a novel idea was proposed to automate forced degradation studies by the application of a laboratory automation tool.

The automated forced degradation approach has significantly reduced the amount of labor spent on manually performing tests, and harmonizes the operational procedures of forced degradation throughout Merck Research Laboratories. Ultimately, this technology has become a general platform approach to the method validation and stability testing workflow that has been utilized at remote sites and in other departments throughout Merck. The automated system is user-friendly, and is intended to be used as a “walk-up system" that is able to prepare forced degradation and linearity samples, to perform online HPLC analysis, as well as to generate reports automatically. The details of the system will be discussed along with a number of case studies demonstrating its use.

Pattern-Recognition Nanopore Sensor for Single-Molecule Detection of Biomolecules

Qitao Zhao, Deqiang Wang, Dilani A. Jayawardhana, and Xiyun Guan

Department of Chemistry and Biochemistry, The University of Texas at Arlington, 700 Planetarium Place, Arlington, TX 76019-0065, USA

Biomolecules play important roles in physiological activities of most natural systems. In this study, we demonstrated that identification and quantitative determination of biomolecules, typically present at very low concentrations in complex mixtures, could be achieved by the combination of single-molecule nanopore stochastic sensing with pattern recognition technique. In stochastic sensing, individual binding events are detected as transient blockades with sub-milli-second time resolution by recording the ionic current driven through a single nanopore at a fixed transmembrane potential. This approach reveals both the concentration and the identity of an analyte: the former from the frequency of occurrence of the binding events and the latter by its characteristic current signature, typically the dwell time of the analyte coupled with the extent of current block (amplitude) it creates. With an array of protein channels engineered with different functional groups, a pattern-recognition mechanism may additionally be used to differentiate one compound from the other. This can also achieve simultaneous detection of the individual components present in a complex mixture based on the collective responses of analytes towards individual component nanopores through reversal binding interactions. Our results show that a variety of peptides and DNA molecules, including those differing by one amino acid, can be differentiated using the pattern-recognition nanopore technique. An automated Nanopresensing technique offers potential as a laboratory tool for routine sensor applications.