This is the third and final summary of ICAR '96/Lab Automation authored by Peter Grandsard and Robin Felder
Automation to Enable Biotechnology Research by Steve Hamilton
Steve Hamilton described automation projects in progress at the Thousand Oaks and Boulder facilities of Amgen Inc. His presentation gave the audience an inside look at some of the systems currently being developed by Amgen. They are constructing vision systems for cell colony recognition and counting to guide robotic devices for bacterial colony picking. They are also developing automated enzyme linked immunoassays (ELISA) for molecular screening.
Modular Error/Exception Management and Machine Vision applied to Automated Sample Preparation by Peter Grandsard
The Consortium on Automated Laboratory Systems (CAALS) is developing a software algorithm that systematically manages unexpected events that might occur in an automated analytical procedure. This novel system, called the error and exception management module (EMM), may be equipped with a variety of sensor technologies to detect exceptions, including machine vision. Peter Grandsard addressed the following subjects: the compliance of the EMM to the CAALS specifications, the management functions and commands, the EMM operation schemes, the detectable external events, the classification of all these events, and the manner by which the EMM chooses an appropriate restoration procedure for a classified event. No specific examples of fully automated process recovery mechanisms were given, since the program is still in early development.
Can you Trust your Electronic Data? by Teri Stokes
Teri Stokes discussed the essentials of software validation. Automated systems that generate or process data should perform in a credible, reliable, and auditable fashion. Does the data make sense? Does a system do what it is supposed to do? Are the data handling procedures well documented? Different guidelines and government regulations deal with the quality of electronic data, but one can also establish internal policies to start building trust in the electronic data. Companies should allocate resources that become responsible for computer/data validation activities. It is recommended to make a lifecycle for each data handling system along with scripts to test it. Trust is documented evidence.
Modular Approach to Automated Organic Synthesis using DIVERSOMER Technology by Eleanora Hogan
Recent advances in high-throughput screening (HTS) have made the synthesis of new compounds the rate-limiting step. The DIVERSOMER technology involves the parallel synthesis of up to 40 structurally related compounds in one proprietary reactor module. Milligram quantities of drugs can be produced with each run. Essential peripherals to the DIVERSOMER equipment are a liquid handling robot and a Laboratory Information Management System (LIMS). The robotic system adds the reagents while the LIMS manages the large quantity of generated data. The LIMS also allows the data to link the results of the compound testing to initial reaction settings. Eleanora Hogan's presentation concluded with a video showing automated combinatorial chemistry with DIVERSOMER technology.
HTS: A Decentralized Approach by Paul Domanico
Paul Domanico talked about the management philosophy and strategies that exist at Glaxo-Wellcome USA. High throughput screening is a decentralized activity, but is supported by a core group that is responsible for assay development and automated HTS. One important management target is to increase productivity and efficiency of scientists (explorers or developers), instrumentation, and procedures. The implementation of liquid handling devices that can deliver many small volumes simultaneously should augment efficiency, as will sensitive and fast detection systems, and “cradle-to-grave” data processing. Another management goal is to validate instrumentation and novel technologies. To do so, the types of samples, liquids, detection systems, assays, schedulers, and data all have to be addressed.
Automated Compound Storage and Retrieval Systems: Providing a Backbone for HTS Programs by Tim Bruemmer, Vice-President R&D of Sagian Inc.
Faster HTS requires better data handling as well as the preparation and storage of more screenable compounds. Storage systems differ in physical space, temperature, air handling capability, or type of chemical exposure. The Sagian compound storage and retrieval system, called StoRetrieve™, consists of one articulating robotic arm to retrieve compound storage containers from storage cabinets, and a second robotic arm to pick compound samples from their containers and place them in microtiterplate wells or other appropriate vessels. The articulate arm is mounted on a rail to increase its working envelope. The software is Windows-based and controls the hardware, the library access, and the compound loading operation. Tim Bruemmer gave a short tour through the different software windows. The system can handle up to 180 requests and about 120 restores per hour. A “restore” means that the compound is returned to a cabinet drawer.
Selection Criteria and Implementation Strategies for Laboratory Robotics Applications by James Little
According to Jim Little, laboratory automation is no longer a scientific curiosity. It has become a business necessity. Lab automation is at the intersection of sample automation and data automation. Among the critical factors to the success of implementing automation are the commitment of the user organization to automation. The user organization should know the procedure to be automated well and should assign a champion to the project. Another critical factor is the state of automation technology. Sample automation extremes involve either standard workstations with little flexibility or custom systems with high flexibility. The degree of required flexibility should be guided by the application need. Drug discovery research is likely to use highly flexible custom systems whereas quality control of established procedures probably requires standard workstations. Finally, automation complexity (and hence cost) should be linked to the required automation benefit. Since the value of any business operation is the ratio between benefit and cost one should be careful to not put too much investment into complexity.
The Use of Multivariate Analytical Methods in the Design of Informative Chemical Libraries by Torbjörn Lundstedt
In general, Principle Component Analysis (PCA) is a multivariate method used to identify the least number of factors that cause the largest amount of variance. Specifically, PCA can be used to design a combinatorial chemical library containing a small number of molecules with a high chemical diversity. Imagine the reaction of 100 similar A-components with 100 similar B-components to yield a theoretical 10,000 C-components.
Using PCA, one can downscale this complex reaction system to a combination of, say, 10 A-components with 10 B-components, which gives a library of 100 different C-components. This much smaller system can be considered as a good representation of the functionalities of the theoretical 10,000 components.
An Automated System for High-Throughput Screening of Large Genomic DNA Libraries by Mark Russo
Mark Russo reported on a robotic HTS system that is used by Recombinant BioCatalysis, Inc to identify novel enzymes from large genomic DNA libraries created from organisms that flourish in extreme environments. There is definitely a commercial potential for enzymes with optimal activity at, for example, 100°C Enzyme assays involve a wide range of operational parameters and operations. The system automatically generates its assay scripts. The scheduler was created in-house. Other distinguishing features of the library screening system include the incubation and storage of large numbers of microtiter plates at temperatures between −4°C and 95°C.
The Development and Implementation of a Low Cost Robot System for the Determination of Dry Matter in an Environmental Laboratory by Ad Snijders
The presentation began with information on the current Dutch environmental legislation and the TAUW Milieu Laboratory. The company has begun to automate some of their routine environmental analyses, such as the determination of dry matter. This particular procedure currently uses a single motor XYZ robot to handle the sample containers. Validation of this robotization was presented. A video on the automated procedure was shown. The company has recently begun automating another analysis, the determination of loss on ignition.
Various Workstation Approaches to Automating Organic Synthesis by James Harness
James Harness discussed two different combinatorial chemistry workstations. The primary components of the first reaction workstation are a liquid handling unit and a 40-position reaction block with heating and cooling fixtures. Many parts are made of chemically inert materials such as Kel-F. However, the 5 ml to 10 ml reaction containers are glass. A second type of reaction workstation uses microtiter plate wells as reactors. The plates are in contact with a heating system and are placed in an argon atmosphere. They can also be shaken.
Dynamic Scheduling for Random Feeding of Samples and Process Recovery in a Robotic Environment by Alain Donzel
The purpose of scheduling is to minimize operation time and use of resources. Unlike static schedulers, a dynamic scheduler can rearrange the current schedule to process new samples, new procedures, or to let the process manager act on unexpected events during an on-going automated procedure. The process manager communicates with the dynamic scheduler, the user, and the robotic system. Lab operators can even enter their schedules into Scitec's scheduler CLARA so that the process manager knows when or if human intervention is possible. Alain Donzel stated that most process recovery procedures end up being human interventions.
Automation of Combinatorial Chemistry on Solid Phase by John Cargill of Ontogen Corp., Carlsbad, CA
John Cargill first presented a modular approach to solid phase combinatorial chemistry. The central piece of hardware in this approach is the OntoBlock, a unit consisting of 96 reaction vessels. The 2 ml vessels are made of a polymeric material. The chambers contain beads on which the synthesis takes place. Temperature and pressure are controlled: between −80°C and 100°C and up to 30 psi, respectively. Each block can interact with a series of universal docking stations, such as a synthesis station, an agitation station, a washing station, and a station where the reaction compounds are brought into microtiter plate wells. From there, the test components are either stored, characterized, or undergo HTS.
Characterization is done by high-speed electrospray mass spectrometry. This analysis only takes 15 seconds per sample, the software automatically confirms expected ions. The synthesis station uses reagent dispensers configured with coaxial needles to displace reagents by pressurization with inert gas. The modular software was written for library planning, reagent inventory, generation of chemical structures, compound registration, HTS, and data analysis.
A second aspect of automated solid-phase combinatorial synthesis is the encoding of the substrate. This allows easy recovery of particular synthesis products. Initially, Ontogen attached decodable chemical tags onto the beads, on the opposite side of the product/reagent molecules. However, this technique has several limitations. The attachment of chemical tags and the decoding are both inefficient processes. In addition, the tags may react with the other attached molecules. Ontogen currently uses a coding system that uses radio frequency (RF) encodable microchips. Each chip is pretuned to emit a unique binary code when pulsed with electromagnetic radiation. RF transponders are then coupled to synthesis sites on the substrate resin, giving each site a unique code. This talk won the TNO - Scitec award for best oral presentation.
Developing Performance Metrics for Automated Systems by Robert McDowall
Performance metrics for automated systems is a very new discipline. It is used to assess whether the expenditure of time, money, and other resources has any likelihood of payback for the organization. So far, only one paper has been published on the subject. According to Bob McDowall, the subject would perhaps receive more attention if it had the same impact as factory automation or occurred at the same scale. Performance metrics involves the following activities: transformation of the critical success factors into quantitative performance parameters, the definition of the minimum levels of acceptable performance before the start of the project and the measurement of vendor deliverables.
On-Line Receptor Screening in Drug Discovery by Hubertus Irth
The presentation dealt with the integration of separation and detection technology for drug screening. A compound separating Liquid Chromatography (LC) system is directly coupled to a continuous-flow fluoro-receptor assay for the screening of estrogenic ligands. The steroid binding domain of the human estrogen receptor is continuously added to the effluent of the LC column.
A fluorescent estrogen is added next to monitor the binding of the separated estrogenic ligands to the receptor. A fluorescence detector is placed downstream. One has to make sure that the assay does not take too long, since long assays may result in ligand remixing due to extra band broadening inside the flow system.
COMMERCIAL DISCLAIMER
Certain commercial equipment, instruments, and materials are identified in this review to summarize adequately the presented papers. Such identification does not imply recommendation or endorsement by NIST or the ALA, nor does it imply that the equipment, instruments, or materials are necessarily the best available for the purpose.