Instrument integration

Published: 1-Aug-2004

Technological advances have led to a proliferation of analytical data that must be managed effectively. Colin Thurston, of Thermo Electron Corporation, discusses how Laboratory Information Management Systems (LIMS) integration can help


Technological advances have led to a proliferation of analytical data that must be managed effectively. Colin Thurston, of Thermo Electron Corporation, discusses how Laboratory Information Management Systems (LIMS) integration can help

Advancing technology in laboratory instrumentation along with higher-throughput processes are generating massive volumes of analytical data - even terabytes per project. The challenge is to automate and integrate laboratory operations and procedures wherever possible in order to manage and process these growing data volumes more effectively. A LIMS (Laboratory Information Management System), like any other information system, becomes more responsive and productive when it is integrated with other systems and applications in the laboratory environment. Integrating LIMS with analytical instruments not only automates laboratory functions but also provides data-sharing benefits that add value to the original investment.

LIMS implementation tends to be a high profile undertaking, and effective instrument integration can provide a positive and visible early milestone. Achieving this integration, however, demands not just specific product functionality but a close partnership between the LIMS vendor and the customer.

Traditionally, instrument integration products were designed for single PC operation and struggle with today's networked operations, leaving a clear need for a more flexible, 'scalable' solution. Alternatives are emerging that employ advanced technology and architecture that provide considerably more efficient and effective integration of LIMS with laboratory instrumentation.

Instrument integration systems acquire data automatically, with the primary goal of eliminating errors in transcription and reducing the amount of time needed to manually write or type results. Most systems in the pharmaceutical industry were implemented to ease compliance with regulatory protocols, such as 21 CFR Part 11. Further development has meant that some systems offer instrument connectivity that involves very little coding to be installed and, more importantly, the fault tolerance necessary for 24-hour production environments.

Systems are usually capable of connecting to most instruments that have either an output transmitting ASCII or that can export data as an ASCII file. Also, each PC that runs the instrument integration software is able to connect multiple instruments so that the number of PCs that can be connected to the LIMS is effectively unlimited.

costly specifications

But these types of solutions, often termed 'thick client' solutions by informatics vendors, tend to fall short in a number of areas and can struggle, particularly in laboratories in a high throughput environment. They were not originally designed to handle huge volumes of data from the latest versions of highly sophisticated instruments.

Scalability of hardware can also rapidly become a problem. Each PC dedicated as an instrument workstation needs a full system installation - capable of data acquisition, processing and interchange to the LIMS, while also providing the client interface for the user. This inevitably requires a costly, high specification machine at each workstation. Also, logistical problems may be encountered when integrating instruments 'down the wire', in which case a high specification PC would need to be within the radius of the maximum cable run from each instrument.

In a laboratory with 20-30 instruments spread across a number of locations, providing the administration and security for all workstation PCs can be a major undertaking. Configurations and upgrades of any individual element of the software can require a full installation of the application, including the necessary configuration and subsequent validation.

To cover for events such as hard disk failure, an administrator would require a back-up strategy for each workstation. Maintaining system security and eliminating risk of data loss becomes a significant management challenge - occupying resources that could be more productively allocated elsewhere.

new approaches

A more flexible and scalable instrument integration solution would incorporate an architecture that separates the collection and processing of instrument data into separate software components. In situations where high levels of data processing and modern, reliable networking are required, these software components could be deployed in a distributed configuration. For many global organisations, server-based data processing is the preferable option.

A new approach to integrating LIMS with instrument integration includes a PC that holds a series of instrument file collectors with the workstation and instruments. The function of these collectors is configured at the central server, which instructs each to look for either a data file from a network or hard drive or a stream of data via an interfacing port.

For example, the server can be configured to direct a collector to listen for a stream from a specific port, or to be ready for a particular stream of data and to stop at the termination of the stream. In addition, configuration of the server controls the scheduling of the transfer of the instrument data from the collector. No processing whatsoever need be carried out at the instrument workstation.

As analytical instruments evolve with new technology, collectors may be added to support different types of data output. For instance, access to data via XML (eXtensible Markup Language) based files are becoming more prevalent within pharmaceutical QA/QC laboratories. Connecting these instruments to the LIMS could be as simple as installing a new collector.

An architecture utilising a central server within a dedicated, secure, IT department-controlled strong-room leads to management efficiencies in a number of areas. For example, it provides a single location for updates, for configuration of all parsing scripts (which extract the data from the instrument output) and all mapping scripts (which correlate the data from the instrument to the correct fields in the LIMS).

The single central server also enables the back-up routines and security to be centrally administered and provides the ability to build in hardware fault tolerance, redundancy and back-ups at a single location. In traditional architecture for LIMS/instrument integration architecture, all these critical issues would need to be addressed at each workstation individually.

Taking PC theft as an example, in a workstation-based application a full installation of the application would be required on a replacement PC plus a full configuration determining from where data is to come and in what format (instrument file, COM-based instrument, etc). Reconfiguration of all the parsing and mapping scripts would also be necessary.

This would either result in significant down time, or require laborious manual transcription of results while waiting for the instrument to be reconnected.

Contrast this scenario against a solution using scalable architecture, where a replacement preconfigured PC can be quickly brought online. Only the collector component needs to be installed, along with any local parsing and mapping configurations, for the system to be up and running again.

Once the server is directed to the new collector or series of collectors, instrument capture can be restored immediately. Furthermore, savings can be made since only low-grade PCs would be required in this new architecture.

Modern LIMS/instrument integration solutions have been designed to address the data management and security requirements of today's laboratory, such as compliance with 21 CFR Part 11. Organisations can configure different roles and groups for individuals inside and outside the lab, assign appropriate functionality and system menu options to groups or specific end-users, and prevent unauthorised users from accessing specific areas of the system.

Integrated security, password protection, audit trail and electronic signatures allow multi-user access, while ensuring each piece of information is accessible only by authorised personnel.

The trend for larger LIMS vendors is toward developing and implementing their own instrument integration solutions as alternatives to separately sourcing third-party products. This approach, taken by Thermo Electron Corporation with its flagship LIMS, SampleManager, provides additional value to existing and future installations.

A standard user interface is utilised for all types and brands of instruments, providing consistency throughout the lab and the organisation, reducing training requirements and accelerating implementation. Automating the transfer of results from analytical instrumentation to SampleManager is eliminating manual typing of results with its associated transcription errors.

Now, even in very high throughput environments, analytical information from sophisticated instrumentation can be delivered accurately and quickly to laboratory management and other authorised decision-makers, increasing productivity in the laboratory and lowering the cost of ownership for LIMS.

You may also like