New FDA guidance on data integrity and cGMP compliance

Published: 26-Sep-2016

In light of 21 CFR Part 11 violations related to the reliability and accuracy of ‘dynamic’ electronic records, this article outlines the security and procedural controls needed to maintain unobscured and reliable analytical data throughout the data lifecycle

You need to be a subscriber to read this article.
Click here to find out more.

Earlier this year, the US Food and Drug Administration (FDA) distributed draft guidance for comment purposes entitled Data Integrity and Compliance with cGMP, with applications in drug evaluation and research, including biologics as well as relevant veterinary medicines.

This guidance follows the issuance of MHRA GMP Data Integrity Definitions and Guidance for Industry in March 2015 and, similarly, is meant to define and clarify the role of data integrity according to 21 CFR Parts 210, 211 and 212 (Current Good Manufacturing Practices – cGMPs), giving careful consideration to the narrow scope and application of the FDA’s Center for Drug Evaluation and Research (CDER) 21 CFR Part 11 (Electronic Records; Electronic Signatures), pending a re-examination of Part 11 as it applies to regulated products.

The guidance was issued as a result of ‘increasingly observed cGMP violations involving data integrity during cGMP inspections,’ stating further that ‘these data integrity related cGMP violations have led to numerous regulatory actions, including warning letters, import alerts and consent decrees.’ Once finalised, the guidance will reflect the agency’s current thinking in establishing a data lifecycle approach that is compliant with cGMP requirements.

The FDA became increasingly aware of these data integrity violations during inspections outside the United States, which revealed fraudulent data manipulation. Consequently, the agency is also increasingly looking for specific controls to be put in place to prevent the inadvertent modification or deletion of cGMP data owing to a general lack of control surrounding data generation. These potential circumstances arise in such instances when there is a use of generic user accounts, uncontrolled operating system environments, lack of administrative procedural control or a lack of secured and specified group privileges. These situations may provide proximal cause for the agency to view cGMP data as adulterated.

FDA: expanding global activity, increasing scrutiny

The FDA has taken notice of how the world has become increasingly dependent on India and China for API (active pharmaceutical ingredient) production and, as a consequence, has expanded its global presence and is inspecting foreign API and raw material manufacturers with increasing scrutiny. As part of this effort, the agency has issued warning letters specific to data integrity more frequently for non-compliance, and to more companies under cGMP governance. The agency has also banned imports for API raw ingredient manufacturers directly related to so-called ‘repeat data integrity offenders.’

The numbers show how this has come to become a significant issue. For instance, from 2010 to 2012, the FDA cited five drug manufacturers for data integrity violations. However, since 2013, more than 24 data integrity specific violations have been presented to drug manufacturers, with ten warnings coming in 2015 alone. Most recently, data integrity violations have often occurred during quality control, as highly complex analytical instrumentation, often controlled by a standalone computer or enterprise system, allows for the increased risk of data deletion and fraudulent manipulation.

Maintaining data integrity during quality control

Data integrity during the quality control process presents a broad range of problems, as the data must be managed from CMC (chemistry, manufacturing and controls) throughout the product lifecycle, inclusive of the manufacturer’s quality control systems, as well as those of contract product and development support companies, including raw material suppliers and contract testing partners. Most data integrity violations cited in the warning letters issued more recently by the FDA have related to specific technical control as laid out in 21 CFR Part 11, and should be carefully noted when configuring computerised systems as part of the procurement and validation processes.

The FDA has highlighted the following requirements within the guidance, as they encompass a lifecycle approach to data integrity and ensure the accuracy of the data throughout their lifecycle:

  • 211.68: requiring that ‘backup data are exact and complete’ and ‘secure from alteration, inadvertent erasures or loss’
  • 212.110(b): requiring that data be ‘stored to prevent deterioration or loss’
  • 211.100 and 211.160: requiring that certain activities be ‘documented at the time of performance’ and that laboratory controls be ‘scientifically sound’
  • 211.180: requiring that records be retained as ‘original records, true copies or other accurate reproductions of the original records’
  • 211.188, 211.194 and 212.60(g): requiring ‘complete information, complete data derived from all tests, complete record of all data and complete records of all tests performed.’

Within the guidance, the FDA defines ‘data integrity’ and core principles surrounding compliant data in a question and answer format, and a number of recommendations have been made by the agency to avoid future violations. First, companies must establish technical, administrative and procedural controls to ensure that data are complete, consistent and accurate throughout their lifecycle in accordance with ALCOA requirements (data must be Attributable, Legible, Contemporaneously recorded, Original or a true copy and Accurate). This also applies to metadata, as these often include electronic record-specific audit trails, methods and sequences, as well as instrument configurations, date and time stamps, user IDs, and system administrative and security configurations.

Additionally, audit trails should capture all events associated with data creation, modification and deletion, be contemporaneously time-stamped with the associated auditable action and be attributable. System users should not have the ability to delete or modify audit trails. These audit trails must be reviewed on a periodic basis by those responsible for electronic record review and the items included in an audit trail should be those of relevance to permit reconstruction of the process or activity.

Furthermore, data must be backed up (with temporary backup used in disaster recovery procedures) and archived as a true copy of the original data throughout their lifecycle. The data residing in backup and archive locations must be stored securely and should employ the same integrity controls as those existing for ‘raw data’. Data retrieval should be controlled so that data deletion or manipulation cannot be performed during backup and retrieval.

Other requirements provide that cGMP computer systems must be validated for their intended use, including all workflows required by the system, and hierarchical access control must be implemented to restrict access to the system to only those individuals that regularly use it. Privilege roles for each specific system user must be defined and appropriately configured to prevent those that generate and ‘own’ the data from obscuring or deleting any data or metadata. Furthermore, users must provide their own unique login credentials to access the system and/or provide review or authorisation signatures electronically.

How to comply with the new guidelines

To effectively comply with the FDA guidelines, companies should, at a minimum, assess the current state of their data integrity compliance and remediate computerised system control issues through clearly identified system roles, system validation, identifiable audit trails, and safe and secure backups. It is crucial that there are clearly identified system roles within systems that allow access only to those individuals that are authorised and use them. This is as routine as defining unique user IDs and passwords, and the specification of user group privileges.

Any system must be validated and challenged during the qualification process to verify that users cannot go beyond a specific sequence of operation, such as signing electronic data before it is saved. In addition, commercial, off-the-shelf software is likely to have an export functionality to common non-compliant software residing on the instrument computer, making it possible for an analyst to modify results. This possibility must be carefully monitored, as import events may not capture changes to the data file.

To recreate data and detect data integrity issues, audit trails are critical. However, not all application software provides for a history of user creation, security setting changes, etc. In these cases, it is useful to utilise operating system event logs, which can be secured from modification or deletion and provide general information regarding data creation, modification and deletion. The FDA recommends that these audit trails be reviewed periodically. Although this can be overwhelming to implement, it is often useful to outline audit trail navigation and implement procedural control to find data creation, modification and deletion events within system-specific audit trails.

Data produced by standalone systems are often stored on the instrument computer hard drive prior to backup. It is imperative that this environment be secured from unauthorised modification or deletion from the data directory by employing user groups specific to the operating system and with assigned permissions. Companies should have procedural controls regarding how they handle data integrity deficiencies, which might include hiring third-party auditors or upgrading the system when applicable. Data integrity training and procedures are also good risk mitigation practices that companies can take to reduce the risk of data integrity violations. Specifically, procedural controls outlining how data are securely stored, backed up and handled in a disaster recovery event is recommended, and a naming convention for how data are saved to prevent them being overwritten or obscured is desirable.

New practices: significant impact on data integrity

If implemented, the guidance could have significant impact regarding how data integrity issues are handled within companies. Although the agency provides for ‘flexible and risk-based strategies’ to prevent and detect data integrity issues, such detection and management strategies can be costly. Companies should implement a risk-based approach to data integrity including classification of ‘static’ and ‘dynamic’ systems and the harmonisation of system application software (when available) to reduce procurement and validation costs.

Effective and frequent audit trail reviews are helpful in detecting data integrity issues; however, firms should have procedures in place to identify and implement corrective and preventive actions once data integrity issues have come to light. The FDA provides the following examples in remediating data integrity issues: ‘hiring a third-party auditor, determining the scope of the problem, implementing a corrective action plan (globally) and removing at all levels individuals responsible for problems from cGMP positions.’

FDA: enforcing its guidelines to ensure best practice

When finalised, the guidance will represent FDA’s current thinking on regulatory and data integrity compliance. As always, the agency plans to enforce this guidance, along with existing regulatory specifications, via inspections to decide whether cGMP violations involving data integrity have occurred and have been appropriately remedied. Comments, concerns and questions specific to this draft guidance should be directed to the appropriate regulatory agency within FDA — be it the CDER, the Center for Biologics Evaluation and Research (CBER) or the Center for Veterinary Medicine (CVM).

Relevant companies

You may also like