Skip to main content

Good Manufacturing Practice (GMP) data integrity: a new look at an old topic, part 2

Posted by: , Posted on: - Categories: Compliance matters, Good manufacturing practice

Data integrity is fundamental in a pharmaceutical quality system which ensures that medicines are of the required quality. A robust data governance approach will ensure that data is complete, consistent and accurate, irrespective of the format in which data is generated, used or retained.

This is the second in a series of 3 posts exploring the impact of organisational behaviour and procedures on reliable, consistent and accurate data in medicines manufacture. The first post in this series looked at the impact of organisational behaviour.

Designing systems to assure data quality and integrity

Data charts on a screen

A mature data governance system adopts a ‘quality risk management’ approach across all areas of the quality system. It requires continuous review, proportionate risk-reduction measures, and an understanding of residual risk across the organisation. Despite recent high-profile regulatory cases regarding falsification of analytical data, the collective experience of the MHRA Inspectorate is that data governance is not limited to laboratories or computerised systems. There are opportunities to strengthen both paper and computerised elements of the data lifecycle.

A useful acronym when considering data integrity is ALCOA; data must be attributable, legible (permanent), contemporaneous, original and accurate. The expectations for designing systems which reduce opportunities for data integrity failure are described in more detail in guidance published by MHRA. Simple (and often low cost) system design can have significant impact on the success of data governance. Some are included below as indicators of the ALCOA principles.


The identity of the person completing a record should be unambiguous. The use of aliases or abridged names should only be permitted where this is consistently used, and attributable to an individual. The same alias or IT system log-in which cannot differentiate between different individuals should not be used.

Legible (permanent):

It should not be possible to modify or recreate data without an audit trail which preserves the original record. It is important not to forget paper records in this context. Blank forms for manual recording of data should also be controlled in a manner which prevents unauthorised re-creation.

A pile of papers and folders
It is important not to forget paper records

Exceptionally, there may be a valid reason to re-create a record, eg where it has been damaged beyond use, or where an error does not enable a GMP compliant correction of the original. This must be managed through the quality system, either by making a ‘true copy’ (verified as being a true replicate of the original), or by re-writing a new copy and retaining the original as evidence. In all cases, this must be approved through the quality system, with QA oversight and justification for the action.

It is generally accepted that correction fluid is not acceptable in GMP areas. However, companies may be unaware that their computerised systems often have ‘data annotation tools’ enabled. These permit changes to data which can alter the appearance of reports, and may not have a visible audit trail. From a practical perspective, this is ‘electronic correction fluid’, and should not be permitted.


System design has significant impact upon contemporaneous record keeping. The availability of records in the right place at the right time removes the need for staff to use loose scraps of paper, or their memory, to retain information for retrospective completion in the official record.

When inspecting packaging operations, I still find it a common approach for manufacturers to use a single batch packaging record (BPR) for blistering and cartoning of a solid dosage form. However, if the BPR is located in the secondary packing area, it is impossible for staff in the primary packing area to make contemporaneous records, and vice versa. The BPR may also require periodic checks, such as equipment performance. Specifying exact time intervals (eg ‘every 60 minutes’) may result in an incentive for staff to ‘back date’ the time of the check if they were occupied at the exact time the activity was required. The system is encouraging staff to falsify the record, particularly if there is concern that missing an exact time point might lead to disciplinary measures.

Various medicines in blister packs
Splitting the BPR into 2 parts (primary and secondary) encourages the correct behaviour

This can be addressed by 2 simple changes. Specifying an acceptable window for completion of the activity (eg.‘every 60 ±5 minutes’), and splitting the BPR into 2 parts (primary and secondary) encourages the correct behaviour, and removes both opportunity and incentive to falsify the record.



Original records must preserve data accuracy, completeness, content and meaning. Metadata (data about data) is vital in this aim by enabling reconstruction of an activity – who did what, where and when. There are certain limitations in relation to file formats which may not maintain the full metadata record; so-called ‘flat files’ such as .pdf, .doc etc. We may know who created the file, and when, but there may be no information on how, when or by whom the data presented in that document was created, processed or amended. There is therefore an inherently greater data integrity risk with flat files, as they are easier to manipulate and delete as a single record with limited opportunity for detection.


Automated data capture, with the required IT controls, provides greater control over the accuracy of a record. Where automation is not possible or feasible, real-time second operator verification of quality-critical observed values may be necessary.

Data review must include a review of raw data in its original form. If access to electronic raw data is not possible remotely, this is a good opportunity for the reviewer to escape the confines of their office. Reviewing paper copies or flat file reports of electronic data, even from a validated secure system, is unlikely to enable detection of anomalies. This is because the preparation of reports still requires operator intervention, which can influence what data is reported, and how it is presented.

The final post in this series will look at the recurring problem of ‘trial analysis’, and ways in which organisations within the supply chain can take steps to build confidence and reliance on each other’s data.

Don’t miss the next post, sign up to be notified by email when a new post comes out on the Inspectorate blog.

Check out our guidance on good practice for information on the inspection process and staying compliant.

Sharing and comments

Share this page


  1. Comment by Jyoti posted on

    How can personnel integrity be inducted and checked since the records may be complete but whether the tests were performed or not still is another question. I witnessed this in one of my clients lab.

    • Replies to Jyoti>

      Comment by David Churchward posted on

      Personnel integrity is a complex issue, influenced by various behavioural factors which may alter over time as a person’s circumstances change. It is therefore important that while organisations should be able to trust the integrity and expertise of their employees, this is no substitute for well-designed data governance systems, robust data checking and senior management oversight.

      Data integrity failure due to inadequate personnel understanding of the impact to product and patient should be addressed by initial and on-going training. This training should also foster the correct behavioural environment within the organisation, encouraging open reporting of errors and non-conformances without fear of retribution.

      In the event of a detected data integrity issue, the root cause(s) of failure may be identified by considering:
      - What was the person’s incentive for falsifying the record?
      - What was the person’s justification for doing so (why did they think that it was acceptable to do it)?
      - How did the opportunity arise which enabled the person to falsify the record?

      The situation described where ‘records are complete’ but testing may not have been performed should be detectable by checking raw data (including metadata such as audit trails) and material reconciliation. Verification of acceptable data governance at contractor and supplier organisations is an important aspect of data integrity throughout the supply chain; this will be discussed in the third blog of this series. Contract givers should include data integrity verification as part of their external audit programme, and give a clear message regarding their expectations regarding data governance.

      Personnel are critical to the success of the data governance system, however personal integrity is very difficult to assess prospectively. Within a robust data governance system, organisations should consider an approach summarised by the proverb “Trust, but verify”.

  2. Comment by Obaid Ali posted on

    Very nicely written and responded. Thanks indeed

  3. Comment by HLongden posted on

    Some excellent observations in here. Thank you for sharing your insights.

    In my experience, data annotation tools should be avoided if printed reports are used for review and approval with pen signature, however, those I have come across, cannot affect electronic records or electronic reports used for e-signature. It is important to have strong SOPs around data annotation use and to include a risk assessment around how reports are leveraged in the quality process. If appropriate, they should certainly be disabled.

    • Replies to HLongden>

      Comment by David Churchward posted on

      Thanks for your comment. Data annotation tools can alter summary reports, whether printed or viewed on screen. It makes sense to also disable annotation tools for fully electronic reporting, as it reduces the opportunity to alter the appearance of data. Just to note: when wet approval signatures are used, data review should always include a check of electronic raw data, even if annotation tools are disabled. This will help to identify any data which has been amended or not reported.

  4. Comment by Subida subash posted on


  5. Comment by Nabin posted on

    very useful discussion sir

  6. Comment by David Phelpz - posted on

    This is something interesting. Data integrity issues high risk and are not always easily measureable. As electronic data recording and management systems are applied instead of paper systems, requirements must be authenticated to mitigate any risk for data to be manipulated electronically.

    <a href="">David Phelpz -</a>

  7. Comment by Adi posted on

    Hello Sir, A very good afternoon and a happy new year

    Just a quick question. Is there a set of controls which MHRA has prepared, where applying all of them can automatically make us compliant to the MHRA data security standards.

    • Replies to Adi>

      Comment by David Churchward posted on

      Thanks for your question. In common with most GMP requirements, there is no 'standard' set of rules which will ensure data integrity compliance. An organisations business processes, management culture and procedural / IT control measures all affect data governance. Referring to the MHRA's data integrity guidance (referenced in the data integrity blog) will help you to implement a system which meets the requirements and also works within your business needs.

  8. Comment by aba posted on

    Nice information sir