Data integrity is fundamental in a pharmaceutical quality system which ensures that medicines are of the required quality.
A robust data governance approach will ensure that data is complete, consistent and accurate, irrespective of the format in which data is generated, used or retained.
An increased focus on data integrity and governance systems has led to serious consequences for several companies. This is the first of a series of 3 posts which will explore elements of organisational behavior and system design which can mean the difference between data integrity success and failure.
One of the top global issues reported in the pharmaceutical media over the past 2 years has been data integrity. Regulatory actions resulting from data integrity failures have led to the withdrawal of supply across multiple markets, product recall, and serious reputational damage for those companies concerned. However this hot topic is not a new requirement, as basic data integrity principles are already described in international good manufacturing practice guidance.
There is a general misconception that data integrity failures only result from acts of deliberate fraud. Yet in the collective experience of my colleagues and me, the majority of issues relate to bad practice, poor organisational behaviour and weak systems, which create opportunities for data to be manipulated. However there is a way for companies to navigate the troubled waters of data integrity deficiencies by taking some basic behavioural, procedural and technical steps to significantly improve their systems.
Impact of organisational culture: is your company behaving well?
The impact of organisational culture and senior management behaviour on data governance must not be underestimated. Indicators with relevance to data governance provide a measure of the workforce’s understanding and reporting behaviour, combined with the management’s receptiveness to ‘bad news’. Is error or system failure reported as an opportunity for improvement, or is there a mind-set around ‘not wanting to cause trouble’? To remove the incentive to manipulate, re-create or amend data, the managerial response to ‘bad news’ must be fair and consistent, and not based on a fear of consequences.
‘Led from the top; empowered from below’
Organisational culture is not just addressed by senior management putting the right words in a mission statement. I have seen that communicating expectations clearly to staff at all levels in the company, and then living by these principles, is the key to success. Leadership, engagement and empowerment of staff at all levels in the organisation can then combine to identify and deliver systematic data integrity improvements where good practice becomes automatic.
As the philosopher Aristotle observed:
“We are what we repeatedly do. Excellence, then, is not an act but a habit”.
The data lifecycle
With support from the correct organisational culture, the next important element of successful data governance is to understand the data lifecycle. This will enable the implementation of a system which is designed to assure the integrity of data throughout its life, beyond the limitations of data review.
The data lifecycle considers all phases in the life of the data, from initial generation and recording, through processing, use, archiving, retrieval, and (where appropriate) destruction. Failure to address just one element of the data lifecycle will weaken the effectiveness of the measures implemented elsewhere in the system.
Establishing data criticality and inherent integrity risk
In addition to staff training and implementation of data integrity policies, consideration should be given to the organisational (eg procedures) and technical (eg computer system access) controls applied to different areas of the quality system. The degree of effort and resource should be commensurate with data criticality (how it is used) and inherent risk (how it is generated).
Data which relates to critical process control, batch release decisions or longterm stability may have significant impact to product quality. Other data, while of relevance to the operation of a GMP compliant facility, may be of lower criticality.
The way in which data is generated will influence the inherent data integrity risk. Data may be generated by a paper-based record of a manual observation or, in terms of equipment, a spectrum of simple machines (eg pH meters and balances) through to complex highly-configurable computerised systems (eg HPLC and ERP systems). The inherent risks to data integrity will differ depending upon the degree to which data generated by these systems can be configured, and therefore potentially manipulated.
Our inspectorate finds that manufacturers typically focus data integrity and validation resources on large and complex computerised systems, while paying less attention to other systems with apparent lower complexity. Whereas simple machines may only require calibration, the data integrity risk associated with systems linked to user configurable software (eg PLC-linked production equipment and infra-red / UV spectrophotometers) can be significant, especially where the output can be influenced (modified or discarded) by the user. Without well designed controls it may be possible to manipulate data or repeat testing to achieve a desired outcome with limited opportunity of detection.
More detailed guidance on data integrity expectations, which builds on the behavioural issues, has been published by MHRA . My next post in this series will look at ways in which systems can be designed to assure data quality and integrity.
Don’t miss the next post, sign up to be notified by email when a new post comes out on the Inspectorate blog.
Check out our guidance on good practice for information on the inspection process and staying compliant.