Skip to main content

https://mhrainspectorate.blog.gov.uk/2017/07/04/glp-inspections-metrics-reflections/

GLP Inspections metrics reflections

Posted by: , Posted on: - Categories: Compliance matters, Good laboratory practice

The collation of the 2015 inspection metrics data has highlighted some common themes which I am going to illustrate by sharing some examples with you. All of the situations below have been observed within the past 12 months and have resulted in deficiencies (some major) being raised. The aim of sharing these examples is for organisations to identify if these issues could occur at their own facilities and allow them to pro-actively improve compliance.

Documenting decisions

I will start with some thoughts about documenting decisions and follow with an explanation. The majority of people reading this post will be doing so via a computer at work, you’ve probably sent a few emails before doing so, how many of those emails would you say captured the true meaning of what you intended and did you have to rewrite part of the email or add extra detail? - So why is this applicable to GLP studies?

I would ask you to imagine a scenario where you would need to invalidate an assay based on a justifiable scientific reason that is not covered by your acceptance criteria. Has this been documented and made clear and can someone independent determine why the assay has been invalidated? I have been on a number of inspections this year where decisions to invalidate assays have either not been documented or documented simply as “assay invalidated” and it has not been possible to reconstruct on which basis the decision has been made. This can be particularly problematic with large datasets which contain a lot of steps in assay conduct.

Examples that I have seen in the past year alone including:

  • Systemic poor documentation (sometimes combined with discarded raw data). This has been noted on analytical chemistry systems in which paper was defined as raw data. On a number of occasions the paper data had been discarded as it was not felt necessary to retain as the assays had been invalidated. This resulted in major findings being raised.
  • Decisions made by to invalidate analytical runs by analysts with no awareness from the Study Director. As the Study Director is the single point of control for the study they need to be aware of all decisions made.
  • Decisions to accept data when acceptance criteria has not been met with no explanation or justification. This process converts an invalid assay to a valid assay and could be deemed as selective reporting of data without adequate explanation. This has resulted in significant findings being raised when identified.
  • Reprocessing of QC checked data without explanation. In each situation errors in the original processing and QC were identified but the original study package did not contain explanation to reconstruct this event.

In the majority of cases the Study Director or /Principal Investigator can explain situations to an inspector, but if they leave the company the ability to reconstruct the study leaves with them.

So why am I raising this point? - Because it’s easy to address. Spending a few minutes fully documenting a decision saves time explaining the decision to a QA auditor, an inspector and the time associated with writing and responding to any audit and inspection deficiencies. It is also worth considering the potential effect of poorly explained decisions; an isolated incidence may have minimal effect, but consistently poor documentation could result in the inability to reconstruct a study leading to the study being out of compliance which could have a wider impact.

Study Director/Principal Investigator oversight

Lab worker alongside lab equipement including test tubes, bunsen burner, timer and microscope

Moving on, let’s discuss Study Director/Principal Investigator oversight with a particular focus on the oversight of multisite studies. As we know the Study Director has responsibility for the conduct of that study and the Principal Investigator acts on the behalf of the Study Director for their delegated phase. How does the Study Director demonstrate oversight?

If you as the Study Director are involved in performing procedures it is easy to demonstrate oversight, but what if this is outside of your department? Do you go and oversee key procedures within the study (e.g. dosing/test item application) or do you review all the data generated?  This needs to be considered as inspectors will look for evidence of oversight during the conduct of the study. This is particularly important for multisite studies to demonstrate appropriate communication between the Study Director and Principal Investigator. The primary form of communication is likely to be email, however when teleconferences are conducted it is important to make notes on the topics discussed, who was involved and any decisions taken. This could be followed up with a summary email if felt necessary.

Email retention

This brings me nicely onto the subject of emails. There are varying levels of retention of email within study packages, and I have been asked a number of times how much is enough? The study package needs to contain all pertinent emails to the conduct of the study particularly around management of any issues on a study. A common example seen recently is the following:

  • Extension to the stability of an analytical reference standard communicated via email. The email was not filed in the study package therefore the dates in the report did not correspond to the archived information.

There have also been isolated incidences of extension to the test item stability (formulated and unformulated) which have resulted in the same situation due to inappropriate documentation.

Escalation processes and management intervention

Data graphs on a laptop and reports

Now I would like to talk about a couple of examples of deficiencies with a similar theme – that of a lack of escalation to test facility management for resolution. Over the past year I have identified a number of QA audit reports that have remained open for prolonged periods. The root cause of this has generally been inability to agree on corrective actions. In each case test facility management were not aware of the situation so had no oversight to ensure the issue was resolved and as a result the audit reports had remained open for 6-12 months. One particular example in the past year occurred when an audit report had remained open for 3 years. I think we can all agree that this is not an appropriate timescale and a major deficiency was raised as a result. With respect to QA there should be a process and a timeline for escalation to management if audits are not satisfactorily resolved to ensure situations like this do not occur.

So where else might test facility management intervention be needed? Facility management are responsible for ensuring that technically valid SOPs exist within a facility. A number of deficiencies have been raised recently for SOPs which have exceeded their review date by over 12 months. It is inevitable that some SOPs will not be reviewed within the required timeline. Management need to be aware of the overdue SOPs so that they can intervene as necessary and ensure that the number of overdue SOPs does not become excessive. A major deficiency was raised at a facility this year as a result of ~60% of the required SOP updates not being performed, this had occurred over a period of 12 months, resulting in a significant number of overdue SOPs and outdated procedures.

Risk based approach

Lastly I would just like to give an update on the risk based QA approach. The GLPMA gave a presentation at the 2015 symposium on a risk based QA system and we have been encouraged by the number of facilities adopting this approach.  As a result of inspecting a number of risk based system over the past 18 months a number of deficiencies have been identified, however these will be explored in an upcoming blog on the risk based QA system.

Finally

I hope you have found the above, useful for your facility. Future posts on interesting inspection findings are planned and if you would like GLPMA to expand on any particular areas please contact the GLP mailbox.


Don’t miss the next post, sign up to be notified by email when a new post is published on the Inspectorate blog.

Access our guidance on good practice for information on the inspection process and staying compliant.

Sharing and comments

Share this page