Data Integrity Challenges in the UK Pharmaceutical Sector

Data Integrity Challenges in the UK Pharmaceutical Sector

Published on 07/04/2022
Data Integrity Challenges in the UK Pharmaceutical Sector

In his latest article, Yokogawa’s Peter Cusworth discusses the problems with Big Data and the insider threat for data integrity.

Consumer safety is at the forefront of regulators’ minds when they audit a drug manufacturer’s facility. This is especially true in an industry where the lines between science and technology continue to blur, opening the door to several potential errors and oversights. 

Safety is without question the top priority, but checks are also made to ensure consistent and high-quality products. Process errors can be missed if values aren’t monitored across all phases of discovery and development, resulting in sub-par products that not only damage a company’s reputation but may also be unfit for human consumption. Any drug released must adhere to strict regulatory standards and have robustly recorded information to prove results are an accurate reflection of the development process. 

Data integrity is a critical feature of the pharmaceutical industry, yet many organisations continue to breach well-established standards. The US Food and Drug Administration (FDA), for example, sent 194 warning letters to businesses from 2008 to 2018, with the majority of them being issued in the latter part of that period. There are several reasons why violations are on the rise – some are simply down to carelessness, however, others hint at more serious challenges that have emerged in recent years.   

Understanding Data Integrity

Data integrity is defined as the completeness, accuracy and consistency of information that is absent of alteration in prescribed methods. Auditors expect all records, including those on paper, to be collected and maintained securely, adhering to the ‘ALCOA’ principles. This states that all data should be attributable, legible, contemporary, original and accurate. By extension, ALCOA also implies information should be accessible at all times. 

These ideas are straightforward in principle but complicated in practice. With such a complex chain of events and multiple groups working together, data integrity management soon becomes a challenge for even smaller organisations. All employee actions must be noted, along with instrument calibration, compound analysis and quality control points. This creates a larger volume of information that now needs to be collected in a secure and transparent manner.  

The Problem with Big Data  

Thirty years ago, most drug manufacturing facilities would have used a paper-based recorder to monitor temperature and pressure, with a limited number of lines plotted on a chart. Colleagues would then measure this information against a template to ensure processes fell within set parameters. Today’s acquisition systems, however, can gather a vast range of different inputs and outputs, including flow, inlet or outlet conditions, packing and other critical environmental data. While these spreadsheets are suitable for research purposes, it also means there is a higher chance of data falling out of specification. Consequently, manufacturers now have to be sure all this extra information is calibrated correctly; otherwise, large quantities of the product may need disposing. 

Long, unstructured data sets also create added pressures for the audit trail and those responsible for them. At one time, these records were designed around significant changes like a colleague signing in to begin a batch or someone changing an alarm. However, every action must be recorded no matter how inconsequential, including the transfer of files across an organisation’s internal network. These small additions mean it now takes much longer to check a batch, increasing labour costs and the chances of disruption should an error be found.   

Insider Threat 

Staff are often a facility’s weakest point yet also crucial for managing data integrity. Human error is inevitable in most organised structures but is especially common in the pharmaceutical sector due to the pressure businesses are under to develop and scale products for the market. Automation can minimise the inconsistencies that result in an ineffective or unsafe drug, but these systems cannot guarantee compliance, not least because the information collected has to be validated by someone. On the other hand, staff also need to actively check data even if nothing has gone wrong as there is a tendency only to investigate once a fault has been flagged.

Another common challenge is access permissions. Organisations that blindly adopt technology can find systems with default administrator accounts used for many different processes across production. One employee in one part of the business, for example, may be using the same login details as someone working in another. This makes it very difficult to determine where errors have occurred and who is responsible. Worse still, it offers no protection against disgruntled staff or others acting with malicious intent – a key issue identified by Deloitte in its report on data integrity in life sciences. The solution here is to split privileges between IT and OT networks so no single person can change processes beyond their remit, a strategy identified in 21 CFR Part 11 issued by the FDA. 

Technical Challenges 

Engineering systems and ensuring they’re compliant can often cost more than specialist instruments used to create products. Manufacturers need to know their programmable logic controllers are working correctly, but they also need to be sure the information collected by a SCADA system is accurate and reliable. This issue only gets more complex when businesses opt for proprietary systems with a high GAMP level, as this will require extensive validation work that can take months. Worse still, the entire procedure will need repeating if problems are found or new functions are added later, making it harder to compete in an already cut-throat industry.    

The cost of meeting data integrity requirements is considerable, and for some, the risk of penalties will be offset by not investing in new equipment. Yet this attitude has been shown to have far-reaching complications, hitting share prices and big-name reputations. Those more risk-averse may opt for cheaper systems, though these can prove a false economy when ongoing validation work drives up the overall cost. 

So-called ‘off-the-shelf’ packages seem to offer the best of both worlds, allowing a business to quickly scale up without losing sight of its data. Though no system is ever infallible, and data integrity has as much to do with culture as it does with technology, products of this type offer a cost-effective route to compliance. This is invaluable in a time where there is a marked rise in the number of health authority enforcement actions – some of which have the potential to close doors permanently.

 

Our Valued Sponsors & Partners