Skip to content

ALCOA principles: ‘healthy’ guidelines for data integrity – 4 –

The fourth and penultimate blog in our masterclass on data integrity is about data being Original. In the digital era we find ourselves in, copying (and pasting) documents, pictures and even video’s is easier than ever before. Many people have even become indifferent when it comes to the origin or creator of a shared item, which leaves a lot of room for manipulation. In the pharmaceutical industry however, establishing whether or not a research document or data are real or ‘fake’ is of vital importance.

Original data include the first or source capture of data or information and all subsequent data required to fully reconstruct the conduct of the lab activity. The requirements for original data include:

  • original data should be reviewed;
  • original data and/or true and verified copies that preserve the content and meaning of the original data should be retained;
  • as such, original records should be complete, enduring and readily retrievable and readable throughout the records retention period.

One of the implications of these requirements is that graphs and charts visualizing measurements are not approved of by authorities as such. Before ‘going digital’, research instruments typically provided lab analysts with printed output by means of a plotter device. Ideally, this print included the original dots that a particular graph was based on. This requirement actually still holds today. The same goes for the accompanying meta-data, including the identity of the analyst, date and time of capture, applicable parameters and so on. So much for a plain PDF file as it will not be approved by supervising authorities.

Common pitfalls

Recording and archiving data represents some common pitfalls as well. After all, IT technology moves ahead rapidly making systems such as MS-DOS and Lotus look like technology dinosaurs. However, these are the kind of programs a lot of your data may have been initially recorded in! Maintaining these ancient systems for decades just for this particular purpose will generally be regarded as a strictly theoretical requirement. Using the emulation mode in a virtualised software environment may lend a helping hand here. Migrating the data to more contemporary software, for instance Excel, is a common alternative. To avoid loss of data or applicable algorithms or rendering data incomplete, this should be done in a structured fashion.

Audit trail review

Losing data is also a risk when leaving data on a lab instrument even if it is only for a little while. Always transfer data to a central and protected storage point immediately and don’t forget frequent back-ups.
Finally, an authorized person must be given access to all relevant research data (usually the audit trail) for reasons of verification. One of the reviewer’s priorities is to check all adjustments to the original data, including the reason for change. As in written data, all modifications must be accompanied by an (electronic) signature. At the same time, the reviewer must have a clear understanding of both lab processes and the IT system used. This is a lot easier said than done, as it implies the ability to ‘read’ unappealing log files.

There is hope, however, as we expect reviewing audit trails to become a lot more intuitive and even partly automated in the years to come. You shouldn’t be taken by surprise if Vivenics actually turns out to be at the origin of such initiatives…

Back To Top