During the seminar "Digitization of processes and business flows", on 4 April in Cavenago di Brianza, Pier Luigi Agazzi, Senior Consultant and partner of Adeodata, spoke on data governance, proposing useful measures for the management of this delicate issue.
Why having a policy?
When identifying a Data Integrity problem we are used to look for appropriate technical solutions. But this makes us risk to underestimate certain factors. The Anglo-Saxon model, otherwise, requires to write a policy which takes into account the motivational, behavioral and organizational aspects, before proceeding with the resolving actions. The policy corresponds, in fact, to a general planning document in which all these factors are considered, responsibilities are defined and, only then, activities are planned. The data governance guidelines have Anglo-Saxon origins and follow this model.
The GAMP guidelines on Data Integrity indicate the topics to be included in the policy, the general document on data governance.
To ensure data security it’s necessary, in fact, to work on cultural and awareness aspects of the staff involved. In this case, the training is fundamental to understand how simple everyday actions, seemingly innocuous, can actually threaten data security: like a password written on a post-it in the office or on a piece of tape on an autoclave. It’s true that technological measures are important, but they are not enough.
Raw, Master & Control Data
Within our systems, if we want to manage data in a sustainable way, we can distinguish the various types of data that are stored on a process equipment or a laboratory instrument.
First of all there are the system data, which are settings related, for example, to speed, set up, tuning, etc. of the system used, but are not related to the product. Then there are the master data, which are the data (and instructions) specific to a certain product, for its production or analysis. They include, for example, the production recipe, the analytical methods or the sterilization cycles. Finally, there are the control data that is the records related to the production and analysis activities of a specific production lot.
For example, master data are fixed and collected in master batch records, indicating a series of objectives to be achieved for the production of a certain batch. Once the production of the batch has been performed, the master batch record is compiled recording the actual value obtained for each of the master parameters. In this way it becomes a control batch record. So, while the master batch record indicates how things should go, the second takes into account how they really went. In assessing the criticality of the data it must be said that the control data are more critical because they indicate the real trend (not the desired one) of a production lot. In fact, these are the ones used for the release of a production lot.
While the system data are rather static and set by the system supplier or installer at the beginning of the operations. The master data is fairly stable too and are set for each product. Both the system data and the master data are usually subject to Change Control and to periodic checks or internal QA audits, but they don’t need to be reviewed before the release of each lot.
And the Audit Trail?
The Audit Trail (AT), required as a regulatory requirement (both from annex 11 to EU GMP and from 21 CFR part 11) would be the electronic version of the GMP correction. In fact, when an electronic data is corrected, if the Audit Trail were not available, the information relating to the value that was previously written and to whoever (when and why) corrected it, would not be registered. The AT is therefore defined as a metadata, or information that serves to contextualize the data to which it refers. It should be noted that as an electronic data protection mechanism, the TA does not include the registration of logins and logouts or the sequence of production activity. The information about who did what (outside of data changes) is recorded in batch records or in lab notebooks. So be careful not to enrich Audit Trail with extra data not dealing with it.
Speaking of Audit Trail review we can say that we are used to review and evaluating GMP corrections on paper records, while electronic TA is often ignored in some folder of the system. On the contrary, the verification of the TA should take place similarly to the verification of the GMP corrections.
So then, the previous classification of data becomes important when it comes to determining the audit frequency of the Audit Trail. The revision of the TA must take place simultaneously with the revision of the data to which the TA refers to or, in other words, the data must be reviewed with the relevant metadata and, among them, the TA.
When I review the specific control data of a lot I must also review the related Audit Trail, not necessarily the raw data and the master data and the related AT.
A useful consideration in this regard should be made on the data acquired automatically. Very often electronic data relating to a batch, the control data, are automatically recorded by the system (there are usually few systems in which data are recorded by the operator following observations in the field). Typically, the data acquired automatically cannot be modified by the operator, just think of the temperature during sterilization, the overpressure of a room during processing, but also the raw data of an HPLC during an analysis. Since these data cannot be modified (but this should be verified in validation), these data do not require Audit Trail or Audit Trail review. Therefore, the revision of the TA applies only to the parameters that can be modified by the operator during the processing, such as the HPLC’s instrument parameters or the temperature setting of the blister pack plates.
Some systems have mechanisms that highlight (on the printouts or in display) if changes have been made (and therefore if the TA must be reviewed), making it much easier to verify the TA as they indicate when such verification is necessary, or allowing to select the Audit Trail records you are interested in.
Data security
The first aspect is the management of data security: from the protection of the systems from accidental loss of data, to the access allowed only to authorized users. For this last, several measures are available: with a login to the system through the login credentials; using diversified user profiles in relation to the function and the degree of responsibility of the subject; limiting access to some system folders. Finally, in the case of open systems (when data transits on the Internet) it is essential that they are encrypted.
Any combination of these measures can be used to limit access to authorized users only, that is to say that they have been instructed on how to use the system. Furthermore, these measures must prevent the possibility of changing the date and time of the system or deleting data files.
In the case of data that can be modified by the operator, it must be possible to generate the Audit Trail, therefore individual login credentials are required.