Category Archives: Computer Systems Validation

Computer System Lifecycle: Considerations

Hello good people of the world! Today’s post is an overview of the lifecycle of a computer system in regulated industries (pharmaceutical, biologics, and medical device manufacturing). It is based on my almost 20 years experience in the field, and will provide you at a high level the things you should be thinking about with respect to computer systems.

1. Requirements Elicitation
Before anything else, we must understand the requirements around the need for a computer system. This could be the hardest step, and is definitely the most important. Spending extra time and expertise here can prevent a lot of problems downstream. Know who your stakeholders are, and elicit their requirements better than they do. You will often (always ?) find end-users may not know what their requirements are, or may not be able to communicate them effectively. Understand the business process you are trying to automate, and ensure you are talking to all stakeholders: end-users, administrators, managers, etc.

2. Supplier Selection
Once you know what your requirements are, you’re going to find someone to help you met your requirements. At this stage it is important to vet your suppliers to make sure they have the required experience and expertise to deliver a solution to meet your requirements. Supplier selection in regulated industries includes a Supplier Audit.

3. Planning
Now is the time to document your plan. Traditionally this plan is called a Validation Plan, or Validation Project Plan, or Project Validation Plan, but I like calling it a Quality Assurance Plan. Validation is one (big) part of ensuring the quality of the delivery of your new computer system, but your planning needs to encompass all quality considerations. Your plan should include:
– change management procedure to be followed in implementing the system
– updates to your system inventory
– business continuity impact assessment
– risk management considerations (project risk, compliance risk, etc.)
– data privacy considerations
– HR data access considerations
– GxP regulatory determination (e.g., System-Level Impact Assessment (SLIA))
– plan for leveraging supplier documentation
– test strategy for backup, archive/retrieval, restore
– system release strategy
– data migration strategy (if applicable)

4. Specifications
Once your plan is in plan (and agreed upon by all stakeholders), now is the time to specify how the system will meet requirements. Depending on the time of system, risk, GxP impact, etc., deliverables of this stage may be a Functional Specification (FS), Functional Risk Assessment (FRA), Design Specification (DS), Configuration Specification (CS), Requirements Trace Matrix (RTM).

5. Verification
Now that your system is specified, it’s time to test it. Again, depending on the project scope, this stage could include an overall Test Plan, draft SOPs (admin, operating, end-user, etc.), protocol generation, RTM update, business continuity and disaster recovery plans, functional testing, training curricula development, and user-acceptance testing (UAT).

6. Release
Everything has gone well and it’s time to put the system into production use. Be sure to consider: SOP approval/effectiveness, updating the RTM again, any testing in production (such as IQ), system release notification, creating an index of system documentation, closing the Quality Assurance Plan with a report (what actually happened versus what was planned), and closing the change control.

7. Operation
Once the system is in use and everyone is happy, the work is not done! Maintain activities should include: change management, incident/problem management, periodic review of the validated state, periodic testing of backups and disaster recovery, and maintaining the system documentation index.

8. Retirement
Hopefully after many years of service, the time will come to retire your computer system. Be sure to consider data archiving and retrieval in this (final) stage of the computer system lifecycle.

Did I miss anything? Comment below.

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!

Advertisement

Montrium Connect Compliance Concerns

Hello good people of the world! Today we’re talking about a specific Software-as-a-Service (SaaS) product: Montrium’s Connect. Montrium offers a number of modules in their Connect software (Document Management, Training, CAPA, Incidents, etc.) Today I’ll focus on the Document Management module.

What makes Montrium’s offering unique is that is it built on top of Microsoft Sharepoint. I previously talked about Sharepoint Online with respect to compliance concerns here (a little out-of-date, but still relevant).

The first point I’ll make is having the application built on Sharepoint brings some significant advantages and disadvantages. The primary advantage I see, in comparison with other electronic Document Management Systems (eDMS), is that Sharepoint uses Microsoft’s Office Online suite, and arguably the world’s best online word processor: Word. I am not aware of any online word processor as fully featured as this one. I have used other eDMSs that have their own word processor and having less features can be really frustrating.

That said, Sharepoint also brings it’s clunky user interface and outdated Active Server Page (.aspx) architecture. The application won’t feel as snappy as modern websites, and you’ll see page reloads for things that would be handled by a component re-render in more modern applications. Overall the application feels very slow. I found myself having to wait minutes sometimes for items moving through a workflow to pop up in my task list.

An example of Montrium / Sharepoint UI

The first thing that struck me with the compliance aspect of Montrium’s offering is that they have categorized their Connect SOP (which is the brand name for the Document Management Module) as GAMP category 3 software. GAMP category 3 is commerical-off-the-shelf (COTS) non-configurable software. I don’t know how they consider this software non-configurable, because there is a lot of configuration options that change how it functions, including workflows. This results in end-users not creating a configuration specification and not testing the configuration to their specific intended use. This could be a compliance risk.

Another thing I noticed is the audit trail functionality. There is no interface for audit trail, instead it is automatically exported a protected Excel file every 28 days. I find it strange that the audit trail would not be available in real-time, and think this could introduce some compliance risk. It also falls into the trap of including at least some non-human readable data. See the example below:

Audit Trail Example

So just a couple points of concern with Montrium’s Connect software in a regulated use case.

What has your experience with Montrium Connect been? What is your favorite eDMS? Comment below.

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!

Regulations and Guidance for Assessing a Computer System Supplier

Hello good people of the world! Today’s post is continuing the series on compliance in the cloud. Today’s post is a simple list of regulations and guidance that you can provide to someone who asks the question: why do we have to assess suppliers of computer systems/software? These are the reasons why!

FDA 21 CFR Part 820 Quality System Regulation (link)

Section 820.50 Purchasing controls

Each manufacturer shall establish and maintain procedures to ensure that all purchased or otherwise received product and services conform to specified requirements.

(a) Evaluation of suppliers, contractors, and consultants. Each manufacturer shall establish and maintain the requirements, including quality requirements, that must be met by suppliers, contractors, and consultants. Each manufacturer shall:

(1) Evaluate and select potential suppliers, contractors, and consultants on the basis of their ability to meet specified requirements, including quality requirements. The evaluation shall be documented.

(2) Define the type and extent of control to be exercised over the product, services, suppliers, contractors, and consultants, based on the evaluation results.

(3) Establish and maintain records of acceptable suppliers, contractors, and consultants.

EudraLex Volume 4 Annex 11: Computerised Systems (PDF)

Section 3 – Suppliers and Service Providers

3.2 The competence and reliability of a supplier are key factors when selecting a product or service provider. The need for an audit should be based on a risk assessment.

3.3 Documentation supplied with commercial off-the-shelf products should be reviewed by regulated users to check that user requirements are fulfilled.

3.4 Quality system and audit information relating to suppliers or developers of software and implemented systems should be made available to inspectors on request.

Section 4 – Validation

4.5 The regulated user should take all reasonable steps, to ensure that the system has been developed in accordance with an appropriate quality management system. The supplier should be assessed appropriately.

ICH Guideline Q9 on Quality Risk Management (PDF)

II.4 Quality Risk Management for Facilities, Equipment and Utilities

Computer systems and computer controlled equipment

To select the design of computer hardware and software (e.g., modular, structured, fault tolerance); 

To determine the extent of validation, e.g., 

  • identification of critical performance parameters; 
  • selection of the requirements and design; 
  • code review; 
  • the extent of testing and test methods; 
  • reliability of electronic records and signatures.

II.5 Quality Risk Management as Part of Materials Management

Assessment and evaluation of suppliers and contract manufacturers

To provide a comprehensive evaluation of suppliers and contract manufacturers (e.g., auditing, supplier quality agreements).

ICH Guideline Q10 on Pharmaceutical Quality System (PDF)

Section 2.7 Management of Outsourced Activities and Purchased Materials

  • Assessing prior to outsourcing operations or selecting material suppliers, the suitability and competence of the other party to carry out the activity or provide the material using a defined supply chain (e.g., audits, material evaluations, qualification); 

ICH Guidance E6 on Good Clinical Practice (PDF)

Section 5.5 Trial Management, data handling, and record keeping

5.5.3 When using electronic trial data handling and/or remote electronic trial data systems, the sponsor should: 

(a) Ensure and document that the electronic data processing system(s) conforms to the sponsor’s established requirements for completeness, accuracy, reliability, and consistent intended performance (i.e., validation).

That’s it! Are there any I missed? Comment below!

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!

How to Store an Electronic Signature

Electronic Signature

Hello good people of the world! Today’s post is a short one on the topic of electronic signatures. 21CFR11 includes requirements around the use of electronic signatures in place of traditional handwritten signatures and these requirements are fairly well understood at this point. The question of this post is, how should your electronic system store the electronic signature?

It is important to understand that electronic signatures are meta-data. That is, they are data about data. So if you have an electronic batch record, for example, the electronic signature recorded during approval is not part of the batch record data, but meta-data of the batch record data. In this example, it is specifically the who, what, and when of the batch record approval.

Given that, electronic signature data should not be included in the same record (table row, document, etc.) in the electronic system. It is meta-data and should be handled as such, an important consideration in electronic system design. This will ensure electronic signature data is robust, auditable, and readily available.

How do you store electronic signatures? Any lessons learned? Comment below.

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!

Warning Letter: Daito Kasei (Japan)

WarningLetterFDA

Hello good people of the world! Today’s post is about a recent (January 2018) warning letter issued by the FDA to a Japanese cosmetics and drug manufacturing company called Daito Kasei. The warning letter, which can be found here, underscores the need maintain focus on data integrity requirements in our modern age. This warning letter includes explicitly the following data integrity remediation steps: Continue reading Warning Letter: Daito Kasei (Japan)

Computerized System Qualification: AssetCentre

assetcentre

Hello good people of the world! This covers qualification of Rockwell Automation’s Change Management software application FactoryTalk AssetCentre. AssetCentre is a application the can be used to implement version control on GxP process PLC, HMI, SCADA, BMS, BAS, etc. software files.

AssetCentre may be considered GAMP category 3: non-configurable off-the-shelf software if you’re using the default configuration, or GAMP category 4: configurable off-the-shelf software if you’re doing some configuration related to your specific processes. There is no customization available in AssetCentre.

Some things that may be configured (and therefore in the scope of qualification) include:

  1. Database Limitations: the maximum database size, warnings levels, maximum number of versions per asset, etc. may be configured to ensure application performance
  2. Disaster and Recovery Schedules: the frequency and number of backups of a specific asset or group may be configured
  3. Searches: searches may be configured and specific unique security rights
  4. Security Groups: typically security is integrated with the site Active Directory, but additional groups may be configured to apply specific feature security (see below)
  5. Feature Security: security rights for each feature (e.g. ability to view specific folders, edit the Asset tree, etc.) may be configured
  6. Database Backups: regular backups of the SQL Server database may be configured

Data integrity issues:

  1. There is a function (called “Log Cleanup Wizard”) which effectively allows the audit trail entries older than the current day’s date to be purged from the system.

What details do you include in your AssetCentre qualification?

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!

Data Integrity – What is Means for You

data-integrityHello good people of the world! Data integrity is an important topic in the information age and has come into focus for regulatory agencies as more and more parts of manufacturing processes become automated. Agencies know that data integrity can directly affect drug quality.

This post covers the MHRA (UK) guidance on data integrity version 2, released March, 2015, which can be found here.

The guidance document defines data integrity as “the extent to which all data are complete, consistent, and accurate throughout the data lifecycle.”

Of course, the concept of data integrity also applies to paper records, but it is the novelty and complexity of computerized systems that makes data integrity applied to electronic records a subject worthy of discussion and exploration. While we’ve had generations to get used to maintaining paper records, electronic records are relatively new, and the best practices for assuring data integrity may still be maturing.

Raw electronic data typically comes from one of four sources:

  1. Direct data capture via instrument/device output (e.g. temperature transmitter, valve actuator feedback, etc.)
  2. Capture of data stream from another computerized system (e.g. electronic chart recorder, electronic scale, etc.)
  3. Automated import of data from another computerized system (e.g. event or alarm log, recipe, etc.)
  4. Manual entry via HMI (Human-Machine Interface)/OIT (Operator Interface Terminal)

Each of these methods is subject to qualification/validation. Method #4 is a unique case in that is may require secondary verification by a separate operator or, in some cases, a supervisor, for critical data or any case where data is being transcribed from another location (electronic or paper-based).

The rules that apply to paper-based data also apply to electronic data. Data must be (ALCOA):

  • Attributable – it must be clear who made the entry
  • Legible – it must be clear what the entry is
  • Contemporaneous – the data must be recorded at the time of action/event
  • Original – the data must be raw
  • Accurate – the data must be correct, complete, and accurate

In order to maintain electronic data integrity, the following concepts are applied:

  1. Access Control
    • Each user shall be uniquely identified
    • Password controls shall be adequate
    • User’s shall have only the permissions necessary to perform their job functions
    • A list of current and historic users shall be maintained
  2. Change Control
    • Changes to the system shall be controlled and only available to authorized users
  3. Training
    • All users shall have the training necessary to perform their job functions
  4. Record and Retain Data
    • Required data shall be recorded ALCOA and retained through the lifecycle
  5. Audit Trail
    • Modifications to raw data shall be recorded in an audit trail, with who made the change, the original data, the modified data, when the change was made, and why
    • The audit trail may also record system events, transactions, logins, etc.
  6. Review Data
    • Data shall be available for review
  7. Backup Data
    • Data shall be backed up to ensure redundancy and eliminate any single point of failure

Originally, audit trails only captured changes to raw data, the way a line-out would capture a correction on a paper record. Now, much more may be expected of the audit trail, and audit trail functionality may consist of multiple system reports, for example record of logins (attempted and successful), application transactions, any change to application data or metadata. In addition, the audit trail report is expected to:

  • Record the original and modified values of any data change with user and date/time stamp
  • Not be editable
  • Be viewable and understandable by end-users (that means no foreign key values or other coded/hex values please!)
  • Be reviewed as part of batch release
  • Be regularly backed up

Some more considerations around your audit trail:

  1. Do administrators have the ability to modify or disable the audit trail? If yes, how to control the added risk
  2. Does the audit trail contain enough data to allow robust data review?
  3. Do the items in the audit trail include enough relevant items that will permit the reconstruction of the process or activity?
  4. What is the process for audit trail review?

Some more considerations around your user access procedures:

  1. Is there a procedure that describes how access is granted to a user, defines each user group, and their access levels?
  2. Is user access granted only after a documented training has been completed?
  3. Do users have only access rights appropriate for their job role (tied to SOP ideally)?
  4. Is it clear what rights to a specific individual (e.g. via user rights report)?
  5. Is historical information regarding user access levels available?
  6. Are shared logins or generic user access accounts used? Should avoid these!
  7. Is administrator access restricted to the minimum number of people required? Don’t want excessive numbers of admins!
  8. Is the generic administrator account available for use? Don’t allow this!

How do you assure data integrity in your organization?

Like this MWV (Mike Williamson Validation) blog post? Be sure to like, share, and subscribe!

WHO’s Draft Guidelines on Validation May 2016

Hello good people of the world! On May 15, 2016, the World Health Organization released its draft Guidelines on Validation. It is available on the WHO website for download here.

This post covers my review of the guidance. Continue reading WHO’s Draft Guidelines on Validation May 2016