Data Integrity by Design is Your First Line of Defense!

If you’ve had your finger anywhere near the regulatory pulse, then you know Data Integrity is on the regulatory radar. Some turn to statistics to assess data to identify anomalies. Others rely on Validation to ensure a system is performing as intended. Consider the following illustration that identifies three (3) major aspects of the Data Integrity paradigm.

blog-data-diagram

DESIGN:

Focused on the system design including physical integrity and logical integrity.

VALIDATION:

Formal testing to ensure a system performs as intended.

ASSESSMENT:

Assess data to verify accuracy or identify anomalies.

Each of the above is important. However, the critical point is that data integrity should be designed into the system right from the beginning. Data integrity should be at the forefront of system development (i.e, design).

Assessments

Assessment are great, but they’re a little too late. This is because data is already out there and may have been used to make decisions. It’s like proofreading an article after it has been published. Ideally, statistics can be used to confirm all preceding control measures are effective and reliable thus confirming data has attained a high degree of integrity.

Validation

Validation is a regulatory requirement for GxP systems, so it must be done. However, when you validate a system it has already been developed. Identifying problems, such as corrupt data, will require re-design and at this point in the System Development Life Cycle (SDLC), it’s too late because the negative impact on cost and time can be exponential. Second, validation ensures a system performs as intended, which is essentially black-box testing. What about a system not performing as intended? This is where white-box and grey-box testing comes into play.

The reality is you may not, in fact, be able to use white box or grey box methods because they require access to source code. Because source code may be intellectual property of a software vendor, it may be confidential and proprietary. Furthermore, exposing source code can jeopardize security.

Data Integrity by Design

Designing the system to ensure data integrity at the onset is your first line of defense. But this isn’t an easy task and requires skilled individuals. Unless you’re in the business of developing software, you should avoid developing custom software. It’s better to procure systems from qualified vendors who are, in fact, in the software development business.

To qualify a vendor, it’s important to ensure they’re team is qualified by education, experience, and training. They should have a Quality Management System in place that covers the entire system and software development life cycle (SDLC). Their processes should include practices such as code reviews, white-box, and grey-box testing. They should follow standards, methodologies, and good practices common to software development. They also need domain expertise to effectively deliver solutions to highly-regulated industries such as Life Sciences.

And to the point of this article, their solutions should have been designed in a manner that ensures data integrity.

About the Author

cloud-sun Steve Thompson is Senior Manager of Professional Services and is responsible for managing ValGenesis’s Implementation & Professional Services in the West Coast Region of the United States. Steve has over 20 years of GxP experience in Life Sciences (including Medical Device), is a Parenteral Drug Association (PDA) certified Auditor, has held managerial positions at various levels within Information Technology (IT) and Quality Assurance (QA) for major organizations, is a published author and has presented at several conferences and industry associations. Steve has a B.S. in Computer Information Systems from DeVry University, City of Industry, California.
Data Integration GxP Systems Life Science Industries System Developement Lifecycle Validation Assessment

Leave a Comment

Your email address will not be published. Required fields are marked *

Archive
x