EMEA Office
Groenenborgerlaan 16
2610 Antwerpen
Belgium
06:37
Introduction to Critical Thinking
12:00
Regulatory Landscape
32:01
Fostering Critical Thinking
39:10
Providers: Capitalizing on Efforts
45:38
Conclusions and Opportunities
48:50
Q&A Session
In this free session, Joseph Turton, QA Manager and CSV Specialist at The Knowlogy delivers an insightful examination of the application of critical thinking throughout the CSV lifecycle. He delves into how critical thinking enhances regulatory navigation, ensures data integrity, addresses challenges in CSV, and offers practical techniques for its integration into validation processes.
Do you suggest using FMEA before preparing URS for CSV?
I recommend integrating FMEA into your process while developing your specification. Start by building your specification, then consider how to implement FMEA methodology. It's an iterative process, so begin with the information you have at the outset, however limited, and continually update your URS as you progress.
What does critical thinking mean in GAMP5 2nd edition?
Critical thinking in GAMP5 2nd edition primarily refers to employing risk-based approaches and applying logical, rational thinking to decision-making processes. It involves identifying and prioritizing the highest risks, understanding why they pose significant threats, and determining the most effective strategies to address them.
If I am procuring a system I have not used or interacted with before, how best would I perform this risk assessment especially while applying the FMEA?
This is where mapping out your requirements as if you weren't purchasing a specific system comes in. You map out your process and say, "I need a system that can do X, Y, and Z", rather than "I have a system that can do X, Y, and Z", and then think about the risks that relate to those requirements.
In this way, you'll identify the necessity for a document management system, such as one to manage SOPs, requiring specific version controls and workflow functionalities. By outlining these requirements generically, you can procure a system that meets these needs and then assess associated risks. As you progress, continue updating your approach. It's crucial to regularly review and reflect on your decisions, as circumstances may evolve during system selection. This iterative review process ensures alignment with evolving requirements and mitigates any new risks that may arise. Therefore, consistent review and reflection remain paramount.
Should we apply critical thinking to choosing suitable team members for CSV/other projects?
When selecting team members for projects like Computer System Validation (CSV), it's crucial to consider individuals with a mindset aligned with rational thinking, analytical skills, and a willingness to experiment. This requires a certain personality type that embraces risk management and problem-solving. During the interview process, evaluating candidates for these qualities can be beneficial. While I haven't personally explored this approach, it could prove insightful in assembling a capable team.
Could you share an example/case study of CSV collaboration with other departments?
One common challenge in implementing CSV is when IT professionals attempt to do it in isolation, without collaboration from other departments. This approach often proves ineffective or cumbersome.
To address this issue, fostering a culture of communication and collaboration is essential. In practical terms, this means involving the relevant stakeholders from various departments early in the process. This includes clinicians, data entry personnel, IT staff responsible for infrastructure, as well as QA professionals tasked with auditing.
Establishing a continuous learning culture supported by critical thinking is key for companies aiming to adapt to changing requirements. Collaboration ensures that compliance viewpoints are considered comprehensively, leading to more effective and efficient CSV processes. Attempting to tackle CSV within silos is simply not conducive to success. Effective collaboration across departments is vital for achieving meaningful results.
What should be the CSV methodology for SaaS?
The CSV methodology for SaaS should adhere to the fundamental principles of demonstrating fitness for purpose. This typically involves establishing a set of requirements, conducting thorough testing, and obtaining sign-off from relevant stakeholders.
There are a thousand ways to do CSV, even within SaaS. However, it's crucial to ensure that the requirements are comprehensive, reflecting the perspectives of both software developers and end-users. Testing should be a collaborative effort involving both the vendor and the user to validate how the SaaS solution meets these requirements.
Currently, there seems to be a gap where users are often not actively involved in testing and instead rely solely on documentation provided by the vendor. Ultimately, obtaining sign-off from the relevant organization is essential to validate the SaaS solution effectively. This provides a broad overview of the basic steps involved in validating SaaS.
How can one incorporate requirements for GDPR in validations?
Actually I've been considering this recently. The way we've done it is, when you're building your requirements specification at the beginning, break down the requirements of GDPR in your specification.
There might not be regulatory requirements in the sense of GCP, GMP, or GLP regulations, but there are still requirements that the system has to meet. So I would break them down in the specification, highlight which parts of the data are obviously protected and personal, and that are higher risk, and then test them appropriately and test the controls around them appropriately.
Is it correct that "CSV+Critical thinking (including Risk assessment tools)=CSA (Computer Software Assurance)?"
Yes and No. In my opinion, CSA is just a "rebranding" of CSV done properly with rationale and logical thinking applied. The history of CSV involves a lot of blanket or blinkered approaches which has been to just complete documentation without thought. I think this is what causes confusion and drags down the CSV approach. So I would say CSA= CSV with RA & CT included (lots of people have been doing what people consider CSA for years without calling it that.
Do you recommend preparing URS for validating the existing system?
Absolutely, the requirements specification is the basis on which you can demonstrate that the system is 'fit for purpose'. If you don't have the requirements defined, how are you demonstrating that the system is appropriate?
EMEA Office
Groenenborgerlaan 16
2610 Antwerpen
Belgium
US Office
Scilife Inc.
228 E 45th St. RM 9E
New York, NY 10017
EMEA Office
Groenenborgerlaan 16
2610 Antwerpen
Belgium
US Office
Scilife Inc.
228 E 45th St. RM 9E
New York, NY 10017
Copyright 2024 Scilife N.V. All rights reserved.