CRP 112 Lecture 5 Flashcards
Measuring Quality
missing data entries per CRF page
Site Metrics
% monitoring visits done on time
% of evaluable participants with no PVs
% SAEs reported within 24 hrs to REB & sponsor
% properly executed ICFs
# queries per CRF page
Data Management Metrics
% database errors
% of queries manually generated
Time: last participant out to database lock
# times locked database is opened
Statistical Analysis
% of Tables, Listings, Graphs (TLGs) with numerical/formatting errors
% of SAS programs adequately validated
Time: database lock to final TLGs
Internal Process Audits
Employee training
SOP compliance
Regulatory compliance
QC compliance
Need documentation of all!
Site Audits – how are sites chosen?
High participant enrollment
High staff turnover
Either too many or too few AEs
Enrollment rate either too high or too low
-complaints from staff
-high query rates
-risk-based
Quality Surveillance
This occurs after trial ends
Ensures:
Tables, Listings, Graphs (TLGs) match database
Clinical Study Report (CSR) is based on TLGs
Data Management processes are compliant with SOPs and GCP
The BIG Question
Should Quality be an independent
entity within an organization?
Quality by Design principle says
An organization should have
* An independent entity to measure/review
quality standards
* An integrated system to continuously verify
analyze, correct and prevent issues from arising
Quality by Design Step 1
Identify Critical To Quality factors for
each specific trial
-protocol design
-study conduct
-third-parties
-feasibility
-patient safety
-study reporting
Quality by Design Step 2
Discuss potential risks related to each CTQ identified that
impact study quality (participant safety or credibility of
results etc.)
Quality by Design Step 3
Mitigate risks that could lead to errors that matter
Determine how to quickly ID & react when issue exists
Your Quality Philosophy Should
Encompass
Proactive analysis and identification of risk factors
Apply Lean 6-Sigma techniques
QRM systems and analytical tools (Quality Risk Management)
timely implementation of CAPA
Coach, motivate & develop winning staff through effective team building and communication
Build Quality in by
Implementing standard processes/procedures (SOPs)
Effective GCP training / Refresher training
Define clear roles & responsibilities
Effective management oversight and accountability
Continuous assessment of Risk Factors
Adequate study-specific training:
Perform mock patient visits, study procedures walk-through, etc.
QA Audit Plan purpose
Specific guideline to be followed when conducting an audit
Independent and separate from monitoring and quality control
Purpose:
To evaluate trial conduct and compliance with protocol, SOPs,
GCP and regulatory requirements
QA audit plan elements
# sites & vendors to be audited
Selection criteria for audits
Internal processes to be audited
Audit team members specified
Standards for audit to be conducted against (protocol, CRF completion guidelines, SOPs, ICH GCP, regulations)
-specify: documents requires, location/dates/duration, timeline for report of audit
-prepare by reviewing essential docs
Traits of QA Auditors
Detective-like analytical skills
Ability to influence people, even when they have no authority over them
Detail oriented
Can see the big picture
Process thinking:
Past – problem solving
Present – decision making
Future - planning & innovation
Goals of Audit based on
Importance of trial re submissions to regulatory authorities
Type and complexity of trial
Level of risk of trial
Any previously identified problems
Compliance audit
Do activities, process & systems meet requirements
Usually a Pass or Fail
Performance audit
- Compliance to the rules
- Effectiveness of those rules in use
- Suitability of those rules to achieve the organization’s goals
Specific objectives of audit
Pre-qualification of organizations involved in clinical trial
Compliance with human participant protection
Confirm appropriate conduct of trial
Confirm credibility of data obtained
Confirm condition of record keeping at institutions and internally
Confirmation of monitoring conduct
Confirmation of clinical study report credibility
Early detection of problems with a system or process – ability to apply CAPA
Early detection of problems at an institution doing study
Who/What is Audited?
CRO
Vendors (e.g., labs, diagnostics, IVRS etc.)
Research Ethics Boards
Systems
Medical institution
Computerized system validation
Database
Clinical study report
conducting auditing
Must inform sponsor about conduct of an audit in advance
Examination & evaluation of information (e.g., Essential
documents, SOPs)
Trial site investigation (facilities, equipment)
Interview with auditees
Evaluation of conformity and compliance with reference
documents
Important to standardize the audit
Risk Assessment and Categorization Tool (RACT)
- Identifying risks which could affect patient safety, data
integrity or regulatory compliance - Categorizing risks which will be managed by and affect
the Monitoring Plan (and data review) - Determining baseline level of monitoring activities
Audit Checklists
Checklist needs to be a decision tree
Everything on a checklist is not “equal”
Need to critically think
Keep them current
Use as a guide only
Always have lots of follow-up questions to each point
TMF
Trial master file (trial, county and site level)
-sponsor controlled
-follows DIA reference model
-holds essential docs
-defined in ICH GCP
ISF
Investigator site file
-site controlled
-part of model held at each site
-holds essential docs
-part of TMF
DIA Reference Model TMF 11 zones
Hierarchical Structure
1. Trial Management
2. Central trial Documents
3. Regulatory
4. IRB or IEC and other approvals
5. Site Management
6. IP and Trial Supplies
7. Safety Reporting
8. Central and Local Testing
9. Third Parties
10. Data Management
11. Statistics