3. Module 1 - Topics
• Impact of Improper Practices
• Purpose and Scope of Training
• Assessment Process Flow Chart
• Assessor Roles and Responsibilities
• Assessor Skills and Qualifications
• Responding to Improper Practices
• Definitions and Examples
4. Impact of Improper Practices
• Sound decision-making requires reliable data of
known and documented quality
• Laboratories are coming under increasing scrutiny for
improper practices
• Improper practices result in millions of dollars in
unreliable data
• Impacts
Public Health
Regulatory Programs
Industry
Laboratories
Individuals
5. Purpose and Scope of Training
• This course presents tools for identifying improper
laboratory practices during routine assessments
• Presumes knowledge of laboratory assessments
such as:
National Environmental Laboratory Accreditation
Program (NELAP)
Good Laboratory Practices (GLP)
Drinking Water Certification program
Pre-qualification (e.g., project-specific assessments)
6. Purpose and Scope of Training
• This is supplemental training for experienced
assessors
• Determining intentional wrongdoing is a job
for the investigator
8. Assessor Roles and Responsibilities
• Verify adherence to specifications (quality
system documents and SOPs)
• Be alert to deficiencies (non-conformance to
a standard)
• Probe further when a deficiency is observed
• Have contacts from whom to seek advice
• Follow your assessment plan and observe
your assessment program’s policies
9. Assessor Roles and Responsibilities
• Be alert to patterns of behavior that could
indicate a systematic weakness
• Document findings and gather sufficient
objective evidence to support deficiencies
• Inform management of deficiencies
• Recognize good practices
10. Assessor Skills & Qualifications
• Assessment training and experience
• Verbal and written communication skills
Interviewing
Documenting deficiencies
• Expertise in areas being assessed
Laboratory instrumentation (including computer
skills)
Data collection, reduction, analysis, and reporting
Record-keeping
11. Assessor Skills & Qualifications
• Knowledge of applicable specifications
Relevant federal and state regulations
Laboratory assessment and quality systems
standards
Methods
Minimum QA and QC requirements
Flexibility is essential: Assessors must adapt as they go!
12. Responding to Improper Practices
• Act impartially and observe due process
(innocent until proven guilty)
• DO NOT take actions that could compromise
a future investigation
• DO NOT overstep your authority (e.g.,
interrogate laboratory analysts until they
“confess”)
13. Responding to Improper Practices
(cont’d)
Important Reminders
• There may be a number of explanations for
an observation
• When in doubt, gather the objective evidence
and records and seek advice
• Be candid with laboratory management
during the exit briefing and encourage open
dialog about your observations
14. Responding to Improper Practices
(cont’d)
Circumstances that require follow-up:
• You have been approached by a “whistle-
blower” or otherwise suspect deliberate
wrong-doing
• You observe practices that could result in a
potential threat to public health and safety
• You observe a pattern of improper practices
that raises concerns about data integrity
15. Responding to Improper Practices
(cont’d)
• Your organization should have specific
procedures for managing assessment
findings, including making referrals
• Before you go on-site, make sure you have
identified points of contact for both technical
assistance and referrals
17. Definitions
Red Flag or Warning Sign:
An observation that indicates the potential for, or
leads the assessor to suspect, system vulnerabilities
or improper practices
Examples:
Weaknesses in internal assessments/corrective
action
High staff turnover in key areas
Unclear roles and responsibilities
Too few QC records to support data output
18. Definitions
Deficiency:
An unauthorized deviation from acceptable
procedures or practices; a defect in an item; non-
conformance with a specification
Examples:
The laboratory has no arrangements for annual
internal audits (ISO 17025 4.3.1, NELAC 5.xx
2002)
Lack of written SOPs (GLP standards 40 CFR Part
160 Section 160.81)
19. Definitions
Improper Practice:
A scientifically unsound or technically unjustified
omission, manipulation, or alteration of procedures or
data that bypasses the required QC parameters,
making the results appear acceptable
Any alteration of data such that the data are
unauthentic or untrue representations of the
experiment or test performed
20. Definitions
Improper Practice - Examples:
Reporting failed QC results as acceptable
Reporting post-digested spikes as pre-digested
Changing instrument clock setting or altering
date/time information
Manipulating peak area without
approval/documentation
Selectively dropping calibration points
21. Definitions
Fraud:
A deliberate deception practiced so as to secure
unfair or unlawful gain
Reminder: An assessor can not determine whether
an improper practice is fraud. A determination of
fraud is the conclusion of a legal process which must
evaluate the elements of both “intent” and “unlawful
gain”. Until that point, the practice in question is an
allegation of misconduct.
(Webster’s II)
22. Definitions
System Vulnerability:
Any aspect of a Quality System that allows improper
practices to occur and to go undetected
Examples:
Inadequate training
Ineffective internal assessments
Lack of independent QA reviews
Lack of management controls
23. Definitions
Objective Evidence:
Data supporting the existence or truth of something
Data Assessment:
An in-depth review and reconstruction of data from
raw/source data through final reporting
26. Topics
• Requesting Documents and Records
• Pre-Assessment Review
• Pre-Assessment Red Flags
• Developing the Assessment Plan
27. Requesting Documents and Records
• Qualifications statement
• Organizational chart
• Quality Systems manual
• Resumes for staff performing key functions
• Previous assessment reports
• Standard Operating Procedures (SOPs)
28. Requesting Documents and Records
• Proficiency Testing (PT) sample results
• Data Packages
Request specific data packages
Designate the type of data to be sent
Include cross-section of procedures
30. Pre-Assessment Review
1) Do initial signs point to a well-established
internal assessment and corrective action
program?
2) Does the pre-assessment submittal
document proficiency in requested services?
31. Pre-Assessment Review
3) Does the technical depth appear to be
adequate?
4) Have there been recent, significant
organizational changes?
32. Pre-Assessment Review
5) Was the laboratory’s initial response prompt,
complete, and organized?
6) Is management’s commitment to data
integrity clearly spelled out?
33. Pre-Assessment Red Flags
• Weaknesses in internal
assessments/corrective action
• Inadequate demonstration of proficiency
• Questionable technical depth
• Recent, significant organizational changes
• Lack of responsiveness
• Lack of management commitment to data
integrity
34. Developing the Assessment Plan
An assessment plan is a written document that
identifies the scope and objectives of an
assessment and includes:
Laboratory information including primary contacts
Names of assessment team members
Areas to be assessed
Documents reviewed during pre-assessment
Documents to review on-site
List of laboratory staff to interview
Schedule of assessment team activities while on-site
35. Developing the Assessment Plan
1) If you find weaknesses in internal
assessments/corrective action, then:
Examine corrective action reports
Interview QA staff
Interview analysts
36. Developing the Assessment Plan
2) If you question the laboratory’s proficiency,
then:
Interview analysts
Request process demonstrations
Review training records
Conduct data assessment
37. Developing the Assessment Plan
3) If you have concerns about technical depth,
then:
Review additional resumes
Check training records
Interview key staff
Interview selected analysts
38. Developing the Assessment Plan
4) If you find recent organizational or ownership
changes, then:
Verify that the quality systems manual has been
updated
Include a cross-section of staff for interviews
Confirm that SOPs are current
Confirm that training records are up to date
Confirm that internal assessment files and
corrective action records are up to date
39. Developing the Assessment Plan
5) If the laboratory has been unresponsive,
then:
Reissue your request
Report concerns during opening meeting
Conduct on-site interviews of technical
management, QA staff, and customer service staff
40. Developing the Assessment Plan
6) If a strong management commitment to data
integrity is not apparent, then:
Interview staff at all levels
Review ethics or data integrity policy
Review internal assessment records and
corrective action reports
Review training records
Perform data assessment
41. Summary
• Note areas of concern during pre-assessment
for follow-up
• Identify additional resources for technical
questions
• Determine what information you will need to
obtain on-site
44. Topics
• Effective Interviewing Skills
• Setting the Stage
• Using Core Questions
• Useful Interview Phrases
• Know How to Respond to Questions
• Interviewing Do’s and Don’ts
45. Effective Interviewing Skills
• Organizational skills
• Maintaining a non-threatening demeanor
• Ability to listen
• Ability to control situations and manage time
• Ability to read body language
46. Effective Interviewing Skills
• Ability to redirect
• Ability to follow a lead
• Accurately documenting interview
• Ability to recognize when to refer to other
experts
• Know the right questions to ask
47. Setting the Stage
• Start slowly
• Put interviewees at ease
• Tell them
who you are
why you are there
what you want to talk about
what are you going to do with the information
48. Using Core Questions
• Asking several employees the same set of
core questions can provide a sense of how
the laboratory functions. Begin with:
What are your responsibilities?
How do you like working here?
How long have you been working here?
How do you handle overtime?
49. Using Core Questions (cont’d)
• Move into job function/responsibilities
What do you do in a typical day?
Who is your backup?
Explain the data review process?
What is your repeat rate? Do you get a lot of rush
jobs?
Tell me about recent training you have received
Does the laboratory have a bonus program? How
does it work?
50. Using Core Questions (cont’d)
• Draw out more specific information
How are you assigned work?
How do you document which work you did?
How do you interact with your quality assurance
staff?
What do you do, and where do you go when you
have a problem?
51. Using Core Questions (cont’d)
• Draw out more specific information (cont’d)
What would you do if you came across an
improper practice?
Describe the laboratory process for corrective
action
Has anyone (such as a project manager) ever
asked you to change data?
Tell me about the laboratory’s data integrity
program
52. Useful Interview Phrases
• Let’s move on to another topic…
• Why do you do that…?
• Tell me about…
• Where do you document…?
• Can you show me…?
53. Useful Interview Phrases
• I’m confused; can you help me
understand…?
• This is interesting…
• I came across this in our data review…
• [Someone] mentioned this to me; how do you
handle that in here?
• You seem extremely busy...
54. Be Prepared to Answer...
• Am I going to get into trouble for talking with
you?
• Should my supervisor be here?
• Can I talk to you about something off the
record?
• I saw something recently that concerns me.
What should I do?
55. Be Alert to...
• Managers translating or clarifying staff’s
responses to questions
• Emphasis on production
56. Interview Do’s
• Be prepared
• Be objective and impartial
• Adapt you interview style to the individual
• Ask who else to talk to
• Consider having a witness (scribe) with you
• Know when to move on or stop
57. Interview Don’ts
• Don’t overpower the conversation
• Don’t respond for the interviewee; silence is
OK
• Don’t take statement at face value -- check
your information
• Don’t overtax your schedule; conduct fewer,
more thorough interviews
58. Summary
• Interviews are a critical part of the
assessment process
• Plan for effective interviews
• Practice effective interviewing skills
61. Topics
• Types of Improper Practices
• Causes/Factors
• Quality System Vulnerabilities
62. Improper Practice
Improper Practice:
A scientifically unsound or technically unjustified
omission, manipulation, or alteration of procedures or
data that bypasses the required QC parameters,
making the results appear acceptable
Any alteration of data such that the data are
unauthentic or untrue representations of the
experiment or test performed
63. Quality System Vulnerabilities
• Definition
Any aspect of a Quality System that allows improper
practices to occur and to go undetected
• Weaknesses in the following areas:
Management commitment to data integrity
Resources
Qualifications and training
Supervision and oversight
Preventative and corrective actions
Document control
64. Types of Improper Practices
1) Fabrication of data or other information
2) Misrepresentation of QC sample results
3) Improper date/time setting or recording
4) Improper peak integration
5) Improper GC/MS tuning
6) Improper calibration and verification
65. Types of Improper Practices (cont’d)
7) Data file substitution or modification
8) Unwarranted sample dilution
9) Deletion of non-compliant data
10) Improper alteration of analytical conditions
11) Unwarranted manipulation of computer
software
12) Concealment of a known problem
66. 1) Fabrication
• Creating information that is not true
• Creating data for an analysis that was not performed
• Creating information for a sample that was not collected
• Claiming ownership for work performed by external
analysts, equipment, facilities
• Cutting and pasting reports and support data
Examples:
Subcontracting PT samples
Creating CoCs without sample possession
Filling in logbooks for audits
“Reappearing” QC results
Recording autoclave conditions before starting the process
67. 2) Misrepresentation of QC Results
Improperly processing or reporting QC
samples
Examples:
Representing QC samples as digested or
extracted when they were not
Adding surrogates after sample extraction
Adding more than the prescribed amount of spikes
Reporting post-digested spikes as predigested
Failing to prepare blanks, spikes, PT samples or
standards in the same manner as samples
68. 3) Improper Date/Time Setting
Altering the recorded times that samples were
collected, extracted, or analyzed
Examples:
Resetting instrument clocks to make it appear that
a sample was analyzed within holding time
Altering dates/times on printouts and/or screen
printing to make analyses appear to meet 12-hour
windows
69. 4) Improper Peak Integration
Altering the area of a chromatographic peak to
avoid QC failures
Examples:
Adding or subtracting peak area to make QC
results appear to meet criteria
Artificially reducing the height of peak responses
Failing to manually subtract an interfering peak
because doing so would result in a QC failure
70. 5) Improper GC/MS Tuning
Manipulating ion abundance of MS tune
verification to cause abundance to appear to
meet criteria
Examples:
Choosing non-representative scan(s) for evaluation
Performing incorrect background subtraction
Injecting incorrect amounts (BFB surrogate in CCV)
Copying and renaming files and screen printing
Adding spectra from two different files
71. 6) Improper Calibration/Verification
Any technically unsound deviation from proper
calibration techniques
Examples:
Recording results for pH meter calibrations that are not
performed
Performing multiple calibration runs or QC analyses
Representing previous initial calibration data as current
Inserting calibration levels or CCVs run well after initial
calibration
Discarding analyte responses from the center of the
calibration curve without technical justification
72. 7) Data File Substitution/Modification
Substituting previously generated analyses
for non-compliant calibration, QC or sample
runs
Examples:
Reusing historical initial calibration data and
representing it as current
Changing sample IDs in the data files
73. 8) Unwarranted Sample Dilution
Diluting samples or blanks without
explanation, often to the point of eliminating
target analyte responses
Example:
Diluting a sample to reduce laboratory
contamination below the method detection limit
(MDL)
74. 9) Deletion of Non-Compliant Data
• Deleting non-compliant analytical results for
calibrations, QC samples or blanks
• Failing to record non-compliant data for these
samples
Examples:
Deleting common laboratory contaminant results
from method blanks
Recording only those assays that work in the
laboratory notebook
75. 10) Improper Alteration of Analytical
Conditions
Changing analytical or instrument conditions
for standards or QC samples from those
specified in the method or SOP
Examples:
Adjusting EM voltage
Increasing gain
Failing to run samples and standards under the
same conditions
76. 11) Unwarranted Software
Manipulation
• Removing operational codes to eliminate or hide
manipulations
• Performing inappropriate background subtractions
• Adjusting baselines
Example:
Removing data qualifying flags (e.g., removing the “m”
flag to hide the fact that a manual integration was
performed)
77. 12) Concealment of a Known Problem
• Concealing a known sample problem or QC
failure
• Concealing an unethical or improper practice
Examples:
Failure to discuss surrogate or CCV failures in the
case narrative
Failure to report and resolve equipment malfunction
issues
78. Improper Practices
Causes, Factors
• Production pressure
• Conflicts of interest
• Lack of awareness
• Lack of communication
• Misinterpretation of method requirements
• Personality and attitude
• Financial stability
79. Improper Practices
Quality System Vulnerabilities
• Inadequate training
• Poor workload management
• Inadequate documentation and document
control
• Unclear roles/responsibilities
• Inadequate procedures for addressing ethical
dilemmas
• Inadequate oversight
83. Topics
• Opening Meeting and Lab Tour Red Flags
• Assessing the Quality System
• Quality System Red Flags
• Follow-Up on Quality System Red Flags
• Technical Area Assessment
• Technical Area Red Flags
• Case Studies
84. Flow Chart
O pening M eeting & Tour Q uality S ystem Technical A reas Data A ssessm ent
O n-S ite A ssessm ent
85. Opening Meeting and Laboratory
Tour
• Make introductions
• Establish ground rules
• Set up separate space and times for team
conferences
• Conduct brief laboratory tour
86. Opening Meeting and Laboratory
Tour Red Flags
• Time wasters
Lengthy health and safety overview
Non-relevant briefings (history, sales)
• Selected staff not available for interviews
• Housekeeping practices
• Unexplained restricted access
87. Flow Chart
O pening M eeting & Tour Q uality S ystem Technical A reas Data A ssessm ent
O n-S ite A ssessm ent
88. Importance of the Quality System
• An effective Quality System is critical to a
laboratory’s timely detection, correction, and
deterrence of improper practices
• An ineffective Quality System can allow
shortcuts and improper practices to occur and
to go unnoticed
89. Assessing the Quality System
• Documents and Records
Quality Manual
Resumes
Training records
Assessment and corrective action reports
• Interviews
General Manager, Technical Director(s), QA
Manager, QA staff, laboratory staff
90. Management Commitment
to Data Integrity
Red Flags
• Lack of a data integrity policy or statement
• Lack of a no-fault mechanism
• Emphasis on production over quality
91. Resources
Red Flags
• Inadequate backup (personnel and
equipment)
• Poor coordination for accepting work
• Excessive overtime
• “Bottleneck” departments
93. Supervision and Oversight
Red Flags
• QA staff lacks direct access to senior
management
• Unclear roles and responsibilities
• QA staff performing competing
responsibilities
• Inadequate internal assessments
• Lack of data assessments
94. Preventative and Corrective Action
Red Flags
• Incomplete assessment files
• Repeat assessment findings
• Inadequate procedures for handling
complaints
• Corrective action fails to address root cause
• Lack of documented follow-up to corrective
action
95. Document Control
Red Flags
• Lack of record verification in archiving
procedures
• Inadequate archiving procedures to ensure
data retrieval
96. Follow-Up on Quality System
Red Flags
• As appropriate, verify:
Adequate coverage for key roles
Clear lines of communication and reporting
Authority and accountability for data integrity
QA manager independence
Adequate procedures to avoid conflicts of interest
97. Flow Chart
O pening M eeting & Tour Q uality S ystem Technical A reas Data A ssessm ent
O n-S ite A ssessm ent
99. Technical Assessment Tools
Document/Record Reviews
• Records subject to on-site review:
PT sample results
Method Detection Limit (MDL) studies
Control charts
Sample receipt and handling records
Standard materials preparation and use records
Standard Operating Procedures (SOPs)
Instrument run logs, maintenance logs
Initial and continuing demonstrations of
performance
100. Document/Record Review Red Flags
• Records not readily accessible
• Discrepancies between method and SOP
• Referenced methods out of date
• Presence of pencils or whiteout
• Extremely clean, neat logbooks
• Uncontrolled records
• Logbook entries all aligned
101. Technical Assessment Tools
Process Demonstration
• Select a process to observe and schedule
with the laboratory prior to going on site
• Review the SOP
• Ask the analyst to describe the steps in the
process as they are conducted
102. Process Demonstration Red Flags
• Discrepancies between SOP and practice
• Expired standards
• Disabled audit trails
• Shared log-on and password access
• Unlabeled containers or illegible labels
• Sample volume discrepancies
• Materials inventory doesn’t match throughput
• Sample throughput exceeds time required to
process samples
104. Interview Red Flags
• Managers interfering with analyst’s responses
• Emphasis on production
• Inadequate sample handling procedures for
evening/weekend deliveries
• No mechanism for reporting problems
• Analysts unable to describe data review
and/or oversight
109. CS # 2: Easy Does It
• What types of improper practices have
occurred?
• What was the specific red flag?
• What other red flags might have been
apparent?
• What quality system elements are key to
early detection/correction?
• What other steps could an assessor take?
110. CS # 3: Assessor Clousseau
• How did you uncover this debacle?
• Where did you start?
• Trace the trail
112. Flow Chart
O pening M eeting & Tour Q uality S ystem Technical A reas D ata A ssessm ent
O n-Site A ssessm ent
113. Topics
• Application of Data Assessment
• Data Assessment Process
• Data Assessment Red Flags
114. Application of Data Assessment
• Data assessment may be performed
During the pre-assessment process
As a part of the on-site process
Following the on-site process
• It is especially important to conduct data
assessment when:
High visibility projects are involved
Data integrity red flags have surfaced during other
phases of laboratory assessment
115. Triggers for Data Assessment
• Incomplete or illegible records
• Improper changes, corrections, or method
deviations
• Inadequate technical proficiency
• Data review and reduction deficiencies
• Computer security deficiencies
• Inadequate internal oversight or surveillance
116. Data Assessment Process
• Step 1 - Select the data packages
Request initial data packages during pre-
assessment process
Request additional packages based on your on-
site observations, for example:
• Red flags related to technical proficiency
• High visibility projects
• High volume business, or “bottleneck” departments
117. Data Assessment Process (cont’d)
• Step 2 - Conduct a Paper Review
Purpose
• Links analyst to data
• Tracks samples to client report
• Answers key questions regarding systems and
operations
• Identifies red flags for further probing
Process
• Review PT data packages
• Look at data from complaints or visible projects
• Identify red flags and inconsistencies
118. Data Assessment Red Flags
Paper Review
• Incomplete case narratives
• Unexpected sample results
• Too-perfect QC results
• Reports missing review signatures or dates
• Discrepancies between CoC and reports
119. Data Assessment Process (cont’d)
• Step 3 - Evaluate electronic/source data
Purpose
• Allows comparisons between source data and final
report
• Allows comparisons between analysts
• Can detect system vulnerabilities and improper practices
• Requires understanding of instrument operations and
analytical procedures
Process
• Pay attention to self-written programs
• Look at audit trails
• Compare raw data to submitted data packages
120. Data Assessment Red Flags
Source Data
• Unexplained gaps/changes in records
• Discrepancies in QC performance between
analysts
• Calibration records missing review
signatures/dates
• “Reappearing” QC results
• Non-standard report formats
121. Data Assessment Process (cont’d)
• Step 4 - Reconstruct the data
Purpose
• Verifies data traceability from source data to final report
• Validates laboratory operations
• Requires the most time, effort, and access to expertise
Process
• Trace and verify dates, times, sample custody, analysts,
instruments
• Look at logbooks, preparation logs, analytical sequences
• Request supporting data or additional packages if
needed
122. Data Assessment Red Flags
Data Reconstruction
• Routine dead time on auto-injection run logs
• Samples or blanks diluted without apparent
justification
• Too few QC records to support data output
• Chronological chromatograms that do not
display prevalent background
123. Finding Data Assessment Red Flags
• Sample Receipt Records
Transcription errors, date/time discrepancies,
inappropriate containers, improper preservation or
storage
• Wet Chemistry Records
Discrepancies between original data and reports,
inappropriate sample sequencing, “reappearing”
QC results
124. Finding Data Assessment Red Flags
(cont’d)
• Sample Preparation/Extraction Records
Failure to meet holding times, inappropriate
sample volumes or dilutions, improper sample
processing cycles
• Calibration Records
Improper calibration procedures (e.g., dropping
points, re-use of historical calibration data)
125. Finding Data Assessment Red Flags
(cont’d)
• Logbooks, sequences, data sheets, run logs
Improper changes or corrections, data fabrication
• Chromatograms
Improper electronic peak integration, improper
manual integration, improper background
subtraction
126. Important Things to Remember
• Be flexible
• You don’t have to be an expert
• Take more time and get assistance when you
need it
129. CS # 4 - Dip and Swirl
• What other red flags might be revealed
through data assessment?
• What system vulnerabilities are indicated?
• What should the assessor do next?
• What should laboratory management do?
131. Topics
• Recording Findings
• Assessment Team Meetings
• Exit Briefing
• Follow-up and Closeout
• Adapting the Flowchart
132. Recording Assessment Findings
• Record all relevant information legibly in ink
• Record and compile information as you go
Assessment checklists
Assessor notebooks
Copies of relevant laboratory records
133. Recording Assessment Findings
(cont’d)
• When conducting interviews and process
demonstrations, record the following
Names and titles of all persons present
Date, time, and place of interview/demonstration
Type of process observed or reviewed
Instrumentation, method and SOP used
Topics discussed
Observations
134. Recording Assessment Findings
(cont’d)
• When reviewing documents and records,
record the following information:
Type of record reviewed
Date, time, place and location of record
Observations
• As appropriate, make a copy of the record
supporting deficiencies/improper practices
Sign, date, and number the copy
135. Assessment Team Meetings
• Conduct regular assessment team working
sessions
Compare notes for completeness
• Observations, deficiencies, improper practices
Identify and collect any additional objective
evidence needed
• Laboratory records, interviews
136. Exit Briefing
• Follow your assessment program’s protocols!
Determine who should be present
Summarize assessment findings objectively
Provide laboratory with a schedule for completing
follow-up activities
137. Exit Briefing (cont’d)
Exception !
If you suspect or have discovered improper
practices:
• Describe the finding in general terms (as an observation
or deficiency)
• Complete evaluation off-site if necessary
If you suspect misconduct
• Gather your records
• Refer to your assessment program’s procedures
138. Follow-up and Closeout
• Prepare assessment report; include required
corrective action
• Review laboratory response with Corrective
Action Plan
• Prepare response to Corrective Action Plan
• Conduct follow-up assessment (if needed)
• Issue or deny accreditation or approval
• Ensure assessment file is complete
139. Contents of the Assessment File
• Purpose of the assessment
• Laboratory application or nomination
• List of pre-assessment records reviewed
• Letter announcing on-site assessment,
including agenda
• Opening meeting notes
• Checklists
• Interview records
140. Contents of the Assessment File
(cont’d)
• Exit briefing notes
• Assessment report
• Laboratory response with Corrective Action
Plan
• Assessor response to Corrective Action Plan
• Final decision
143. CS # 5 - The Oil Slick
• What are possible technical area red flags?
• What are possible data assessment red
flags?
• What are possible quality system red flags?
• What should the QA Manager do first?
• What should be done to determine impacts?
• Is retraining an effective correction action?
145. Case Study # 6 - Name that Tune
• What should you do next?
• If the audit trail confirms the tune file has
been overwritten, what should you do?
• What would you expect to see in the
laboratory’s Corrective Action Plan?
146. Case Study # 7 - Time Warp
• What system vulnerabilities are indicated?
• What red flags might have been apparent
during a project compliance assessment?
• What assessment tools would have been
most effective in uncovering this problem?
#2:[Instructors should welcome trainees, introduce themselves]
Let’s take a few minutes for introductions. Please give your name and a few words about your background and interest in this training.
[Following introductions, the instructor should briefly review the course agenda.]
#3:These are the topics we will cover in this module. We’ll briefly review assessor roles, desirable skills, and conduct, and then go over some definitions used in the course.
#4:In an open letter to the “environmental analytical laboratory community”, dated September 5, 2001, the EPA Inspector General reported “an increase in the number of allegations that environmental laboratories inappropriately manipulate data” and urged laboratory management to take proactive measures to “better ensure proper conduct”.
A number of factors have been identified as contributing to “misconduct,” including poor training, ineffective ethics programs, shrinking markets for analytical services, economic incentives for implementing cost-cutting measures, and ineffective oversight.
[last bullet] The impacts of improper practices on programs, industry, public, laboratories and individuals include:
Public health is threatened (e.g., not cleaning up enough, not detecting permit violations, drinking water contaminants)
Confidence in decisions is undermined
Additional time and cost to repeat work
Loss of credibility with the public
Greater scrutiny of the laboratory industry by government
Fines and jail time for individuals
#5:The tools discussed in this course are not specific to any one assessment program. They can be used in any type of routine assessment, in any type of laboratory performing environmental work, whether on a production basis, or in a research mode. Assessors will need to determine which tools to use; and how best to incorporate them into their existing programs.
#6:This course has been designed as supplemental training for experienced assessors; however, laboratories also can use the training to improve the effectiveness of their internal assessments. It is designed to help you detect improper practices, whether they have occurred unintentionally (for example, as a result of mistakes, poorly implemented systems, or inadequate training) or intentionally.
As assessors, it is your job to report the facts surrounding your observations, but not to draw conclusions about intent. If you suspect deliberate wrongdoing, you should report your suspicions according to your assessment program’s procedures
Suspicions of misconduct also can be reported to the EPA Office of Inspector General (OIG). The EPA OIG Hotline web page (http://www.epa.gov/oigearth/hotline.htm) has information on what to report and how to contact the OIG. The EPA Office of Enforcement and Compliance Assurance (OECA) also accepts tips and complaints through their web site (http://www.epa.gov/compliance/complaints.html).
#7:Let’s look at the flow chart and review the assessment process and red flags. The assessment flowchart is the framework for the course. We’ll talk more about “red flags” in a few minutes. For now, we’ll briefly review the overall assessment process, and then return to it during each of the subsequent modules.
ACTIVITY (3-5 minutes): Using the flow chart, review the assessment process steps, and refer to the list of red flags for each major step.
#8:Now we’re going to briefly review some basics on assessor roles and responsibilities.
[first bullet] When we talk about “specifications” here, we’re referring to:
program requirements (e.g., Good Laboratory Practices (GLP) study protocols)
contract statement of work (e.g., EPA’s Contract Laboratory Program)
regulatory requirements (RCRA waste characterization mandatory methods, Safe Drinking Water Act mandatory methods, Clean Water Act permit requirements)
accreditation/certification standards (National Environmental Laboratory Accreditation Conference (NELAC), EPA Drinking Water certification program)
Observe your agency or program’s policy, and follow your assessment plan. Before your team sets foot in the laboratory, prepare an assessment plan that spells out how you will respond to observations, including any suspicions of wrongdoing. We’ll talk more about this in Module 2.
#9:[first bullet] Patterns of behavior indicate the potential for more far-reaching problems. For example, you’ve spoken to three analysts, and none of them can find a copy of the current SOP.
[second bullet] Document your findings and report them objectively to laboratory management during the exit briefing. For example, make a copy of the out-dated SOP.
[third bullet] When you find a problem, document the problem, and report your observations impartially. Problems could be reported to your (the assessor’s) management and/or the laboratory’s management. The procedure for reporting problems should be documented in your assessment program SOPs.
[fourth bullet] It is appropriate to recognize good performance. Note that some assessment programs may not permit this (e.g., EPA’s Good Laboratory Practices)
#10:Assessors typically work in teams, with a lead assessor, and one or more technical specialists. There are a number of skills that are important and desirable for each individual, as well as skills that can be brought as a team.
All assessors should have documented training and experience. They should be technically proficient in their area of assessment.
Good communication skills are essential. These include good reasoning, writing, and organization skills. When interviewing an individual, an assessor must be able to clearly ask questions and be an attentive listener. Assessors need to be able to assemble evidence or objective facts, and follow leads. Good writing skills are needed to document and present deficiencies in a clear and logical manner.
The assessment team collectively should have expertise, or access to expertise, in all areas of the laboratory being assessed. While an assessor doesn’t need to be able to operate each instrument observed, he/she should have expertise in the technology in general and have the necessary computer skills to be able to recognize whether the analyst is performing a task according to specifications. The team should be able to 1) evaluate and recognize proper practices (e.g. the proper handling of samples and laboratory reagents, operation of instruments, data-collection procedures, and record-keeping practices) and 2) recognize improper practices.
#11:
The team needs to understand all the regulations and standards that govern the scope of the assessment, the laboratory quality systems requirements, and actual work performed. Any formal deficiencies listed in the assessment report need to be linked to the relevant standard.
ACTIVITY (3-5 MINUTES): The instructor can ask the trainees to list additional skills or traits
A handout in your notebooks contains a comprehensive list of desirable assessor skills and traits. [Note to instructor: See supplemental course materials]
#12:A brief review of assessor conduct is needed at this point, to help answer the question, “If I uncover an improper practice, what should I do?”
Observe your agency or program’s policy, and follow your assessment plan. Before your team enters the laboratory, prepare an assessment plan that spells out how you will respond to types of observations, including any suspicions of wrongdoing. We’ll talk more about this during Module 2.
When you find a problem, document the problem, and report your observations impartially.
There are also a few important “Don’ts”.
Actions that could compromise a future investigation include inappropriately disclosing the contents of the assessment file, or recording a judgment in an assessment report that an analyst’s error was “deliberate”.
If you discover an inconsistency in an analyst’s statements, for example, or suspect they are lying, further objective questions may be acceptable however, never be antagonistic.
Remember you are evaluating a function or performance, not the person. Don’t let it get personal. Always demonstrate respect for the staff.
#13:The guiding rule is “innocent until proven guilty,” and acquiring any proof of guilt is the job for the investigator. During the exit interview, report the facts objectively after consulting with your management (as appropriate). Laboratory management may be able to correct many deficiencies and observations immediately. They also may be able to provide a legitimate explanation behind your observations or concerns.
The exception to this rule is if you suspect management complicity in improper practices. In this case, follow your assessment program procedures.
#14:This slide lists some situations that call for further action. You should report the incident or observation to your assessment program. Referrals for civil or criminal investigation are generally made by the assessment program rather than the laboratory assessor.
[Note to instructor: Discuss each bullet and note that specific examples are covered during Module 4.]
#15:You should specify the assessment reporting chain in your assessment plan. We’ll talk more about assessment planning in the next module.
#16:There are several terms used frequently throughout this toolbox. Let’s review a few definitions just to make sure we’re all on the same page when using these terms.
#17:Note that a red flag is not necessarily a deficiency; it is an indicator of a potential problem, not evidence. There could be a satisfactory explanation behind the observation. The red flag means you need to look further.
Alongside each major set of activities on the flow chart, we’ve listed some typical red flags that can help focus your assessment.
[Note to instructor: It might be worthwhile to write this definition on a flip chart and post it during the class, as a reminder that a red flag is only an indicator.]
#18:In the assessment report, the relevant requirement is cited along with the finding.
If the assessment finding can not be described in terms of a relevant standard or specification, it is simply an “observation” not a deficiency.
#19:Note that this definition doesn’t say anything about an individual’s intent. An improper practice is not necessarily a deliberate act. It could be the result of carelessness, or poor training. Whether or not it is deliberate, an improper practice compromises data integrity and it is wrong.
#20:Here are some examples of improper practices. We’ll be discussing many more examples in the following modules
#21:The legal process will evaluate whether there has been both intent and unfair or unlawful gain. As we discussed earlier, reaching those conclusions is not your responsibility as assessors.
#22:System vulnerabilities can be due to improper or ineffective implementation of the quality system.
In an assessment based on a quality system standard, such as the National Environmental Laboratory Accreditation Conference (NELAC) standards, system vulnerabilities may also be deficiencies. We’ll talk more about quality system assessments during Module 5.
#23:Objective Evidence (definition from ISO 9000/2000):
Any documented statement of fact, information, or record pertaining to the quality of an item or activity, based on observations, measurements, or tests which can be verified.
Understanding what we mean by objective evidence becomes very important when documenting potential improper practices. In the case of laboratory assessments, it is important to collect sufficient objective evidence to document the link between the observation and the improper practice
Data Assessment:
We’ve included a definition for data assessment, because the term means different things to different people. For the purpose of this course, data assessment is [refer to definition].
It is a particularly powerful tool for detecting improper practices, but it’s one that is not currently included in many routine assessments. We’ll spend some time discussing its use during Module 6.
#26:[refer to flow chart pre-assessment section]
In this module we’ll review the pre-assessment process, the red flags that can be detected during pre-assessment, and how to develop the assessment plan to follow up on those red flags during the on-site.
#27:This slide and the following slide list the records and documents you can request and review ahead of time, depending on the scope of your assessment. Program requirements will be the guide as to what can be requested in the pre-assessment phase. Doing some pre-assessment homework will help you focus specifically on potential problem areas and improve your efficiency on-site. Specify in writing what documents you want and when you want them.
Laboratories are routinely asked to provide these documents [refer to bullets] to potential clients and assessing organizations. Reluctance to provide requested documents is itself a red flag. If the laboratory has a web site, visit it.
If applicable, specifically request copies of SOPs for any procedures requiring project-specific modifications.
#28:Additional documents and records to include in the pre-assessment review. [refer to bullets]
You should request relevant PT sample results, and sample data packages. For an accreditation or certification assessment, this request should cover all fields of testing for which accreditation/certification is being sought.
For a compliance or pre-qualification assessment, the request should cover at least the major tests being performed.
ACTIVITY (2 minutes): Are the other items that might be requested?
#29:Once you have received the documents from the laboratory, you can start the review. The next few slides we will cover typical questions that should be answered during the pre-assessment review. If you are not able to answer all of these questions during pre-assessment activities, then plan to get them answered during the opening meeting.
ACTIVITY (10 minutes): The following 3 slides can be used for interactive discussion. Ask the question and invite audience responses. The speaker notes contain the answers. Use flip charts to record class input. Give answer sheet handouts after the discussion (or module) is complete. See supplemental course information for a handout of answers.
#30:1) Which documents or records can answer this question? What should you look for? [Invite discussion].
[Answer] Check the quality systems manual and recent assessment reports.
Are internal assessments performed regularly?
Do they include data review?
Is corrective action timely, appropriate, and well-documented? (This last item may need to be checked on site)
2) Where will you find evidence of proficiency? What should you look for? [Invite discussion]
[Answer] Check the SOPs, PT sample results and data packages
Are SOPs up-to-date, and do they meet current method requirements? Are any project-specific method modifications appropriately noted?
Do PT sample data reports provide any warning signs of repeated patterns of weaknesses (for example, repeat findings in the warning range)?
Are data packages complete? Do both content and format meet the relevant specifications?
Check to make sure that SOPs meet method requirements and reference current versions of the methods. Make sure any project-specific modifications are appropriately noted.
#31:3) How will you evaluate technical depth? (keep in mind the pre-assessment review only provide the view from 30,000 feet.) [Invite discussion]
[Answer] Check professional profiles for key staff
Is the employee’s education and experience appropriate for level of responsibility?
Is additional relevant training summarized?
4) What specific information should you look for? Where should you look? [Invite participation]
[Answer] Check the qualifications statement, website, and quality systems manual
Has the laboratory changed ownership in the last year?
Is the workforce stable, or does the staff turnover rate appear to be high?
Have there been recent changes in key positions?
If you are not able to answer all of these questions during pre-assessment activities, then plan to get them answered during the opening meeting.
#32:5) How will you rate your initial experience? [Invite participation]
[Answer] Describe your initial experience:
Were you readily able to make contact and schedule the assessment?
Did management willingly supply the requested information?
6) Where should management commitment be apparent? What should you look for? [Invite discussion]
[Answer] Check the organization chart and quality systems manual
Does the QA manager have direct access to senior management?
Does the QSM contain the necessary elements, and does it meet the specifications of your program?
Is a data integrity/ethics policy included?
Is there regular, formal, training for all staff in data integrity/ethics?
#33:This slide lists examples of red flags which may become apparent during the pre-assessment activities. We’ll talk about each in more detail later in this module.
[first bullet] As thorough as it might be, the assessment is only a snapshot. A key to the early detection, prompt correction, and ongoing prevention of improper practices is a program of internal assessments and corrective action. Early indicators of weaknesses in this area can sometimes be spotted during the pre-assessment activities.
[second bullet] The first indicators of potential problems with technical proficiency can surface during a pre-assessment review of SOPs and data packages (availability of demonstrations of capability). Take this opportunity to make some notes for follow-up.
[third bullet] As laboratories have struggled to compete in the marketplace, higher paid staff are being replaced by lower paid staff. It’s becoming more important to verify acceptable experience, training, and proficiency of laboratory staff.
[fourth bullet] Recent, significant organizational changes present laboratories with pressures that can affect data integrity, resulting from corresponding changes in workload, staffing, training, quality systems implementation, or oversight.
[fifth bullet] A laboratory about to be assessed should be putting its “best foot” forward, and unresponsive customer service can be a red flag. Note that the assessment team is the customer here.
[sixth bullet] Management commitment to data integrity is absolutely essential. Assessors should look for procedures/policies in place that demonstrate management’s commitment to data integrity.
#34:An assessment plan documents what, when, and how the assessment will be done. The assessment plan should be provided to the laboratory prior to going on site.
[Instructor Prompt for upcoming 6 slides]. The following slides contain some suggestions from the “toolbox” to help you develop your assessment plan based on information gathered during the pre-assessment process.
#35:The Assessment Plan is a critical component of the assessment process. The Assessment Plan is essentially a work plan that documents when and how the assessment will be done.
The following 6 slides provide suggestions for areas to include in the assessment planning. Many of these items should be done on a routine basis, however they are highlighted here to give special emphasis to those areas during the assessment.
The QA staff and technical staff should be able to explain in detail how internal assessment and corrective action work in the laboratory. If these processes really are working, they should be able to provide examples.
#36:As you interview staff and observe demonstrations, compare responses and actions to the records.
Augment the interviewee list as necessary if additional concerns about technical depth surface during the actual on-site process demonstration. You should feel free to do so, on the spot. (A process demonstration is observing staff running a particular method.)
When you’re assessing proficiency, YOU (not laboratory management) should select the analysts to be interviewed, and YOU should select the data to be assessed.
#37:Make a list of staff for review ahead of time (include this in the assessment plan), and let laboratory management know that you expect them to be available.
Include a cross section of organizational and technical responsibilities:
Departments and functions
Analysts, supervisors, managers
New hires as well as “veterans”
Always include key policy-setters and decision-makers
QA officer or staff
Laboratory director
Office manager
#38:Interviews should include staff at all levels, and in all departments, to ensure roles/responsibilities are understood and key functions are covered. Evaluate whether the new management culture has been adopted.
#39:This refers to the laboratory being unresponsive to the assessment team’s requests for information. For some types of assessments, an unresponsive pre-assessment submittal is grounds for terminating the assessment. In any case, follow your assessment program’s procedures for reporting your concerns (may require reporting concerns to your management first).
If all requested information has not been provided by the time you conduct the on-site assessment, point out your concerns during the opening meeting, request copies of any documents or records that were requested earlier, but not provided, and request an explanation.
Following the opening meeting, interview additional staff to verify that the customer service network works the way management claims it does. By customer service we are talking about any staff, administrative, or technical, that is responsible for supplying documents for the pre-assessment review.
#40:Lack of a clearly articulated, commitment to data integrity is a red flag. It can open the door for for improper practices and allow them to go undetected.
As you go through the assessment, as we discussed earlier, YOU select the staff to interview, and the records to assess.
[Invite discussion]
#41:Based on any red flags an assessor identifies during the pre-assessment, adjust the assessment plan prior to going on-site.
Identify any additional documents or records you need to review based on red flags.
Review list of staff interviews to make sure that you have adequate coverage.
Identify (your) resources that the assessment team can use in case of technical issues that may not be covered by team expertise.
Have a plan for unexpected questions/issues
#44:Now we’ll spend a little time reviewing interview techniques. [refer to bullets]
#45:Interviewing skills come naturally to some, but for most, developing good skills requires practice. We don’t intend to cover basic interviewing during this course, but we can provide some tips for conducting more effective interviews, as well as some opportunities to practice the use of interviews to follow up on red flags.
#46:Here are a some effective interviewing skills -
[first bullet] If the analyst is having trouble understanding the question, or if the answer just doesn’t make sense, then ask the question a different way.
[second bullet] The interview has taken an unexpected turn, and you think you may be on to something. What do you do now? Adapt your script!
[third bullet] The analyst is describing the procedure, but it’s clearly different from the SOP. What do you do? Support your observations with effective documentation. In this case, record the response AND make a copy of the SOP. Highlight the discrepancy.
[fourth bullet] Something is seriously wrong here. You don’t believe what you’re hearing. You’ve redirected the question, but the analyst is sticking to his/her story. Do not resort to interrogation tactics. Prepare your documentation, and call the experts.
[fifth bullet] Know the right questions to ask, and how to ask the question. You can start with broad open-ended questions and move to more narrow closed-ended questions to confirm what you heard.
#47:This is basic interviewing, but it’s worth reviewing at this point. You can encourage your subjects to be cooperative in any type of interview setting, by setting the stage.
#48:The term “core questions” refers to a basic set of questions asked of each interviewee. The purpose of asking “core questions” is to get a sense of how the laboratory functions from different perspectives (managers, technical staff, QA staff) and to establish whether there is consistency in the laboratory culture and work ethic. For example,
Does management think QA is a staff responsibility and staff think it is management’s responsibility? Is there QA independence?
Is there a bonus system? How does it work?
Is training provided to employees?
Does management think overtime is needed or used? How does overtime work?
#49:These are examples of core questions that help you assess the analyst’s understanding of job function. These open-ended questions can turn up leads that might not otherwise become apparent during a routine assessment.
#50:The answers to these specific questions can provide the first signs of key system vulnerabilities, including poor workload management, inadequate documentation, ineffective oversight, or lack of effective corrective action implementation. Be alert to inconsistencies between employees’ answers to these core questions.
#51:If information you’ve gathered up to this point has raised any concerns about data integrity, or any concerns that the quality system is not functioning as it should, then here are some questions that can be used to follow those leads.
Talk to several employees, and include employees at different levels within the organization.
#52:Experienced assessors have suggested some of their favorite open-ended phrases for managing red flags that may arise during informal interviews. It helps to have relevant documents or records in front of you. Here are some examples:
If you’re getting bogged down by an uncooperative analyst, try [read first bullet]
If an analyst demonstrates a process step that doesn’t make sense, or that deviates from the SOP, try [second bullet]
If you’ve reviewed the Quality Manual, and found the section on internal assessment to be weak, then you might ask the analyst [third bullet – Tell me about internal assessments.]
If record-keeping at the bench appears haphazard, ask [fourth bullet – Where do you document calibration verification]
If you can’t find complete training documentation for all employees, then ask the QA Manager [fifth bullet]
#53:These open-ended phrases are great, innocuous ways to confront laboratory staff in impromptu interviews when you’ve uncovered an inconsistency; for example, a disconnect between written procedure and actual practice, or a discrepancy in the reconstruction from data package to source data. Their answers can help you determine if there’s a logical explanation for your observation or if you’ve uncovered a real problem.
Just be careful – ask the question once, follow the lead if one is provided, but be prepared to drop it, if not. In this case, note the response and prepare the documentation. Future probing should be left to the investigator.
#54:If your interviews turn up an improper practice (or worse), you could be asked to respond to some tough statements or questions, such as: [refer to bullets]
Before you go on-site, know how your agency expects you to answer these questions. In general though, if someone comes forward as a whistle-blower talk to them as quickly as possible. Like a teenage boy calling a girl for a date, if no one answers he may not have the courage to call again.
ACTIVITY: As time allows, encourage group discussion to suggest possible responses to the above questions and record on flip charts.
#55:[first bullet] If managers will be present during the interview, establish ground rules during the opening meeting. Emphasize the importance of the interviews and remind them that you will ask when clarification is needed.
[second bullet] An emphasis on production can indicate ….
In addition, to the items here [refer to bullets], be alert to staff modifying their answers based on a manager’s body language, or staff that look at management before answering.
#56:As a wrap-up, here are some Do’s and Don’ts. [Discuss each bullet and add examples where appropriate]
[first bullet] Know what information you want to get, have questions ready.
[second bullet] Go into interview with an open mind, not with pre-conceived notions of the outcome.
[third bullet] You may need to adjust your approach to the interview based on the staff person’s personality (e.g., is overly shy, or talkative, aggressive)
[fourth bullet] Ask the person who else you should talk to for more information. They know their system better than you do.
[fifth bullet] Having someone scribe for you can let you focus on answering the questions, watching body language, and keeping the interview on track.
[sixth bullet] Knowing when to move on comes with experience, but obvious times to move on are if the person gets upset, or overly angry or aggressive
#57:Here are some interviewing don’ts. [Discuss each bullet and add examples where appropriate]
#58:Interviews are a critical part of the assessment process. (In order to gather accurate information and objective evidence you will need to conduct interviews. Remember the role of the interview and use the tool effectively.)
Plan for effective interviews (Interviews are more than just questions. They should be designed to help you get the necessary information and they should support your assessment objectives.)
Practice effective interviewing skills. (Effective interviewing skills take practice. Focus on continuous improvement in this area.)
#59:Are there any questions or comments about interviewing?
#61:At this point, we’ll begin to discuss how the tools can be used to follow up on red flags, identify system vulnerabilities and detect improper practices.
#62:Let’s review the definition of improper practice that was presented earlier today (Module 1).
Experienced assessors and investigators have found there are many “creative” ways to bypass quality control and/or alter data. Some are intentional, others are unintentional, for example, the result of inadequate skills and/or and training. Let’s take a look at some specific types of improper practices, so you’ll have a better idea of what to look for, and how to find it.
#63:We’ll also review our definition of system vulnerability. [refer to definition]
The Quality System and Management assessment is the place to focus on vulnerabilities that can allow improper practices to occur. The improper practices themselves occur in the technical areas.
The approach we’re taking is to look for potential weaknesses first, and then use these to focus the technical assessment.
Key quality system elements are listed on the next two slides. Weaknesses in any of these elements of a quality system can open the door for improper practices.
These system vulnerabilities have their own set of red flags, as shown on the flow chart [refer to flow chart]
#64:This and the following slide list twelve general types of improper practices. The subsequent slides discuss each practice in more detail. Understanding the types of improper practices that occur, as well as the factors and system vulnerabilities that “open the door”, will increase your ability to detect them.
[Read the list, which is continued on the next slide]
#65:In the following slides, we’ll look at some specific examples. With each example, I’d like you to consider three things:
What factors could be responsible for the improper practice? In other words, why is this happening?
What system vulnerabilities would allow the practice to occur and go undetected?
What tools could you use (as an assessor) to identify the vulnerability and detect the practice?
#66:(Fabrication is also known as Drylabbing, Drysink, Graphite Chemistry (pencil chemistry))
[Read slide, then ask the following three questions]
Why would laboratory personnel be tempted to do these things?
[Record responses on a flipchart, labeled “temptations (causes or factors)”]
What system vulnerabilities would allow these practices to go on, unchecked?
[Record responses on a second flipchart, labeled “system vulnerabilities”]
During the on-site assessment, how would you detect these practices?
[Record responses on a third flipchart, labeled “detection tools”. Ask participants to be specific, for example, interview whom? Request what document or record? Request a demonstration of what process?]
ACTIVITY: After completing the following 12 slides, you will have three flip charts with participants feedback to the three questions above. Reviewing and discussing the examples will build awareness of the types and extent of possible improper practices. Participants also will realize that there is a common list of causes (temptations) and system vulnerabilities, and that effective detection is going to require interviewing the analysts, looking at the records, requesting process demonstrations, and reconstructing the data] [Suggestion for flip charts – after each example given by the class, indicate which item (1-12) it relates to.]
#67:(also known as Fortification, Juicing, Running “Raw”)
[Continue to read slides, discuss the examples, and continue to build the lists on the flip charts. You may also ask participants to volunteer additional examples.]
#68:(also known as Time Travel)
[Continue to read slides, discuss the examples, and continue to build the lists on the flip charts. You may also ask participants to volunteer additional examples.]
#69:(also known as Peak Shaving, Enhancing, Snow-Capping, Boat Anchoring)
[Continue to read slides, discuss the examples, and continue to build the lists on the flip charts. You may also ask participants to volunteer additional examples.]
#70:(also known as Skewing the Tune)
[Continue to read slides, discuss the examples, and continue to build the lists on the flip charts. You may also ask participants to volunteer additional examples.]
#71:(also known as Run 10, Pick 5; Insurance; Dropping Points, Building a Curve; Centering Curves)
Another example is varying the amount of standard from method requirements (e.g., inject more of the low standard than required and less of the high standard).
[Continue to read slides, discuss the examples, and continue to build the lists on the flip charts. You may also ask participants to volunteer additional examples.]
#72:[Continue to read slides, discuss the examples, and continue to build the lists on the flip charts. You may also ask participants to volunteer additional examples.]
#73:[Continue to read slides, discuss the examples, and continue to build the lists on the flip charts. You may also ask participants to volunteer additional examples.]
#74:[Continue to read slides, discuss the examples, and continue to build the lists on the flip charts. You may also ask participants to volunteer additional examples.]
#75:[Continue to read slides, discuss the examples, and continue to build the lists on the flip charts. You may also ask participants to volunteer additional examples.]
EM voltage = electron multiplier voltage on a GC/MS
#76:[Continue to read slides, discuss the examples, and continue to build the lists on the flip charts. You may also ask participants to volunteer additional examples.]
#77:[Continue to read slides, discuss the examples, and continue to build the lists on the flip charts. You may also ask participants to volunteer additional examples. This is the last “type” of improper practice. Ask participants if there are any other types of practices they wish to discuss]
#78:This is a list of the most common causes of improper practices. It comes down to this: The improper practices are shortcuts – they save time and effort (at least initially). In some cases, however, the after-the-fact data manipulations end up taking a lot more time than it would have taken to do the job right in the first place.
In some cases, the analyst may not know any better; but in others, a shortcut can mean the difference between working 8 hours or 24 hours, OR getting paid or not. These are some pretty influential motivators, and assessors need to be alert to the existence of these kinds of pressures.
#79:These system vulnerabilities open the door for improper practices, and allow them to go undetected. During assessments, assessors need to be aware of whether these vulnerabilities exist.
#80:These tools can be used in the on-site assessment which is our next module – we’ll talk more about them in a few minutes.
#83:Here we begin to talk about the on-site portion of the assessment. At this point you have your team together, completed the pre-assessment review of documents, and prepared your assessment plan. In this module we will be discussing [refer to bullets].
#84:Let’s spend a few minutes discussing the opening meeting and laboratory tour. These first steps of the on-site assessment provide the opportunity to spot a few early red flags, and make adjustments to the assessment plan if necessary. [refer to flow chart]
#85:During the opening meeting, the laboratory management will probably provide a brief presentation discussing the laboratory’s qualifications and providing an overview of operations. When it’s your turn, there are a few steps you can take to focus the assessment process
[first bullet] Introduce the assessment team members, and explain briefly what each person will be doing. At this time, laboratory management will also introduce staff members who will be escorting your team.
[second bullet] This is the time to review the assessment agenda and schedule. Make sure selected staff are available for interviews. Let laboratory management know you will need to speak directly and informally with analysts, without management interference
[third bullet] It is very important for your team to have the opportunity to confer regularly and compare notes. Request separate space to do this.
[fourth bullet] The opening tour is the view from 30,000 feet. You won’t get into specifics here, but make notes about instrumentation, specific staff members you’d like to interview, and any areas of restricted access.
#86:You’re unlikely to uncover improper practices during either the opening meeting or the laboratory tour, but there are a few warning signs that can point to problems, especially if they occur in combination with one another.
[first bullet] An inordinate amount of time spent discussing health and safety or other information (history, sales) that are not really relevant to your assessment could be completely innocuous. On the other hand, it could be an early sign of attempts to limit your time reviewing the laboratory or talking to staff.
[second bullet] If key staff are not available to be interviewed, it could mean that management doesn’t want you speaking to them. Request an explanation.
[third bullet] A laboratory that is excessively cluttered or chaotic can indicate the potential for problems. On the other hand, an excessively neat laboratory can indicate that they’ve made special preparations for the assessment and you’re not seeing typical operations.
[fourth bullet] There are legitimate reasons for restricting access in laboratories, especially in laboratories handling things such as radioactive or materials or conducting classified testing. If you’re not allowed to see a specific area, just make sure you understand the reason.
[If time allows, solicit anecdotes from the class about opening meetings]
#87:Now on to assessing the laboratory’s Quality System
#88:If employees are determined to perform improper practices, they’re going to be able to do it, at least for a while; however an effective quality system will minimize the scope and impact through early detection and correction.
On the other hand, an ineffective Quality System can allow improper practices to occur and go unnoticed.
Specific system vulnerabilities can be used to focus the assessment on detecting improper practices.
Keep in mind that system vulnerabilities are indicators, not evidence, of improper practices.
In an assessment based on a quality system standard such as ISO/IEC 17025, however, a system vulnerability may also constitute a deficiency.
#89:Records to be reviewed during the Quality System assessment include:
Quality Manual
Resumes
Training records
Assessment and corrective action report
Inconsistencies, gaps, and missing records are the first indications of problems.
Interviews should include the following:
General Manager
Technical director(s) - the person who is day-to-day supervisor over data generation and reporting activities
QA Manager and QA staff
Laboratory Staff – those involved in work related to the scope of the assessment
The following slides are red flags that can occur during the quality system assessment.
#90:If the laboratory management’s commitment to data integrity isn’t readily apparent to you, as an assessor, then it’s not going to be apparent to the staff, either. Some red flags are:
[first bullet] The data integrity policy may be a stand-alone document, or included in the Quality Manual. In either case it should be readily available to employees.
[second bullet] Ask management to explain the process by which employees can report problems and seek advice. When you speak with analysts, later during the technical assessment, ask them if it works the way management says it should.
[third bullet] An emphasis on production can create serious pressures for analysts. Signs of a production focus include bonuses based on production.
#91:These are all signs of inadequate resources and poor workload management, leading to an overworked staff. Pressures resulting from too much work, too little time, and inadequate resources are a breeding ground for shortcuts.
[first bullet] Some of the questions you should ask:
Do resources match throughput?
Is there adequate backup, especially for high volume tests?
Do they subcontract work during times of high workload?
How do they ensure the quality of subcontracted work?
[second bullet] Evaluate procedures for accepting new work. How does customer service coordinate with technical and QA staff to verify the staff has the time and resources to complete the work, on time, according to specifications?
[third and fourth bullets] Excessive overtime and bottleneck departments are signs of stressed resources; these pave the way for shortcuts to be taken.
#92:Evaluate resumes, especially those for key positions, for technical depth and staff turnover.
Review training records. The records should answer the following:
Is training provided as part of new employee orientation?
Is training repeated at least annually?
Does the training program clearly describe both expected behavior and improper practices?
Is training documentation complete, or are there gaps?
#93:[first bullet] QA staff should have direct access to senior management, as well as stop-work authority.
[second bullet] Roles and responsibilities of key staff, should be clear. There should be no ambiguities between the Quality Manual, organizational chart, and staff interview responses.
[third bullet] In many labs, especially small labs, QA staff may wear multiple hats. They may assist in fixing instruments, writing SOPs, managing client correspondence. What you need to verify in this case, is that the staff have enough time to carry out their QA responsibilities, and that they are free from conflicts that could compromise their objectivity when doing so. QA staff, for example, should be free from production pressures.
[fourth bullet] Look at the internal assessment process and records. Is it a haphazard box-checking exercise, or a well-documented, in-depth review, performed on a regular basis?
[fifth bullet] Internal assessment programs should include regular, “deep-level” data assessments performed on randomly selected data packages.
#94:Red flags in the area of preventive and corrective actions include the following: [discuss each bullet and add examples where appropriate]
Questions to ask yourself:
Are the files complete?
Do corrective action reports name the same kinds of problems over and over, or is management actively looking to see where vulnerabilities are, working to correct them, and seeking to prevent future problems?
#95:Red flags in the area of document control include: [discuss each bullet and add examples where appropriate]
Archiving procedures need to include a step to verify that records are complete and that they remain secure and retrievable for the specified retention period (agencies, programs or regulations may specify retention times for records).
#96:At the end of the quality system assessment, gather the rest of the assessment team and compare notes so that observations can be incorporated into the technical assessment.
If you have found vulnerabilities related to supervision, oversight, and management commitment to data integrity, then conduct impromptu interviews with additional managers and senior staff as appropriate.
#97:Now we’ll move on to the technical area assessment.
#98:For the purpose of a quick review, the technical assessment toolbox includes: [refer to slide] These are the tools we identified in the previous module. Here we begin to bring it all together. We will discuss the first 3 tools, and then introduce data assessment, the topic of the next module.
#99:Records subject to on-site review include: [refer to slide]
Review technical documents and records to:
Verify internal consistency
Verify compliance with program and/or project specific requirements
Follow-up on red flags and deficiencies identified during the pre-assessment review
Review record-keeping practices. Note where the records are kept, and note their condition. Logbooks and SOPs, in particular, should be kept where they are used.
Does staff have access to them?
Do they appear to be new and rarely used?
Were they generated just prior to the assessment?
Are the current?
Some other records that you may have access to are the laboratory’s health and safety plan and the waste disposal plan. Although these documents are not directly related to the quality system or technical proficiency, they are important indicators of management culture (concern for employees and the environment).
#100:This group of red flags in records indicates poor document control and inadequate oversight – system vulnerabilities that can open the door to quality problems as well as improper practices.
#101:Having the staff demonstrate the process while you watch with SOP in hand, is one of the best ways to evaluate proficiency.
[first bullet] When selecting a process to observe keep in mind that:
your observation of the process shouldn’t unduly impact laboratory productivity
the process should be able to be observed in a reasonable amount of time, otherwise select only a portion of the process to observe
the process should be representative of the work within the scope of the assessment
Make arrangements with the laboratory before going on site and include the process demonstration in your assessment plan agenda. Note that some circumstances, like unexpected findings once on-site, may provide the opportunity for an unscheduled process demonstration.
[second bullet] Prepare for the process demonstration by reviewing the SOP and compare it to the referenced method. As an aid to evaluating the process, develop a checklist based on the SOP.
[third bullet] During the process demonstration, there is a lot going on. Although we are discussing process demonstration separately from interviewing, you will be doing both at the same time. Take your time.
Be aware that the laboratory QA manager or supervisor may want to be present during the process demonstration.
#102:
The first two flags [refer to first two bullets] can simply indicate quality problems, but these problems can compromise data integrity.
The next two, [refer to third and fourth bullets] open the door for serious mischief, such as file substitution, improper baseline manipulations, improper integrations, and QC misrepresentations
Besides the potential for serious errors and QC misrepresentation, these last four process demonstration red flags [refer to last four bullets], can also indicate fabrication.
[sixth bullet] A sample volume discrepancy is evident when there is too much sample left in the container to have allowed for the reported analysis, or that volumes used in analysis would render defective results.
[Note to instructor: As time allows, you can solicit additional process demonstration red flags from the class or ask for specific examples of the above bullets.]
#103:As we discussed in Module 3, here are some basic interview techniques that also apply during the technical assessment. If you have found system vulnerabilities related to technical qualifications, proficiency, or training, then focus interviews on staff whose proficiency is in question. Include new hires as well as seasoned employees
[first bullet] First set the stage by introducing yourself and telling the interviewee the purpose of the interview.
[second bullet] Ask core questions to target any specific system vulnerabilities you have found.
“How long have you been conducting this analysis?”
“What are your other responsibilities in the laboratory?”
“What types of training have you taken?”
[third bullet] Be prepared with specific follow up questions and listen for answers that may point to other vulnerabilities.
#104:[first bullet] The assessor should interview the person actually doing the work. The presence of a manager to clarify or translate when the analyst is inarticulate or speaks little or no English may be helpful. However, it is a red flag if the supervisor detrimentally interferes with employees responses. The analyst or technician should be free to answer questions, without any interference or coaching from the escort or manager. Depending on the nature of concerns you’ve uncovered so far, you may request to speak to the analyst without other laboratory representatives present. In this case, it’s a good idea to have another member of the assessment team present.
[second and third bullet] Use core questions that we talked about in Module 3 to evaluate production pressures and the ability to process samples after hours.
Uncovering these red flags during interviews indicates a dysfunctional quality system.
[fourth bullet] Experienced laboratory managers and assessors report that most improper practices are reported by employees. Detecting and correcting problems in a timely manner require a well-functioning mechanism for reporting problems, without fear or retribution.
[fifth bullet] Analysts should know how data review and oversight work, and they should have confidence in these systems. Anything less is a serious weakness.
#106:This is a good time to begin putting the tools together. Let’s take a quick look at a case study.
[Note to instructor: The first case study is short (it should take about 10 minutes total), and it is a good one to conduct in the large group]
Take 5 minutes, read the case study, and jot down your answers.
[At the end of 5 minutes, advance to the next slide, ask for responses, and record these on flipcharts]
#107:[Show this slide while compiling feedback. Compare audience responses to instructor’s notes.]
[See additional course material for Case Study #1 and answers. Answers should be given as a handout after group discussion]
#108:Let’s break into groups and work on two more case studies.
[See additional course material for hand out case studies #2 and #3]
Let’s take 30 minutes, and then we’ll ask for a volunteer from each group to report back
[Review the case studies one at a time – questions are also on the following slides]
[See additional course material for Case Study answers. Answers should be given as a handout after group discussion]
#109:[Compile feedback, and compare to answer sheet]
#110:[Compile feedback, and compare to answer sheet]
#112:On to data assessment. As mentioned in the last module, data assessment is part of the technical assessment and a valuable tool for detecting improper practices.
#113:In this module we’ll talk about three types of data assessment tools as part of the data assessment process. Each of the tools provide an increasing level of scrutiny of the data. We’ll also talk about the types of red flags that can be uncovered during data assessment.
#114:[discuss each bullet and add examples where appropriate]
#115:Examples of assessment observations that would trigger the need to perform data assessment include the following: [refer to bullets]
#116:You pick the data packages, not the lab. You should select packages for on-going projects or recently completed projects relevant to the scope of the assessment.
Although there are legitimate claims of client confidentiality, you should be allowed to see data packages that have been generated for your organization. One way to manage client confidentiality claims is to treat the data packages as Confidential Business Information (CBI), or ask for a “sanitized” data package. If you can’t get past claims of client confidentiality, then request the data packages for PT samples, recognizing that these show the laboratory on its best behavior.
Some states and federal assessors reserve the right to review any data package produced for their state or department/agency.
#117:The paper review can be done off-site, which will give you more time for a thorough review, and allow more ready access to other experts on your staff.
The paper review is generally limited to a review of the final laboratory reports, including quality control results, and the chains of custody, however it can include anything on paper as part of the analytical and reporting process including bench records, intermediate records, and correspondence. If the paper review is performed on-site it can include a review of log books.
The paper review provides a general indicator of data integrity. It: [refer to “Purpose” sub-bullets]
[refer to “Process” bullets] Look at the case narratives – are all QC exceptions explained?
Do reports and calibration records contain secondary review signatures and dates?
Look for significant discrepancies:
Are there differences in QC results between analysts?
Are there discrepancies between the CoC and reports?
Were samples or blanks diluted without legitimate explanations?
#118:Paper reviews can uncover a number of red flags, including: [refer to bullets]
[first bullet] An incomplete case narrative can be an attempt to hide QC problems.
[second bullet] This is the red-face test. Do the results make sense, given the source of the samples?
[third bullet] Is the lab reporting 100% recovery for polar constituents in aqueous matrices?
The paper review can also uncover [last two bullets]
#119:The source data review involves looking at the instruments, software, instrument printouts and, in the case of manual procedures, the log books and bench sheets. Source data review is especially effective when reviewing multiple, randomly selected data packages. It: [refer to “Purpose” sub-bullets]
In one example, a laboratory’s internal review of source data showed that analyst “A” was consistently able to produce quality control results for matrix spikes that met the acceptance criteria for the time period under review. Analyst “A” trained Analyst “B”, who consistently generated QC failures.
Subsequent investigation focusing on Analyst “B’s” proficiency showed that Analyst “B” was performing the procedures correctly, but the laboratory was dealing with difficult matrices at the time. Analyst “A”, however, was adding extra spike when no-one was looking.
[first Process sub-bullet] Assessors need to look at the electronic data systems, especially self-written programs, to check for problems with rounding, truncating, etc. Look at the electronics WITH the analyst present.
[second sub-bullet] Have the analyst retrieve the data while you observe. Ask them to explain irregularities. BUT, if you see 15 manual integrations on the same sample, then copy the data and take it back to the office to work on.
[third sub-bullet] In microbiology, for example, look at the logbooks and bench sheets to check hold times, incubation times, media preparation, and standards preparation.
#120:A review of source data can reveal additional red flags such as ---[refer to bullets]
[fourth bullet] Note that to detect “reappearing QC results” you may have to look at a lot of data, which you may not have time to do.
[last bullet] Laboratory instrument and LIM systems have standard report formats. Any reports that do not match the specified format or are different from other reports in the data package requires additional scrutiny. An example are screen printed reports (reports printed by screen capture) that can be detected by the absence or headers or footers that are part of the normal report format.
#121:Step 4 involves reconstruction of data from sample receipt to report publication. Data reconstruction: [refer to “Purpose” sub-bullets]
Data reconstruction is the most rigorous data assessment tool, but according to experienced assessors, if you’re going to uncover an improper practice during an assessment, this is the tool most likely to do it. It’s the best chance to independently verify data integrity. Data reconstruction requires access to original raw data.
[refer to “Process” sub-bullets] Some assessors prefer to “deconstruct” the data; that is, to begin with the data package and work your way back to the sample and chain of custody. This way you can use data package red flags to focus the “deconstruction”.
#122:Reconstructing the data can show problems that might not be uncovered any other way. Data reconstruction is often the final step that uncovers the improper practice.
Examples red flags that can be found during data reconstruction include: [refer to bullets]
[first bullet] Routine dead time refers to gaps in time between analytical batches or between samples.
[last bullet] For example, if all of the samples in a batch have the same complex background and the matrix spike has a background that looks very different.
#123:[refer to slide] What can the records tell us? [[discuss each bullet and add examples where appropriate]
#124:[discuss each bullet and add examples where appropriate]
#125:[discuss each bullet and add examples where appropriate]
#126:There are a few important things to keep in mind:
[first bullet] Be flexible – your decision to use data assessment, as well as your choice of tools, may change during the assessment
[second bullet] Bring in an expert for that portion of the on-site assessment, or ask the analyst to access the system and retrieve the data while you observe.
[third bullet] If you think you’re on to something, if you want to spend more time on the process, or if you’re outside your comfort zone, then copy the records, get some help, and complete the data assessment off-site.
#128:Let’s take a quick look at a case study. Take a few minutes, read the case study, and jot down your answers. [Case study questions are on the following slide.]
[Note to instructor: The following case studies can be done in small groups or as a large group discussion]
[Case study text and answer sheets are available in the additional course materials. Answer hand out should be provided after discussion].
#131:While this may seem to be Assessor 101 material, inadequate documentation and follow-up have allowed many cases of improper practices to go unchecked for long periods of time.
#132:When collecting objective evidence, follow a “graded approach”. The level of documentation needed depends on purpose and scope of the assessment. If in doubt, write it down, or make the copy. There’s rarely a problem due to “too much information.”
Ensure collection of all objective evidence necessary to document assessment findings and prepare assessment report. Have a contingency plan in case the laboratory’s “copier is out of order”. (Some assessors take along a small, portable copier, hand-held scanner or digital camera for back-up). Sometimes just letting the laboratory know what resources you have can magically fix a broken copier!
In some cases it may be necessary or advisable to complete the review of laboratory records off-site. An example is the case where electronic records show potential improper practices (e.g. consecutive chromatograms that show baseline irregularities, or improper peak integrations), and the evaluation is going to require more time or expertise than your team has on-site.
#133:When recording observations, particularly those that indicate a vulnerability or suggest the potential for an improper practice, record statements made as accurately as possible, and make notations to indicate what objective evidence supports the observation.
Document observations objectively, and avoid recording suspicions or conclusions (even positive ones; e.g. “everything is under control”), as your assessment notebook could become “discoverable” in the event of a subsequent investigation.
#134:Some experienced assessors recommend that nothing should be written on the face of the copy. In this case, make one-sided copies and make any notations on the back of the copy. Some assessors use a rubber stamp with spaces for the appropriate notation.
In some programs, assessors make a second copy of objective evidence for the laboratory, so they have a record of information in your possession. Some laboratories will allow the assessor to sign and date the original to show what was copied.
Your assessment program should have procedures for handling exhibits and record-keeping requirements and you should be thoroughly familiar with them. This gets into legal issues, which are program-specific.
[As time allows, solicit from the class examples of how their programs handle taking copies of records or data from the lab]
#135:Daily team meetings during the assessment are advisable; they are especially important if you suspect improper practices.
A final team meeting should be held to review plans for the exit brief.
During each meeting, compare notes for completeness. Group and discuss your findings in general categories of observations, deficiencies, and improper practices.
Remember that deficiencies need to be described in terms of the relevant requirement. In most, if not all, cases, improper practices will also be deficiencies. In your notebook, describe the deficiency, cite the requirement, and make a note of any objective evidence or further interviews needed to support the deficiency.
#136:[first sub-bullet] Determine who should be present. This generally includes all members of the assessment team, the laboratory’s technical director and QA manager. Department or section managers may also attend.
[second sub-bullet] Summarize observations and deficiencies. Briefly discuss the objective evidence, but not your suspicions or judgments of fault, or whether actions were deliberate. Give management a chance to respond. Some apparent deficiencies can be cleared up immediately, but in any case, it gives management a chance to get the corrective action process rolling. This can save you both a lot of time.
[third sub-bullet] The schedule is usually dictated by program requirements – there are no hard and fast rules here.
If you have any suspicions about misconduct consult your management prior to the exit briefing.
#137:There are a couple of important exceptions. In the first, [refer to first bullet]
You might not have time or expertise to review all the records necessary to expose an improper practice or support a deficiency finding. In any case, the exit debrief is the place to discuss objective facts, not suspicions. Describe your findings in general terms. You could say, for example: We have observed some calibration “irregularities” and we need more time to evaluate the records.)
If you suspect misconduct: then [refer to second set of sub-bullets]
You could say, for example: “We’re concluding the on-site assessment, and making a copies of a few records for further review.” We need more time to evaluate them before we can discuss our findings.”
The EPA Office of Inspector General encourages anyone who suspects fraud to contact the hotline (http://www.epa.gov/oigearth)
#138:In general, these are the steps that can be involved for follow-up and closeout, depending on whether a follow-up assessment is needed.
#139:By making sure the assessment file is complete, you can check to make sure all appropriate follow-up is complete. The final assessment file should contain the following information [refer to bullets].
Note the purpose of the assessment should be documented (e.g., accreditation/certification, for cause, project specific, contract compliance, etc.)
#142:Let’s apply some of the information we have learned. Take a few minutes, read the case study, and jot down your answers. [Case study questions are on the following slide.]
[Note to instructor: The following case studies can be done in small groups or as a large group discussion]
[Case study text and answer sheets are available in the additional course materials. Answer hand out should be provided after discussion].
#144:[Length – 75 minutes (10:15 am – 11:30 am, including 15-minute break]
[Format – Individual or small group case studies]
[Ask participants to spend 30 - 45 minutes answering questions from the case studies. Reconvene the large group, and review questions and answers against the instructor’s notes]