blank.gif
U.S. Environmental Protection Agency
U.S. EPA Office of Research and Development

ETV Quality & Management Plan

Environmental Technology Verification Program


EPA Report No: EPA/600/R-98/064

May 1998

ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM
QUALITY AND MANAGEMENT PLAN FOR THE PILOT PERIOD (1995-2000)  
National Risk Management Research Laboratory National Exposure Research Laboratory Office of Research and Development U.S. Environmental Protection Agency Cincinnati, Ohio 45268    

APPROVED BY

E.Timothy Oppelt, Director 
National Risk Management Research Laboratory
Robert M. Clark, Ph.D., Director NRMRL Water Supply and Water Resources Division
Gary J. Foley, Ph.D., Director National Exposure Research Laboratory Subhas K. Sikdar, Ph.D., Director NRMRL Sustainable Technology Division
Penelope Hansen, Coordinator Environmental Technology Verification Program Larry T. Cupitt, Ph.D., Director NERL Human Exposure and Atmospheric Sciences Division
Frank T. Princiotta, Director NRMRL Air Pollution Prevention and Control Division Jay J. Messer, Ph.D., Acting Director NERL Environmental Sciences Division

ACKNOWLEDGMENTS

The first draft of this document was developed by a team of writers consisting of the following quality assurance staff members from the U.S. Environmental Protection Agency Office of Research and Development National Risk Management Research Laboratory and National Exposure Research Laboratory: Sam Hayes, Lora Johnson, Ann Kern, and Jeff Worthington.

Subsequent revisions included significant input in the form of comments from members of the Environmental Technology Verification Team. Verification partners also provided comments. The team of writers were joined by the following EPA staff in development of this final document: Nancy Adams, Penelope Hansen, Linda Porter, and Shirley Wasson.

The HTML version of this document was prepared in July 1998 by Jeremiah McBurrows, an apprentice in the Minority Research Apprenticeship Program of the U. S. Environmental Protection Agency and the University of Cincinnati.


TABLE OF CONTENTS

DOCUMENTS AND GENERAL TERMS

ABBREVIATIONS AND ACRONYMS

INTRODUCTION

PART A

MANAGEMENT SYSTEMS

1.0 MANAGEMENT AND ORGANIZATION

1.1 ETV quality policy

1.2 Organization structure

1.3 ETV customer identification and ETV customer needs/expectations/work objectives

1.4 Potential verification partners

1.5 Management resolution for verification partner quality constraints

1.6 Resources

1.7 Authority to stop work for safety and quality considerations

2.0 QUALITY SYSTEM AND DESCRIPTION

2.1 Authorities and conformance to E4 quality standard

2.2 Quality system documents

2.3 Quality system scope

2.4 Quality expectation for products and services

2.5 Quality procedures documentation

2.6 Quality controls

2.7 Management system reviews (MSRs)

3.0 PERSONNEL QUALIFICATION AND TRAINING

3.1 Personnel training and qualification procedures

3.2 Formal qualifications and certifications

3.3 Technical management and training

3.4 Retraining

3.5 Personnel job proficiency

4.0 ETV VERIFICATION PARTNER SELECTION

4.1 Planning and control of selection process

4.2 Technical and quality requirements

4.3 Quality specification/conformance

4.4 Peer review of assistance agreements

4.5 Conformance of verification testing efforts

5.0 RECORDS

5.1 Scope

5.2 Preparation, review, approval, and distribution

5.3 Records storage and obsolete records

6.0 COMPUTER HARDWARE AND SOFTWARE

6.1 General procedures

6.2 Scope of ETV computer hardware/software procedures

6.3 Configuration testing

6.4 Measurement and testing equipment configurations

6.5 Change assessments - configurations, components, and requirements

6.6 ETV website roles and responsibilities

7.0 PLANNING

7.1 Systematic planning process

7.2 Planning document review

8.0 IMPLEMENTATION OF WORK PROCESSES

8.1 Implementation

8.2 Procedures

8.3 Oversight

9.0 ASSESSMENT AND RESPONSE

9.1 Numbers and types of assessments

9.2 Procedures

9.3 Personnel qualifications, responsibility, and authority

9.4 Response

10.0 QUALITY IMPROVEMENT

10.1 Annual review for quality improvement

10.2 Detecting and correcting quality system problems

10.3 Cause and effect relationship

10.4 Root cause

10.5 Quality improvement action

PART B

COLLECTION AND EVALUATION OF ENVIRONMENTAL DATA

1.0 PLANNING AND SCOPING

1.1 Systematic planning of the verification test

1.2 Systematic planning for verification testing

2.0 DESIGN OF TECHNOLOGY VERIFICATION TESTS

2.1 Design process

3.0 IMPLEMENTATION OF PLANNED OPERATIONS

3.1 Implementation of planning

3.2 Services and items

3.3 Field and laboratory samples

3.4 Data and information management

4.0 ASSESSMENT AND RESPONSE

4.1 Assessment types

4.2 Assessment frequency

4.3 Response to assessment

5.0 ASSESSMENT AND VERIFICATION OF DATA USABILITY

5.1 Data validation

5.2 Existing data

5.3 Reports reviewed

REFERENCES

APPENDIX A

APPENDIX B

APPENDIX C


DOCUMENTS AND GENERAL TERMS

Return to Table of Contents

Annual progress report
The report developed on an annual basis to report implementation of the ETV program.

Directors of Quality Assurance
Quality assurance directors for the EPA ORD laboratories, the National Exposure Research Laboratory and the National Risk Management Research Laboratory

E4
The ANSI/ASQC national consensus standard, which is the Agency standard, is applicable to assistance agreements. The standard is entitled, E4-1994, Specifications and Guidance for Quality Systems for Environmental Data Collection and Environmental Technology Programs.

ETV assistance agreement
The contractual record developed by the EPA and signed by the verification partner.

ETV coordinator
The EPA person designated by EPA ORD to lead the ETV team.

ETV verification statement
A summary statement, developed by the verification partner and approved by the EPA pilot manager, which reports individual technology performance.

ETV verification report
The report of the result of an individual technology test.

EPA line management
The management structure to whom each EPA pilot manager reports; i.e. branch chief, division director, laboratory director.

EPA pilot manager
The EPA person designated by EPA line management to serve as the lead for an individual ETV pilot.

EPA pilot quality managers
The EPA quality assurance person designated by EPA line management to manage quality assurance efforts on behalf of the pilot manager.

EPA review/audit reports
The “quality records” developed by EPA as a result of conducting assessments during ETV implementation.

ETV team
EPA employees actively working on the ETV program; the ETV coordinator, pilot managers, and the directors of quality assurance are core members

ETV test objective
The stated objective(s) of each technology test. Verification partners use the DQO process to establish test objectives and test measurement quality criteria.

ETV webmaster
The person designated by EPA line management with responsibility for establishing and maintaining the ETV website.

Evaluation contractor
The contractor selected to collect information on pilot performance.

Generic verification protocol
Those protocols developed, modified, or selected to promote uniform testing for a single pilot operated by the verification partner. Adequate documentation of a robust protocol may allow the development of abbreviated individual test/QA plans which incorporate the generic verification protocol by reference.

Laboratory director
The directors of EPA ORD laboratories, the National Exposure Research Laboratory and the National Risk Management Research Laboratory.

Management system review
The qualitative assessment of data collection operation and/or organization(s) to evaluate the adequacy of the prevailing quality management structure, policies, practices, and procedures for obtaining the type and quality of data needed are obtained.

Office of Research and Development Assistant Administrator
The administrative lead person directing the EPA’s Office of Research & Development.

Quality and Management Plan for the Pilot Period
The specific policies and procedures for managing quality related activities in the ETV program.

Raw data
All data and information recorded in support of analytical and process measurements made during planning, testing, and assessing environmental technology including support records such as: computer printouts, instrument run charts, standards preparation records, field log records, technology operation logs, and monitoring records. ETV test files (all records including raw data) and technical data and associated quality control data which support the data summarized and the conclusions made in each ETV verification report.

Records
All books, papers, maps, photographs, machine readable materials, or other documentary materials, regardless of physical form or characteristics, made or received by the EPA or a verification partner or their designated representative for the ETV program.

Stakeholder groups
Groups set up for each pilot area consisting of representatives of any or all of the following verification customer groups: buyers and users of technology, developers and vendors, the consulting engineering, the finance and export communities, and government permitters and regulators.

Standard operating procedures
Procedures describing routine verification activities including sample collection, analytical testing, and associated verification process.

Test/QA plan
The plan developed by a verification partner for each individual test of a technology or technology class. Therefore, the test/QA plan may include more than one technology. The test/QA plan provides the experimental approach with clearly stated test objectives and associated quality objectives for the related measurements. The test/QA plan may incorporate or reference generic verification protocols.

Test measurement
Those critical measurements that must be made during the course of a technology test to evaluate achievement of the ETV test objective.

Verification
Establishing or proving the truth of the performance of a technology under specific, predetermined criteria or protocols and adequate data quality assurance procedures.

Verification partners
The public and private sector organizations holding cooperative or interagency agreements to assist EPA in implementing the ETV program.

Verification partner manager
The person designated by the verification partner to manage the pilot and serve as the chief point of contact with the EPA.

Verification partner quality manager
The person designated by the verification partner to manage quality assurance for the pilot on behalf of the verification partner manager.

Verification partner quality management plan
The procedures for quality related activities developed and implemented by the verification partner to assure quality in the work processes and services developed for ETV. If the verification partner has a current quality system that accommodates ETV’s needs, additional quality system elements do not need to be developed.

VP review/audit reports
The “quality records” developed by the verification partner as a result of conducting assessments during ETV implementation.

Verification Strategy
The ETV Program Verification Strategy, first published in February 1997, outlines the pilot period goals, programmatic operating principles, pilot selection criteria, key definitions, budgets, and implementation activities that are molding the ETV program, as well as the challenges that are emerging and the decisions that need to be addressed in the future.


ABBREVIATIONS AND ACRONYMS

Return to Table of Contents
 
 

ADQ audits of data quality
ANSI American National Standards Institute
ASQ or ASQC American Society for Quality
AWBERC Andrew W. Briedenbach Environmental Research Center Building
CBD Commerce Business Daily
COAG cooperative agreements
ETV environmental technology verification
ETV QMP ETV quality and management plan
FTE full time equivalent
GAD Grants Administration Division
IAG interagency agreement
ISO International Standards Organization
MSR management systems review
NRMRL National Risk Management Research Laboratory
NERL National Exposure Research Laboratory 
ORD EPA's Office of Research and Development
OSHA Occupational Safety and Health Administration
PEA performance evaluation audits
QA quality assurance
QC quality control
RFA request for application
SOP standard operating procedure
SOW statement of work
TSA technical systems audit
VP verification partner
VP QMP verification partner quality management plan

INTRODUCTION

Return to Table of Contents

Background

The Environmental Technology Verification Program (ETV) has been established by the Environmental Protection Agency (EPA) to evaluate the performance characteristics of innovative environmental technologies across all media and to report this objective information to the permitters, buyers, and users of environmental technology. ETV has evolved in response to the following mandates:
 

  • A directive to EPA by the President in his 1995 environmental technology strategy, Bridge to a Sustainable Future, to “work with the private sector to establish a market-based verification process... which will be available nationally for all environmental technologies within three years.
  • Goals articulated in the Vice-President's Reinventing Government; A Performance Review which directed EPA to begin a comprehensive environmental technology verification program no later than October 1995.
  • Congressional appropriation language contained in the FY96 and FY97 budgets, that the Agency fund technology verification activities at the $10 million level in each year.

 

To comply with these directives, EPA's Office of Research and Development (ORD) has established a five year pilot program to evaluate alternative operating parameters and determine the overall feasibility of a technology verification program. ETV began in October 1995 and will be evaluated through October 2000, at which time the Agency will prepare a report to Congress containing the results of the pilot program and recommendations for its future operation.

Program Description
Developers of innovative environmental technology report numerous impediments to commercialization. Among those most frequently mentioned is the lack of acceptance of vendor performance claims. It is believed that objective, independently acquired, high quality performance data and operational information on new technologies will significantly facilitate the use, permitting, financing, export, purchase, and general marketplace acceptance of such technologies. The purpose of ETV is to provide such data and information to the customer groups that require them in order to accelerate the real world implementation of improved technology. Better technology will more thoroughly, rapidly, and efficiently protect the environment. It is important to stress that the product of ETV is high quality data and information, not technology approval or endorsement.

The thesis that independent performance verification more rapidly moves new technology into use will be tested by EPA’s five year pilot program. ETV funds and operates twelve pilot projects, each operated by third party organizations under the auspices of EPA. These "partner organizations" include private sector testing, evaluation and research companies, state technology evaluation programs, federal laboratories, and industry associations. For the most part, each pilot is focused on a different environmental, industry, or technology sector (e.g., air pollution control technology, drinking water systems, field monitoring devices, industrial coatings products). By design, all pilots are operated in a somewhat different manner in order to test various methods for both technical and operational efficiency and effectiveness in verifying performance. Management techniques are in place to assure that constant evaluation of alternative methods occurs and results in continuous improvement of processes throughout the pilot period.

Because credible information is the ultimate product of ETV, the highest appropriate quality assurance procedures will be used throughout the program. The EPA's Office of Research and Development implements an Agency-wide quality system to assure that activities conducted in EPA research laboratories, other EPA research facilities or locations, or at facilities being operated on behalf of or in cooperation with the EPA are supported by data of known and acceptable quality for their intended use. Individual research laboratories develop laboratory-specific quality management plans. The National Risk Management Laboratory (NRMRL) and the National Exposure Research Laboratory (NERL) are implementing ETV in conformance with such plans.

Program and Quality Management Documents
Several documents define the overall operation of the program during its pilot stage. The first to be published (February 1997) was the Environmental Technology Verification Strategy. This document lays out goals, customer and key word definitions, basic operating principles, pilot selection criteria, and the programmatic and budgetary vision of the pilot program. The Strategy is intended to evolve over time and is now (May 1998) undergoing review to evaluate the need for modification and amplification.

The second major program management document being used by ETV to guide its operation is the ETV Quality and Management Plan (QMP) which follows this introduction. Under development for over a year, the ETV QMP uses the structure, policies and standards of the American National Standard ANSI/ASQC E4-1994, "Specification and Guidelines for Quality Systems for Environmental Data Collection and Environmental Programs." This document,

...describes a basic set of mandatory specifications and non-mandatory guidelines by which a quality system for programs involving environmental data collection and environmental technology can be planned, implemented and assessed*.

As of February 1996, all cooperative agreements entered into by EPA concerning environmental technology must be in conformance with the provisions of E4. This requirement is expected to be extended to EPA contracts in 1998#.

The ETV Quality and Management Plan
Based upon the structure and standards of E4, the ETV QMP lays out the definitions, procedures, processes, inter-organizational relationships, and outputs that will assure the quality of both the data and the programmatic elements of ETV during the pilot period. Part A of the ETV Quality and Management Plan contains the specifications and guidelines that are applicable to common or routine quality management functions and activities necessary to support the ETV program. Part B of the ETV Quality and Management Plan contains the specifications and guidelines that apply to test-specific environmental activities involving the generation, collection, analysis, evaluation, and reporting of test data.

EPA's verification program is organizationally complex, involving numerous outside organizations through its extensive stakeholder process, partner organizations who bear most of the quality assurance responsibilities, and testing and consulting companies hired by partner organizations to conduct field and laboratory work. Within EPA, the program is coordinated through ORD's ETV Team consisting of staff from ten Branches located in six Divisions of two Laboratories, NRMRL and NERL, along with quality assurance staff in each of the laboratories' physical locations. Finally, EPA program offices and regions are increasingly involved in outreach activities, as are states and other Federal agencies through the White House Environment and Technology Working Group. The ETV QMP is designed to play a major role in clearly delineating the roles and responsibilities of all of these diverse and important players.

All partner organizations will use this document and its parent, the E4 standard, to create quality management plans that assure appropriate levels of data collection, quality outputs, and customer responsiveness. These plans will be submitted to EPA for review and approval by pilot managers and quality assurance staff. It is not the purpose of this document to require that partner organizations create wholly new operating procedures solely for use under ETV. Most organizations selected by EPA as cooperators already have many of the procedural and process elements required by E4 incorporated into their existing management systems. Other requirements found in this document will be new or different. Cooperators should address all appropriate elements of the ETV QMP either specifically in their ETV plan or include appropriate and adequately detailed references to existing documents.

The ETV QMP will be reviewed on an annual basis throughout the pilot period (and beyond if the program is extended) to incorporate lessons learned from the experiences of the pilots and feedback from customer groups. The addition of new policies and elimination or modification of ineffective procedures will be discussed with all participants and modifications to partner QMP's may be required.

The ETV QMP follows the general outline of the ANSI/ASQC E4-1994 document.

____________________

*American Society for Quality; American National Standard and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs; Milwaukee, Wisconsin; 1994.

#All ETV pilots are executed under cooperative agreements with the exception of the Site Characterization and Monitoring and the Industrial Coatings and Coatings Equipment Pilot which utilize Interagency Agreements with the Departments of Energy and Defense.

____________________


PART A
MANAGEMENT SYSTEMS

Return to Table of Contents

Part A of the ETV Quality and Management Plan contains the specifications and guidelines that are applicable to common or routine quality management functions and activities necessary to support the ETV program.

Part B of the ETV Quality and Management Plan contains the specifications and guidelines that apply to test-specific environmental activities involving the generation, collection, analysis, evaluation, and reporting of test data.

1.0 MANAGEMENT AND ORGANIZATION

1.1 ETV quality policy

Return to Table of Contents

The Office of Research and Development shall establish and implement a quality policy to ensure that the Environmental Technology Verification (ETV ) program produces the type and quality of program outputs needed and expected by ETV clients.

The EPA Office of Research and Development's (ORD) quality policy for the Environmental Technology Verification (ETV) program is established as follows:

    The quality system for the overall ETV program seeks to be consistent with industry consensus standards. Each verification partner shall implement a valid and approved quality system. As of February 1996, the Agency’s required quality system for cooperative agreements is ANSI/ASQC E4. Each verification test will be performed according to planned and documented, pre-approved test/QA plans. All technical statements in ETV verification reports shall be supported by the appropriate data.

 

1.2 Organization structure

Return to Table of Contents

The relevant organizations, functional responsibilities, levels of accountability and authority, and lines of communication shall be formally defined in the quality system and approved by the EPA laboratory directors responsible for the quality of work performed by or on behalf of each EPA laboratory.

The overall organizational structure of the ETV program graphically presents lines of accountability, authority, and communication. The general functional responsibilities for the major organizational units are specified in the structure.

Organizational Structure of the ETV Program

To view a larger and more legible organization chart, click here.

1.2.1 Assistant Administrator for ORD and the USEPA Administrator responsibilities:

  • provides overall program direction
  • serves in a program leadership role with Congress, other agencies of the Executive Branch, and the general public

1.2.2 ORD laboratory directors responsibilities:

  • approve and implement annual program budgets and resource allocations
  • allocate laboratory personnel and other resources to accomplish ETV’s goals, including appointment of the ETV coordinator
  • approve all ETV verification statements
  • review and approve ETV Verification Strategy and ETV Quality and Management Plan (QMP)
  • ensure that appropriate assessments (i.e., see Table 9.1) are implemented

1.2.3 Division directors and branch chiefs responsibilities:

  • allocate appropriate division and branch personnel and other necessary resources to support pilots located in the division/branch
  • provide oversight of pilot outputs and products prior to public release

1.2.4 ETV coordinator responsibilities:

  • leads the ETV team by providing communication opportunities, e.g., periodic conference calls, meetings, and training
  • coordinates the overall ETV program, including design of multi-year strategies, operating principles, implementation activities, and annual budgets
  • communicates ETV team and program activities, progress, outputs, and recommendations to EPA, Congress, agencies in the Executive Branch, customer groups, and the general public
  • maintains an up-to-date ETV website containing materials relevant to the program and to each pilot
  • manages overall ETV program outreach activities to ensure that customer groups are knowledgeable about the existence and use of ETV generated data
  • collects data on operational parameters and program outputs to continuously evaluate the ETV program and make recommendations to management and the Congress on its present and future operation.
  • reviews, approves, and assists in revision of ETV QMP
  • ensures ETV QMP is implemented in the ETV program

1.2.5 ETV team responsibilities:

  • establish mutually acceptable program-level strategies and protocols
  • participate in development of an overarching ETV outreach strategy
  • communicate pilot-specific progress, issues, difficulties, and lessons learned
  • meet to discuss program objectives, seek collegial guidance, and evaluate success
  • review ETV QMP

1.2.6 EPA pilot managers responsibilities:

  • select and oversee verification partners
  • communicate requirements for and oversee the verification partner’s quality system
  • arrange for peer review of verification partner proposals and ETV verification reports
  • attend and/or conduct regular meetings with stakeholders
  • oversee production and approval process of ETV verification reports and ETV verification statements
  • assist with ETV outreach activities for the assigned ETV pilot
  • participate in ETV team activities

1.2.7 Verification partners responsibilities:

  • establish, attend, and/or conduct meetings of stakeholder
  • maintain communication with EPA to assure mutual understanding and conformance with EPA quality procedures and expectations and ETV policies and procedures
  • manage the oversight and conduct of verification activities
  • assure that quality procedures are incorporated into all aspects of each ETV pilot
  • develop, conduct, and/or oversee test/QA plans in cooperation with technology vendors
  • solicit technology vendor proposals or vendor products
  • operate ETV activities within their documented and approved quality management plan
  • prepare ETV verification reports on technology tests
  • prepare a three-to-five page ETV verification statement at the completion of each technology verification
  • appoint a quality manager, responsible for ensuring that the VP’s organization and suppliers/contractors to the VP organization have quality systems in compliance with this QMP, and that the VP complies with their documented quality system.

1.2.8 Stakeholders group responsibilities may include the following:

  • assist in development of generic verification protocols
  • assist in prioritizing the types of technologies to be verified
  • review pilot-specific procedures and selected ETV verification reports emerging from the ETV pilot
  • assist in the definition and conduct of outreach activities appropriate to the technology area and customer groups
  • serve as information conduits to the particular constituencies that each member represents

1.2.9 ETV directors of quality assurance responsibilities:

  • develop and implement the ETV quality system at the direction of the ETV coordinator and in coordination with the ETV team
  • document the ETV quality system in the ETV QMP
  • review, and update, if necessary, the ETV QMP annually in cooperation with the ETV coordinator
  • work with ETV pilot quality managers to ensure implementation of ETV QMP
  • provide current copies of the ETV QMP to the appropriate participants in the ETV program
  • communicate quality issues and information to the ETV team in a timely manner
  • conduct internal management system reviews (MSR)

1.2.10 EPA pilot quality managers responsibilities:

  • communicate quality system requirements, quality procedures, and quality issues to the assigned EPA pilot manager and verification partner
  • review and comment on verification of partner’s quality system description to verify conformance to the quality provisions of this document
  • perform MSR of each pilot’s quality system to verify conformance to the quality provisions of this document perform technical systems audits (TSA's) and performance evaluations (PEs) of pilots, as appropriate
  • provide assistance to pilot personnel in resolving quality assurance issues

Tables available on the ETV website  present a current listing of the pilots that are either underway or soon to be awarded. Included in the table are the EPA pilot managers, EPA pilot quality managers, verification partner managers, and verification partner quality managers. The tables contain their names, geographic locations, ORD laboratory or company affiliations, and phone numbers.

1.3 ETV customer identification and ETV customer needs/expectations/work objectives

Return to Table of Contents

The ETV coordinator, pilot managers, and partners are responsible for coordinating the identification of customers and communicating the needs of the internal and external customers to ensure that ETV work products satisfy their needs.

1.3.1 As identified in the ETV Verification Strategy, external customers (i.e., outside EPA) include, but are not limited to:

  • public and private sector buyers and users of technology
  • developers and vendors of technology
  • the consulting engineering community who recommend technologies to buyers
  • federal, state and local government permitting/regulatory agencies
  • international marketers and the financial community
  • Congress

In a general sense, needs and expectations of external customers include:

  • ETV verification reports and ETV verification statements supported by objective and reliable data, provided in a timely manner
  • a justifiable approach to selecting technologies for testing
  • a practical approach in testing which provides efficient, timely, and cost-effective technology tests
  • user-friendly documents (e.g., easy to read and to implement)
  • technology operation consistent with statements in the ETV verification report

For each pilot, needs and expectations of external customers are defined and documented in the minutes of stakeholders meetings. The process to define these pilot-specific needs and expectations includes:

  • discussions between the EPA pilot managers, verification partners and stakeholders
  • development by EPA pilot managers, verification partners and stakeholders of pilot objectives for use prior to testing

NOTE: Not all pilots are structured the same. For example, the independent pilot uses an advisory committee, an expert review, and an alliance group in determining pilot-specific needs and expectations.

1.3.2  Internal customers of the ETV program are those EPA staff responsible for execution of the ETV program in accordance with the expectations of Congress and the Administration. These customers include EPA and ORD senior managers who expect conformance with management and quality policies of the Agency.

Other EPA staff, such as EPA technical experts in the regions and headquarters, will benefit incidentally from the program in the following areas:

  • data of known and useful quality
  • expedited use of improved environmental technologies
  • ETV testing accomplished on a wide variety of technologies
  • user-friendly documents (e.g., easy to read and to implement)
  • development of appropriate verification testing protocols

1.4 Potential verification partners

Return to Table of Contents

EPA line management shall oversee the selection of ETV verification partners.

Each ETV pilot seeks to evaluate a wide variety of verification partnerships, using both interagency and cooperative agreements. When cooperative agreement holders conduct the verification testing, they are competitively solicited using the Request for Application (RFA) process, whereby notice of EPA's intent to issue an RFA is published, typically in the Commerce Business Daily (CBD).

1.5 Management resolution for verification partner quality constraints

Return to Table of Contents

When necessary, appropriate EPA management shall negotiate acceptable measures of quality and success when constraints of time, costs, or other problems affect the verification partner's capability to fully satisfy customer's needs and expectations.

When constraints of time, costs, or other problems significantly affect the verification partner's capability to fully satisfy the ETV's quality system needs and expectations, EPA pilot managers negotiate with the verification partner by the following procedure to establish acceptable measures of quality and success:

  • If any one of the above problems occur, the verification partner notifies the EPA pilot manager.
  • The EPA pilot manager negotiates and documents an acceptable agreement with the verification partner.
  • If agreement cannot be reached between the EPA pilot manager and the verification partner, the negotiation discussion will be elevated to the appropriate EPA branch chief, division director, and/or ETV coordinator.
  • If agreement cannot be reached between EPA management and the verification partner, appropriate actions will be performed (e.g., increased funding for the benefit of the program, non-funding of extensions, or non-support for verifications statements).
  • If negotiation of issues reaches a stalemate, the Grants Administration Division (GAD) will be notified for possible legal action.

1.6 Resources

Return to Table of Contents

The laboratory directors shall provide adequate resources to the ETV directors of quality assurance, EPA pilot managers and EPA pilot quality managers to enable them to plan, implement, assess, and improve the overall ETV program and quality system effectively.

Laboratory directors take the following actions to achieve the above policy:

  • provide FTE allotment of EPA pilot managers
  • provide FTE allotment of QA and other support personnel at each laboratory’s geographical location
  • provide sufficient travel funds for each pilot for an appropriate level of oversight and external assessments
  • provide for maintenance of communication lines between ORD laboratory directors, the ETV team, and the ETV coordinator

1.7 Authority to stop work for safety and quality consideration

 Return to Table of Contents

The verification partner shall stop unsafe work and work of inadequate quality, or shall delegate the authority to do so to others.

The following procedures are necessary to stop unsafe work and work of inadequate quality:

  • The verification partners shall ensure compliance with all federal, state, and local health and safety policies during the performance of the pilot tests. This includes obtaining appropriate permits.
  • The verification partner's quality system shall identify one or more individuals who may issue a stop work order in the event that unsafe work or work of inadequate quality is identified.
  • EPA pilot managers and EPA pilot quality managers shall contact the authorized individual(s) in the event that work of inadequate quality is discovered.
  • In extreme circumstances, the EPA pilot managers may ask GAD to intervene if the verification partner does not implement their approved quality management plan.

2.0 QUALITY SYSTEM AND DESCRIPTION

Return to Table of Contents

A quality system shall be planned, established, documented, implemented, and assessed as an integral part of an ETV management system for environmental technology verification programs defined by ETV quality policy.

Development and subsequent endorsement of this plan by the ETV coordinator and EPA line management are evidence that the ETV quality system is planned, established, documented, implemented, and assessed as an integral part of an EPA ETV management system.

 

2.1 Authorities and conformance to E4 quality standard

Return to Table of Contents

The ETV quality system shall address applicable parts of E4 and shall include the organizational structure, policies and procedures, responsibilities, authorities, resources, and guidance documents.

The authority for developing appropriate quality systems for ETV is USEPA Order 5360.1. The requirement for assistance agreement holders is found in Federal Register, CFR Parts 30 & 33, February 15, 1996.

This plan complies with ANSI/ASQC E4-1994, Specifications and Guidance for Quality Systems for Environmental Data Collection and Environmental Technology Programs, the Agency standard applicable to assistance agreements. E4 is comparable to the International Standards Organization (ISO) 9000 standards series, as shown in the comparison table provided in Annex B-5 to E4.

The ETV quality system addresses each applicable individual “specification” provided in the published quality standard, ANSI/ASQC E4-1994, using the policies and procedures in this plan, as appropriate.

Verification partners develop quality system descriptions to be consistent with both ANSI/ASQC E4-1994 (and/or ISO 9001) and/or this document.

 

2.2 Quality system documents

Return to Table of Contents

The ETV quality system shall be described in a QMP that is reviewed and approved by the ETV coordinator and EPA line management.

The ETV quality system is described in this quality and management plan.

  • The ETV team develops and implements the quality system.
  • The ETV coordinator, ORD laboratory directors, and appropriate ORD division directors review and approve the ETV QMP and subsequent revisions to the plan, as policy for the ETV program.
  • Verification partners' quality systems (which are consistent with ANSI/ASQC E4-1994) are described in a written ETV pilot quality management plan, and are reviewed and approved by verification partner management, the EPA pilot manager, and EPA pilot quality manager. Subsequent revisions are reviewed in a similar manner.

2.3 Quality system scope

Return to Table of Contents

The ETV quality system description shall identify in general terms those items, programs, or activities to which it applies.

This quality system description applies to the following:

  • the EPA ETV program
  • selection and oversight of verification partners
  • review and approval of verification partners’ quality management plans
  • ETV products (e.g., test/QA plans, reports, ETV verification statements)
  • planning, implementation, and assessment activities supporting ETV verification activities

2.4 Quality expectation for products and services

Return to Table of Contents

The ETV quality system shall include provisions to ensure that products or results of the environmental programs defined by the ETV program are of the type and quality needed and expected by ETV clients.

The preeminent products of the ETV program are the environmental technology verification reports and statements issued by EPA and the verification partner. Provisions to ensure that these products and other results of the ETV program are of the quality expected include:

  • Products are reviewed as described in part A, section 5.0.
  • MSR's and technical assessments are conducted as described in part A, section 9.0. Technical assessments may include field and laboratory audits, performance evaluation audits, and audits of data quality.

2.5 Quality procedures documentation

Return to Table of Contents

Following approval of the ETV QMP, management elements of the quality system shall be implemented as described.

Verification partners must operate the ETV pilots under a written and EPA-approved quality management plan that is based on E4 and/or the provisions of this plan.

  • Environmental technology pilots initiated prior to approval of the ETV QMP will be requested to provide quality system elements consistent with requirements of their assistance agreements with EPA. The ETV coordinator requests that existing pilots comply with the requirements set forth in this plan within 6 months of the date of this document.
  • For pilots initiated after the approval of the ETV QMP, verification partners provide evidence of compliance before verification activities are started, as required by the assistance agreement signed by the verification partner.

The EPA pilot manager is responsible for obtaining a copy of the verification partner's quality management plan, as specified in the RFA, for their own review and forwarding the document to the EPA pilot quality manager for review and approval prior to planning technology tests.

2.6 Quality controls

Return to Table of Contents

The ETV quality system description shall define when and how controls are to be applied to specific technical or technology testing efforts and shall outline how these efforts are planned, implemented, and assessed.

2.6.1 ETV program controls include:

  • existing EPA policies and procedures for selection and administration of verification partner efforts in ETV
  • an approved ETV QMP
  • quality management, assurance, and control procedures as part of the assistance agreement

2.6.2 Pilot-specific controls include:

  • test/QA plans, developed and approved prior to testing
  • oversight by the EPA pilot quality managers of the implementation process and follow-up to any finding of nonconformance
  • technical operations assessment
  • specified quality control requirements in the assistance agreement

Pilot-specific procedures for planning, implementation, and assessment are described in the verification partner's quality system. Procedures for planning, implementing, and assessing the overall ETV quality system are detailed in part A sections 7.0, 8.0, and 9.0 and in part B.

2.7 Management system reviews (MSRs)

Return to Table of Contents

At regular intervals (at least annually) the ETV quality system shall be reviewed and its description updated, if appropriate, to reflect changes in the organization as well as changes in ETV quality policy.

The ETV directors of quality assurance perform an internal MSR of the program in accordance with the process as outlined in part A section 9.0.

3.0 PERSONNEL QUALIFICATION AND TRAINING

3.1 Personnel training and qualification procedures

Return to Table of Contents

Personnel performing work shall be trained and qualified based on appropriate requirements prior to the start of the work or activity.

3.1.1 EPA pilot managers are selected based on:

  • educational background and/or a degree that is directly relevant to the pilot technology area
  • work experience specific to the pilot technology area
  • experience in program management
  • participation in required training for project officer responsibilities on assistance agreements, as documented in training records

3.1.2 EPA pilot quality managers are selected based on:

  • educational background and/or a degree relevant to the technology tests and programs
  • work experience specific to QA of technology tests and programs
  • experience in quality management

3.1.3 Key participants working directly for or on behalf of the verification partner in support of the pilot and/or individual test operations are selected by the verification partner and evaluated by the EPA during the RFA process. RFA evaluation criteria for key personnel will vary, but typically include a consideration of the following:

  • educational background and/or a degree(s) relevant to technical areas represented in the pilot
  • work experience related to the technology areas represented in the pilot
  • experience in quality management

The verification partner's documented quality management plan will address training and qualification procedures for pilot personnel.

3.2 Formal qualifications and certifications

Return to Table of Contents

The need to require formal qualification or certification of personnel performing certain specialized activities shall be evaluated and implemented where necessary.

ETV program management, quality management, and pilot management requires no formal qualification or certification other than where applicable:

  • EPA Project Officer Training and Assistance Training (or Work Assignment Manager training as appropriate)
  • Appropriate OSHA courses

Formal qualification or certification of personnel performing specialized activities for each pilot or for specific test/QA plans is addressed on a pilot-specific or test/QA plan-specific basis. Verification partners maintain records of the qualification or certification of such personnel.

NOTE: Requirements for formal qualifications or certification may be based on applicable federal, state, or local requirements associated with a particular test. Examples of possible certifications include but are not limited to drinking water plant operators certification, professional engineering registration, and certification of industrial hygienists.

3.3 Technical management and training

Return to Table of Contents

Appropriate technical and management training, which may include classroom and on-the-job, shall be performed and documented.

EPA line management is responsible for appropriate technical and management training for staff working on the ETV program. Such training will be documented in each individual's training file.

Verification partners are responsible for personnel training and qualification procedures for each pilot or for specific test/QA plans. Verification partners maintain the training records (available for review by EPA).

The ETV team will be trained at meetings occurring at least twice a year to develop a policy, share information and lessons learned. The directors of quality assurance provide training on the requirements of the ETV QMP during the periodic workshops organized by the ETV coordinator.

3.4 Retraining

Return to Table of Contents

When job requirements change, the need for retraining to ensure continued satisfactory job proficiency shall be evaluated.

The need for retraining EPA staff is evaluated on an annual basis by the appropriate line management.

Verification partners are responsible for retraining for each pilot or for specific test/QA plans. 

3.5 Personnel job proficiency

Return to Table of Contents

Evidence of personnel job proficiency shall be documented and maintained for the duration of the technology test or activity affected, or longer if required.

3.5.1 EPA pilot managers - The existing performance standards of the EPA pilot managers may already include tasks consistent with the following items. These items should be considered for specific identification in the performance standards:

  • active participation in the ETV team; communicating pilot issues, lessons learned, required reports, and appropriate assistance to members of the ETV team and management
  • developing RFAs and/or management of cooperative agreements/IAGs
  • facilitating stakeholders group activities
  • ensuring development of and contributing to generic verification protocols and test/QA plans
  • providing a leadership role to ensure technologies are selected consistent with the ETV Verification Strategy
  • serving as a communication link between EPA and the verification partner, in particular, providing information and documents to support the ETV website
  • reviewing draft and final ETV verification reports and other pilot documents
  • reporting to program management the completeness and validity of the ETV verification statement prior to report issuance
  • ensuring the timely delivery of complete and consistent ETV products and services

NOTE: Evaluations are the responsibility of the appropriate supervisor and are not a record of the ETV program.

3.5.2 EPA pilot quality managers -The existing performance standards of the EPA pilot quality managers may already include tasks consistent with the following items. These items should be considered for specific identification in the performance standards:

  • verification partner pilot quality management plan review
  • management system reviews
  • technical system audits, performance evaluation audits, and audits of data quality
  • verification reports and statements reviews
  • providing complete and timely pilot audit reports

NOTE: Evaluations are the responsibility of the appropriate supervisor and are not a record of the ETV program.

3.5.3 Verification partner staff - Verification partners document and maintain records (such as annual performance reviews) of personnel job proficiency for work performed directly in support of the verification partner’s ETV activities.

NOTE: Evaluations are the responsibility of the verification partner and are not a record of the ETV program.

4.0 ETV VERIFICATION PARTNER SELECTION 

4.1 Planning and control of selection process

Return to Table of Contents

Funding of assistance associated with the ETV program shall be planned and controlled to ensure that the quality of verification tests is known, documented, and meets technical requirements and acceptance criteria of the clients.

The ETV program is designed to investigate ways to facilitate the verification and use of environmental technology and exists solely for the benefit of industry and the user community. Two pilots operate under interagency agreements; however, the ETV program primarily operates by securing the cooperation of verification partners through open competition utilizing the agency’s assistance agreement program. Agency procedures for advertising, reviewing, and awarding assistance agreements are followed during the selection process. The procedures governing this process are available from GAD and are not discussed in this section. The procedures used to plan and control the selection of verification partners for ETV are listed below.

Planning to select verification partners requires:

  • assessing and prioritizing environmental technology categories for use in pilot projects (i.e., defining the scope of each pilot project in terms of technology areas to be tested)
  • establishing ANSI/ASQC E4-1994 as an applicable quality standard
  • issuing RFA's
  • selecting the appropriate verification partner based on their experience and proficiency
  • Controlling the selection process to ensure the quality of verification tests includes:
  • implementing controls stipulated in EPA policies and procedures for assistance agreements
  • establishing specific language in each RFA requiring development and implementation of a quality system consistent with the ETV quality system, and ANSI/ASQC E4-1994.
  • reviewing the applicant’s proposed quality system to verify it meets the RFA requirement and provides for quality of verification tests which will be known, documented, and meet technical requirements

4.2 Technical and quality requirements

Return to Table of Contents

Assistance solicitation documents shall contain information clearly describing the technical and quality requirements associated with the verification testing.

Technical and quality requirements expressed in the RFA include technical evaluation criteria for technical skills and experience of staff members, and demonstrated experience in the development of quality systems relevant to ETV. Cooperative agreements require that a verification partner, once selected, develops and submits for approval to EPA, a pilot quality management, plan consistent with E4, prior to conducting technical activities.

4.3 Quality specification/conformance

Return to Table of Contents

Assistance solicitation documents shall specify the ETV quality requirements for which the verification partner is responsible and how the verification partner’s conformance to client's requirements shall be verified.

ETV quality requirements for which the verification partner is responsible are specified in the RFA and in this ETV QMP. During verification partner selection, the applicants’ proposals and written responses to the requirements are reviewed for conformance to the RFA specifications. After a verification partner is selected, the EPA pilot quality manager reviews and the EPA pilot manager approves written quality system documents (e.g., pilot QMP) for conformance to the EPA and ETV quality policies and procedures.

4.4 Peer review of assistance agreements

Return to Table of Contents

Assistance award documents shall be reviewed for accuracy and completeness by qualified personnel prior to award.

Peer review is an integral part of EPA’s project planning, implementation, and assessment process. RFA packages are internally peer reviewed prior to their issuance. Responses to the RFA undergo a peer review process which supports the award of the assistance agreement.

4.5 Conformance of Verification Testing Efforts

Return to Table of Contents

Appropriate measures shall be established to ensure that the verification testing efforts satisfy all terms and conditions of the assistance agreement. Verification partners shall have a demonstrated capability to meet all terms and conditions.

Once a verification partner has been selected, measures to ensure continued conformance to terms and conditions in the assistance agreement are implemented as described in part A, sections 8, 9, and 10.

5.0 RECORDS

5.1 Scope

Return to Table of Contents

Procedures shall be established, controlled, and maintained for identifying, preparing, reviewing, approving, revising, collecting, indexing, filing, storing, maintaining, retrieving, distributing, and disposing of pertinent quality documents and records. Such procedures shall be applicable to all forms of documents and records, including printed and electronic media. Measures shall be taken to ensure that users understand the documents to be used. Records requiring control shall be identified.

Records to which this policy applies include:

- ETV Verification Strategy

- ETV QMP (this document)

- cooperative agreement and interagency agreement records

- verification partners’ quality management plans

- minutes of stakeholder meetings

- generic verification of protocols (how a given type of technology is verified)

- Test/QA plans (procedures for an individual test, including SOPs)

- raw data (all written and electronic data generated when tests are conducted)

- ETV verification reports (comprehensive reports on a technology verification project)

- ETV verification statements (summary statement for an individual technology test)

- annual pilot progress reports

- EPA reviews and audit of reports

- verification of partner reviews and audit reports

__________________________

Information in this section applies to both electronic and printed records, as well as original records developed on behalf of the ETV program that are required to demonstrate the quality of information and data provided in ETV verification reports.

TABLE 5.1 Records Management Scheme

  

Record Type

   

Preparation/Updating

   

Review

   

Approval

   

Finals distributed to:

   

ETV Verification Strategy

   

ETV coordinator

   

ETV team  

VP managers  

Laboratory directors

   

ORD Deputy Assistant Administrator

   

ETV Webmaster

   

ETV quality and management plan

   

ETV  

directors of quality assurance

   

ETV team  

VP managers  

EPA pilot quality managers

   

ETV coordinator  

laboratory directors  

division directors

   

ETV Webmaster

   

COAG/IAG records

   

EPA pilot manager  

VP manager

   

N/A

   

N/A

   

N/A

   

VP quality management plan

   

VP manager  

VP quality manager

   

EPA pilot quality manager

   

EPA pilot manager

   

N/A

   

minutes of stakeholder meetings

   

VP manager

   

EPA pilot manager  

Stakeholders

   

N/A

   

ETV Webmaster

   

generic verification protocol

   

VP manager

   

EPA pilot quality manager  

Stakeholders

   

EPA pilot manager

   

ETV Webmaster  

(draft and final versions)

   

test/QA plan  

(including SOPs)

   

VP manager

   

VP quality manager  

EPA pilot quality manager

   

EPA pilot manager  

Vendors

   

stakeholders  

ETV Webmaster  

vendors

   

raw data

   

VP manager

   

N/A

   

N/A

   

EPA can request copies

   

ETV verification report

   

VP manager

   

EPA pilot quality manager  

Vendor

   

EPA pilot manager

   

ETV coordinator

   

ETV verification statement

   

VP manager

   

EPA pilot manager   

EPA pilot quality manager  

Vendor  

ETV coordinator

   

laboratory directors 

   

ETV Webmaster

   

annual ETV progress report

   

evaluation contractor

   

EPA Team  

VP manager  

Stakeholders

   

ETV coordinator

   

laboratory directors

   

EPA reviews/ audit reports

   

EPA pilot quality manager  

directors of quality assurance

   

EPA pilot manager  

ETV coordinator

   

N/A

   

laboratory directors  

VP manager  

VP quality manager

   

VP reviews/audit reports

   

VP quality manager

   

VP manager

   

N/A

   

EPA pilot manager  

EPA pilot quality manager

5.2 Preparation, review, approval, and distribution

Return to Table of Contents

Sufficient records shall be specified, prepared, reviewed, authenticated, and maintained to reflect the achievement of the required quality for completed work and/or to fulfill any statutory requirements. Documents used to perform work shall be identified and kept current for use by personnel performing the work. Documents, including revisions, shall be reviewed by qualified personnel for conformance with technical requirements and quality system requirements and approved for release by authorized personnel.

Table 5.1 lists the pertinent quality records for ETV, the person(s) responsible for preparing and updating these records, the reviewers, those given approval authority for each record type, and the distribution plan. Where a procedure is not applicable (e.g., a document is not subject to approval), N/A is entered in Table 5.1. All reviewers and approving officials receive copies of the records they review/approve; the Distribution column in Table 5.1 lists only those individuals who receive final copies, in addition to the reviewers and approving official. For revised documents, these same review, approval, and distribution pathways are followed. Unless otherwise noted, material placed on the ETV website is available for public inspection, comment, and use.

5.3 Records Storage and Obsolete records

Return to Table of Contents

Obsolete or superseded documents shall be identified and measures shall be taken to prevent their use, including removal from the work place and from the possession of users when practical. Maintenance of records shall include provisions for retention, protection, preservation, traceability, and retrievableness. While in storage, records shall be protected from damage, loss, and deterioration. Retention times for records shall be determined based on contractual and statutory requirements, or, if none stated, as specified by the EPA coordinator and EPA line management.

Obsolete records should be clearly marked as such. These records may be retained in the workplace for historical reference, or they may be removed to archival storage. ETV will follow ORD'S Records Management Policy, Part 003 (see Appendix A), which addresses requirements for indexing, filing, maintaining, retrieving, and disposing of documents and records from all extramural financial agreements. The current minimum requirement is that all records be kept for seven years after the final payment on a cooperative agreement or interagency agreement.

6.0 COMPUTER HARDWARE AND SOFTWARE

6.1 General procedures

Return to Table of Contents

Computer software and computer hardware configurations used in the ETV program shall be installed/tested/used/maintained/controlled/documented to meet users' requirements and shall conform to this quality policy and applicable consensus standards and/or data management criteria.

At the program level, ETV does not expect to develop software. At the pilot level, if verification partners intend to develop software to support their ETV process (or an individual test/QA plan), the partner should have procedures in place as specified here. If the verification partner uses only commercial software for office operations (e.g., word processing software, spreadsheet software), it is unlikely that the partner would need specific procedures for assessing software quality. Part A, sections 6.2 through 6.6, apply only to software and software/hardware configurations developed specifically for the ETV program.

The following are the ETV program procedures which ensure that each pilot controls the quality of all computer hardware/software configurations for the program.

  • The EPA pilot manager and the verification partner discuss and agree upon the computer hardware and software requirements of the pilot and/or specific test/QA plan.
  • Once decisions are finalized, the verification partner supplies evidence of meeting all requirements before data collection, reduction, or validation procedures begins.
  • For software developed for ETV programs, the verification partner tests all applications and configurations using a test data set or by running a shakedown test of the system to ensure all applications/configurations are operating to specifications. The verification partner must show evidence of a system to maintain, control, and document such software and hardware configurations. This includes, but is not exclusive of; resources to correct any hardware/software failure with minimal downtime to the program, tracking upgrades/revisions to software or configuration changes, documenting software names, versions, and copyright dates, and complete documentation of the code. Complete documentation of code includes the written code with comments structured in a modular form.

6.2 Scope of ETV computer hardware/software procedures

Return to Table of Contents

Computer software and computer hardware/software configurations covered by ETV’s quality policy includes, but is not limited to:

  • operation or process control of environmental technology systems (including automated data acquisition and laboratory instrumentation)
  • and data bases containing environmental data

Computer software and computer hardware/software configurations covered by this quality and management plan include all agreed upon, pilot-specific applications or configurations. These include, but are not limited to:

  • evaluating and reducing environmental data
  • reporting environmental data
  • data bases containing environmental data

6.3 Configuration testing

Return to Table of Contents

Computer hardware/software configurations shall be tested prior to actual use and the results shall be documented and maintained.

On a pilot level, the verification partner conducts tests of the computer hardware/software configuration using a standard set of testing conditions.

NOTE: Verification partner is required to have a system to document all testing of computer hardware/software configurations, as required by part A section 6.1. A test data set or a standard set of testing conditions should be developed on a pilot- or test/QA plan-specific basis. Maintenance testing should be easily tracked and retrievable.

6.4 Measurement and testing equipment configurations

Return to Table of Contents

Computer hardware/software configurations integral to measurement and testing equipment that are calibrated for a specific purpose do not require further testing unless:

  • the scope of the software usage changes OR
  • modifications are made to the hardware/software configuration

On a pilot level, verification partners perform the following procedures (as provided in the verification partner's quality system).

Whenever computer hardware/software configurations integral to measurement and testing equipment are calibrated for a specific purpose, further testing is not normally performed unless the scope of the software usage changes or modifications are made to the hardware/software configuration.

In the event either of the above mentioned changes occurs, the verification partner retest the changes as described in part A sections 6.1 and 6.3. Retesting is documented to the same extent as the original application/configuration.

6.5 Change assessments - configurations, components, and requirements

Return to Table of Contents

Changes to hardware/software configurations, components, or program requirements shall be assessed to determine the impact of the change on the technical and quality objectives of the ETV program supported.

The verification partner is responsible for assessing the changes, determining the need for testing, and reporting the assessments to the pilot manager.

6.6 ETV website roles and responsibilities

Return to Table of Contents

The ETV website shall be operated in such a way that it serves all ETV participants and customers through prompt and accurate posting of ETV information and documents.

The pilot managers, or alternate(s) designated in writing by the pilot manager, are responsible for sending the following information to the ETV webmaster:

  • general fact sheets and brochures
  • stakeholders lists (and updates)
  • meeting announcements and summaries
  • generic testing protocols (indicating draft or final)
  • test/QA plans (indicating draft or final)
  • CBD announcements
  • ETV verification statements
  • upcoming meetings/speeches/announcements

7.0 PLANNING

7.1 Systematic planning process

Return to Table of Contents
 

A systematic planning process shall be established, implemented, controlled, and documented to:

  • identify the customer(s), and their needs and expectations
  • identify the technical and quality goals that meet the needs and expectations of the customer
  • translate the technical and quality goals into specifications that shall produce the desired result
  • consider any cost and schedule constraints within which technology test activities are required to be performed
  • identify acceptance criteria for the results or measures of performance by which the results shall be evaluated and customer satisfaction shall be determined

7.1.1 Systematic planning process established for ETV is conducted as follows:

  • EPA establishes the number and type of ETV pilots necessary to comply with the Presidential mandate to cover all environmental technologies within three years.
  • EPA lays out basic program operation parameters in two documents, the ETV Verification Strategy and ETV QMP.
  • Based upon the ETV Verification Strategy, the ETV coordinator, in consultation with the Agency's Innovative Technology Council, the ETV team, and EPA line management, designs an annual budget.
  • Appropriate personnel are appointed from within the ORD laboratories to fill ETV positions. Selection and information on qualification is presented in part A sections 3.1 and 3.2.
  • The division and branch management provides resources and planning to support EPA staff assigned to the pilot, such as training and travel. Technical training is discussed in part A sections 3.3 and 3.4.
  • At the beginning of each pilot, verification partners are selected in conformance with part A section 4.0 .  EPA’s requirements for the appropriate extramural agreement (e.g., cooperative agreement, interagency agreement) must also be met.
  • After selection, the verification partner, in consultation with the EPA pilot manager, establishes a stakeholders group that contains representatives of customer groups of concern to that pilot area. [Note: in order to evaluate the efficacy of stakeholders groups, at least one pilot will not establish such a group.]
  • The EPA pilot manager and the verification partner develop plans for pilot verification tests and present them to the stakeholders for review, comment, and advice.
  • The EPA pilot manager, the EPA pilot quality manager, the verification partner, and the stakeholders group hold at least one joint meeting annually to:

- identify, revise, and/or clarify the technical and quality goals of the work to be accomplished

- translate the technical and quality goals into written specifications that will be used to produce the desired result

- consider any cost and schedule constraints within which test activities are required to be performed

- develop qualitative measures of performance by which the results will be accepted

- determine testing priorities and evaluate customer satisfaction

  • These four parties appoint a recorder to document the minutes of each meeting.
  • Minutes of each meeting are taken and then are distributed to participants for comment. Minutes of stakeholders meetings are incorporated within the record management scheme described in part A section 5.0.

7.1.2 Implementation of the systematic planning process Planning is accomplished through frequent meetings among participants and through posting initial planning documents and stakeholders meeting minutes on the ETV website. Procedures for planning at the pilot and test level are addressed in Part B. Procedures for implementing the planning process are detailed below:

  • Customer Identification - ETV customers are identified in part A section 1.
  • Technical and quality goals identification - These are identified during planning meetings with senior management, conference calls with ETV participants, and meetings with EPA quality professionals and technical staff.
  • Technical and quality goal specifications - The ETV coordinator works with the EPA pilot managers, EPA directors of quality assurance, and other quality professionals to translate technical and quality goals of the overall program into the ETV QMP.
  • Cost and schedule constraints - These are discussed during planning meetings with senior management and considered yearly for allocation to each pilot.
  • Measures of performance - The ETV coordinator develops measures of performance for the program, which are evaluated by the ETV team and verification partners throughout the course of the program. Appendix B contains the most current measures of performance.
  • Customer satisfaction evaluation - The program evaluates all pilots on an annual basis to collect data for a final report on the pilot phase. This evaluation includes customer satisfaction measures as appropriate.

7.1.3 Systematic planning process controls include:

  • development and implementation of written procedures (verification test protocols and test/QA plans)
  • requirement of minutes of stakeholders group meetings
  • review of verification partner work efforts by the EPA pilot manager and EPA pilot quality manager

7.1.4 Systematic planning process documentation includes the ETV Verification Strategy, the ETV QMP, the verification partners QMPs, and test/QA plans.

 

7.2 Planning document review

Return to Table of Contents

All planning documentation shall be reviewed and approved for implementation by authorized personnel before the specific work commences. Such documentation includes but is not limited to test/QA plans and generic verification test protocols.

Planning document review is discussed in part A section 5.2.

8.0 IMPLEMENTATION OF WORK PROCESSES 

8.1 Implementation

Return to Table of Contents

Work shall be performed according to approved planning and technical documents.

The planning for the implementation of the EPA management and quality work processes is contained in part A section 7.0. The individual ETV pilot work is performed according to planning documents written by the pilot. All technology verification work shall occur according to protocols and test/QA plans developed and agreed upon by EPA, the verification partner, and the vendor. The authors, reviewers, and approvers of these documents are specified in part A section 5.0, Table 5.1.

The approved protocols and test/QA plans shall be present on the site of testing, and the work shall be implemented in accordance with them. During the work phase, modifications to plans and procedures shall be documented, and the modifications shall be incorporated into the final protocols and test/QA plans. The authors, reviewers, and approvers of changes to these documents are the same as for the original documents and are specified in part A section 5.0, Table 5.1.

Verification partners are responsible for implementing their work processes in accordance with their quality systems. 

8.2 Procedures

Return to Table of Contents

Procedures shall be developed, documented, and implemented for appropriate routine, standardized, special, or critical operations. Operations needing procedures shall be identified. The form, content, and applicability shall be addressed, and the reviewers and approvers shall be specified.

Procedures for the overall operation of the ETV program are contained in the ETV Verification Strategy, the ETV QMP and in other appropriate EPA policies (e.g., contractual, records management). The individual ETV pilots shall identify and document those operations in their pilots requiring procedures as discussed in Part B. Procedures shall be written in a format that can be readily comprehended by the user and shall contain sufficient detail and clarity to ensure that results are achieved effectively. Appropriate operations documents, authors, reviewers, and approvers are specified in part A section 5.0, Table 5.1.

8.3 Oversight

Return to Table of Contents

Implementation of work shall be accomplished with a level of management oversight and inspection commensurate with the importance of the program and the intended use of the results, and shall include the routine measurement of performance against established technical and quality specifications.

EPA line management has responsibility for oversight of verification work processes as discussed in part A section 1.0. Verification partner oversight and responsibilities for the verification work processes are given in the individual pilot QMPs.

9.0 ASSESSMENT AND RESPONSE

9.1 Numbers and types of assessments

Return to Table of Contents

Assessments shall be planned, scheduled, and conducted to measure the effectiveness of the implemented management and quality systems. Several types of assessments are available for this purpose. Management shall determine during the planning stage the appropriate types of assessment activities. Assessments shall include an evaluation to determine and verify whether technical requirements, not just procedural compliance, are being implemented effectively.

The assessments shown in Table 9.1 and the minimum frequency are commensurate with the importance of the ETV program and the intended use of the verification results. Management assessments shall be used to measure the effectiveness of the implemented management systems and technical systems. Performance assessments shall be used to evaluate performance of the pilot technical operations. Data assessments shall assess reported data quality. Verification partners perform self-assessments in accordance with the individual pilot management plans, and EPA performs independent assessments of verification partners.

Table 9.1 Assessments
 

Level Assessment Tool Assessors Assesses/   Responders Basis for Assessment Minimum Frequency Reason for Assessment Report Reviewed by
Program Management systems review directors of quality assurance ETV program management ETV QMP once; thereafter, as requested assess management practices for ETV program Laboratory directors: ETV coordinator
Pilot Management systems review EPA pilot quality manager VP's pilot QMP once; thereafter, as requested assess quality management practices of verification partner for the ETV pilots EPA directors of quality assurance: EPA pilot managers: VP managers, ETV coordinator
Pilot Technical systems audits self: VP quality managers -  independent: EPA pilot quality managers VP's: Field testing organizations test/QA plans self: see pilot QMP - independent: twice per pilot assess quality of technical verification tests EPA pilot managers: EPA pilot quality manager: VP managers
Pilot Performance evaluation audits self: VP quality managers -  independent: EPA pilot quality managers VP's: Field testing organizations test/QA plans self: see pilot QMP - independent: for each pilot, as applicable assess measurements performance EPA pilot managers: EPA pilot quality manager: VP managers
Pilot Audits of data quality self: VP quality managers -  independent: EPA pilot quality managers VP's: Field testing organizations raw data and summary data self: At least: 10% of all of the verification data - independent: for each pilot, as applicable assess data calculations and reporting EPA pilot managers: EPA pilot quality manager: VP managers

 

(Also, see Part B, 4.2 for information re: assessment frequency.)

9.2 Procedures

Return to Table of Contents

Assessments shall be performed according to written and approved procedures, based on careful planning of the scope of the assessment and the information needed. Assessment results shall be documented and reported to management. Management shall review the assessments.

Assessments shall be planned according to the scope of the assessment and the information needed. Suitable written procedures for planning and conducting audits shall be contained in the operating manuals of EPA quality teams, the operating and quality manuals of the verification partners, and EPA guidance documents. Assessments are based on interviews, on the physical examination of objective evidence, and on the examination of the documentation of past performance. Results are documented in audit reports, and reviewed by appropriate management.
 

9.3 Personnel qualifications, responsibility, and authority

Return to Table of Contents

Personnel conducting assessments shall have the appropriate technical or management skills to perform the assigned assessment. Management shall determine and document the level of competence, experience, and training necessary to ensure the capability of personnel conducting assessments. The responsibilities and authorities of personnel conducting assessments shall be clearly defined and documented, particularly in regard to authority to suspend or stop work in progress upon detection and identification of an immediate adverse condition affecting the quality of results or the health and safety of personnel.

EPA and verification partner management determines and documents the level of competence, experience, and training of their respective audit personnel during hiring and periodic performance reviews. Qualified audit personnel, as listed in Table 9.l, have access to the appropriate management personnel and documents required to perform their audit duties. They are organizationally independent of the program or pilot they are auditing. They have the responsibility and authority to:

  • identify and document problems that affect quality of verification results
  • propose recommendations for resolving problems that affect quality of verification work processes or results
  • independently confirm implementation and effectiveness of solutions

If auditors identify a severe problem affecting verification quality, EPA pilot managers have the authority to request of the verification partner manager that work be stopped until the problem is addressed. If auditors identify a problem where the health and safety of personnel are in danger, they have the responsibility to bring it to the immediate attention of appropriate EPA management, verification partner management, and onsite testing personnel.

9.4 Response

Return to Table of Contents

Responses to adverse conclusions from the findings and recommendations of assessments shall be made in a timely manner. Conditions needing corrective action shall be identified and the appropriate response made promptly. Follow-up action shall be taken and documented to confirm the implementation and effectiveness of the response action.

When the recommendations and conclusions from the findings of assessments are adverse, response from the auditee detailing the corrective action shall be expected within 10 working days of receiving the audit report. Auditors shall follow up with appropriate documentation to confirm the implementation and effectiveness of the response.

10.0 QUALITY IMPROVEMENT

10.1 Annual review for quality improvement

Return to Table of Contents

A quality improvement process shall be established and implemented to continuously develop and improve the ETV Quality System.

The ETV coordinator and EPA directors of quality assurance review the quality and management plan annually and recommend improvements to the plan.

The EPA directors of quality assurance recommend improvements and negotiate improvements with the ETV team during the annual meeting and through the ETV website.

10.2 Detecting and correcting quality system problems

Return to Table of Contents

Procedures shall be established and implemented to prevent as well as detect and correct problems that adversely affect quality during all phases of technical and management activities.

EPA pilot managers and EPA pilot quality managers report problems in any of the areas to EPA line management and the ETV directors of quality assurance:

  • adequacy of the ETV quality system
  • consistency of the quality system
  • implementation of the quality system
  • correction of quality system procedures
  • completeness of documented information
  • quality of data
  • quality of planning documents
  • implementation of the work process

EPA line managers respond promptly to address correction of the quality problem.

10.3 Cause and effect relationship

Return to Table of Contents

When problems are found to be significant, the relationship between cause and effect and the root cause shall be determined.

The following are general procedures. Specific procedures are found in the individual verification partners written quality systems. When problems are significant, the quality manager determines and documents the relationship between cause and effect, and when possible, determines and documents the root cause of the problem. The quality manager provides this information to the appropriate project managers so corrective action can be authorized and implemented.

A significant problem is any problem requiring:

  • a testing protocol change OR
  • a management system change OR
  • a quality system change (either internal or external to EPA, but still within the ETV program)

NOTE: The verification partner quality managers in accordance with their quality systems are continually reviewing and assessing their projects for conformance with their quality documents. At the program level, assessment reports from the individual projects are monitored and evaluated by the ETV directors of quality assurance for trends or recurring problems that are indicative of significant problems affecting the ETV program as a whole. Any such situation is immediately communicated to the ETV coordinator. The ETV coordinator shares the information and any corrective actions with the EPA pilot managers.

10.4 Root cause

Return to Table of Contents

The root cause should be determined before permanent preventative measures are planned and implemented.

To guard against implementing ineffective changes, EPA personnel ensure when possible that root causes are determined before preventative measures are planned and implemented.

10.5 Quality improvement action

Return to Table of Contents

Appropriate actions shall be planned, documented, and implemented in response to findings in a timely manner.

In the event that a significant problem is identified that requires a structural change to the ETV program, the ETV Coordinator will initiate discussions with EPA line management appropriate to correct the deficiency.

PART B

Return to Table of Contents

COLLECTION AND EVALUATION OF ENVIRONMENTAL DATA

Part A of the ETV Quality and Management Plan contains the specifications and guidelines that are applicable to common or routine quality management functions and activities necessary to support the ETV program.

Part B of the ETV Quality and Management Plan contains the specifications and guidelines that apply to test-specific environmental activities involving the generation, collection, analysis, evaluation, and reporting of test data.

1.0 PLANNING AND SCOPING

Return to Table of Contents

The work of the ETV program at the pilot level is to verify the performance of commercial-ready technologies. As discussed in part A section 4.0, the planning process begins with the Statement of Work (SOW) contained in the Request for Applications (RFA). The successful applicant becomes the verification partner for the pilot.

1.1 Systematic planning of the verification test

Return to Table of Contents

All work involving the generation, acquisition, and use of environmental data shall be planned and documented. The type and quality of environmental data needed for their intended use shall be identified and documented using a systematic planning process. The test-specific planning must involve the key users and customers of the data. EPA pilot managers should guide planning activities and ensure that participants are informed of and understand completely the requirements of each test.

The programmatic planning for verification of commercial-ready technologies is discussed in part A section 7.1.1. This section continues the discussion of systematic planning at the pilot level.

Verification partners, working with the EPA pilot managers, begin a systematic process to plan the individual pilot tests. Systematic planning may be accomplished through any demonstrated technique including the data quality objectives process (EPA QA/G-4) and the observational method. The planners perform the following actions:
 

  • refine the scope of their respective technology areas
  • determine interest in verification from the manufacturers of commercial-ready technologies within the defined scope of the technology areas
  • convene stakeholder groups, containing representatives of verification customer groups, which advise during the planning process
  • mediate and facilitate the selection of focus areas
  • prepare generic verification protocols which are developed to promote uniform testing for a given type of technology
  • coordinate the review and revision of the protocols (See the review and approval scheme in part A section 5.0.) keeping in mind both the customers and EPA’s objectives for verification as defined in the ETV Strategy
  • solicit vendor agreements to participate in verification of their products based on the generic protocol (some iteration of the two previous points frequently occurs here as the vendors review and request revision of portions of the generic protocols)
  • prepare test/QA plans for the acquisition of data to verify the performance of the vendors technologies

The protocols and test/QA plans describe the experimental approach, with clearly stated test objectives and associated quality objectives for the related measurements.

1.2 Systematic planning for verification testing

Return to Table of Contents

  • Organizations that participate in the test shall participate in the planning.
  • The scope and objectives of the verification testing and the desired action or result from the work shall be defined.
  • The data to be collected to achieve verification shall be identified, and the QA and QC requirements to establish the quality of the data shall be defined.
  • Verification tests shall undergo a design process
  • Verification tests shall be documented.
  • Equipment, operators, and skill levels required for the verifications shall be identified.
  • Any constraints (e.g. time and budget) shall be identified.
  • Conditions, which will suspend work, shall be identified.
  • Assessment tools shall be determined.
  • Methods and procedures for storing, retrieving, analyzing, and reporting the data shall be identified.
  • Methods and procedures for minimizing, characterizing, and disposal of hazardous waste generated during the test shall be identified.

1.2.1 Planning personnel

The verification partner shall coordinate test planning among the participating organizations including EPA, the stakeholders, the vendors, and any testing organizations and laboratories participating in the test. The verification partner, with the concurrence and oversight of the EPA pilot manager, shall identify the planning roles of the various players, and shall conduct planning activities by shared communication via teleconference, video conference, and in-person meetings, as appropriate, and within the constraints of the budget.

1.2.2 Purpose, scope and objectives

The purpose of this testing is to verify the performance of commercial-ready technologies. Another objective is to develop an efficient method for testing commercial-ready technologies. Many of the pilot tests accomplish this objective by preparing generic verification protocols whereby the performance of similar technologies can be verified in the future using the same protocol. The characteristics of individual technologies and the specifics of individual tests are covered in the test/QA plan that incorporates the generic verification protocol by reference.

1.2.3 Data to be collected and design of experiment

During planning of the technology verification test, the process, environmental, laboratory, response, and QA data to be collected are identified. Also identified are testing organizations, test personnel, skill levels, methods, procedures, and equipment unique to each verification test. Planning is integrated into design as discussed in part B section 2.0.

1.2.4 Documentation and reporting

Records generated during the pilot tests are listed in part A section 5.0. Records consist of both paper and electronic records. Electronic methods for storing, retrieving, analyzing, and reporting the data are generally commercially available programs for word processing, spreadsheet, or database processing, or commercial software developed especially for data collection and processing on a specific instrument or piece of equipment. Pilots may also develop software/hardware configurations, as appropriate, in their technology verification tests. The use of computer hardware and software is discussed in part A section 6.0. Paper records such as field notebooks, bench sheets, field data sheets, custody sheets, and instrument printouts are part of the raw data test record and kept with the study records.

1.2.5 Assessments

The assessment tools and minimum frequencies of assessments for the verification tests are identified in part A section 9.0. The definitions of the assessment tools and suggested frequencies are given in part B section 4.0.

1.2.6 Constraints, suspension of work, waste minimization and disposal

Verification partners work under the constraints of time and resources communicated to them by the EPA ETV Coordinator and the EPA pilot manager. When constraints are determined by the verification partner to affect quality, the resolution of the problem proceeds as described in part A section 1.5. Circumstances under which work can be suspended are discussed in part A section 1.7. If waste is generated as part of the verification testing, the verification partner seeks to minimize the amount, and disposes of it in accordance with applicable local, state, and federal laws.

2.0 DESIGN OF TECHNOLOGY VERIFICATION TESTS 

2.1 Design process

Return to Table of Contents

The design shall incorporate those activities pertaining to verification of performance identified during the planning process, establish test specifications, and identify appropriate controls. The design shall include

  • Selection of field sampling or testing equipment, and its operational parameters, as appropriate
  • Selection of field sampling or testing methods, as appropriate
  • Sample types, numbers, quantities, handling, packaging, shipping, and custody, if applicable
  • Sampling locations, storage, and holding times, if applicable
  • Selection of analytical methods, quality measures of performance, analysis providers, if applicable
  • Requirements for calibration standards, and performance evaluation samples, as appropriate
  • Requirements for field and/or laboratory QA/QC activities
  • Requirements for qualifications of testing, sampling and/or analysis personnel
  • Protection of health and safety of test personnel and the public
  • Readiness reviews prior to data collection
  • Assessments required including technical and performance audits, audits of data quality, and assessments of data use limitations
  • Data reporting requirements
  • Methods for validating and verifying the data
  • Requirements for data security, archival, and retention
  • Integration of time and schedule constraints
  • Procedures for minimization or disposal of wastes generated during verification activities
  • 2.1.1 Design technique

    In designing technology performance verification operations, designers use verification testing design techniques including statistical methods, as appropriate. The design takes into account constraints of time, scheduling, and resources.

    2.1.2 Field and laboratory equipment and methods

    During the design process, the appropriate field and laboratory equipment that was identified during planning for the testing of the technology verification performance is incorporated. Appropriate test methods and operating parameters are specified.

    2.1.3 Sampling and analysis

    The design process produces a testing plan based upon the data quality objectives for the verification of the technology performance.

    • The plan specifies the tests to be conducted, the baseline parameters, the number of replicate tests, and the controls.
    • If the testing involves samples, the plan specifies sampling methods, sample types, numbers, quantities, handling, packaging, shipping, and custody. Also specified are sample locations, storage conditions, and holding times.
    • Analysis methods, quantitative measures of performance, calibration standards, calibration check standards, and performance evaluation samples, as appropriate, and as identified in the planning process, are incorporated into the design.
    • Methods and procedures are included to ensure the test produces data of known and acceptable quality.
    • The design incorporates any other field or laboratory QA/QC activities identified by planners.
    • The design specifies the requirements for qualifications of technical staff responsible for obtaining, analyzing, and evaluating the data. Protection of the health and safety of testing personnel and the public is incorporated into the design.
    • Procedures for the minimization and disposal of wastes generated are designed into the verification activities.

    2.1.4 Assessments

    Assessments incorporated into the design include self-assessments (internal audits) by the verification partner and independent assessments by EPA. The assessments identified in the planning process are incorporated into the design. The type and minimum number of assessments are identified in part A section 9.0. A suggested schedule of assessments is given in part B section 4.0.

    2.1.5 Validating, reporting, securing, and archiving data

    Data are validated as indicated under Audits of Data Quality in part A section 9.0. Data are reported in ETV verification reports and ETV verification statements. Data records are stored as discussed in part A, section 5.0 and in Appendix A.

    2.2 Generic verification protocols and test/QA plans: planning documents from the design process

    Return to Table of Contents

    Planning documents from the design process include generic verification protocols and test/QA plans.

    Writing planning documents is generally a lengthy process involving iterations of review and revision. Authors should be knowledgeable of the activity and the equipment described in the planning documents. Two types of planning documents have been identified, as the core documentation needed for operation of an ETV pilot: the generic verification protocol and the test/QA plan. The generic verification protocol is meant to promote uniform testing for a single pilot and, therefore, is considered a more general document. The test/QA plan contains the specific information needed to conduct a verification test.

    2.2.1  Generic verification protocols provide the necessary framework for development of the more detailed test/QA plan. The specific content and level of detail given in generic verification protocols will vary greatly between pilots. For some pilots, the generic verification protocol may be so detailed that the test/QA plan may require very little additional information. Conversely, other pilots may use the generic verification protocol to describe the general procedures that guide the pilot. Given the highly variable nature of the generic verification protocol, no specific format has been proposed.

    The issues that may be addressed in the generic verification protocol are the following:

    • General description of the pilot
    • Responsibilities of all involved organizations
    • Experimental design
    • Equipment capabilities and description
    • Description and use of field test sites
    • Description and use of laboratory test sites
    • QA/QC
    • Data handling
    • Requirements for other documents
    • Health and safety
    • References

    The QA/QC section of the generic verification protocol typically describes the activities that verify the quality and consistency of the work. Preparation and use of appropriate QA procedures such as QC samples, blanks, split and spiked samples, and performance evaluation (PE) samples to verify performance of the technology being tested can be described. Criteria for success can be included. Frequency of calibrations and QC checks and the rationale for them can be described. Procedures for reporting QC data and results can be given. Who or what organization is responsible for each QA activity, and who has the responsibility for identifying and taking corrective action can be specified. However, if these items vary between tests within a given pilot, the more appropriate document in which to describe them may be the test/QA plan.

    The protocol may cite documents or procedures that explain, extend, and/or enhance the protocol such as related procedures, the published literature, or methods manuals. The specific location of any reference not readily available from a full citation in the reference section should be given (as in a facility-specific standard operating procedure) or attached to the protocol.

    2.2.2  Test/QA plans contain the following required elements. Not all elements listed are appropriate to every test. The author of the test/QA plan will note and explain those elements that are not applicable.

    • Title and approval sheet
    • Table of contents, distribution list
    • Test description, test objectives
    • Identification of the critical measurements, data quality objectives, data quality indicator goals, schedule, milestones
    • Test (including QA) organization and responsibilities
    • Documentation and records
    • Experimental design
    • Sampling procedures
    • Sampling handling and custody
    • Analytical procedures
    • Test-specific procedures for assessing data quality indicators
    • Instrument calibration and frequency
    • Data acquisition and data management procedures
    • Internal systems audits
    • Internal performance audits (where applicable)
    • Corrective action procedures (response actions to audit findings)
    • Assessment reports to EPA
    • Data reduction, data review, data validation, data reporting
    • Reporting of data quality indicators for critical measurements
    • Limitations of the data

    The generic verification protocol is incorporated by reference. One reference document available for writing test/QA plans is EPA/QA G-5, Guidance for Quality Assurance Project Plans.

    If another level of detail is required for describing test activities, for example operation of an instrument, a standard operating procedure may be written and attached to the QA/test plan. The following topics, from EPA QA/G-6, Guidance for Development of Standard Operating Procedures (SOPs), may be included (or a reference provided) in the standard operating procedure:

    • Scope and applicability
    • Summary of procedures
    • Definitions (acronyms, abbreviations, etc.)
    • Personnel qualifications
    • Health and safety warnings (Warn of activities which could result in possible personal injury.)
    • Cautions (Warn of activities which could damage equipment, degrade samples, or invalidate results.)
    • Apparatus and materials
    • Calibration
    • Sample collection, sample labeling, sample tracking
    • Handling and preservation of samples
    • Interferences
    • Sample preparation and analysis
    • Data acquisition, calculations and data reduction
    • Requirements for computer hardware and software used in data reduction and reporting
    • Data management and records management

    3.0 IMPLEMENTATION OF PLANNED OPERATIONS

    3.1 Implementation of planning

    Return to Table of Contents

    Environmental data operations shall be implemented according to the approved planning documents. Deviations shall be documented and reported to and evaluated by management. Approved changes shall be made and distributed to test personnel to replace previous versions of the documents.

    Technology performance verifications are implemented according to the generic verification protocols and test/QA plans prepared during planning. During implementation, changes are incorporated, reviewed and approved according to the scheme discussed in part A section 5.0. Test personnel have access to the approved planning documents, approved changes to planning documents, and all referenced documents. The final protocols are posted on the ETV web page for future use for similar technology verifications.

    All implementation activities are documented. Suitable documents are bound notebooks, field and laboratory data sheets, spreadsheets, computer records, and output from instruments (both electronic and hardcopy). All documentation is developed as described in the planning documents. All implementation activities are traceable to the planning documents and to test personnel.

    3.2 Services and items

    Return to Table of Contents

    Only qualified and accepted services and items shall be used in the performance verification operations. Acceptance shall be identified on the items themselves and /or in documents traceable to the items. Tools, gauges, instruments, and other sampling, measuring, and testing equipment used for activities affecting quality shall be controlled as required and, at specified intervals, calibrated to maintain accuracy with specified limits. Documentation of calibration shall be maintained and shall be traceable to the equipment. Periodic preventative and corrective maintenance of equipment shall be performed, and it shall be recalibrated prior to use.

    ETV program services are delivered by the verification partners. The verification partners are accepted via the request for application, proposal, and assistance agreement process as discussed in part A section 4.0.

    Qualified and accepted services and items used in testing are provided for in the verification partners quality systems. The pilot quality management plan contains provisions for acceptance of services and items, and documentation of acceptance. Control of equipment, calibration to maintain accuracy within specified limits, maintenance, and documentation is the responsibility of the verification partner. The verification partner verifies that the tools, gauges, instruments, and any other sampling, measuring, and testing equipment used for activities affecting quality are controlled as required by the planning documents, and calibrated at specified intervals to maintain accuracy within specified limits. Equipment found to be out-of-specification is not used without documented repair and reassessment of performance. All maintained and repaired equipment is recalibrated as necessary before it is used for measurement work.

    Oversight is the responsibility of EPA during the pilot period, and is conducted through review and acceptance of the verification partners quality system documents, the pilot quality management plan, and through independent audits.

    3.3 Field and laboratory samples

    Return to Table of Contents

    Handling, storage, cleaning, packaging, shipping, and preservation of field and laboratory samples shall be performed according to required specifications, protocols, or procedures to prevent damage, loss, deterioration, artifacts, or interference. Sample chain of custody shall be tracked and documented.

    If samples for analysis are taken in the field, they are to be handled according to procedures in the verification partners quality systems and the pilot quality management plan. The oversight responsibility of EPA during the pilot phase is to determine that the approved systems and plans contain adequate procedures for handling, storage, cleaning, packaging, shipping, and preservation of field and laboratory samples to prevent damage, loss, deterioration, artifacts, or interference. The verification partner provides adequate chain of custody procedures, if they are required.

    3.4 Data and information management

    Return to Table of Contents

    Data or information management, including transmittal, storage, validation, assessment, processing, and retrieval, shall be performed in accordance with the approved instructions, methods, and procedures.

    ETV program records and the procedures for handling them are listed in part A section 5.0.

    4.0 ASSESSMENT AND RESPONSE

    4.1 Assessment types

    Return to Table of Contents

    Management system review - Audit of a quality system for conformance to a quality management plan

    Technical systems audit - Qualitative onsite audit of the physical setup of the test. The auditors determine the compliance of testing personnel with the test/QA plan.

    Performance evaluation audit - Quantitative audit in which measurement data are independently obtained and compared with routinely obtained data to evaluate the accuracy (bias and precision) of a measurement system.

    Audit of data quality - Qualitative and quantitative audit in which data and data handling are reviewed and data quality and data usability are assessed.

    4.2 Assessment frequency

    Return to Table of Contents

    Activities performed during technology verification performance operations that affect the quality of the data shall be assessed regularly, and the findings reported to management to ensure that the requirements stated in the generic verification protocols and the test/QA plans are being implemented as prescribed.

    The types and minimum frequency of assessments for the ETV programs that are listed in part A section 9.0. The pilot tests will have at minimum the following types and numbers of assessments:

    • management systems review - one independent assessment by EPA, as provided in the pilot quality management plan
    • technical systems audits - self-assessments for each test as provided for in the test/QA plan and independent assessments by EPA, twice per pilot
    • performance evaluation audits - self-assessments, as applicable, for each test as provided in the test/QA and independent assessments, as applicable for each pilot
    • audits of data quality - self-assessments of at least 10% of all the verification data; and independent assessment, as applicable for each pilot

    Additional assessments may be provided for in individual test/QA plans. Assessments by the verification partner will remain steady throughout the pilot period, but independent assessment by EPA will decrease in keeping with the policy of preparing the pilots to eventually operate independently.

    4.3 Response to assessment

    Return to Table of Contents

    Appropriate corrective actions shall be taken and their adequacy verified and documented in response to the findings of the assessments. Data found to have been taken from non-conforming equipment shall be evaluated to determine its impact on the quality of the data. The impact and the action taken shall be documented.

    Assessments are conducted according to procedures contained in the verification partners quality systems or the quality procedures available to EPA personnel, as discussed in part A, section 9.0. Findings are provided in audit reports. Responses to adverse findings are required within 10 working days of receiving the audit report. Follow-up by the auditors and documentation of response are required.

    5.0 ASSESSMENT AND VERIFICATION OF DATA USABILITY

    5.1 Data validation

    Return to Table of Contents

    Data obtained during verification tests shall be assessed, verified, and qualified according to their intended use (as verification performance data). Any limitations on this intended use shall be expressed (quantitatively to the extent practicable) and shall be documented in the ETV verification report.

    Audits of data quality are used to validate data at the frequency cited in Table 9.1 and are documented in the data audit report. The goal of an audit of data quality is to determine the usability of test results for reporting technology performance, as defined during the design process. Validated data are reported in the ETV verification reports and ETV verification statement along with any limitations on the data and recommendations for limitations on data usability.

    5.2 Existing data 

    Return to Table of Contents

    Any data obtained from sources that did not use a quality system equivalent to the E4 Standard shall be assessed according to approved and documented procedures.

    Existing data may be used for planning, subject to the individual rules set up by each pilot. Data used for verification collected outside the ETV test is subject to rigorous scrutiny according to the procedure in Appendix C.

    5.3 Reports reviewed

    Return to Table of Contents

    ETV verification reports containing data and reporting the results of technology verification performance shall be reviewed independently (i.e., by others than those who produced the data or the reports) to confirm that the data or results are presented correctly. These reports shall be approved by management prior to release, publication, or distribution.

    The procedure for ETV verification report and ETV verification statement review and approval is given in part A section 5.0. ETV verification reports are peer-reviewed, and during the pilot phase ETV verification statements are signed by the EPA laboratory directors.

    REFERENCES

    Return to Table of Contents

    Guidance for the Preparation of Standard Operating Procedures (SOPs) for Quality Related Documents, EPA/600/R-96/027. Washington DC: U.S. Environmental Protection Agency, 1995.

    Guidance for Quality Assurance Project Plans, EPA QA/G-5 Washington DC: U.S. Environmental Protection Agency, 1998.

    Simes, G. F., Preparation Aids for the Development of Category II Quality Assurance Project Plans, EPA/600/8-91/004. Cincinnati OH: U.S. Environmental Protection Agency, 1991.

    Guidance for the Data Quality Objectives Process, EPA QA/G-4, EPA/600/R-96/055. Washington DC: U.S. Environmental Protection Agency, 1994.

    Guidance for Data Quality Assessment, EPA QA/G-9, EPA/600/R-96/084. Washington DC: U.S. Environmental Protection Agency, 1996.

    USEPA Office of Research and Development. Environmental Technology Verification Program Verification Strategy. EPA/600/K-93/003. US Government Printing Office; 1997.

    American Society for Quality Control, Energy and Environmental Quality Division, Environmental Issues Group. AMERICAN NATIONAL STANDARD Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs. ANSI/ASQC E4-1994. American Society for Quality, 1994.

    ENTICE White Paper - 1994

    Science Advisory Board Review Report- 1995

    ETV Team and Duties Memorandum - 1995

    Budget Memoranda - Fiscal Year 1995

    Budget Memoranda - Fiscal Year 1996

    Budget Memoranda - Fiscal Year 1997
     
     

    APPENDIX A

    Return to Table of Contents

    EPA SERIES NO. 003A

    U.S. EPA RECORDS CONTROL SCHEDULE

    SERIES TITLE: Grants and Other Program Support Agreements

    PROGRAM: All Programs, except Superfund Site Specific and Wastewater Construction and State Revolving Fund Grants

    EPA SERIES NO: 003A

    NARA SCHEDULE NO. N1-412-94-2/1

    (Use this number to retire records to the FRC)

    APPLICABILITY: Agency-wide

    IDENTIFYING INFORMATION:

    DESCRIPTION: Includes records that document all types of agreements with other Federal, State, or local government agencies, universities and other institutions to which EPA is a party, and which support EPA's environmental programs (other than Superfund site specific and wastewater construction grants). Specific types of agreements include assistance agreements, grants, cooperative agreements, Interagency Agreements, and other types of program support agreements administered by Headquarters or EPA regions and which provide for research, demonstration projects, training, fellowships, investigation, surveys, studies, or other types of program support activities.

    Includes:

    Supporting documentation - Specific types of records include documentation of significant actions and decisions, justifications, cost estimates, scopes of work, correspondence, applications, pre-award reviews, funding decisions, award documentation, commitment notices, transmittal correspondence, agreements, agreement oversight activities, non-compliance/dispute documentation, audit records, closeout documentation for completed agreements, and reports and evaluations resulting from agreements.

    Excludes: Final products and deliverables, Superfund site specific grants, and agreements and wastewater construction grants which are scheduled separately.

    ARRANGEMENT: Arranged by agreement.
     

    TYPE OF RECORDS: Case file SPECIFIC RESTRICTIONS: None 
    MEDIUM: Microfilm, paper, forms, 

    Electronic

    VITAL RECORD: No

    FUNCTIONS SUPPORTED: Program operations 

    SPECIFIC LEGAL REQUIREMENTS: Varies according to program

    40 CFR 30, 31, 35, Subparts A, H, P, 40, 45-47

    DISPOSITION INFORMATION:

    FINAL DISPOSITION: Disposable

    TRANSFER TO FRC PERMITTED: Yes

    FILE BREAK INSTRUCTIONS: Break files immediately after closeout of the agreement.

    DISPOSITION INSTRUCTIONS: Keep inactive materials in office at least 1 year after file break, then retire to FRC. Destroy 7 years after file break.

    If record copy is in microform, break file upon completion of microform quality assurance check. Retire one silver and one diazo copy to the FRC along with finding aids and indexes. Destroy 7 years after file break. Retain one or more sets for office use. Destroy any Agency microform copies when superseded or no longer needed.

    APPLICATION GUIDANCE:

    REASONS FOR DISPOSITION: The retention period for supporting documentation has been extended because the records are needed in the event of a claim against the Agency. The statute of limitation on such claims is 7 years. Final products and deliverables are covered in EPA 258A.

    AGENCY-WIDE GUIDANCE: Final products and deliverables are permanent records and are scheduled as EPA 258A.

    Agreement closeout is when the Agency determines all administrative actions and required work is completed (submission of the final expenditure report, SF 269 - Financial Status Report, by the recipient) or when the agreement is terminated or annulled and any disputes settled. Final closeout documentation may consist only of an internal Agency memo.

    The Grants Administrator (also called the Grants Management Officer), Grant Project Officer, and Financial Management Officer are responsible for the record copies of grant agreement records and implementing the disposition. Records can include unique program files maintained by the grant project officer or client or technical representative. All other copies may be destroyed when no longer needed.

    The following offices and managers are responsible for maintaining a complete record set and dispositioning documents as designated below:

    Grants Management Officer (Grants Specialist) - Record copy of applications; reviews and amendments related to the application; administrative review checklist; certifications; agreements and any amendments; award documentation; requests for deviations; stop work orders; documentation relating to termination actions, disputes and appeals, annulments and audits; legal opinions; financial status reports; and increases and decreases; correspondence and other related documents.

    Program Office (Project Officer) - Record copy of documents used for day-to-day technical direction of the grant or interagency agreement such as draft and final products and deliverables; work plans and progress reports; draft documents and comments provided or other records of technical direction. Copies of applications, awards, amendments and other administrative and financial documents.

    Financial Management Officer - Record copy of reimbursement requests, payment vouchers, payment files, federal cash transaction reports; copies of financial status report and other related documents.

    See EPA 274A for Unsuccessful Grant Application Files. This item does not include Superfund site specific grants which are scheduled as EPA 001A or Waste Water Construction and State Revolving Fund Grants which are covered in EPA 232A. Contracts are covered under EPA 020A, EPA 055A, EPA 202A, and EPA 258A. The Grants Information and Control System is scheduled as EPA 575A.

    PROGRAM OFFICE GUIDANCE/ DESCRIPTIVE INFORMATION: Previous schedule items combined into this schedule were for the following programs: Federal Activities, Water, Solid Waste, Emergency and Remedial Response, Toxic Substances, Mobile Source, Air and Hazardous Waste, Regional Administrator, Research and Development, Pesticides, Radiation, and Information and Resources Management. Specific item numbers are cited below.

    CUSTODIAL INFORMATION:
     
     

    CONTROLLING UNIT: Multiple units CONTACT POINT:
    Name: Name:
    Location: Mail Code:
    Inclusive Dates: Telephone:
    Volume on Hand (Feet): Office:
    Annual Accumulation: (feet or inches) Room:

    CONTROL INFORMATION:

    RELATED ITEMS: EPA 001A, EPA 020A, EPA 055A, EPA 202A, EPA 232A, EPA 258A, EPA 274A, EPA 575A

    PREVIOUSLY APPROVED BY

    NARA SCHEDULE NOS: NC1-412-75-6/1, NC1-412-76-1/III/14 and 20,

    NC1-412-76-9/25, NC1-412-77-1/8 and 9, NC1-412-77-4/1, NC1-412-77-5/11,

    NC1-412-78-10/6b, NC1-412-82-12/11, NC1-412-85-6/8 and 15,

    NC1-412-85-7/8, NC1-412-85-12/6, NC1-412-85-14/7, NC1-412-85-17/2,

    NC1-412-85-18/2, NC1-412-85-19/4, NC1-412-85-23/4a, NC1-412-85-25/5a and b, NC1-412-85-26/I/4, N1-412-86-1/8, N1-412-86-3/7 _________________________________________________________
     

    Approval Approval Entry Last
    Date EPA Date NARA Date Modified
    11/27/96 3/20/91 12/27/96

    APPENDIX B

    Return to Table of Contents

    The following measures of success is excerpted from the November 1997 ETV meeting:

    What Constitutes Success for ETV?

    Timing

    • No more than one year for partner selection.
    • No more than one year after partner selection for organizational phase (i.e., stakeholder selection, technology prioritization, initial protocol development, stakeholder approval of protocols).
    • For each technology test event, no more than one year between vendor meeting and draft final report.
    • No more than two months for EPA approval and one month for publication.

    Cost of Operation, Testing, Participation

    Customer Satisfaction

    • Significant number of States accept ETV data for permitting.
    • Significant number of consulting engineers use ETV data for making technology recommendations.

    Effects

    • Vendor sales data; technology use data.

    APPENDIX C

    Return to Table of Contents

    ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EXISTING DATA: POLICY AND PROCESS

    Background

    The Environmental Technology Verification program was established by the U.S. EPA for the purpose of verifying the performance of commercial-ready technologies for their ability to monitor, prevent, control, or clean-up pollution. Verification is accomplished by the evaluation of objectively-collected, quality-assured data which are provided to potential purchasers and permitters as an independent and credible assessment of the performance of a technology. Data are collected and evaluated in partnership with independent third party verification partners chosen from the public sector (such as states), the private sector (such as non-profit research institutions), federal laboratories, and others. During the pilot phase (1994-2000), EPA provides oversight of the verification partner to assure the credibility of the process and data, and keeps the authority for the verification process and decision (except in the case of an independent pilot). After the pilot phase, responsibility and authority revert to the verification partner.

    The ETV program seeks to identify optimal methods to verify environmental technologies without compromising quality. Stakeholder groups, consisting of representatives of major verification customer groups, advise and assist EPA and the verification partners in this effort. One consistent and urgent request has been that existing data, i.e., data collected prior to the ETV program, be used for ETV verification. This suggestion is reinforced by the programs of individual states, as well as those of other countries, that routinely consider previously-collected data in the verification of vendor claims for a technology. The purpose of this document is to establish a guideline whereby the ETV program may use these “historical,” “existing,” or “secondary” data to increase and enhance the scope of individual pilot projects.

    POLICY

    Currently, under the U.S. ETV program, the verification partner and the technology developers typically plan and execute tests which provide the objective and quality-assured data by which the environmental technologies are evaluated. Existing data are used to support test plan development. Measurements and data are collected in a demonstration of the technology by the developer, under the direction of the verification partner, and overseen by EPA. Reports are peer-reviewed and verification statements are issued. In this closely-monitored scenario, the origin and quality of the data upon which the verification statement rests are generally known and documented, and therefore the possibility for verification decision error is minimized. The consequences of a serious verification decision error can include verification of fraudulent claims, litigation, and loss of credibility for the ETV program, the verification partners, and EPA.

    Compelling arguments exist for considering using certain qualified existing data to replace some or all of the verification testing for a given technology. Some technologies are time-consuming and expensive to evaluate. Due to resource constraints, demonstrations can, at best, show the performance of the technology under only limited conditions. A test may provide only one small performance snapshot in time as opposed to providing data from several years of performance collected by the developer or his customers under a full range of conditions. Limited resources may require that testing focus on only one component of a technology rather than its full range of capability. Before coming to the commercially viable stage of development, these technologies may have been tested numerous times with acceptably reproducible results.

    Judicial precedent provides argument for the defensible use of existing data. In Daubert v. Merrill Dow Pharmaceuticals, Inc. , the Supreme Court in 1993 adopted a new standard for the admissibility of scientific evidence. The Court there held that Federal Rule of Evidence 702 requires that, when presented with proposed scientific testimony, the district court must make a preliminary assessment of whether the reasoning or methodology underlying the testimony is scientifically valid, and therefore reliable. The Court declined to adopt a definitive checklist or test, but noted several factors a court should consider. Those factors include: (1) does the theory or technique involve testable hypotheses; (2) has the theory or technique been subject to peer review and publication; (3) are there known or potential error rates and are there standards controlling the technique’s operation; and (4) is the method or technique generally accepted in the scientific community? The court must also consider the relevance or fit of the proposed testimony by determining if the reasoning and methodology can properly be applied to the facts at issue.

    The Clean Air Act Credible Evidence Revisions (see Federal Register, Vol. 62, No. 36, February 24, 1997) provide precedent within the Agency for defensible consideration of existing data for verification use. These revisions clarify that data from methods which are not EPA Standard Reference Methods can be used in enforcement actions and for compliance certification. Conversely, emission sources will be able to use any credible evidence (ACE) for contesting allegations of noncompliance in enforcement actions. As the rule states, it “exemplifies EPA’s common sense” approach to environmental protection, which encourages smarter, cheaper and more flexible means of achieving environmental goals without compromising the fundamental health and environmental protections provided by federal environmental laws.” It follows that if EPA can use ACE for enforcement actions, it can be considered for verification.

    Other precedent within the Agency exists at the Office of Air Quality Planning and Standards (OAQPS). OAQPS uses secondary data, defined as data that are utilized for a purpose other than that for which they were initially collected, in its regulatory efforts. In order to effectively focus its quality assurance (QA) efforts within the constraints of available resources, OAQPS concentrates its consideration of secondary data according to category of project. The QA activities associated with evaluating secondary data are conducted to assure that the data will be adequate and sufficient for their planned secondary use.

    Recognizing therefore that it is neither prudent nor cost-effective to ignore existing data, the ETV program establishes by this document a consistent process to evaluate these data for the extent of their credibility and usability in the verification decision. Data to be considered for use to replace verification testing undergo a rigorous process of evaluation using stringent criteria. The following guidelines are used to qualify existing data for verification purposes (detailed procedures follow in the “process” section of this document):

    1. 1. Data are evaluated using qualified reviewers following the data evaluation process established in the “process” section of this document .
    2. 2. The documentation of the candidate data is sufficient to allow the reviewers to assess the quality of the data and its usability for verification.
    3. 3. The data are evaluated to determine that they meet the same minimum quality acceptance criteria as that collected in a comparable ETV pilot demonstration.
    4. 4. All of the data used for a verification must have been objectively collected, independently of the vendor.
    5. 5. Only data collected under a well-defined, documented quality system will be considered. Such data sets should contain all the elements required to withstand peer review, and thus be useful for verification.

    Recognizing that useful data exist which will not qualify for verification under these guidelines, and responding to customer needs, individual pilots may establish individual evaluation criteria by which existing data may be considered. These data may not be used directly for verification, but may be used, for example, to support planning or to augment verification testing. No ETV program-wide guidelines are necessary for the use of existing data for purposes other than for verification.

    PROCESS

    Identifying and Qualifying the Data

    The vendor proposes the data to be evaluated. EPA and the verification partner shall (with input from the stakeholder group, as applicable) identify for the vendor the procedures and acceptance criteria used in the pilot demonstrations to evaluate technology performance. These procedures and criteria are the same as that used for other technologies evaluated by the verification partner. The data requirements are developed by EPA, the verification partner, and interested stakeholders for the pilot, and are not specific to the existing data. The vendor and verification partner perform the initial evaluation.

    The vendor shall provide the verification partner with the detailed protocols and test plans used to develop the existing data. The vendor shall identify those data that he believes will meet the acceptance criteria, qualify those data, and submit the data along with detailed evidence that the data meet the requirements of the pilot project. The evidence shall be submitted to the EPA and verification partner in a detailed report. The report shall show how the data verify the performance of the technology, identify data that were excluded, give an explanation of how and why they were excluded, and address other requirements specific to the pilot project. The vendor shall be prepared to provide all of the raw data.

    The verification partner shall review the planning documents to determine whether they meet the requirements of those being used by the verification partner for evaluation tests of other technologies. At a minimum the existing data protocols and test plans shall require the same level of QA/QC, replicate tests, data treatment, and reporting as that required by the verification partner in its technology demonstrations. The verification partner shall conduct a detailed review of the vendor’s data report to determine whether the data adequately evaluate the performance of the technology. The verification partner has access to the raw data and works through a reasonable random sample (suggest 10% of the data). A recommended method for evaluation of data is tracing a random selection of data points from the raw data set to the final report.

    Minimum General Acceptance Criteria

    • The technology is based on sound scientific and engineering principles.
    • The conditions under which the data were collected are clearly defined and were appropriate for the demonstration of the capabilities of the technology.
    • The data are quality assured. For example, where appropriate, the documentation provides a measure of the bias and precision of the measurements. Where needed, minimum detection limits have been determined and reported. Where applicable, the measurement range of the technology is given. A narrative statement will include a discussion of how well the data represent the capabilities of the technology in its intended environmental application
    • Sufficient data are supplied to allow the technology to be verified. Sufficiency of the data will be determined by the reviewers.
    • Vendor-generated data may be reviewed as part of the evaluation process because it is a rich source of knowledge about the technology. Only data collected objectively and independently of the vendor, however, may be used to replace verification testing.

    Specific Acceptance Criteria

    In addition to the general acceptance criteria, the specific pilot project stakeholders may impose specific acceptance criteria which must be as stringent as the acceptance criteria for the data collected during verification testing.

    Convening the Data Evaluation Panel

    If the verification partner determines that the report does not adequately evaluate the performance of the technology, the vendor is notified and no further action is required. If the verification partner determines that the vendor’s report does adequately evaluate the performance of the technology, then a data evaluation panel (DEP) is appointed. The verification partner enlists the services of 3 qualified reviewers to serve on the DEP. During the pilot phase of the ETV program, the DEP will generally consist of one person from EPA, one person from the verification partner, and one person who is an outside expert in the technology being evaluated. The DEP must contain members who are credible, experienced, knowledgeable, and qualified in the technical areas critical to the technology being evaluated. The members of the DEP must be objective and have no real or perceived conflict of interest with the commercial developer of the technology they are evaluating. DEP members must be independent; they cannot have been involved in the collection of the data being evaluated.

    Evaluation of the Data by the DEP

    The DEP reviews and agrees on the acceptance criteria and determines their applicability to the data to be evaluated. The evaluation shall follow the procedures and criteria developed by the verification partner and EPA for other technology verifications conducted in the pilot project.

    The verification partner provides a written summary of its review to the DEP. When the verification partner submits the data to the DEP, it ceases to be proprietary. The DEP reviews and evaluates the data using the applicable acceptance criteria.. The DEP determines that the data were gathered following appropriate test protocols similar to the protocol used for verification testing. It ensures that the data were gathered following written test plans developed using a similar protocol. Planning must have included specific test objectives, experimental design, criteria for data quality, QA/QC procedures followed and reported, number of samples or frequency of sampling, and sampling and analytical procedures. The DEP must determine that the data quality meets or exceeds the minimum data quality requirements of the verification testing conducted during the pilot.

    The quality and usability of the existing data shall be evaluated against clearly defined data quality requirements based on the data quality requirements of the ETV pilot project. The data shall be sufficient to evaluate the performance of the technology.

    Recommendations for Acceptance of Data for Verification Role

    The DEP shall prepare a report on its findings. At a minimum the report must address the following:

    • Was the data collected by following the protocol and test plan provided by the vendor?
    • Do the data meet the minimum QA/QC requirements of the ETV pilot project demonstrations?
    • Do the data adequately evaluate the performance of the technology? Are there enough data, and are the data of sufficient quality for the verification partner, the ETV program, and EPA to place their reputations on the line?

    The DEP provides a written statement of the performance of the technology as provided by the data, a statement of how well the data meet the acceptance criteria, and a data acceptance recommendation.

    Review and Acceptance of Recommendation by Verification Partner and EPA

    The EPA reviews the report, determines whether to accept the data acceptance recommendation, and signs the verification statement.

    ____________________

    @ It is suggested that testing entities having a quality system which is modeled after the American National Standard Institute/American Society for Quality Control (ANSI/ASQC) Standard E-4-1994, Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs, or the International Organization for Standardization (ISO) Standard 9000, Quality Management and Quality Assurance Standards: Guidelines for Selection and Use, may have appropriate quality systems. Other similar quality systems may be accepted at the discretion of the reviewers.

    ____________________

    Return to Table of Contents

     


    EPA Home | Up One Level | EPA Search | Comments
    last revised  |