Information Quality Act
44 U.S.C. § 3516 note (2012); enacted December 21, 2000, by Pub. L. No. 106-554 § 515, 114 Stat. 2763, 2763A-153.
- 1 Overview
- 1.1 Information Quality Guidelines
- 1.2 Administrative Mechanisms for Information Quality Challenges
- 1.3 Agency Reporting Requirements
- 1.4 Judicial Review
- 1.5 Peer Review Bulletin
- 1.6 Peer Review of Influential Scientific Information—Section II
- 1.7 Peer Review of Highly Influential Scientific Assessments—Section III
- 1.8 Additional Peer Review Guidelines
- 1.9 Risk Assessment
The Information Quality Act (IQA), also frequently termed the Data Quality Act, mandates the establishment of guidelines “ensuring and maximizing the quality, objectivity, utility, and integrity of information (including statistical information) disseminated” by agencies. The IQA consists of a two-sentence appropriations rider to the Treasury and General Government Appropriations Act for Fiscal Year 2001 (Pub. L. No. 106-554) and amends the 1995 Paperwork Reduction Act (PRA) 44 U.S.C. §§ 3501–3520. The amendment directs the Office of Management and Budget (OMB) to issue policy and procedural guidance to federal agencies that are subject to the PRA. The IQA requires OMB to guide agencies (1) in issuing their own data quality guidelines to regulate agency use and dissemination of information; (2) in developing administrative mechanisms so that affected parties may seek correction of information that does not comply with information quality guidelines; and (3) in making periodic reports to OMB on the number, nature, and resolution of any complaints the agencies receive concerning their failure to comply with either OMB or agency-specific information quality guidelines.
OMB published proposed guidelines pursuant to the IQA in the Federal Register on June 28, 2001 (66 Fed. Reg. 34,489). Following public comment, OMB published interim final guidelines on September 28, 2001 (66 Fed. Reg. 49,718), and then finalized guidelines, after additional public comment, on February 22, 2002 (67 Fed. Reg. 8452). On October 1, 2002, the OMB guidelines went into effect, thus requiring agencies to issue, by the same date, their own guidelines detailing their agency-specific information quality standards and outlining administrative mechanisms for affected parties to challenge the quality of agency-disseminated information. Any information disseminated by agencies on or after October 1, 2002, is subject to the OMB and agency-specific information quality guidelines. Agency-specific guidelines may be found on the agency websites, in the Federal Register, or on the OMB website.
Information Quality Guidelines
The IQA aims to regulate “the sharing by Federal agencies of, and access to, information disseminated by Federal agencies.” The OMB guidelines define “information” as “any communication or representation of knowledge such as facts or data, in any medium or form, including textual, numerical, graphic, cartographic, narrative, or audiovisual forms” (§ V.5). This definition includes information posted on an agency web page, but excludes hyperlinks to information disseminated by sources outside the agency. Similarly, if an agency presents information in a manner clearly indicating that the information reflects an opinion, not a fact or an agency view, the information is not subject to the same information quality standards. The guidelines define “dissemination” as “agency initiated or sponsored distribution of information to the public” (§ V.8). The guidelines exclude from the definition of dissemination any distribution that is limited to “government employees or agency contractors or grantees; intra- or inter-agency use or sharing of government information; and responses to requests for agency records under the Freedom of Information Act, the Privacy Act, the Federal Advisory Committee Act or other similar law.” Further excluded are “correspondence with individuals or persons, press releases, archival records, public filings, [and] subpoenas or adjudicative processes” (§ V.8).
The IQA addresses the “quality, objectivity, utility, and integrity” of agency information. The OMB guidelines provide definitions for these terms. “Quality” encompasses utility, objectivity, and integrity (§ V.1). “Utility” contemplates the usefulness of information for both the agency and the public such that “when transparency of information is relevant for assessing the information’s usefulness from the public’s perspective, the agency must take care to ensure that transparency has been addressed in its review of the information” (§ V.2). “Integrity” refers to the safeguarding of information from unauthorized access or revision (§ V.4). “Objectivity” focuses on whether information is not only substantively accurate, reliable, and unbiased, but also presented in an accurate, clear, complete, and unbiased manner (§§ V.3.a-b).
To satisfy the presentation prong of the objectivity requirement, agencies must present information in context, which may—depending on the sort of information—require providing additional information, sources, supporting data, models, sources of errors and documentation. The substantive objectivity prong requires that scientific, financial, or statistical information be based on original and supporting data and analytic results produced using “sound statistical and research methods” (§ V.3.b). The guidelines provide that information presumptively satisfies the substantive objectivity requirement if the relevant data and analytic results have undergone formal, independent, external peer review (§ V.3.b.i). A presumption of objectivity is rebuttable if a petitioner persuasively demonstrates a lack of objectivity. If peer review is agency-sponsored, certain transparency requirements are triggered to ensure the peer review is adequate. Agency-sponsored peer reviews must be open and rigorous, and peer reviewers must be selected primarily on the basis of necessary technical expertise, disclose to agencies any prior positions they may have taken on pertinent issues, and reveal sources of personal and institutional funding.
The objectivity standard—that information be “reproducible” to prove transparency—is more particular for information that falls into the category of not simply ordinary but “influential” information (§ V.3.b.ii.B). Information is influential if “the agency can reasonably determine that dissemination of the information will have or does have a clear and substantial impact on important public policies or important private sector decisions” (§ V.9). Each agency may separately define “influential” to accord with “the nature and multiplicity of issues for which the agency is responsible” (§ V.9). Agencies should consult with relevant scientific and technical communities on the feasibility of making certain categories of data subject to the reproducibility standard (§ V.3.b.ii.A). Reproducibility in the analytic context demands that “independent analysis of the original or supporting data using identical methods would generate similar analytic results, subject to an acceptable degree of imprecision or error” (§ V.10). The acceptable degree of imprecision changes to reflect the impacts the information may have. The reproducibility standard for other categories of data is set forth by individual agencies in their guidelines whenever they identify specific categories of information as subject to the reproducibility standard (§ V.10).
Compelling privacy interests, trade secrets, intellectual property, and other confidentiality protections take precedence over reproducibility and transparency requirements, in which case agencies must alternatively apply “especially rigorous robustness checks to analytic results and document what checks were undertaken” (§ V.3.b.ii.B.i). Each agency determines the type and detail of robustness checks it conducts. Privacy interests notwithstanding, “[a]gency guidelines shall . . . in all cases, require a disclosure of the specific data sources that have been used and the specific quantitative methods and assumptions that have been employed” (§ V.3.b.ii.B.ii).
Some categories of information are automatically subject to special information quality requirements under the OMB guidelines. Agencies disseminating vital health and medical information “shall interpret the reproducibility and peer-review standards in a manner appropriate to assuring the timely flow of vital information from agencies to medical providers, patients, health agencies, and the public” (§ V.3.b.ii.C). Similarly, when agencies maintain and disseminate information with regard to analysis of risks to human health, safety, and the environment, they must “either adopt or adapt” the same information quality principle standards that apply to such information used and disseminated pursuant to the Safe Drinking Water Act Amendments of 1996 (42 U.S.C. § 300g-1(b)(3)(A)–(B)). The option to “adapt” rather than “adopt” these standards provides agencies with flexibility in applying them (§ V.3.b.ii.C).
In urgent situations involving imminent threats to public health or homeland security, for example, an agency may temporarily waive information quality standards according to standards set forth in the agency’s own guidelines (§ V.3.b.ii.C).
Administrative Mechanisms for Information Quality Challenges
The IQA and subsequent OMB guidelines require that in their guidelines agencies set forth an administrative process allowing affected parties to challenge agency-disseminated information that allegedly fails to meet the standards set forth in either the OMB guidelines or the disseminating agency’s guidelines (§ III.3). In establishing these processes, agencies must specify limits on the time an agency may take to consider a challenge and provide for notification to the affected party regarding the agency’s decision (§ III.3.i). Agency guidelines must also provide for an administrative appeal process to allow an affected party to seek reconsideration of an agency’s initial decision (§ III.3.ii). The appeals mechanism must similarly incorporate specific limits on the time allowed for reconsideration.
Agency Reporting Requirements
Pursuant to the OMB guidelines, each agency was obligated to submit to OMB a report—following a draft report, public comment, and revisions—outlining its agency-specific information quality guidelines and how the guidelines ensure the quality of information, and also explaining the agency’s administrative mechanisms for allowing affected persons to seek correction of agency information that allegedly does not meet information quality standards (§§ IV.3-5). Following OMB review of the reports, agencies then had to post the reports on their respective websites and publish notice of the availability of the final reports in the Federal Register by October 1, 2002 (§ IV.5).
Since January 1, 2004, agencies must report annually to the director of OMB the number and nature of complaints received by the agency regarding agency compliance with information quality guidelines (§ IV.6). The first reports are included in OMB’s report to Congress. OMB does not apparently publish subsequent reports, but the information for individual agencies is available through links to their websites from the OMB site.
Attempts to obtain judicial review of agency compliance with the IQA and in particular of agency refusals to make requested corrections to disseminated information have been uniformly unsuccessful. An early case, Salt Institute v. Leavitt, 440 F.3d 156 (4th Cir. 2006), held the plaintiff did not have standing because it was not injured by any failure of the agency to disseminate correct information, nor did the IQA create any individual right to correct information. See also Mississippi Comm’n on Envtl. Quality v. EPA, 790 F.3d 138 (D.C. Cir. 2015). Other cases have denied relief on different grounds. Because the IQA does not itself provide a cause of action, challenges to agency actions have been brought under the APA. However, courts have been persuaded either that review of a denial of a correction request is precluded because it is committed to agency discretion by law, see, e.g., Styrene Info. & Research Center, Inc. v. Sebelius, 944 F. Supp. 2d 71 (D.D.C. 2013), or that the denial is not final agency action, because it does not affect the rights or obligations of the requester, see, e.g., Single Stick, Inc. v. Johanns, 601 F. Supp. 2d 307 (D.D.C. 2009), aff’d on other grounds, Prime Time Int’l Co. v. Vilsack, 599 F.3d 678 (D.C. Cir. 2010).
Peer Review Bulletin
On January 14, 2005, OMB published notice in the Federal Register of its Final Information Quality Bulletin for Peer Review (Bulletin) (70 Fed. Reg. 2664) after having released and solicited public comment on two previous drafts dated September 15, 2003 (68 Fed. Reg. 54,023) and April 28, 2004 (69 Fed. Reg. 23,230). OMB issued the Bulletin under the IQA, Executive Order 12,866 (58 Fed. Reg. 51,735, Oct. 4, 1993), and OMB’s authority to manage agencies under the President’s Constitutional authority to supervise the executive branch. The OMB’s Office of Information and Regulatory Affairs (OIRA) oversees implementation of the Bulletin in consultation with the Office of Science and Technology Policy (OSTP).
The Bulletin establishes governmentwide guidance for peer review of agency scientific information that will be disseminated. “Scientific information” includes “factual inputs, data, models, analyses, technical information, or scientific assessments based on the behavioral and social sciences, public health and medical sciences, life and earth sciences, engineering, or physical sciences” (§ I.5). Under the Bulletin, the definition of “dissemination” encompasses the IQA definition, but excludes research produced by government-funded scientists that does not represent agency views and displays a disclaimer to that effect. The definition also excludes information distributed for peer review and that bearing a disclaimer stating that the information’s distribution is solely for the purpose of pre-dissemination peer review (§ I.5).
Peer Review of Influential Scientific Information—Section II
Section II of the Bulletin mandates that agencies conduct peer reviews of influential scientific information intended for dissemination. The definition of “influential” is the same as that under the IQA. Section II grants agencies discretion to choose the reviewer selection process and the peer review mechanism. Reviewer selection must take into account reviewer expertise, the need for a balanced representation of perspectives, conflicts of interest, and the reviewers’ independence from the work product under review (§ II.3). In choosing a peer review mechanism, the agency must consider the information’s novelty and complexity, importance to decision making, and prior peer review, as well as any benefits and costs of review (§ II.4). Certain transparency requirements also apply to the peer review mechanism. The reviewers must have advance notice of how their comments will be conveyed. Reviewers must prepare a report describing their review, findings, and conclusions. The report must also provide the names and affiliations of the reviewers and “shall either (a) include a verbatim copy of each reviewer’s comments (either with or without specific attributions) or (b) represent the views of the group as a whole, including any disparate and dissenting views” (§ II.5). The agency is required to post the report on its website along with all related materials, discuss the report in the preamble to any related rulemaking, and include the report in the administrative record behind any related agency action (§ II.5).
Agencies may commission independent entities to carry out reviewer selection and peer review (§ II.6). Influential scientific information that has already undergone adequate peer review is exempt from the requirements of section II. Principal findings, recommendations, and conclusions in official reports of the National Academy of the Sciences are presumed to have undergone adequate peer review (§ II.2).
Peer Review of Highly Influential Scientific Assessments—Section III
The Bulletin imposes requirements, in addition to the section II requirements, for the peer review of highly influential scientific assessments. A “scientific assessment” is “an evaluation of a body of scientific or technical knowledge that typically synthesizes multiple factual inputs, data, models, assumptions, and/or applies best professional judgment to bridge uncertainties in the available information” (§ I.7). The term “highly influential” means that the agency or the OIRA Administrator determines that dissemination could have an impact of more than $500 million in any one year or will be “novel, controversial, or precedent-setting, or has significant interagency interest” (§ III.2). The Bulletin provides that peer review of highly influential scientific assessments adhere to the following additional requirements:
- Scientists employed by the sponsoring agency (outside of the peer review context) may not function as reviewers. An exception may be made if the agency determines that an agency scientist employed by a different agency of the Cabinet-level department possesses expertise essential to the review and does not hold a position of management or policy responsibility. The agency must obtain prior written approval on a nondelegable basis from the Secretary or Deputy Secretary to make such an exception (§ III.3.c).
- Absent an essential need, the same reviewer should not participate in multiple peer reviews (§ III.3.d).
- Reviewers must be provided information sufficient to enable them to understand the data, analytic procedures, and assumptions used to support the key findings or conclusions they are reviewing (§ III.4).
- If feasible and appropriate, the agency must make the draft scientific assessment available for public comment and sponsor a public meeting attended by the reviewers. The reviewers should then have access to the public comments pertaining to significant scientific or technical issues (§ III.5).
- “The peer review report shall include the charge to the reviewers and a short paragraph on both the credentials and relevant experiences of each peer reviewer” (§ III.6). The agency must also write a response to the report to be posted on the agency’s website explaining the agency’s agreement or disagreement with the report, the actions the agency has undertaken or will undertake in response to the report, and, if applicable, the reasoning behind the choice of those actions (§ III.6).
Additional Peer Review Guidelines
Given a compelling rationale, the agency head may waive or defer the peer review requirements of sections II and III (§ VIII.3). Additionally, instead of adhering to the requirements of sections II or III, an agency may ensure the quality of scientific information by relying on information from the National Academy of Sciences, commissioning the National Academy of Sciences to conduct the peer review, or undertaking alternative procedures approved by the OIRA Administrator in consultation with OSTP (§ IV).
“Peer review shall be conducted in a manner that respects (i) confidential business information and (ii) intellectual property” (§ VIII.2). In disclosing information about a reviewer, the agency must comply with the requirements of the Privacy Act, 5 U.S.C. § 522a as amended, and OMB Circular A130 (Appendix I, 61 Fed. Reg. 6428 (Feb. 20, 1996)) (§ VIII.1). Information relating to certain national security, foreign affairs, or negotiations involving international trade or treaties is exempt from the peer review requirements if adhering to the requirements would interfere with the need for secrecy or promptness. Similarly exempt is information disseminated in the course of an individual agency adjudication or permit proceeding (unless the agency determines that peer review is practical and appropriate and that the influential dissemination is scientifically or technically novel); time-sensitive health or safety information; regulatory impact analysis or regulatory flexibility analysis subject to interagency review under Executive Order 12,866; routine statistical information released by federal statistical agencies; accounting, budget, actuarial, and financial information; and information disseminated in connection with routine rules that materially alter entitlements, grants, user fees, or loan programs, or the rights and obligations of recipients thereof (§ IX).
If an agency supports a regulatory action using information subject to the Bulletin, it must include in the administrative record, along with relevant materials, a certification explaining how the agency complied with the Bulletin’s requirements (§ VII).
Agencies must post on their websites, and update at least biannually, an agenda providing descriptions of all planned and ongoing influential scientific information subject to the Bulletin, links to documents made public pursuant to the Bulletin, and a peer review plan for each entry (§ V.2). Agencies must also establish a mechanism for allowing the public to comment on the adequacy of its peer review plans. Further, each agency is required to provide to OIRA, by December 15 of each year, a summary of the peer reviews conducted by the agency during the fiscal year (§ VI).
The Bulletin states that it “does not create any right or benefit, substantive or procedural, enforceable at law or in equity, against the United States, its agencies or other entities, its officers or employees, or any other person” (§ XII), thereby seeking to avoid judicial review of agency compliance with the Bulletin’s requirements.
In 1995, an interagency working group, co-chaired by the Office of Management and Budget (OMB) and the Office of Science and Technology Policy (OSTP), developed a set of principles to guide policymakers in assessing, managing, and communicating policies to address environmental, health, and safety risks (the 1995 Principles), available at https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/assets/regulatory_matters_pdf/jan1995_risk_analysis_principles.pdf. In 2006, OMB proposed to adopt uniform quality standards for agencies to adhere to in conducting risk assessments. However, in light of comments from agencies, the public, and the National Academy of Sciences, OMB decided not to adopt that proposal. Instead, in 2007 it issued Updated Principles for Risk Analysis, which is available at https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/memoranda/2007/m07-24.pdf. The 1995 Principles were divided into five parts: general principles, principles for risk assessment, principles for risk management, principles for risk communication, and priority setting. The 2007 Update reiterates those principles and further develops them in light of scientific advances in the previous 12 years.