Public Policy

Uncle Sam’s Evidence-Based Policy Panel Looking for Input Public Policy
Perhaps all government policies should come with this sticker affixed. (Photo: ORC Forensics)

Uncle Sam’s Evidence-Based Policy Panel Looking for Input

September 29, 2016 1283

Evidence Stickers Roll

Perhaps all government policies should come with this sticker affixed. (Photo: ORC Forensics)

The U.S. government’s Commission on Evidence-Based Policymaking will hold its first-ever public hearing on October 21. The commission is currently looking for people to testify at that hearing or provide comment on 19 questions that the panel has set out to address.

Created through a law passed in March, the commission is “charged with examining strategies to increase the availability and use of government data, in order to build evidence related to government programs and policies, while protecting the privacy and confidentiality of the data.” In the coming year its 15 members will examine how data, research and evaluation are currently being used in policy and program design, and how they could be. The commission will present its findings to the president and Congress.

According to a release from Senator Patty Murray, a Democrat from Washington, the impetus for the commission came during election night in 2014 when she and Speaker of the House Paul Ryan, a Republican, were texting each other on election night in 2014. Having already worked together on 2013 budget act, they “decided to do something to show that Democrats and Republicans could still work together. From this conversation came the decision to work together on the Evidence-Based Policymaking Commission Act – a bill to create a bipartisan commission to make recommendations for how the federal government could better use data to improve programs and the tax code.”

“The commission will determine whether the federal government should establish a clearinghouse for program and survey data, what data should be included in the clearinghouse, and which qualified researchers from both the private and public sector could access the data to perform program evaluations and policy-relevant research. By coordinating data across federal programs and tax expenditures, and giving qualified researchers and officials greater access to that data, with appropriate controls on the use of that data, federal agencies will gain a better grasp of how effective they are, and lawmakers will gain a better grasp of how to improve them.”

The commission’s work will conclude with a presentation of findings and recommendations on evidence-building to Congress and the president.

The 19 questions are sorted into two broad categories centered on data, infrastructure and use. The questions also include two “overarching” inquiries:

Are there successful frameworks, policies, practices, and methods to overcome challenges related to evidence-building from state, local, and/or international governments the Commission should consider when developing findings and recommendations regarding Federal evidence-based policymaking? If so, please describe.

Based on identified best practices and existing examples, what factors should be considered in reasonably ensuring the security and privacy of administrative and survey data?

The balance of the questions are as follows:

Data Infrastructure and Access

  • Based on identified best practices and existing examples, how should existing government data infrastructure be modified to best facilitate use of and access to administrative and survey data?
  • What data-sharing infrastructure should be used to facilitate data merging, linking, and access for research, evaluation, and analysis purposes?
  • What challenges currently exist in linking state and local data to federal data? Are there successful instances where these challenges have been addressed?
  • Should a single or multiple clearinghouse(s) for administrative and survey data be established to improve evidence-based policymaking? What benefits or limitations are likely to be encountered in either approach?
  • What data should be included in a potential U.S. government data clearinghouse(s)? What are the current legal or administrative barriers to including such data in a clearinghouse or linking the data?
  • What factors or strategies should the Commission consider for how a clearinghouse(s) could be self-funded? What successful examples exist for self-financing related to similar purposes?
  • What specific administrative or legal barriers currently exist for accessing survey and administrative data?
  • How should the commission define “qualified researchers and institutions?” To what extent should administrative and survey data held by government agencies be made available to “qualified researchers and institutions?”
  • How might integration of administrative and survey data in a clearinghouse affect the risk of unintentional or unauthorized access or release of personally-identifiable information, confidential business information, or other identifiable records? How can identifiable information be best protected to ensure the privacy and confidentiality of individual or business data in a clearinghouse?
  • If a clearinghouse were created, what types of restrictions should be placed on the uses of data in the clearinghouse by “qualified researchers and institutions?”
  • What technological solutions from government or the private sector are relevant for facilitating data sharing and management?
  • What incentives may best facilitate interagency sharing of information to improve programmatic effectiveness and enhance data accuracy and comprehensiveness?

Data Use in Program Design, Management, Research, Evaluation, and Analysis

  • What barriers currently exist for using survey and administrative data to support program management and/or evaluation activities?
  • How can data, statistics, results of research, and findings from evaluation, be best used to improve policies and programs?
  • To what extent can or should program and policy evaluation be addressed in program designs?
  • How can or should program evaluation be incorporated into program designs? What specific examples demonstrate where evaluation has been successfully incorporated in program designs?
  • To what extent should evaluations specifically with either experimental (sometimes referred to as “randomized control trials”) or quasi-experimental designs be institutionalized in programs? What specific examples demonstrate where such institutionalization has been successful and what best practices exist for doing so?

To request a hearing slot to make an in-person statement, send your name, affiliation, a two- to three-sentence summary, and your written statement by email to input@cep.gov by October 16.

Those interested can also submit written comments through November 14 via instructions available HERE. The commission asks that each respondent include the name and address of his or her institution or affiliation, and the name, title, mailing and email addresses, and telephone number of a contact person for his or her institution or affiliation, if any.

(H/T to the Council of Professional Associations on Federal Statistics)


Related Articles

Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research
Communication
November 21, 2024

Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research

Read Now
Deciphering the Mystery of the Working-Class Voter: A View From Britain
Insights
November 14, 2024

Deciphering the Mystery of the Working-Class Voter: A View From Britain

Read Now
Doing the Math on Equal Pay
Insights
November 8, 2024

Doing the Math on Equal Pay

Read Now
Tom Burns, 1959-2024: A Pioneer in Learning Development 
Impact
November 5, 2024

Tom Burns, 1959-2024: A Pioneer in Learning Development 

Read Now
All Change! 2024 – A Year of Elections: Campaign for Social Science Annual Sage Lecture

All Change! 2024 – A Year of Elections: Campaign for Social Science Annual Sage Lecture

With over 50 countries around the world holding major elections during 2024 it has been a hugely significant year for democracy as […]

Read Now
‘Settler Colonialism’ and the Promised Land

‘Settler Colonialism’ and the Promised Land

The term ‘settler colonialism’ was coined by an Australian historian in the 1960s to describe the occupation of a territory with a […]

Read Now
Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

The creation of the Coalition for Advancing Research Assessment (CoARA) has led to a heated debate on the balance between peer review and evaluative metrics in research assessment regimes. Luciana Balboa, Elizabeth Gadd, Eva Mendez, Janne Pölönen, Karen Stroobants, Erzsebet Toth Cithra and the CoARA Steering Board address these arguments and state CoARA’s commitment to finding ways in which peer review and bibliometrics can be used together responsibly.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments