- Home
- Agencies
- Department of Agriculture
- Department of Housing and Urban Development
- General Services Administration
- Department of Commerce
- Department of the Interior
- National Aeronautics and Space Administration
- Department of Defense
- Department of Justice
- National Science Foundation
- Department of Education
- Department of Labor
- Office of Personnel Management
- Department of Energy
- Department of State
- Small Business Administration
- Environmental Protection Agency
- Department of Transportation
- Social Security Administration
- Department of Health and Human Services
- Department of the Treasury
- U.S. Agency for International Development
- Department of Homeland Security
- Department of Veterans Affairs
- Goals
- Initiatives
- Programs
Primary tabs
FY 16-17: Agency Priority Goal
Enable evidence-based decision making
Priority Goal
Goal Overview
GOAL OVERVIEW
The Department of Education (ED) is committed to using its resources strategically to increase the amount of rigorous evidence about what works in education. Education leaders need this evidence to decide whether a potential program, policy, or practice is likely to produce improved student or other educational outcomes. While ED supports research on what works through its research programs, it also is committed to taking every opportunity to build and use research evidence in a range of competitive grant programs.
ED takes a two-pronged approach to evidence-based grant making with its competitive grants. First, ED directs its competitive grant dollars to approaches that are supported by at least some evidence. Second, ED generates new evidence by asking grantees to conduct rigorous evaluations of their interventions.
ED’s first competitive grant program to use an evidence-based grant making strategy (other than research programs at the Institute of Education Sciences (IES)) was the Investing in Innovation program (i3), which made its first grants in 2010 and currently manages a portfolio of over 100 projects. Since then, ED has expanded its evidence-based grant making to other competitive grant programs when appropriate, given resources and program purpose. The First in the World program (FITW) is an example of a postsecondary program that invests in approaches with at least some research support and asks grantees to conduct rigorous studies of impact on student outcomes. FITW made new evidence-based awards in FYs 14 and 15, but was not funded in FY16.
Doing evidence-based grant making well involves substantial, coordinated effort from many ED offices. ED must assess whether there is existing evidence on which applicants can build, help applicants to understand ED’s evidence standards, support applicants’ search for and application of evidence, coordinate timelines and logistics of internal study reviews (to ensure that applicants who cite evidence in support of their proposed projects are citing appropriately rigorous studies), and provide support on grantees’ rigorous evaluations. ED learned from the i3 experience that technical support for grantees and their evaluators is crucial to producing the highest-caliber evidence. By way of example, ED’s application of high-quality support for grantees’ evaluations has ensured that, to date, 22 i3 projects have met rigorous What Works Clearinghouse (WWC) Evidence Standards, 107 i3 projects are currently expected to meet those standards, and 40 of 42 FITW projects are currently expected to meet them. However, providing support for rigorous studies in a consistent, cost-effective way is a new challenge that ED faces as it rolls out rigorous study expectations to additional grant programs.
Evidence-based grant making has been a high-priority initiative at ED for the past several years, and ED has underscored its importance more recently with the FY14 – FY15 APG focused on increasing the percentage of competitive dollars that support evidence-based strategies. Prior to FY15, we focused intently on scaling the successful practices of i3 to other competitive grant programs. In FY15, ED shifted its focus to (1) investing our resources more carefully to ensure that the quality of implementing these strategies remains strong, and (2) picking only those programs where the evidence-based grant making approaches described above are likely to be successful.
Given ED’s current competitive grant programs, the availability of evidence in the field, the current funding and statutory landscape, and early projections for program funding over the next few years, increasing the new funding that supports evidence-based practices to 18% by the end of FY16 and 20% by the end of FY17 is an ambitious, yet achievable, goal. As background, 9% of ED’s new competitive grant funding supported evidence-based practices in FY12. As of FY14, almost 16% of ED’s new competitive grant funding supported evidence-based practices, and the FY15 percentage, which serves as the baseline, was significantly higher still – over 29%. Some of this jump is attributable to the size of competitions using evidence in FY15, and the number of applicants in those competitions. We do not assume this percentage will grow at a linear rate over time, and note that when taking into account ED’s ongoing planning process for using evidence in its competitive programs, and ED’s appropriations for FY16, it is likely that ensuring that more than 29% of its competitive dollars support evidence-based activities may not be possible or wise. We believe that ED has scaled this work to almost all of the competitive programs for which it makes sense to do so, and are focusing our next two years on quality of implementation so that use of evidence in competitions is supported and implemented in a meaningful way.
The second metric of the APG focuses on the number of rigorous studies ED expects to add to the education research base. ED expects that i3, in particular, will add a large number of studies on what works for elementary and secondary students. Over the next few years, we expect that 42 i3 grantees will release studies that meet WWC Evidence Standards. Because the timing of when those studies will be published and receive an official WWC rating varies, and because some i3 grantees have received extensions to collect and analyze more data, we estimated that 20 of these studies will be added to the WWC database and will be determined to meet WWC Evidence Standards by the end of FY17. As of the end of FY16, we have surpassed this target and 22 studies from i3 grantees were found to meet WWC Evidence Standards. Since the WWC began releasing evidence reviews in 2006, more than 2,350 eligible studies (effectiveness studies with an outcome identified in a review protocol) have been reviewed. Of those, 878 studies (approximately 37%) have findings that meet WWC standards with or without reservations. i3’s contribution, over the next two years, of 20 rigorous effectiveness studies is critical as educators, families, and policymakers continue to seek clear and credible information on what works in education.
In addition to i3, ED supports a focus on rigorous evaluation in other competitive grant programs, such as Supporting Effective Educator Development (SEED) and FITW. However, because i3 will be the only program to contribute studies by the end of FY17, only i3 project evaluations are included in the formal reporting on this APG. Even so, ED intends to provide future-focused updates that address ED’s work to support rigorous research endeavors in other programs.
As noted above, it is important that grantees tasked with rigorously evaluating their projects receive high-quality technical assistance in order to produce credible data. In addition to counting the rigorous studies that ED competitive grant programs support, this APG speaks to ED’s ability to support, through funding and technical support, high-quality evaluation.
KEY BARRIERS AND CHALLENGES
Using evidence to inform competitive grant funding decisions entails a shift in culture and capacity across ED, yet ED has limited resources to support those program offices in doing this work well. Final appropriations and other funding decisions and trade-offs also influence the amounts available to competitive grant programs. For example, the FITW program is not funded in FY16. If this and other programs that ED currently considers to be evidence-based are not funded in FY17, it may be more challenging to meet the established targets. In addition, in FY17 ED will begin implementing the Every Student Succeeds Act (ESSA), which reauthorizes the Elementary and Secondary Education Act of 1965 and includes many changes, large and small, for grant programs. ESSA’s emphasis on the use of “evidence-based” activities creates an opportunity to increase funding that supports evidence-based activities in the future, but the challenges of implementing the new law may affect this APG in the short term while ED works to help the field understand the new definition and to support grantees in meeting new evidence requirements.
In Q4, ED announced the winners for all but two grant competitions that contribute to this APG – these programs have funds available beyond September 30, 2016. In general, however, we note that ED’s focus on transitioning to ESSA has created a relative dearth of staff capacity across many of its competitive grant programs. While we spent FY16 competitive dollars within the timeframes mandated by the Congress, delays in a competition schedule from previous quarters caused us to truncate crucial tasks, such as reviewing the evidence applicants submit in support of their projects. ED continues to learn from past evidence-based competitions to improve our processes.
In addition, supporting grantees as they rigorously evaluate their grant-funded projects is resource-intensive and difficult to do well. Technical assistance is costly, and many small programs, or programs funded under different authorities, have limited ability to provide the assistance that grantees need to stay on track. All rigorous studies of effectiveness require careful stewardship, and even in the hands of skilled evaluators, many things can go wrong in the implementation of a study that threaten its credibility. Further, unlike research grant programs, competitive grant programs consider the qualifications of evaluators as one factor among many, with the result that not all evaluators of funded projects have significant experience in conducting rigorous studies of effectiveness. ED continues to problem solve with specific program offices in order to apply the i3 model in an accessible way to get more credible and informative data from grantees. ED’s Evidence Planning Group (EPG) continues to keep this issue on its radar, but focused in Q4 on closing out FY16 competitions and preparing for select FY17 competitions.
In particular, as elementary and secondary programs transition to the ESEA as amended by the ESSA, new statutory provisions that require or incentivize evidence-based practices have caused hiccups in FY17 planning. Specifically, the ESEA’s definition for “evidence-based” aligns with, but does not match exactly, the definitions ED created through regulations in 2013, and certain differences have created substantial process complications. EPG devoted ample time in Q4 to come up with a short-term solution for FY17, and a longer term solution for future years, that ensures that ED continues to hold its applicants to rigorous standards while still considering the internal capacity of staff to conduct evidence-based grant competitions with integrity.
In addition, there are several programs for which ED’s model of evidence-based grant making is problematic or not possible. Some programs provide funding to comprehensive support centers that respond to the needs of an education community – it is difficult to require that all grantees under such a program demonstrate that they will only provide evidence-based support when the applicant may not be able to predict the challenges that will arise in the community or anticipate the amount of research available on such challenges.
Finally, despite general support for evidence-based grant making endeavors, many programs’ stakeholders, who are accustomed to the status quo, have pushed back on integrating evidence into competitive grant programs. Long-standing funding levels and competition designs can make applicants less willing to move toward evidence-based strategies that require them to evaluate their work’s overall impact.
EXTERNAL STAKEHOLDERS
There is an increasing emphasis among stakeholders on the importance of using evidence to support government program funding decisions, and ED regularly engages the field on this topic. A number of outside organizations have convened experts to discuss how to encourage such decisions. In addition, philanthropic and congressional actors prioritized using evidence to support decision-making and have encouraged the field to do the same. Finally, ED has worked with the National Science Foundation to develop a common evidence framework around which to organize research investments and grants.
Strategies
IMPLEMENTATION STRATEGY
ED’s push towards evidence-based grant making includes a focus on (1) ensuring that ED and the public learns as much as possible from grant-funded projects through rigorous impact evaluations, and (2) ensuring that the investments in ED’s competitive grant programs are strategic. In 2013, ED published several amendments to the Education Department Grant Administrative Regulations (EDGAR), which allowed more competitive grant programs to consider evidence-based strategies modeled after i3. Since 2013, ED launched the FITW program, the postsecondary counterpart to i3, and has taken steps to reward evidence-based interventions and encourage rigorous evaluations in existing programs, most notably in the Office of Innovation and Improvement (OII) and OPE. The ongoing emphasis on evidence-based grant making has pushed program offices to think differently about their work and what they expect of their grantees. In addition, competitive grant programs increasingly work closely with IES (the research arm of ED) and the Office for Planning, Evaluation, and Policy Development (OPEPD) to plan and execute high-quality and fair evidence-based competitions.
First, to increase the percentage of competitive programs that support evidence-based strategies, ED intends to continue with its current approach for implementation. Specifically, ED’s Evidence Planning Group (EPG) continues to actively engage with select competitive grant program offices through the creation of small, program-specific working groups. These groups review spending plans for each fiscal year and determine if a program is designed appropriately to ask applicants to demonstrate the evidence supporting their proposed projects. In Q4, EPG pivoted to focus more carefully on programs with optimal conditions for making evidence-based grants – e.g., well-funded, includes a statutory purpose that allows for this work, includes a statutory requirement or incentive for evidence-based practices. This more selective strategy is part of EPG’s ongoing work to ensure ED can maintain high-quality competitions as it transitions to new programs authorized under the ESEA, as amended by the ESSA.
Second, in documenting ED’s progress toward increasing the number of ED-funded project evaluations that provide credible evidence by FY17, ED will count only i3-funded evaluations. However, ED is employing new strategies, similar to those discussed above, in order to expand the universe of competitive grant programs that can support rigorous evaluations beyond FY17. The EPG will engage competitive grant program offices to review plans for each fiscal year and determine if a program is designed appropriately for asking applicants to design evaluation plans that, if well-implemented, could meet WWC Evidence Standards (as defined in 34 CFR 77.1).
The intent with this metric of the APG is to: (1) inform ED of its progress in getting grantees to produce credible data; (2) develop and measure a systematic approach for WWC review of studies funded by ED competitive grant competitions; and, (3) ultimately, add credible, grant-funded studies to the education research base to facilitate wider use of effective interventions. In designing this APG, IES has agreed to report quarterly on the number of studies that are in the WWC and meet WWC Evidence Standards that were funded by i3. Creation of this APG has also spurred internal discussion about how to ensure that all grant-funded, rigorous studies are reviewed by WWC in a timely and organized manner. This APG is a mechanism to push ED forward in how it communicates with the field about the credibility and usefulness of evidence generated by competitive grant-funded projects.
EXTERNAL FACTORS THAT ED CAN AND CANNOT INFLUENCE
External factors that ED can influence: Some ED programs are specifically designed to produce evidence, and more evidence is appearing on a regular basis. ED can and will use its grant making authority to require or encourage applicants to consider evidence in the projects they propose and to encourage applicants to propose evaluations designed to produce evidence of what works. As discussed above, ED is also able to support the ongoing implementation of such evaluations under certain conditions.
External factors that ED cannot influence: The amount of evidence available in specific fields is highly variable. While ED hopes that this APG will influence the creation of new evidence, this work happens slowly and is dependent on many variables outside of ED’s influence. Such factors include technical capacity in the field, appetite for rigorous studies that may include random assignment, and funding. Accordingly, some grant competitions will enjoy a more robust evidence base supporting proposed interventions than others, especially in the short and medium term. To some extent, ED can influence the evidence base through its continued work to support the second metric of this APG. However, rigorous research done well can take time, and it may be several years before ED-funded studies that are determined to meet WWC Evidence Standards are available to push the field in new directions. In addition, ED is dependent on Congress to appropriate funding for competitive grant programs and to continue to allow ED to use evidence as a competitive priority in those programs.
Progress Update
In Q4, the Department spent virtually all of its FY16 discretionary grant money. Two programs that impact this metric have funds available through part of FY17 (Investing in Innovation) or all of FY17 (Striving Readers), so the final calculations for this metric are not yet available. However, when considering just the programs that have already made their FY16 awards, the Department has surpassed its target for this metric.
Despite this success, the Evidence Planning Group (EPG) continued to grapple with complex issues, such as the sustainability of the Department's evidence review process, transitioning to a new elementary and secondary education law, and above all, whether the efforts we have made over the past several years to use evidence in more grant programs has actually changed program staff or grantee behavior. EPG is currently meeting at least once a week to delve deeper into these cross-cutting issues and has, so far, developed a draft process for how evidence reviews could work (in FY18 and later) in a way that incorporates ESSA's evidence requirements and alleviates burden on IES while still upholding the integrity of the review. In addition, planning for FY17 is underway. Program offices have submitted their spending plans for ED review assuming level-funding next year, and EPG has already made a list of possible programs that could run evidence-based competitions in FY17. EPG continues to work with program offices to flesh out evidence plans.
Regarding the second metric, the WWC continued to review studies for the i3-funded evaluations that have produced publicly available reports. To date, the WWC has reviewed the majority of i3-funded evaluations that are publicly available and reviews are underway for the remaining studies. As of Q4, we have exceeded our FY17 target for this metric.
Next Steps
NEXT STEPS
EPG continues to work with program offices managing evidence-based competitions to provide resources in such areas as the existing research on particular topics and guidance on competition design. ED uses two indicators to determine whether we are on track to meet or surpass our target for the year: (1) documented intentions from programs that are running a new competition and planning to use evidence in those competitions and (2) published notices inviting applications in the Federal Register. For the first indicator, we finalized spending and policy decisions for each program in the early months of Q2. These decisions are captured in an internal database, which can then be used to determine whether it is likely that ED will meet its goal. As of Q4, all but two programs contributing to the APG have finalized their FY16 competitions. Based on preliminary data, we have exceeded our goal for this year but are unable to report final numbers at this time. For the second indicator, all programs running new competitions must publish a NIA in the Federal Register. In general, these notices are published by the end of Q2. As noted above, due to delays earlier in the year not all NIAs were published by the end of Q2; however, as of the end of Q3 all evidence-based competitions (save one, whose funding availability extends into FY17) were announced. The EPG, and anyone with an internet connection, can review each NIA on the Federal Register website. When a program running an evidence-based competition publishes its NIA, we know we can count at least a portion of its funds available for new awards towards this APG. After the NIAs publish, the next main milestone that we track is when awards are made. Most ED programs make their awards by the end of Q4, with a few exceptions that will make awards in the following fiscal year.
Because many of ED’s competitive grant programs involve offering professional development, IES offered a five-part webinar series on designing strong studies of professional development through the Regional Educational Laboratory Southeast. This was offered live in January 2016, just at the time when some applicants were designing their proposed evaluation activities. The archived webinars, agendas, and handouts are available on the IES website. In September 2016, IES launched a webpage on its website for evaluation-related technical assistance materials that were used to support i3 and FITW grantees (available here: http://ies.ed.gov/ncee/projects/evaluationTA.asp). The purpose of this webpage is to make these tools and materials widely available, including to applicants and grantees of other ED programs that require evaluations.
In addition, IES has implemented a robust strategy for managing WWC study reviews at scale and on an accelerated timeline. IES currently has five contracts that comprise the WWC investment, including a task order contract that allows for quick-turnaround reviews of studies that are submitted as part of grant applications. The WWC also is developing an online reviewer certification course, which may generate a greater number of individuals who are familiar with the WWC standards and capable of conducting study reviews. IES is preparing enhancements to the WWC’s reviewed studies database and the Find What Works interface, two key features on the WWC website that allow practitioners and policymakers to search for evidence from effectiveness studies. The new Find What Works tool and the reviewed studies database portal will be released as part of a major WWC website overhaul, which is scheduled for launch in October 2016. In addition, in early June 2016 IES launched a new online tool, RCT-YES, intended to help ED grantees design and conduct evaluations that meet WWC Evidence Standards.
The RCT-YES software package (Version 1.0) was released on June 14, 2016, and an updated version (Version 1.1) was released on July 25, 2016 that fixed a few bugs identified by users. Through October 26, 2016, the RCT-YES website had 12,060 visitors (from all 50 states and DC, as well as international users), and 672 downloaded the software. In the year leading up to the release, demonstrations of RCT-YES were conducted at six conferences, two foundations, and three agencies beyond ED (the Office of Management and Budget, the Department of Health and Human Services, and the Department of Labor).
To ensure that ED stays on track to meet our goal for the second metric, IES and the i3 program maintain internal databases with projected schedules for when grantees will release their evaluation findings. Although i3 grantees are required to make their evaluation findings publicly available, they are not required to make them available in a way that most easily facilitates WWC review. Despite this, the i3 program staff continue to encourage their grantees to submit final reports to ERIC (where they remain public in perpetuity) and indicate that they are i3-funded studies. The i3 program and IES staff are also in regular communication about when evaluation findings have been made publicly available and are ready for WWC review.
Expand All
Performance Indicators
By September 30, 2017, ED will increase to 20% the percentage of new competitive grant dollars that support evidence-based strategies.
By September 30, 2017, ED will increase by 20 the number of ED-funded project evaluations that provide credible evidence about what works in education.
Contributing Programs & Other Factors
CONTRIBUTING PROGRAMS
ED is currently planning for the following programs to contribute to one or both metrics of this APG:
Striving Readers
TRIO – Talent Search
Charter Schools Program
Investing in Innovation Fund (i3)
Various Higher Education Act Title III Programs
Various Office of Special Education Programs
National Professional Development
Innovative Approaches to Literacy
For additional programs see Appendix D of the Department’s FY2015 Annual Performance Report and FY2017 Annual Performance Plan, available here: http://www2.ed.gov/about/reports/annual/2017plan/2015-2017-apr-app-plan-...
Expand All
Strategic Goals
Strategic Goal:
Continuous Improvement of the U.S. Education System
Statement:
Enhance the education system’s ability to continuously improve through better and more widespread use of data, research and evaluation, evidence, transparency, innovation, and technology.
Strategic Objectives
Statement:
Facilitate the development of interoperable longitudinal data systems for early learning through employment to enable data-driven, transparent decision-making by increasing access to timely, reliable, and high-value data.
Description:
Statement:
Provide all education stakeholders, from early childhood to adult learning, with technical assistance and guidance to help them protect student privacy while effectively managing and using student information.
Description:
Statement:
Invest in research and evaluation that builds evidence for education improvement; communicate findings effectively; and drive the use of evidence in decision-making by internal and external stakeholders.
Description:
Statement:
Accelerate the development and broad adoption of new, effective programs, processes, and strategies, including education technology.
Description:
To achieve the president’s 2020 college attainment goal, the nation’s education system will need to graduate many more college-ready students from high school, ensure they have access to postsecondary education, and support them as they complete their degrees—all while facing resource constraints. When other sectors of the economy need to become better, faster, or more productive, they innovate, often relying on technology for help. The education sector is no different, and the need for innovation—and its benefits—spans grade levels, curricular areas, and student needs.
A 21st-century infrastructure that harnesses modern technological advances and provides easy access to high-speed Internet can serve as a platform for greater innovation in education. Accordingly, the Department will continue to focus on ways to improve schools’ technology infrastructure and effective use of technology. It will also continue to work with Congress to establish a new advanced research projects agency for education that will use directed research and development activities to pursue breakthrough technological innovations in teaching and learning.
Technology holds the potential to expand all students’ opportunities to learn, including by supporting personalized learning experiences, providing dynamic digital content, and delivering more meaningful assessments. Technology can also help districts and schools support teachers in becoming more effective and better connected to the tools, resources, and expertise students need and help them meet more rigorous college- and career-ready standards. Technology can also help schools by providing students and school library media specialists with increased access to academic tools and other resource-sharing networks. Technology can also help schools by providing students and school library media specialists with increased access to academic tools and other resource-sharing networks. Technology-enabled instructional and assessment systems will be pivotal to improving student learning and generating data that can be used to continuously improve the education system at all levels. Innovative technology must be matched by innovative educational practices to maximize its potential to improve learning and instruction for all students, and it must be accessible to all students, including students with disabilities. Leadership is essential to ensure that innovative applications are disseminated and brought to scale.
Agency Priority Goals
Statement:
By September 30, 2015, the percentage of select new[1] (non-continuation) competitive grant dollars that reward evidence will increase by 70%.
[1] “New competitive grant dollars that reward evidence” includes all dollars awarded based on the existence of at least “evidence of promise” in support of a project, per the framework in the Education Department General Administrative Regulations (34 CFR Part 75). Consideration of such evidence appears through: eligibility threshold (e.g., in the Investing in Innovation program); absolute priority; competitive priority (earning at least one point for it); or selection criteria (earning at least one point for it). The percentage is calculated compared to the total new grant dollars awarded.
Description:
GOAL OVERVIEW
Through its mix of grants, contracts, and internal analytic work, the Department of Education (ED) will support the use of research methods and rigorous study designs that provide evidence that is as robust as possible and fit for the purpose. This goal will track whether ED is increasing its internal capacity to make competitive grant awards based on the existence of (and amount of) evidence in support of projects, where appropriate.
KEY BARRIERS AND CHALLENGES
The process to collect data and track progress against the goal is still under development, and using evidence to award competitive grants entails a shift in culture and capacity building across ED to do it well. Additionally, goal targets are based on reasonable projections about which competitive grant programs may make new awards in this fiscal year, but the actual dollar amount awarded will depend on final appropriations amounts and other funding decisions and trade-offs. Grantees vary in their comfort with and understanding of evaluation and use of evidence, yet ED has limited resources to support grantees in conducting rigorous evaluations that would produce evidence of effectiveness.
EXTERNAL STAKEHOLDERS
There is an increasing emphasis among stakeholders on the importance of using evidence to support government program funding decisions, and ED regularly engages the field on this topic. A number of outside organizations have convened experts to discuss how to encourage such decisions. In addition, philanthropic and congressional actors prioritized using evidence to support decision-making and have encouraged the field to do the same. Finally, ED has worked with the National Science Foundation to develop a common evidence framework around which to organize research investments and grants. Indeed, ED is considered a leader on the issue among federal agencies, and external groups are eager for ED to deepen and broaden its efforts.