The tools used to create EdReports’ mathematics and English language arts reports have been in use for more than five years. As a learning organization, we regularly examine our review process, evidence guides, and review criteria to determine what, if any, updates need to be made.

Beginning in the summer of 2019, EdReports launched efforts to formally collect feedback on our review tools (review criteria, corresponding evidence guides, and the overall review process) in an effort to continually learn and improve our reports. We have synthesized feedback from the field and have applied strategic updates to our tools. Reviews utilizing the revised tools will begin in spring 2020 with completed reviews launching early next year.

We have put together a list of FAQs that provide insight into our revision process as well as address potential impact on the education field and publishers.


Q: What are considered to be EdReports “tools”? 

A: EdReports “tools” consist of our review criteria, corresponding evidence guides, our training, and the overall review process. These tools are intended to be applied by trained educator reviewers to determine the degree to which instructional materials meet college and career-ready standards and usability indicators. The goal of our reports is to empower schools and districts with evidence to make critical instructional materials selection decisions.

Q: Why is EdReports revising its review tools?

A: EdReports’ theory of action is that by providing credible information (free, public reports on our website) we will help drive demand for quality instructional materials. As more states, districts, and schools demand quality materials, this in turn will influence publishers to improve the quality of curriculum which will ultimately improve student learning. Consistent with our mission, we are examining how we can make these reports even better and more helpful to the field. 

We published our first reviews more than five years ago. Since then, we have received constructive feedback from the field and our reviewers about how to improve our reports. We continually track potential improvements to our review processes and products, and we are always exploring, listening, and reflecting to determine what type of maintenance may be necessary. 

In particular, we have made significant changes to indicators focused on usability (gateway 3) torespond to how much materials have improved since our organizational launch in 2015. Then, only 1 out of the 19 programs reviewed met expectations for alignment. Now, educators have more aligned programs to choose from. We realized that we could provide more information about product usability to support a district’s selection and implementation process. 

We have revised the tools—we have not recreated them. 

Q: Is this revision of your review tools in response to the COVID-19 health crisis?

A: The work on these revisions began in 2019 and pre-date the current health crisis. In order to meet emerging needs from districts, EdReports will begin providing more information in summer 2020 about the components of curriculum that support critical needs, such as remote learning and digital components, for more than 200 existing grade-level materials that meet expectations for alignment. This is in response to the growing need from school districts for more precise information and guidance about specific characteristics of materials (e.g., ease of implementation, student and parent facing components, availability of digital components) so that districts can learn more about these components during their curriculum adoptions.

Q: Are all review tools being updated?

A: Revisions to indicators focusing on questions of usability (gateway 3) have been implemented across all content areas (mathematics, ELA, science). Revisions focusing on questions of standards alignment (gateways 1 and 2) have been made in our K-12 English language arts and K-8 mathematics tools. 

Our science and ELA foundational skills review tools are relatively new and therefore do not necessitate revisions at this time.

Q: What exactly is EdReports revising or updating?

A: The goal of this tool revision is to minimize disruption for the field and continue to provide high-quality evidence of the greatest value. We have incorporated our learnings and reflections from conducting several hundred reviews as well as feedback from the field about what information is the most valuable

Revisions are based on primary feedback from educators and researchers that include:

  • Improving the review process by making it more efficient;
  • Modifying the report structure so content is easier to navigate and use; 
  • Updating a small number of indicators in focused on alignment (gateways 1 and 2) in ELA and mathematics review criteria; and
  • Updating usability indicators (gateway 3) to better align to what teachers and students need to successfully utilize instructional materials

Q: Why are so many changes being made to usability indicators?

A: Gateway 3 focuses on the question of usability: Are the instructional materials user-friendly for students and educators? Materials must be well designed to facilitate student learning and enhance a teacher’s ability to differentiate and build knowledge within the classroom. In order to be reviewed and attain a rating for gateway 3, the instructional materials must first meet expectations for alignment to college and career-ready standards (gateways 1 and 2).

When we released our first mathematics reports in 2015, only one program met expectations for alignment which meant gateway 3 was not applied to the majority of our early reviews. A similar pattern emerged in early ELA materials reviews. In the past five years, more materials have met criteria for standards alignment. As a result, gateway 3 is becoming a much more important aspect of our reviews. In analyzing research and listening to feedback, we realized we could provide more valuable information to the field around these criteria

Q: What are the changes you made to indicators around usability and design (gateway 3)?

Our revision of gateway 3 addresses details about program format, student populations that require support for language acquisition, learner variance, and teacher supports to help understand standards and how to utilize the materials. The tool will include a ‘Technology Criteria Form’ that allows for the collection of a greater level of detail on features such as data privacy and interoperability with learning management systems.

In addition, gateway 3 changes increase clarity and consistency across all content areas. Primary changes include: 

  • Streamlining the review tool from five rating sheets to four
    • Teacher Supports – identifies opportunities and guidance for teachers to effectively plan and utilize materials with integrity and to further develop their own understanding of the content
    • Assessment – identifies what assessments are present, which standards they are intended to assess, and how the assessments across the program provide information on what students are learning or have learned
    • Student Supports – identifies information related to learner variance, English learners, and what students need to engage with content
    • Intentional Design – identifies how the program is designed for use in the classroom 
  • The inclusion of a separate, unscored Technology Criteria Form
     

The revisions also clarify student populations and renamed the criterion for “Differentiated Instruction” to “Student Supports.” Under “Student Supports” we:

Created specific indicators for Learner Variance that look for:

  • Supports and/or accommodations
  • Grade-level engagement
  • Opportunities to meet or exceed grade-level expectations
  • Support for below grade-level/unfinished learning
  • Varied approaches to learning tasks/opportunities to demonstrate learning

Created specific indicators for English Learners (EL) that look for:

  • EL content and lesson objectives that are grade-level/age appropriate and of equal rigor to non-EL instruction to help ELs meet grade-level standards. 
  • A curriculum that reflects the experiences and perspectives of a variety of cultural and linguistic groups.

Created a Technology Criteria Form* that will be given to a publisher at the time of materials purchase and posted alongside the report with publisher’s response.

* Due to emerging needs from school districts in response to the COVID-19 health crisis we are revising our Technology Criteria Form to include more information about the components of curriculum that support distance learning, including digital components. The revised form will be available early summer 2020.

Q: What sort of research or stakeholder engagement took place as part of the revisions?

A: The standard first step for all EdReports content review expansions is a listening and learning tour. EdReports must first learn about the market and answer key questions about the status of standards, what the need is, and who to engage for feedback. This tour is with researchers, nonprofits, publishers, states, districts, and classroom educators to ensure we get feedback and that people feel heard and clear on next steps. The same learning applied to the tool revision process.

In addition: 

  • EdReports executive leadership and board members utilized a reflective process designed by the Center for Public Research and Leadership to guide decision making and planning.
  • We worked closely with states and districts to customize our tool for an internal curriculum vetting process, including rethinking our approach to equity, learner variance, and English Language Learners.
  • We conducted an internal audit of our tools and coordinated with experts and organizations with deep experience in working with students with learning differences and English learners.

Q: How will these revisions affect programs that have already been reviewed by EdReports? Will they be re-reviewed against the new criteria? 

A: EdReports considered this question at length as we have reviewed 700 grade/course reviews in the past five years with some copyrights that go back to 2008. We consider this tool revision to be scheduled maintenance versus a radical new approach. We stand by all of our reviews and believe that they provide a host of important evidence. This set of revisions to the tools provide more fine grained information to help districts make choices.

Because of those reasons, EdReports will not be conducting retroactive reviews as part of its roll out of the revised tools.* However, EdReports’ established policy is that it stands ready to re-review materials when they have been substantively updated.

Reviews utilizing the new tools begin this spring with the first round of reports published early in 2021. 

* Due to emerging school district needs arising from the COVID-19 health crisis, EdReports will provide additional information this summer about critical components of curriculum, such as remote learning and digital resources, for more than 200 existing grade-level materials that meet expectations for alignment.

Q: Can publishers request a re-review of their program?

A: We stand ready to re-review materials when they have been substantively updated. Our desire is to make these decisions collaboratively with publishers. We ask publishers to indicate where the changes are and discuss whether they would have an impact on select indicators or the review overall. Depending on the answer, we may bring together a new review team to re-review the materials in their entirety or to review for just a few indicators.

Q: Can I request to be reviewed on the old tool?

A: No. In order to ensure an equitable process for all publishers, programs scheduled for review after May 2020 will be using the revised tools. The exception to this policy will be for materials that are part of a K-8 series where a portion of the series was reviewed on the initial tool. The remaining grades in the series will be reviewed using the same tool to ensure consistency throughout the program.

Q: Did EdReports get rid of indicators around differentiation?

A: Differentiation indicators are still present, but they are now labeled Student Supports and include more expansive coverage. For example, we include indicators that are more specific to student needs, such as those learning English.

Q: For programs that have been reviewed, is it possible that some of them would receive a higher score with the revised tools?

A: We have made minor changes to our criteria focusing on standards alignment (gateways 1 and 2) and feel confident that overall alignment ratings will remain consistent. The bulk of our changes are focused on usability (gateway 3), and we will look at new versions (copyrights) of programs with our revised tool as they become available.

Q: How frequently does EdReports plan on updating its tools?

A: We are always cautious to make changes because it’s important for school districts and other stakeholders to know that our reviews are accurate and stable. We understand that consistency is important, but we will not use consistency as an excuse not to innovate. We are a learning organization that continues to evolve and listen. Our intention is to revise our tools frequently enough to reflect research consensus and ensure our reviews continue to be relevant to the field, but to do so without causing confusion.

Q: Why didn’t EdReports include more indicators around supporting diverse student populations such as English Learners?

A: Regarding English Learners, it is important to note that the research and evidence for student needs in curriculum is still developing. We are proud of the first steps we have taken around new indicators that provide information about English Learner supports in our review tools; however, we are mindful of not forcing a consensus regarding the research and respecting the emerging evidence based approaches currently being discussed. At EdReports, our role is not to impose conclusions that have yet to be established. Rather, once consensus is reached we aim to partner with educators who know materials and think about what the appropriate review criteria are to assess curriculum.

Content Area-Specific Changes

Mathematics K-12

Q: What changes are happening with the mathematics review tools?

A: Our mathematics tools were the first to be released when we launched EdReports in March 2015. In the past five years we have been listening to and collecting feedback from the field while simultaneously monitoring changes in mathematics programs. The gateways 1 and 2 changes in mathematics address coherence in gateway 1, and the mathematical practices in gateway 2. 

Gateway 1 revisions focus on more detailed information on coherence. We achieve this through these key changes:

  • Provide narrative evidence only on the number of instructional days in a program.
  • Delineate coherence indicators by vertical (between grades) and horizontal (within grades) coherence. 

Gateway 2 revisions look at all mathematical practices in more depth. We achieve this with these key changes:

  • Provide guidance to engage students with a practice, and provide opportunities for students to meet the full intent of a mathematical practice.
  • Emphasize mathematical practice 3: Construct viable arguments and critique the reasoning of others.
  • Improve reporting on mathematical practice 6: Attend to precision with the specialized language of mathematics. 

Gateways 1 and 2 for our high school mathematics review tool are unchanged.

ELA K-12

Q: What changes are happening with the ELA review tools?

A: Our revisions to K-12 ELA review tools focus on bundling indicators to make the resulting reports more streamlined. 

In gateways 1 and 2, we are making these key changes: 

  • Identify “bloat”—when a program is difficult to use because there is more content than can be feasibly taught in a single school year.
  • Align the foundational skills indicators that evaluate core comprehensive ELA materials to match our supplemental foundational skills tools. The purpose of this change is to give districts more opportunity to dig deeper into essential elements, such as phonics and phonemic awareness, that are connected to how kids learn to read in the early grades.

To compare the v1.5 and v1.0 tools for K-5 ELA specifically, see our crosswalk document.

Q: What changes are happening to the ELA foundational skills tools?

A: Our ELA foundational skills review tools were launched in 2019, and we feel confident in their current status. As with our other content areas, we want our foundational skills reviews to be out in the field long enough for users to provide feedback on what’s working and what can be improved.

Science K-8

Q: What changes are happening with the science review tools?

A: Middle school science reviews were launched in February 2019 and K-5 reviews will be released in the second quarter of 2020. No revisions will be made to gateways 1 and 2 of the review tools as we feel confident about their current status. As with our other content areas, we want our science reviews to be out in the field long enough for users to provide feedback on what’s working and what can be improved. At this point in time, the current feedback reveals that the science review tools are producing quality evidence that is helpful to the field.

However, we will be updating usability criteria (gateway 3) for our science tools to align with changes being made to our ELA and mathematics tools.

Using New Reports

Q: Does the introduction of EdReports’ updated tools mean reviews conducted on the old tools are irrelevant?

A: No. As a learning organization this revision is part of a regular cycle of learning and reflection—it doesn’t mean our previous tool was wrong. We stand by our reviews and believe that the indicators provide high quality information to the field. However,  we also knew it was time to reexamine the market and revise our indicators to provide more fine-grained information to help districts make choices.

Q: Can I use reports reviewed on the old tool alongside reports reviewed on the new tool during an adoption? Is the information equivalent?  

A: Yes, the foundation of our tools remains the same. Information about standards alignment is focused in the first two gateways. Usability information is found in gateway 3. Although there are score point shifts within some criteria, overall score totals have not changed in gateways 1 or 2. We always recommend looking at the details of a report after first looking to see how a program aligned to the standards.

For additional questions please contact communications@edreports.org.