• Ohio’s ESSA plan demonstrates thoughtful continuation of the education reform work that has been done in recent years with a special emphasis on the performance of all students—from the lowest-achieving students to its gifted student population.


  • It builds on a strong foundation of high standards, a commitment to transparent reporting on a wide range of innovative indicators, and tailored supports for districts to improve their low-performing schools.


  • Ohio has included multiple incentives across its system to encourage students to take advanced courses and, in its achievement index, for schools to focus on supporting students to reach advanced levels beyond mere proficiency, as well as to move from below proficient to grade-level expectations.


  • While work remains to emphasize individual subgroup performance, Ohio is also helping to ensure that more students are included in its accountability system—where student performance is monitored more closely—by lowering its subgroup size from 30 to 15 students.




  • While Ohio includes many innovative measures of school quality, the sheer number of measures included creates a complicated system and tends to dilute the value of many individual measures as a result.


  • Ohio produces overall school grades, as well as grades across individual components—helping to cut through the noise—but even some of the individual components include dozens of measures.


  • In addition, the state doesn’t clearly define how schools with low-performing groups will be identified for targeted support and how it will ensure more rigorous interventions in schools that fail to improve over time.


  • Ohio’s plan does not clearly tie its long-term goals and annual targets to a strategic vision; while its long-term goals expect significant gap closure between subgroups, even if they are achieved, those gaps will persist.


  • Finally, Ohio could strengthen its plan by articulating an overarching vision, aligned to state-level policy initiatives, that is driving the system forward.


Click through the tabs on the left to see how Ohio scored in each category.



Ohio’s ESSA plan includes a clear set of long-term goals and annual targets that will reduce achievement gaps in the state over 10 years, with the aim of ensuring all students are prepared for success after high school.


Even though gaps will not completely close over this time period, Ohio provides data as evidence that the goals are ambitious for its schools and districts. For example, only about 5 percent of schools currently meet the state’s long-term achievement goal for all students of 80 percent overall proficiency.


The state’s goal-setting methodology expects subgroups to either cut the gap between current performance and 100 percent proficiency in half or to meet the 80 percent proficiency long-term goal within 10 years. As a result, however, some subgroups’ targets may be more attainable—but less ambitious—than others, and large gaps will remain even if the goals are achieved.


Ohio also wisely aligns its goals with its objectives for A-F school grades.


The state sets a 93 percent four-year graduation rate as its long-term goal, which also reflects the rate required for a high school to earn an A on that component of the grading formula. Other aspects of the A to F school grades system are similarly aligned, including its target proficiency rates, where school grades take into account whether a school has achieved the 80 percent proficiency rate reflected in the goals. Ohio also sets goals on additional measures, including the five-year graduation rate, its Performance Index in English language arts and math, and two of its school quality indicators.


For English learners, Ohio will consider a student’s initial level of English language proficiency (ELP) and grade level in setting progress targets, which expect either a one- or two-point increase in a student’s composite score on the ELP exam each year. While Ohio states the average student is reclassified as a non-English learner within seven years, its plan does not clearly indicate a maximum period of time that will be used in setting goals; this should be added. The state’s long-term goal is for 75 percent of English learners to be making sufficient progress toward ELP in 10 years, a significant increase from the baseline of 45 percent and a standard less than 5 percent of districts currently meet.



Ohio has built its accountability system on high-quality standards and assessments that are aligned to college and career readiness, including multiple end-of-course assessments in high schools.


As a result, Ohio aims to continue using flexibility to enable all middle school students to take high school end-of-course exams early, instead of their middle school tests (beyond the flexibility in ESSA for just eighth-grade math). These students would take more advanced end-of-course tests in high school, although a subset would take the ACT as their high school assessment because they will have already taken all of the end-of-course exams.


Ohio provides assessments in Spanish, the only language present to a significant extent among its English learners, in math, science, and social studies.


Ohio also provides a list of all other languages spoken by its English learners, and notes that districts are required to offer language translation services for all state-mandated tests. While English language arts tests are only offered in English, the state notes that English learners are permitted extra time and other accommodations.


Finally, Ohio could strengthen its plan by providing more information about its alternate achievement standards and aligned assessments for students with the most severe cognitive disabilities, including the steps it will take to ensure that the state does not exceed the 1 percent cap on participation in its alternate assessment for students with the most significant cognitive disabilities.



Ohio proposes a large number of indicators that incorporate stakeholder feedback.


These indicators include many innovative approaches that go beyond ESSA’s requirements to focus on K-3 literacy, college and career readiness, and students’ performance at all levels, including the lowest-performing and gifted students. The state clearly heard from stakeholders about including multiple dimensions of school quality on report cards, such as allowing districts to provide narrative information on their accomplishments and priorities beyond the reported measures.


Ohio organizes its indicators into six main “components:” achievement, progress, gap closing, graduation, “prepared for success” (i.e., postsecondary readiness), and K-3 literacy improvement. While all of these components are potentially promising individually, the state needs to be cautious about incorporating too many indicators. This makes the system complex, without much differentiation of schools. With so many indicators in the system and multiple measures within each of them, the summative ratings may not provide clear information to parents and the public.


To measure achievement, Ohio combines two different measures.


One is a Performance Index, where students scoring above proficient levels receive additional weight and students scoring below proficient are weighted less. The other is an “Indicators Met” calculation, which measures the extent to which schools meet benchmarks on a set of indicators covering both achievement and other factors, like its chronic absenteeism indicator. Indicators Met assesses whether students met the state’s targets on every required test, including science and social studies, which may help address concerns of curriculum narrowing. The measure also supports alignment between the state’s goals and its A-F system. However, Indicators Met is only one part of one component in the formula, and also includes factors that are quite different from achievement, like chronic absenteeism and a measure of gifted students’ performance, progress, and access. As a result, an individual metric within Indicators Met may not receive significant attention.


While the Achievement component only measures all students, subgroup performance is captured in a second component: Gap Closing. The Gap Closing component also has sub-components, including subgroup-specific performance indices, graduation rates, and progress toward English language proficiency (ELP). While it is worthwhile for Ohio to explicitly include subgroups in one of its six components, it is unclear whether the progress of English learners toward ELP will be overshadowed if it is not a stand-alone indicator.


Ohio deserves recognition for emphasizing on-time graduation and postsecondary readiness.


The state weights the four-year rate more heavily than five-year graduation rates in its Graduation Rate component, emphasizing on-time completion, while recognizing some students need additional time. However, it will be important to ensure both rates are reported separately to provide transparency and clarity for stakeholders. High schools are also evaluated on an innovative Prepared for Success component, which includes multiple measures of college and career readiness. Students are counted as “ready” if they earn a college-ready score on the ACT or SAT, earn an industry-recognized credential, or graduate with an honors diploma, and schools receive a bonus for students earning college credits via dual enrollment or Advanced Placement or International Baccalaureate exams.


Finally, the K-3 Literacy measure calculates the percentage of students in grades K-3 who were off track in reading and either get back on track to 3rd-grade literacy or score at the proficient level in 3rd grade on the ELA test. This is a novel approach other states could consider emulating, especially as most elementary school measures do not include the early grades.



Ohio’s system is complex—but it aims to balance the incentives for students to meet grade-level targets in its Achievement and Gap Closing components with incentives to encourage all students to make progress year-over-year.


Its Achievement and Academic Progress components each receive equal weight in the A-F formula, with additional weight provided for Gap Closing.


Because the Achievement Component is primarily based on Ohio’s Performance Index, there are incentives to get students not merely to the proficient level, but also to higher levels of achievement. But the index does not allow students scoring at advanced levels to fully compensate for lower-performing students. This nuanced approach is echoed and reinforced by the Gap Closing component, which includes the Performance Index for each individual subgroup in a school as half of the overall component.


The Academic Progress component includes a value-added growth metric in English language arts, math, and science.


The inclusion of science, as well as the state’s ability to measure growth in high school, sets Ohio apart. So does Ohio’s inclusion within Academic Progress of three other value-added measures: growth of the lowest-performing 20 percent of students in the school, students with disabilities, and gifted students. This recognition for student growth across the performance spectrum is unique.



Ohio attempts to ensure that more students are included in its accountability system and their performance is monitored more closely.


The state does this by including a specific weight for subgroup performance in its A-F system (via the Gap Closing component) and lowering its minimum group size for accountability to 15 students over three years. Ohio’s plan should be commended for its thorough analysis of the effect of choosing different minimum group sizes and its process for consulting with stakeholders on the best approach. Because the state will lower its group size from 30, the percentage of schools held accountable for students with disabilities will increase from 58 percent to 86 percent.


The Gap Closing component accounts for 15 percent of a district’s grade and from 17 to 27 percent of a school grade (depending on grade level).


While this weight is less than the weight for either the Achievement or Progress components, it will send a signal that subgroup performance matters. It is also notable that a school cannot receive an A for Gap Closing if a subgroup scores below an established minimum threshold on the Performance Index in either subject, or has a four-year graduation rate below 70 percent.


However, the state’s approach to identifying targeted support schools could be improved and clarified to ensure schools with struggling subgroups receive the supports and attention they need. In addition, Ohio’s long-term achievement goals for subgroups emphasize gap closing, but would not completely close gaps between all students and individual subgroups.


Ohio should be commended for examining subgroup participation, as well as that of all students.


A school’s Gap Closing grade will be docked if any subgroup does not meet the 95 percent participation rate requirement. In that vein, any nonparticipating student is counted as a zero in the Performance Index in both the Gap Closing and Achievement components of the A-F grade. This could have real consequences and will help preserve the integrity of the A-F results and ensure all students are included.


Ohio’s plan stands apart for measuring the performance of gifted students as a separate subgroup and its emphasis on moving students to advanced levels.


First, the scores of students taking tests above grade level are given extra weight in Ohio’s Performance Index. Second, Ohio’s A-F grades examine opportunities for gifted students, including the prevalence of traditionally underrepresented students, in its Indicators Met measure and in the Academic Progress component.



Ohio’s accountability system produces a single summative A-F grade overall and individual grades for each of six components, which can help educators, parents, and the public understand school performance in context and be used to identify schools for support and interventions.


Ohio should also be recognized for its efforts to improve its school and district report cards, adding new features to help communicate school performance along a number of dimensions.


With six components and multiple subcomponents, the sheer number of measures the state includes is cause for concern.


The six components in the A-F system appear simple, but most also have multiple sub-components, adding significant complexity and making it challenging to discern the emphasis any one metric receives. For example, Achievement is weighted at 20 percent of a district’s grade and includes two sub-components, the Performance Index and Indicators Met. The Indicators Met measure is worth a quarter of the total weight afforded to Achievement, but Indicators Met itself includes 29 different indicators for districts. As a result, individual metrics account for very little of summative ratings. For example, chronic absenteeism—one of the Indicators Met—accounts for less than 1 percent of a school’s overall grade.


Overall grades make the identification for comprehensive support is straightforward.


Overall grades will be used to name the lowest-performing 5 percent of Title I schools, which Ohio will call “Priority” to maintain consistency with its previous system. Ohio will identify these schools once every three years, beginning in 2018-19, plus any high school with a four-year graduation rate below 67 percent. Three years later, in 2021-22, Ohio will also identify schools for comprehensive support that were previously in targeted support but did not improve.


Ohio could clarify how it will identify targeted support schools, or “Focus” schools.


The plan defines four different ways to identify them: one based on the Performance Index for subgroups, one based on Gap Closing, one based on subgroup performance benchmarks over multiple years, and one based on “locally determined” goals. While it could be beneficial to identify struggling subgroups in multiple ways, the plan provides insufficient detail, which raises questions. For example, how many years qualify as “multiple,” and how can a “locally determined” goal result in a statewide approach that is consistent?


Clarifications are needed to Ohio’s approach to identify targeted support schools in order to assess whether it ensures schools needing support receive them. The state should be recognized for going beyond ESSA to flag additional schools in a “Watch” category each year if their students with disabilities, English learners, low-income students, or gifted students are not reaching satisfactory levels of achievement and growth.



Ohio’s plan for supporting low-performing schools tends to focus on, and work through, its low-performing districts.


The plan features a detailed continuum of district supports, with rigorous interventions for those that fail to improve, including a comprehensive district and state diagnostic review, required participation in a peer-to-peer network, and the possibility of appointing an Academic Distress Commission to oversee the district. The state also provides technical assistance, such as tools for data analysis, resource allocation, and comprehensive planning, with clear goals and objectives. To support districts in selecting evidence-based interventions, Ohio also plans to create an online clearinghouse with evidence-based strategies and programs. Schools would not have to select strategies from the clearinghouse, but it could be a valuable resource to build their capacity and knowledge.


There are places where Ohio could strengthen its plan by making clearer commitments about what it will do, and when, to support low-performing schools.


The plan does not include more rigorous, state-determined interventions for comprehensive support schools that fail to meet the exit criteria, stating only that these schools may be subject to additional monitoring, on-site reviews, and scrutiny of their interventions. It also appears that Ohio may require these districts to apply for funding to provide Direct Student Services to its students, but the plan is ambiguous on whether the state is using this flexibility or not—as well as, more generally, how it will allocate to districts the 7 percent required set-aside of Title I funds.


To support equitable allocation of resources, Ohio collects and reports school-level expenditure data, and will develop a review process for resource allocation to help identify inequities between districts and schools and determine ranges of acceptable allocations. This data will be used to inform improvement planning, funding models, and expenditures. Ohio’s plan to improve equitable access to quality teachers is also commendable. That said, Ohio’s plan could be strengthened by discussing how the state plans to distribute federal school improvement resources (i.e., the 7 percent Title I set-aside) to districts on behalf of identified schools.



Schools will exit improvement status if they are no longer in the bottom 5 percent of Title I schools for two consecutive years.


While it is helpful to look for gains over multiple years, a school could exit improvement simply by other schools getting worse and not the school itself improving. Ohio’s exit criteria for schools in comprehensive support due to low subgroup performance suffers from the same flaw, as the only expectation is to improve their subgroup performance so that no individual group is performing similarly to subgroups in the bottom 5 percent. Ohio should consider adding other expectations for progress—such as increasing the school’s grade or performance on key components—to ensure schools exiting have made real gains over time.


Ohio’s exit criteria for targeted support schools are not based on relative rankings and expect schools to exit status within four years.


This is a rather significant time period. To exit, “Focus” schools must earn a C grade overall on the Gap Closing component and meet the state’s subgroup performance goals. These criteria may be rigorous, but Ohio could improve the alignment between its identification and exit criteria to demonstrate the exit criteria for “Focus” schools expects sustained gains for the subgroup that was previously struggling.



Ohio has articulated a number of places where it plans to monitor implementation over time and make changes accordingly.


For example, it may add new indicators, such as access to advanced coursework, 9th-grade persistence, and ASVAB as an indicator of military readiness. This suggests a commitment to continuous improvement, as does its plan to monitor specific indicators already included in the system and make updates to ensure there is continued alignment with state priorities and goals. For example, Ohio may incorporate a “discipline check” of a school’s suspension and expulsion data into its chronic absenteeism measure to ensure that high suspension and expulsion rates are not being overlooked.


The state appears committed to continuing stakeholder engagement, especially at the local level, but the plan lacks implementation timelines for many of the tools and resources it plans to develop.


For instance, it is developing a toolkit to help schools and districts collaborate with partners to determine priorities for federal funds and set continuous improvement goals. It will also be important for the state to continually engage stakeholders. Especially instructive will be to measure how stakeholders and parents are interacting with its report card. Ohio could consider developing some sort of overarching stakeholder group to provide feedback and help the state make decisions.

Learn more about why these categories matter: