• The most promising portion of the state’s plan is Maryland’s commitment to high standards and assessments—and as important, its consistency over time to the same standards and assessments. This continuity provides the state’s accountability system with a strong foundation that can be leveraged by all players to ensure students graduate prepared to succeed after high school.


  • Maryland’s exit criteria for comprehensive support schools are also commendable for requiring schools to reach and maintain objective targets before exiting.




  • Maryland’s system of indicators is unwieldy and includes measures unlikely to differentiate among schools; this will likely dilute the impact of stronger, more meaningful indicators in the overall system. This challenge is exacerbated by the state’s proposed weighting system, which underweights the stronger academic elements.


  • The system fails to emphasize the performance of subgroups in the calculation of summative accountability scores, potentially obscuring uneven outcomes for some groups of students. While the plan refers to the potential for including some measure of the “Equity Gap” in the future, the measure is vaguely defined.


  • While the supports for low-performing schools described by the state include some promising language, the plan lacks a coherent system of support. It fails to instill urgency around prescribing rigorous, transformative strategies likely to turn around schools and districts or to provide high-quality options to students as quickly as possible. Most importantly, the plan fails to articulate serious consequences for persistent low performance.


Click through the tabs on the left to see how Maryland scored in each category.



Maryland’s academic achievement goals are based on the same performance goal: reduce non-achievement for all students statewide and each student subgroup by half.


This overarching goal is applied to proficiency rates, graduation rates, and English language acquisition across relevant grades, subjects, and student groups, though the timelines laid out in the plan vary. As a result, the goals set higher expectations for underperforming subgroups, requiring them to make greater progress to close gaps to proficiency and graduation than higher-achieving groups. If achieved, the goals would result in the narrowing of achievement gaps between groups over time, but substantial gaps would remain.


Maryland’s proficiency goals are certainly ambitious, expecting gains of nearly 40 percentage points by 2030 in the percentage of students demonstrating proficiency across several subgroups relative to the 2016 baseline. For example, the state will expect that 58 percent of African-American students will be proficient in math by 2030, and 54.18 percent of English learners will be proficient, while 74.36 percent of white students will be expected to reach proficient proficiency. However, the state provides no data documenting trends in performance that would allow an assessment of whether the goals are attainable.


Graduation goals aim for a 95 percent four-year cohort graduation rate for all students and among all subgroups by 2020.


The plan refers to a five-year graduation rate goal, but does not define that goal explicitly. Including a five-year graduation goal acknowledges that some students may require more time to meet graduation standards, but it should be clearly defined. Further, the projected graduation rates outlined in the plan show no subgroup attaining the 95 percent goal based on the four-year rate and only one subgroup reaching 95 percent based on the five-year rate—either contradicting the goal itself or suggesting that it is not attainable with the timeline projected. The plan references intentions to reset goals with 2019-20 data as the baseline.


Maryland bases its goals for English language acquisition on a six-year timeline for all English learners to achieve English language proficiency, including the baseline year. The state sets a 2030 target of 73 percent of English learners meeting this standard.



Maryland uses a rigorous set of standards and assessments aligned to college and career readiness.


The state continues to rely on the Common Core State Standards since first adopting them in 2010 and uses the PARCC assessment program.


PARCC provides translations of English language arts and mathematics assessments in the top 10 languages present in each state, and the state is currently field testing a new science assessment and will begin developing a Spanish-language version once that process is completed.


Maryland uses the WIDA ACCESS 2.0 to assess English language proficiency. This is a widely accepted, evidence-based, high-quality assessment.


Maryland offers an alternative assessment based on alternative standards for students with severe cognitive disabilities. The state would have a stronger plan if it provided the steps it will take to ensure that it does not exceed the 1 percent cap on participation in the alternate assessments.



Maryland’s plan outlines a menu of indicators that includes a mix of robust, objective measures next to comparatively weak, subjective and duplicative measures of performance—resulting in an overly complex system.


Among the stronger measures is the Academic Achievement indicator, which combines proficiency rates and average schoolwide performance. The state should be commended for planning to include proficiency on science and social studies assessments among its indicators, once it completes field testing for newly adopted instruments. And the state indicates plans to develop and add measures of readiness applicable to grades K-3 in the future.


However, the non-academic measures include some elements that are easy for schools to satisfy and seem unlikely to differentiate among schools, weakening the overall system’s ability to identify schools most in need of intervention.


For example, as part of its “Completion of Well-Rounded Curriculum” measure for elementary and middle schools, the state counts the percentage of 5th-grade students who pass social studies, arts, physical education, and health courses. It seems unlikely that many 5th-graders fail these courses, so these measures are unlikely to meaningfully differentiate among schools. Other indicators include chronic absenteeism, school climate (based on surveys), and access to well-rounded curricula (a measure of enrollment in various courses at each level). While chronic absenteeism and school climate measures could potentially add meaningful nuances to the system, the state should monitor data to ensure the measures truly add valuable information, and in the case of school climate survey data, produce valid and reliable data across schools and districts.


The access indicators are more problematic—overlapping with other, more robust indicators.


A finer measure of equity of access could focus on instructional opportunities where inequities are identified by data, potentially including gifted and talented services and enrollment across student subgroups in special programs and advanced courses.


At the high-school level, measures of access and completion of a well-rounded curriculum are stronger, focusing on indicators that translate to increased opportunity after high school, such as earned college credit, advanced career and technical education, and industry certification. These measures are not duplicated elsewhere in the system and provide markers of college and career readiness in alignment with state goals. However, the state should ensure that all options for satisfying this indicator, especially those like dual credit and completion of an apprenticeship that is not externally validated, are an adequate signal of postsecondary readiness.


Given the relative weakness of several of Maryland’s indicators, the state would be better served by a shorter list of stronger indicators.


A simpler system would be easier to explain to parents and the public and would prevent weak measures from diluting the impact on accountability scores of measures that contribute discrete information and truly differentiate among schools.



Maryland is planning to incorporate both student proficiency and growth.


By slightly overweighting growth relative to proficiency, the state’s proposed system should create incentives for schools to focus both on improvement and on standards. However, the overall weight placed on proficiency growth is extremely low, and the state’s approach to measuring proficiency is overly complex.


The overall weighting of the most rigorous academic achievement indicators in Maryland’s system—proficiency rates and growth—amounts to just 45 percent of the total score for elementary and middle school and just 20 percent for high school where no growth measure is included. In addition to challenging ESSA’s requirement that academic measures be given “much greater” weight in state systems, this low weighting may not send strong enough signals about the importance of academics.


Maryland could also strengthen its plan by narrowing the focus of its proficiency measure.


The proficiency measure combines the percentage of students achieving proficiency and the average PARCC performance level of students in the school. Combining these two pieces adds unnecessary complexity to Maryland’s system, dilutes the incentive to move students to and above the important bar of proficiency, and provides educators, parents, and other stakeholders with minimal actionable information.


The state’s chosen methodology for assessing growth, the student growth percentile, will give credit for student growth at all levels. The student growth percentile assesses growth relative to that of other students and not against a standard. Maryland should monitor its data to ensure that the growth model is incentivizing schools to make progress toward proficiency. The plan does indicate that Maryland intends to study the addition of a growth-to-standard indicator, but is noncommittal.



Maryland indicates that scores on indicators will be calculated overall and for each subgroup.


However, only the school-wide average scores will contribute to the summative accountability score that translates to Maryland’s 5-star rating system. As a result, a school could receive a high summative score and star rating, and still have underperforming subgroups of students. Subgroup scores will be used to identify schools with underperforming student subgroups for targeted support, but the state does not provide estimates for how many schools might be identified.


An “Equity Gap” calculation based on subgroup performance will be included as part of a school’s final summative rating in the future.


But the plan does not provide a clear explanation of the method of this calculation, nor does it indicate how the Equity Gap measure may factor into the methodology for assigning the summative star rating. The discussion of a discrete measure of equity or achievement gaps goes a step further than many state plans, but it remains to be seen how the measure may be operationalized within the accountability system.


Maryland’s reporting and use of a minimum group size of 10 for both reporting and accountability should promote transparency in how well schools are meeting the needs of all students.


Maryland indicates that it will include former English learners in the English-learner subgroup. However, the state does not indicate for how many years and does not provide any data to demonstrate whether or not this will mask the performance of current English learners.


Maryland includes some indicators of equity of access to instructional resources that may illuminate gaps in resources across schools and among student subgroups, but those measures could be improved to target challenge areas identified by data.


For schools with participation rates below 95 percent, Maryland will count non-tested students as non-proficient, but the plan would be stronger if it also included consequences for schools that miss the participation threshold, overall or for particular subgroups.



Maryland’s approach to identifying schools for support relies on a relative approach.


The state says it will identify for comprehensive support the lowest-performing 5 percent of Title I schools according to rank order, based on all indicators in the accountability system using two years of data, as well as high schools that fail to graduate one-third or more of their students. This approach meets the letter of the law, but sets an arbitrary cut point for schools identified for support and may leave out schools that could benefit from additional support.


Additionally, the plan’s emphasis on including “all indicators” in the ranking, its long list of relatively weak indicators, and the low weight placed on academic indicators will dilute the impact of higher quality measures in school ratings and may prevent the state from honing in on schools struggling the most.


Maryland does deserve credit, however, for its plan to identify the lowest-performing 5 percent of all schools, in addition to the lowest-performing Title I schools, extending support beyond the requirements of the law.


The state’s method for identifying schools for targeted support is stronger.


The state will identify for targeted support any school with one or more subgroups that fail to meet annual performance targets for two consecutive years. By relying on a standard of performance rather than the relative performance of other schools, this method should better pinpoint schools that are failing to progress toward the state’s goals. However, the plan does not include data on how many schools may be identified.



Maryland’s plan identifies promising support components, and the language indicates that several interventions, such as staffing, scheduling, and programmatic changes, will occur, rather than providing a menu of possible interventions.


However, the plan lacks specificity in describing the process that will be used for enacting these interventions, monitoring their effectiveness, and taking additional action as needed.


Furthermore, Maryland’s plan fails to articulate what happens when schools persistently fail to improve following initial intervention efforts. This creates a concern that nothing meaningful will happen to improve learning in struggling schools.


The plan could be strengthened by documenting how the state plans to use its 7 percent of funds set aside from Title I—presumably in support of its identified interventions. Additionally, the state should indicate if and how it intends to provide direct student services using the optional 3 percent set-aside.



Maryland’s plan includes strong exit criteria for schools identified for support, requiring demonstrated improvement linked to the state’s goals.


Schools exit comprehensive support when they no longer meet the identification criteria and have met performance targets for two consecutive years. These objective, rigorous criteria will help ensure schools are on the right track before they exit that status.


Maryland is less specific in defining performance requirements as part of the exit criteria for schools identified for targeted support to address underperforming subgroups. The plan states that such schools exit when they no longer meet identification criteria. It goes on to explain that schools must demonstrate “significant progress” toward meeting annual targets for two consecutive years. This language suggests a lower standard than actually meeting annual performance targets, and the plan does not establish a standard for what would qualify as “significant.” A low bar could allow schools to bounce in and out of targeted support status year after year without making real improvements. The state could clarify the exit criteria and establish stronger standards for demonstrating sustained progress for schools receiving targeted support.



Maryland’s plan describes a comprehensive outreach and engagement process around the development of the plan, and the plan includes references to additional areas for study and future consideration throughout.


For example, the plan indicates that the state will consider additional indicators, specifically referencing the state’s proposed Equity Gap measure and revising graduation goals.


Additionally, Maryland plans future consultation and coordination, including bimonthly meetings of the External ESSA Stakeholder Group and continued meetings of the Internal ESSA Workgoup. The state says that meetings are scheduled, at this time, through December 2018.

Learn more about why these categories matter: