If you Google "Cross-Grade Comparisons Among Statewide Assessments and NAEP" by Schaefer, Liu, and Wang (2007), you'll find on page 7 a pair of charts showing that, just as in Vermont, the seeming 70 percent reading-and-math-proficient result is actually 30 percent NAEP-defined proficient.
Adding insult to injury, the charts show a downward proficiency line across the grades: the longer students stay in school, the worse they perform. That's what I term punting from proficiency.
Just as in Vermont, educators in Maryland don't want to make it too easy for you to see this 1/3-to-2/3 ratio in test score results. As a result, they simply choose-while publishing the MSs-not to publish the comparable NAEPs.
Maryland's Montgomery County page in the state report is typical: on page 3 you see that after reporting all the 70-to-80 percent MSA proficiency results for all grades on the MSAs, it recommends that "for information on the NAEP, go to [the website]".
Unlike Vermont, Maryland makes it slightly easier to pursue the NAEP scores if you insist by furnishing a link where you can punch up the actual NAEP scores.
For example, in 2009, Maryland's fourth graders came in at 35 percent proficient in math, with whites at 45 percent and blacks at 20 percent (mostly white Vermont came in at 51 percent).
And just as in Vermont, which purchases the NECAP test from private-sector publisher/vendor Measured Progress Inc, Maryland purchases MSA from private-sector vendor/publisher Pearson Education, whose business ancestry goes back to publisher Scott Foresman.
And you thought public educators were predictably anti-corporate in outlook? Maybe they are, as shown by the difficulty in finding the clear link between public ed and test vendors in the publications of either state; actually, Maryland is even more opaque than Vermont on this apparently uncomfortable subject, and the Pearson link shows up only deep into the Maryland SED website's page on its MSA test protocol.