Advancing New Hampshire Public Education

Home » Common Core » Detailed report rates Smarter Balanced best for Common Core related testing

Detailed report rates Smarter Balanced best for Common Core related testing

Enter your email address to follow this blog and receive notifications of new posts by email.

Categories

Summary

The Michigan Legislature mandated that its department of education prepare a report comparing all available Common Core testing options.  Twenty department staff prepared a 23 page report, Report on Options for Assessments Aligned with the Common Core State Standards, with detailed tables comparing the 12 potential vendors, including Smarter Balanced, the consortium New Hampshire has joined and helps manage.

The report looked at testing vendors’ ability to provide not only the “summative” (annual) test but also the “interim” (periodic) and “formative” (short-term) tests.  Right now, many New Hampshire school districts buy interim and other types of tests from a variety of vendors.  The report points out:

It will be much more cost-effective for the state to provide interim assessments and formative assessment resources online…., freeing up local resources and helping to ensure comparability across the state.

This is a benefit not discussed much yet in New Hampshire.  The report goes on to conclude:

Smarter Balanced …remains the only viable option that can satisfy all of the multiple needs for test security, student data privacy, a Michigan governance role, Michigan educator involvement, minimizing local burdens, cost effectiveness, Michigan access to all data to allow for verification, and so on. Because Smarter Balanced was designed primarily by state assessment directors who understand these needs, this should not be a surprising result. 


The Details

How completely does each of the 12 testing vendors to be basic job required? 

First, the report looked at how well each of the 12 tests would fit with the Common Core standards.  Smarter Balanced (and PARCC, the other multi-state testing consortium) scored highest for covering all the kinds of content and all of the types kinds of test questions needed to assess .

How much can the state change the test?

The report also looked at whether Michigan would be able to customize the test for its needs vs. just accept an off-the-shelf solution.  The asked, how much opportunity will there be to involve educators in developing test question?  In addition, how much control would the state have over student data?  Only Smarter Balanced and one other vendor (Houghton Mifflin Harcourt/Riverside) fully meet all of Michigan’s stringent requirements for customization and data control.

Computeriz-adaptive Testing

Like New Hampshire, Michigan puts a priority on offering online assessments, while providing paper-and-pencil tests for schools without the technology. Michigan also agrees with New Hampshire that computer-adaptive assessment, where each student receives a customized test event based on performance, is the best way to measure student achievement and growth.  The study concluded that Smarter Balanced was the only vendor able to provide both computer-adaptive testing and pencil-and-paper testing.

How good are the reports?

Will test results get back to the classroom almost immediately to provide the needed feedback to teachers?  Will there be detailed reports on the depth of knowledge students demonstrate in the test?  Will there be flexibility to create customized reports to meet whatever special needs students, teachers, parents and administrators might have?  Only Smarter Balanced and one other vendor (CBT/McGraw Hill) fully met this requirement.

Cost

Only two vendors – Smarter Balanced and PARCC – provided the demanding “performance assessment” type questions in all the grades that need to be tested.  And Smarter Balanced provides its test at a significantly lower cost, $22.50 per student for the computer-adaptive test (which PARCC does not provide) and $15.62 for the paper and pencil test.

What about “interim” assessments?

Schools often want to give “interim assessments” at any time during the school year to get an idea of how students are progressing.  Smarter Balanced got the highest scores across all the criteria and offered among the lowest costs for its interim assessments.

Accessibility

How well does the test meet the needs of students with disabilities and English language Learners.  Only Smarter Balanced and one other (PARCC) met Michigan’s criteria for accessibility.

Technical Requirements

Michigan wanted to know how whether the tests would work on a wide variety of computers and use the least possible bandwidth. Smarter Balanced  worked on all seven of the computer systems Michigan asked about – from old Windows desktops to Apple and Google devices – and required very little communications bandwidth.

Are “formative” assessments available?

Formative assessments are the day-to-day feedback students and teachers get about how well each student is doing.  Smarter Balanced was one of only two groups (the other was Discovery Education Assessment) to fully meet all 7 of Michigan’s criteria for the formative assessments.


3 Comments

  1. Jack Blodgett says:

    Michigan’s report appears to be comprehensive and thoughtful, carefully narrowing down the field of assessment options to the Smarter Balanced test. And it seems almost impertinent to ask whether the study group considered each test’s impact on teaching and learning – potentially positive or negative – as an additional basis for making recommendations. And though I understand that a number of assessment and scoring issues are still somewhat in flux vis-a-vis the Smarter Balanced process to-date, I think that one issue in particular is worth serious attention: namely, the use of machine-scoring to assess students’ writing.

    The National Council of Teachers of English has issued a strong caution against its likely use by Smarter Balanced, and I have heard similar sentiments expressed by one teacher in a recent guest editorial in the Union Leader. There are better alternatives to machine-scoring which should also be at the table for discussion before we get so far down the road that it will be difficult or impossible to turn back.

    Please see:
    http://www.ncte.org/positions/statements/machine_scoring

    • Bill Duncan says:

      NCTE is off on a side trip about machine scoring here, John. As the Michigan report shows, there is a lot of hand scoring in the SBAC test. And there can be as much more as states want to pay for.

  2. […] adjusts the student’s questions to the students abilities) is far better than the NECAP and was judged in a major study by the Michigan Department of Education to be the best in the country.  And, instead of testing our students in the fall and getting the […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s