Technology and Alternative Assessments for Common Core State Standards

 In Common Core

 

Of key importance to administrators and superintendents in schools today is the question of how to address the technological requirements for assessing students after implementing the Common Core State Standards. This might at first seem like an unfounded concern, but when one really looks into what is asked of schools by the two companies producing the assessments, one quickly sees the implementation of this process could cost hundreds of thousands of dollars, if not millions more than some states are willing to spend. In addition to the cost of training teachers for implementation of the Common Core Standards, public school districts, and private schools that choose to use Common Core Standards will find that the cost of desktop computers, laptops, or tablets will take a significant chunk out of their operating budgets.

The Partnership for Assessment of Readiness of College and Careers (PARCC) and the Smarter Balanced Consortium are both designing assessments for use with computers that have at least 1 gigabyte of memory and a screen display of 9.5 inches (10 inch class) at a resolution of 1,024 X 768 or greater (Norris & Soloway, 2013). Smarter Balanced recommends at least an 80 GB hard drive or at least 1GB of hard drive space be available (Smarter Balanced, 2013). This eliminates Netbooks and the iPad minis or any of the new versions that have display sizes smaller than the required 9.5 inches. The iPad 3+ running iOS6 is recommended by Smarter Balanced (Smarter Balanced, 2013). School systems are being asked to count all of the devices in their schools that are capable of facilitating this assessment compared to the number of students who will be testing. The recommended hardware to student ratio is 1:6-7 (Kantrowitz, 2013). Many schools are coming up short (Davis, 2012). While there might be adequate technology available for Internet access, the support systems may be outdated or not compatible with the new assessment software. For instance, the testing software is not capable of running on Microsoft XP systems so those well-functioning computers are not eligible for use with tests for this particular assessment. Some states may initially choose to continue to use paper and pencil for their annual high-stakes tests, but this will not be an option once the school chooses to assess for mastery of Common Core Standards.

This particular degree of technology dependence is necessary because there are actually four different tests being developed: diagnostic, mid-year, end of year and final. In addition to these tests, there are two other performance tests that are scheduled to be developed for speech and listening.  A quick turnaround time is desired for school administrators and teachers to analyze results and implement instructional changes and interventions. These key assessments will still be attached to teacher evaluations whose deadlines vary throughout the year.

Inadequate bandwidth will also be a concern that will rule out some eligible computers. Additional bandwidth for schools costs more money. In some schools and districts, purchasing additional technology must be approved by the voters or the school board. What happens when not enough money is available? Will there be a divide between school districts with more money and those with less? Who then gets the new tests? How will this effect issues of adequacy and equity?  Will the lack of technology create a bigger achievement gap?

In addition to the cost of purchasing and installing new computers in schools, funds will need to be allocated for teacher training and then student training as well on the use of the technology and software so that these variables are minimized when students begin testing. Even kindergarten students will be required to respond through a keyboard assessment and fourth and fifth graders will have to type a minimum of one to two page responses respectively (Carr & Dreilinger, 2013).

The State of Florida recently pulled out of PARCC assessments because the per pupil cost of between $22.50 to $29.50 for testing implementation doubled the traditional annual assessment cost (Chieppo, & Gass, Aug. 15, 2013; Kantrowitz, October 15, 1013). Georgia dropped out of the PARCC assessments citing the skyrocketing cost and loss of local control. It will develop its own in-state tests aligned to Common Core standards and is looking to other states to form possible partnerships in this development (Shearer, July 22, 2013).

Testing possibilities

 Alabama dropped out of the Smarter Balanced consortium in May of 2013 citing a desire for local control over student learning (Senate Joint Resolutions 49). Gloria Turner, director of assessment, for Alabama’s Department of Education stated that Alabama will use the new Aspire test being field tested right now by ACT (Stacey, 2013) and scheduled for release in April of 2014 (ACT, personal communication, October 17, 2013). This test will eliminate the immediate worry about having to purchase computer software and hardware, as it offers a choice of either pencil and paper with a mail-in and turn-around time of 4-6 weeks or the immediate online submission. Aspire is being developed only for grades 3-8 and early high school (9th/10th grade). It is aligned to the Common Core Standards and the reports will be coded to the pre-existing 1-36 scale of college readiness already used for the ACT exam. Per pupil cost has yet to be released, but one can sign up on the ACT Website to receive immediate updates about the test’s release. Subject area exemplars are given on the web site and Common Core skills are evident, especially in the math exemplars where questions are asked as to how the student arrived at his or her answer. At this time, ACT has not released how, or who, will grade these student self-response items, or if there is a local grading option.

Currently, the Iowa Test of Basic Skills, an annual achievement test given at many private schools, has a turn around time of about two weeks.  When tests are given in the Fall (generally late September or early October) using a prescriptive approach for learning, it is not until November, after the administration reviews test results, that teachers are able to make classroom instructional changes. By this time, 40% of the school year has passed. The use of Internet access can facilitate a quick turnaround time for test results allowing for immediate decision-making and adjustments before the next round of scheduled (or optional) testing mid-year in January. With the new Iowa Form E assessment, a quick turnaround is now a possibility with the use of DataManager. DataManager is a robust online reporting system that provides administrators and teachers with almost immediate online reports. While learning the system and how to request the reports might be a little time-consuming and is designated for the Test Administrator, anyone who has been granted access can generate a student report. The Iowa Form E has also been aligned to the Common Core, but there is a report option to print out a typical, or traditional report, or one that is aligned to the CCSS. Students are still tested in a traditional pencil and paper format, or the school can use the online computer testing option. Per pupil cost ranges from $7.43 to $10.08 for full reporting services with an additional $3.00 for the Cognitive Abilities Test (Michele Baker, personal communication, September 17, 2013).

It would seem that the implementation of a new curriculum and accompanying assessment was not completely thought through when adding the additional technological use and developmental necessities of students to complete online assessments. Districts and states are now doing some backtracking, and all this after teachers have worked over the past three years to make major curriculum and instructional shifts. It’s a shame to have to take the time to place “Band-Aids” on rashly implemented educational policies. At least there are a few less expensive testing alternatives.

References

Carr, S. & Dreilinger,D. (September 29, 2013). A core dilemma: Will the

         littlest learners be able to type? Retrieved from

http://hechingerreport.org/content/a-core-dilemma-will-the-littlest-

learners-be-able-to-type_13198

 

Chieppo, C. & Gass, J. (August 15, 2013). Why states are backing out on

         common core standards and tests. Retrieved from

http://hechingerreport.org/content/why-states-are-backing-out-on-

common-standards-and-tests_12895

 

Davis, M. (2012). Are you tech-ready for the Common Core? Education

 Week. Retrieved from www.edweek.org/dd/articles/2012/10/17/01

readiness.h06.html?tkn=NVZFOAfSykw%2BRxSo395jo6lu%2FAN ympkgfH41&print=1

 

Kantrowitz, B. (October 15, 2013). Testing the common core in Tenne-

         see. Retrieved from http://hechingerreport.org/content/testing-the-

common-core-in-tennessee_13468

 

Norris, C., & Soloway, E. (2013). Common core technological standards:

They are the tail, not the dog. The Journal. Retrieved from

http://thejournal.com/Articles/2013/01/14/Common-Core Technological-Standards.aspx?p=1

 

Shearer, L. (July 22, 2013). Georgia will drop out of common core-aligned

 testing consortium. Retrieved from http://onlineathens.com/local-

news/2013-07-22/georgia-will-drop-out-common-core-aligned-

testing-consortium

 

Smarter Balanced Assessment Consortium. (February, 2013). Hardware

 and software requirements overview. Retrieved from

http://www.smarterbalanced.org/wordpress/wp-content/uploads/2011/12/Technology-Strategy-Framework-Executive-Summary_2-6-13.pdf

 

Stacey, E. (February 13, 2013). Alabama exits national common core

         tests. Retrieved from ttp://news.heartland.org/newspaper-article/

2013/02/13/Alabama-exits-national-common-core-tests

Recent Posts

Leave a Comment