Research
Here you will find a curated collection of reports and white papers produced by Smarter Balanced and our partners.
Simulations
CRESST, 2017
This report describes a study by CRESST, simulating the adaptive administration of the Smarter Balanced summative assessments using the general and accommodated item pools utilized during the 2016-2017 school year. The study was conducted to examine properties of the simulated tests (such as blueprint fulfillment and item exposure) and the quality of the resulting examinee score estimates (including bias and precision).
AIR, 2017
This report summarizes AIR’s simulation results of the 2016-2017 adaptive administrations in grades 3–8, and 11 for ELA/L in English and Braille, and mathematics in English, Braille, and Spanish.
Read: Smarter Balanced Summative Assessments Simulation Results (2016-17)
CRESST, 2015
This report describes a study simulating the adaptive administration of the Smarter Balanced summative assessments. The study was conducted in order to examine properties of the simulated tests (such as blueprint fulfillment and item exposure) and the quality of the examinee score estimates (including bias and precision). The simulated tests included both the computerized adaptive test (CAT) and performance task (PT) components, thus mimicking the operational summative tests.
Read: Simulation-based Evaluation of the Smarter Balanced Summative Assessments
CRESST, 2016
This report describes a study simulating the adaptive administration of the Smarter Balanced summative assessments using the accommodated item pools utilized during the 2014–15 school year.
AIR, 2014
Prior to test administration in 2014-2015, the American Institutes for Research (AIR) conducted simulations to evaluate and ensure the implementation and quality of the adaptive item-selection algorithm and the scoring algorithm.
Read: Testing Procedures for Adaptive Item-Selection Algorithm
Research
2017
Smarter Balanced contracted with Measurement Incorporated (MI) to conduct validation of cut scores for grades 9 and 10 as part of a larger contract to create item maps and playlists for 96 interim assessment blocks (IABs). The grades 9 and 10 cut scores validated by MI were previously interpolated from an analysis of cut scores set in 2014 performed by the American Institutes of Research (AIR).
HumRRO, 2016
The goal of this study was to gather comprehensive evidence about the alignment of the Smarter Balanced summative assessments to the Common Core State Standards (CCSS). The alignment analysis included the range of content, the balance of content, and the cognitive complexity. To determine these facets of alignment, HumRRO conducted a series of workshops during which participants reviewed the alignment among the Smarter Balanced content and item specifications, the Smarter Balanced blueprint, and the Common Core State Standards.
Read: Smarter Balanced Assessment Consortium Alignment Study Report
CRESST, 2015
This report describes the procedures used in obtaining parameter estimates for items appearing on the 2014-2015 Smarter Balanced Assessment Consortium summative paper-pencil forms.
Read: Initial Report on the Calibration of Paper and Pencil Forms
Sireci, 2012
This report informs Smarter Balanced of research that should be done to evaluate the degree to which the Consortium is accomplishing its goals and to demonstrate that the assessment system adheres to professional and federal guidelines for fair and high-quality assessment.
2010
Smarter Balanced Assessment Consortium application for the Race to the Top Assessment program. The US Department of Education awarded a grant to the Consortium on September 28, 2010.
Technical Documentation and Achievement Levels
2015
Achievement level setting for Smarter Balanced assessments in Mathematics and English/Language Arts and Literacy occurred in three phases: online panel, in-person workshop, and cross-grade review. This report documents each of the three phases and provides results and recommendations for cut scores for Smarter Balanced Assessments to be given in the spring of 2015.
Read: Achievement Level Setting Final Report
AIR, 2013
Smarter Balanced cognitive labs, conducted in collaboration with the American Institutes for Research (AIR) in fall 2013, consisted of 14 small think-aloud studies that addressed topics pertaining to an automated test delivery system. The studies were designed to determine how effective various technology enhanced item types are in assessing the cognitive processes indicated within the Smarter Balanced Content Specifications.
2013
The Smarter Balanced Assessment Consortium held an Achievement Level Descriptor (ALD) Writing Workshop in October 2012 to draft an initial set of ALDs and to review and comment on a College Content-Readiness policy. The panel developed a set of Policy ALDs, Range ALDs, and Threshold ALDs for English language arts and mathematics and provided valuable feedback on the Smarter Balanced College Content-Readiness Policy.
Read: Technical Report: Initial Achievement Level Descriptors
Smarter Balanced Findings
2022
In 2021, National PTA and Smarter Balanced collaborated on a project to better understand how to help close the assessment system literacy gap with parents and meet the needs of parents around assessment reporting. National PTA working in partnership with Edge Research, sought to examine
how parents prefer to receive information about their child’s academic progress and whether the Smarter Balanced system, score reports, and standardized year-end assessments are accessible and useful for families. To learn more about the focus groups and their findings, you can view the full report here.
2022
Assessing the Options: Considerations for provision of choice in assessment discusses types of choice and related goals and challenges relevant to assessment systems.
Recommendations: The brief includes a range of potential ways state education agencies and local education agencies can partner with professional development (PD) providers and assessment developers, including:
- Help teachers promote a culture of autonomy in their classroom
s wherein students’ choices converge upon common, recognized learning goals and targets. - Provide teachers with high-qualityprofessional development on personalizing assessment by allowing students to make choices in assessment.
- Ensure innovations introduced into assessment systems and assessment procedures concerning choice are sufficiently supported by empirical evidence on their effectiveness.
Read: Assessing the Options: Considerations for provision of choice in assessment
2022
Leveraging Student Perspective: Considerations for connecting assessment systems and multiple ways of knowing seeks to support state education leaders and assessment developers to include knowing during the design, development, and implementation of a balanced assessment system.
Recommendations: The brief includes a range of suggested ways state agencies can partner with assessment developers and local education agencies, including:
- Develop training, classroom materials, and resources for teachers (e.g., rubrics, observation tools, surveys) that help integrate assessment of and for learning and promote assessment system literacy.
- Create spaces that help educators make connections across school-home and school-community environments.
- Find ways to help students integrate their knowledge and experiences into assessment tasks and promote the development of assessment systems that incorporate considerations for students.
2022
Content Progressions & Clustering Across Instructional Materials: Viability for Supporting the Design of a Through-Year Assessment Model
Prepared for Smarter Balanced by Shelbi K. Cole & Carey Swanson
This paper examines existing through-year assessment models and provides analysis of the most used curriculum within the context of how well they would work with through-year assessment. This piece makes strong recommendations for strengthening the interim and formative resources available to educators, piloting innovative and alternative models, developing pathways for authentic integration with the interim system, and partnering with key curriculum providers and district users.
2022
Through-Year Assessments: Practical Considerations For LEA Implementation
Prepared for Smarter Balanced by the New Teacher Center
As State Educational Agencies (SEAs) weigh the advantages and disadvantages of embracing this emergent assessment system, it is important to consider the impact such a decision would have on Local Educational Agencies (LEAs). This paper explores the practical considerations for LEAs managing the implementation of a statewide through-year model with instructional priorities. Key implications include defining their theory of action for assessment; ensuring alignment and coherence between curriculum, instruction, and assessment; developing a plan to support teachers and students; providing information about assessments with families and other key stakeholders; and managing the administrative systems of testing and assessments.
Read: Through-Year Assessments: Practical Considerations for LEA Implementation
2023
An Introduction to Considerations for Through-Year Assessment Programs: Purposes, Design, Development, Evaluation
A Paper Prepared for the Smarter Balanced Assessment Consortium
The paper, written for Smarter Balanced by Nathan Dadey and Brian Gong from the Center for Assessment, is one of the most comprehensive pieces to date on the purposes of through-year assessments, their designs, and reflections on current models seen across the country.
This document is written primarily for policy makers and state department of education staff who are considering through-year assessments, as well as consultants and contractors state departments rely on. The document identifies essential things to consider when designing or evaluating a through- year assessment program. This document is not, however, a comprehensive encyclopedia of all possible through-year assessment designs and topics. It is not a handbook for how to design and construct a through-year assessment. It is not a comprehensive review of the relevant literature and practical work relevant to through-year assessments. It offers no “thumbs-up/ thumbs down” verdict regarding particular through-year assessment efforts by specific states or vendors. And it certainly is not a crystal ball regarding what the future might bring to federal Peer Review or state laws. It does provide a firm foundation, however, for considering the “whether,” “why,” and “what” of through-year assessment design.
2023
Increasing the Relevance of Performance Tasks for Educators and Students
As foundational research regarding through-year assessments, Smarter Balanced released three papers that examine elements of through-year assessments, including alignment with instruction (Cole, 2022), the practical impact of through-year assessment for local education agencies (LEAs) (New Teacher Center, 2022), and purposes, design, development, evaluation (Dadey & Gong, 2023). These papers described the range of options and challenges that states must consider when they design a through-year system.
Based on these papers and through discussions with the California State Board of Education (SBE) and the California Department of Education (CDE), Smarter Balanced began to explore how to make changes to the system that might mitigate some issues that have been identified, including end-of-year test length and the desire for information that can support student learning during the school year as well as teachers’ learning and reflection about students’ progress, thus supporting educators to accelerate student learning throughout the school year.
This brief summarizes the information gathered from the investigation. In addition, it describes planned future research to inform decisions regarding the implementation of performance tasks as part of a through-year approach to summative assessment.
Read: Increasing the Relevance of Performance Tasks for Educators and Students