Maryland Assessment Research Center (MARC)

Linking Old Assessments with New State Assessment Programs

  • Comparison between Propensity Scoring Matching and Weighting for Linking Two State Assessment Programs

     

  • Comparison between the Post-Smoothing (LEGS) and Pre-Smoothing (EQUATE) in Equipercentile Linking

     

  • Investigating the Concordance Relationship between the HSA Cut Scores and the PARCC Cut Scores Using the 2017 PARCC Test Data
    The purpose of this study is to conduct a replication investigation using the same linking methods as in our 2016 study and the 2017 PARCC test data to obtain the PARCC equivalents of the HSA cut scores and the HSA equivalents of the PARCC cut scores. The two options we explored are listed as follows.

    1. Option I: Using PSAT as an external common test to link HSA and PARCC tests via two-step linking. 
    2. Option II: Using the propensity score matching method to come up with matched equivalent groups so that the equivalent group equipercentile linking method can be used to map the HSA cut scores onto the PARCC scales directly, and vice versa. 

    Comparisons between PARCC cut scores from the 2016 study and this study were also performed.

     

  • Investigating the Concordance Relationship between the HSA Cut Scores and the PARCC Cut Scores Using the 2016 PARCC Test Data
    The purpose of this study is to conduct a replication investigation using the same linking methods as in our 2015 study and the 2016 PARCC test data to obtain the PARCC equivalents of the HSA cut scores and the HSA equivalents of the PARCC cut scores. The two options we explored are listed as follows.

    1. Option I: Using PSAT as an external common test to link HSA and PARCC tests via two-step linking. 
    2. Option II: Using the propensity score matching method to come up with matched equivalent groups so that the equivalent group equipercentile linking method can be used to map the HSA cut scores onto the PARCC scales directly, and vice versa. 

    Comparisons between PARCC cut scores from the 2015 study and this study were also performed.

Technical Report

 

  • Investigating the Concordance Relationship between the MSA and PARCC Test Scores Using Propensity Score Matching and Extrapolation Methods: Using 2016 PARCC Test Data
    The purpose of this study is to obtain the PARCC equivalents of the MSA cut scores using 2016 assessment data. To map the MSA cut scores to the PARCC scales, the MARC team previously conducted a linking study to investigate the concordance relationship between the PARCC and MSA tests for both math and English/reading at each grade level using 2015 data. This current study uses 2016 assessment data to investigate the same concordance relationships. Two options were explored in this study. 

    1. Option I: Using the propensity score matching (PSM) method to form matched equivalent groups and performing equipercentile linking based on these equivalent groups.
    2. Option II: Using data on the percentages of test takers being classified into different performance levels for MSA from 2004 test administration to 2012 test administration at each grade to extrapolate and predict the trend for 2015 and 2016 tests.

Technical Report

 

  • Investigating the Concordance Relationship between the MSA and PARCC Scores Using Propensity Score Matching and Extrapolation Methods
    This study obtains the PARCC equivalents of the MSA cut scores. More specifically, the MSA Math cut scores for being proficient and advanced need to be mapped onto the PARCC Math scales, and the MSA Reading cut scores for being proficient and advanced need to be mapped onto the PARCC ELA scales at each grade level from grades 3 to 8. To map the MSA cut scores to the PARCC scales, two options were explored. 

    1. Option I: Using the propensity score matching (PSM) method to form matched equivalent groups and performing equipercentile linking based on these equivalent groups.
    2. Option II: Using data on the percentages of test takers being classified into different performance levels for MSA from 2004 test administration to 2012 test administration at each grade to extrapolate and predict the trend for 2015 tests.

Technical Report

 

  • Investigating the Concordance Relationship between the HSA Cut Scores and the PARCC Cut Scores
    The purpose of this study is to obtain the PARCC equivalent of the HSA cut score and the HSA equivalent of the PARCC cut score that divides performance level 2 from 3. Two options were explored to create the condordance tables.

    1. Option I: Using PSAT as an external common test to link HSA and PARCC tests via two-step linking. 
    2. Option II: Using the propensity score matching method to come up with matched equivalent groups so that the equivalent group equipercentile linking method can be used to map the HSA cut scores onto the PARCC scales directly, and vice versa.

Technical Report

 

  • Issues and Considerations Regarding Linking between Old and New Assessment
    Beginning in the fall of 2014, according to new standards set forth by the Common Core State Standards Initiative (CCSS), the state of Maryland, under the Partnership for Assessment of Readiness for College and Careers (PARCC), will replace the Maryland School Assessment (MSA) with PARCC assessment, which will differ in content coverage, scope and sequence and psychometric properties, to name a few. Such discrepancies between new and old assessments require a careful linking study in order to compare the two assessments and correctly measure progress. This outline was created to address problems and suggest different linking designs.

     

  • Issues and Technical Considerations Related to Transitioning to a New Test According to Common Core State Standards (CCSS)
    An executive summary is conducted to identify the issues and the corresponding technical suggestions related to adopting the new national testing effort in Maryland. The issues we identified fall into four big categories, including psychometric-related issues (e.g., scaling, linking, and DIF), technology-related (e.g., readiness and security), implementation-related (e.g., test delivery, scoring, and reporting) and policy related (e.g., student growth and evaluation of teacher effectiveness). For each issue that we identified, potential technical considerations are provided.