Hong Jiao

I am currently a Professor at the University of Maryland (UMD), College Park specializing in educational measurement and psychometrics in large-scale assessment and also Director of Maryland Assessment Research Center (MARC). I received my PH.D from Florida State University in Measurement, Statistics, and Evaluation. Prior to joining the faculty in Measurement, Statistics, and Evaluation at UMD, I worked as a psychometrician at Harcourt Assessment on different state assessment programs.
The overarching goal of my methodological research is to improve the practice in educational and psychological assessment and develop solutions to emerging psychometric challenges. Many of these are due to the use of more complex innovative assessment formats. I believe that ultimately this work will promote the use of assessment data for cognitive diagnosis to facilitate learning and instruction. I have successfully established a coherent program of research by integrating item response theory (IRT) models, cognitive diagnostic models (CDM), and machine learning methods with multilevel and finite mixture modeling approaches with an aim to exert a positive and lasting effect on measurement practice in large-scale assessments and for cognitive diagnosis purposes.
My research agenda can be summarized into five general categories: methodological research on local dependence due to the use of testlets, modeling and methods for classification and cognitive diagnosis, Bayesian model parameter estimation, research on computer-based testing, and practical psychometric issues in large-scale tests. My methodological research has been recognized by a national award, academic work including numerous edited books, book chapters, refereed journal papers, and national and international invited and refereed presentations and over $4 million research grants and contracts on which I serve as PI or CO-PI. I proposed a multilevel testlet model for mixed-format tests that won the 2014 Bradley Hanson Award for Contributions to Educational Measurement by the National Council on Measurement in Education.
I have participated in a variety of professional service activities. I am currently chairing the Technical Advisory Committee (TAC) for the Maryland state testing programs, working with the TAC members to help the state provide rigorous testing programs. As Director of the Maryland Assessment Research Center (MARC), I work with the team to provide psychometric research and service to meet the State assessment needs. I have also served on the Research and Psychometric committee for the PARCC consortium testing program representing Maryland and currently on the Research Advisory Committee for the Prince George Public School of Maryland . Further, my research expertise has been recognized by being voted Chair Elect for the Rasch Special Interest Group for the AERA, Co-Chair and Chair of the AERA Division D2 program, Co-Chair and Chair of the AERA Division D Significant Contribution to Educational Measurement and Research Methodology Award Committee and serving on the Early Career Award committee. As a co-chair, I am working with a team to host the 2023 International Meeting of the Psychometric Society (IMPS) on the UMD campus.
I have served on the editorial board for the American Educational Research Journal - Teaching, Learning, and Human Development section, Educational Measurement: Issues and Practices, Measurement: Interdisciplinary Research and Perspectives, and the Methods in Psychology journals. I am also on the Editorial Board of the Springer book series: Methodology of Educational Measurement and Assessment. In addition, I served as a reviewer for national conferences, journals, and books on large-scale assessments. I also served on review panels for National Science Foundation. Recently, I served as a guest editor for two special topics on “Process Data in Educational and Psychological Measurement” and “Cognitive Diagnostic Assessment for Learning” in Frontiers in Psychology. I co-organized several MARC annual conferences and co-edited the books on different cutting-edge topics in assessment including technology-enhanced innovative assessment, the applications of artificial intelligence in assessment, and enhancing effective instruction and learning using assessment data: theory and practice.
2003 The Spaan Fellowship, Funded research in Second or Foreign Language Testing, University of Michigan, Ann Arbor , MI.
2002 The Lenke Psychometric Fellowship, Harcourt Educational Measurement, San Antonio, TX.
1998-2001 University Fellowship, Florida State University, Tallahassee, FL.
2014 The Bradley Hanson Award for Contributions to Educational Measurement by National Council on Measurement in Education.
2011 The International Initiative Fellows Program at the College of Education, University of Maryland, College Park, MD.
2010 The American Educational Research Association Research Grant sponsored by the National Science Foundation.
2009 The GATE Fellows Program (Teaching Innovation Award) at the College of Education, University of Maryland, College Park, MD.
2008 The SPARC: Support Program for Advancing Research and Collaboration Award, College of Education, University of Maryland, College Park, MD.
2008 The General Research Board (GRB) Summer Award, University of Maryland, College Park, MD.
2005 The Revere Award for Customer Focus, Harcourt Assessment, Inc., San Antonio, TX
1996 Liu Yonglin Excellent Teaching Prize, Shanghai Jiao Tong University, Shanghai, China. 1996 Star Teacher, Shanghai Jiao Tong University, Shanghai, China.
1993 Teaching Excellence Award, Shanghai Jiao Tong University, Shanghai, China.
Books Edited (Selected)
Chapters in Books (Selected)
Qiao*, X. & Jiao, H. (2022). Explanatory cognitive diagnostic modeling incorporating response times. Journal of Educational Measurement. DOI: 10.1111/jedm.12306.
Jiao, H, & Liao, M. (2021). Testlet response theory. Educational Measurement: Issues and Practices.
Liao*, D., He, Q., & Jiao, H. (2019). Mapping background variables with sequential patterns in problem solving environments: An investigation on US adults’ employment status in PIAAC. Frontiers in Psychology. https://doi.org/10.3389/fpsyg.2019.00646.
Zhan*, P., Jiao, H., Liao*, D. & Li, F (2019). A longitudinal higher-order diagnostic classification model. Journal of Educational and Behavioral Statistics. Advanced Online Publication. URL https://doi.org/10.3102/1076998619827593.
Zhan*, P., Ma, W., Jiao, H., & Ding, S. (2019). A sequential higher-order latent structural model for hierarchical attributes in cognitive diagnostic assessments. Applied Psychological Measurement. Advanced Online Publication. URL https://doi.org/10.1177/0146621619832935.
Zhan*, P., Jiao, H., Man, K, & Wang, L. (2019). Using JAGS for Bayesian cognitive diagnosis modeling: A tutorial. Journal of Educational and Behavioral Statistics. Advanced Online Publication. URL https://doi.org/10.3102/1076998619826040
Zhan*, P., Wang, W.-C., Jiao, H., & Bian, Y. (2018). The probabilistic-inputs, noisy conjunctive models for cognitive diagnosis. Frontiers in Psychology. URL https://doi.org/10.3389/fpsyg.2018.00997.
Zhan*, P., Jiao, H., Liao*, M., & Bian, Y. (2018). Bayesian DINA modeling incorporating within-item characteristics dependency. Applied Psychological Measurement. 43, 143–158. https://doi.org/10.1177/0146621618781594
Qiao*, X., & Jiao, H. (2018). Comparing data mining techniques in analyzing process data: A case study on PISA 2012 problem-solving items. Frontiers in Psychology.
Zhan*, P. Jiao, H., & Liao*, D. (2017). Cognitive diagnosis modeling incorporating item response times. British Journal of Mathematical and Statistical Psychology. doi: 10.1111/bmsp.12114
Luo, Y., & Jiao, H. (2017). Using the Stan program for Bayesian item response theory. Educational and Psychological Measurement. DOI: 10.1177/0013164417693666
Li*, T., Xie*, C., & Jiao, H. (2016). Assessing fit of alternative unidimensional polytomous item response models using posterior predictive model checking. Psychological Methods.
Li*, T., Jiao, H., & Macready, G. (2015). Different approaches to covariate inclusion in the mixture Rasch model. Educational and Psychological Measurement. DOI: 10.1177/0013164415610380
Jiao, H., & Zhang*, Y. (2015). Polytomous multilevel testlet models for testlet-based assessments with complex sampling designs. British Journal of Mathematical and Statistical Psychology, 1, 65-83. Online first,DOI:10.1111/bmsp.12035.
Wolfe, E., Song, T. W., & Jiao, H. (2015). Features of difficult-to-score essays. Assessing Writing.27, 1-10.
Wolfe, E. W., Jiao, H., & Song, T. (2015). A family of rater accuracy models. Journal of Applied Measurement. 16
Chen*, Y.-F. & Jiao, H. (2014). Exploring the utility of background and cognitive variables in explaining latent differential item functioning: An example of the PISA 2009 reading assessment. Educational Assessment.19, 77-96.
Jiao, H., Wang, S., & He, W. (2013). Estimation methods for one-parameter testlet models. Journal of Educational Measurement, 50, 186-203.
Li*, Y., Jiao, H., & Lissitz, R.W. (2012). Applying multidimensional IRT models in validating test dimensionality: An example of K-12 large-scale science assessment. Journal of Applied Testing Technology, vol. 13, n2.
Jiao, H., Macready, G., Liu*, J., & Cho*, Y. (2012). A mixture Rasch model based computerized adaptive test for latent class identification. Applied Psychological Measurement, 36, 469-493.
Jiao, H., Kamata, A., Wang, S., & Jin, Y. (2012). A multilevel testlet model for dual local dependence. Journal of Educational Measurement, 49, 82-100.
Jiao, H., Liu*, J., Haynie, K., Woo, A., & Gorham, J. (2012). Comparison between dichotomous and polytomous scoring of innovative items in a large-scale computerized adaptive test. Educational and Psychological Measurement, 72, 493 - 509.
Jiao, H., Lissitz, B., Macready, G., Wang, S., & Liang*, S. (2011). Exploring levels of performance using the Mixture Rasch Model for standard setting. Psychological Testing and Assessment Modeling, 53, 499-522.
Jiao, H., & Wang, S. (2010). A multifaceted approach to investigating the equivalence between computer-based and paper-and-pencil assessments: An example of Reading Diagnostics. International Journal of Learning Technology, 5, 264-288.
Wang, S., & Jiao, H. (2009). Construct equivalence across grades in a vertical scale for a K-12 large-scale reading assessment. Educational and Psychological Measurement, 69, 760-777.
Wang, S., Jiao, H., Young, M. J., Brooks, T., & Olson, J. (2008). Comparability of computer- based and paper-and-pencil testing in K-12 reading assessments: A meta-analysis of testing mode effects. Educational and Psychological Measurement, 68(1), 5-24.
Wang, S., Jiao, H., Young, M. J., Brooks, T., & Olson, J. (2007). A meta-analysis of testing mode effects in Grade K-12 Mathematics Tests. Educational and Psychological Measurement, 67(2), 219-238.
Jiao, H., Wang, S., & Kamata, A. (2005). Modeling local item dependence with the hierarchical generalized linear model. Journal of Applied Measurement, 6(3), 311-321.
External Funding
Funding agency: Maryland State Department of Education |
Funding agency: NBOME Title: Differential item functioning of the Levels 1 and 2 examinations |
Funding agency: Classic Learning Test Title: Norming of CLT. |
Funding agency: Management Systems International Title: Linking assessments to a global standard with social moderation. |
Funding agency: Classic Learning Test Title: Psychometric properties of CLT. |
Funding agency: The Council of Chief State School OfficersI Title: Applying sampling weights in Kindergarten Readiness Assessment |
Funding agency: The Partnership for Assessment of Readiness for College and Careers, Inc. Title: Investigating New York City students’ performance on and experience with the 2015 PARCC pilot tests. |
Funding agency: Management Systems International Title: Alignment study for University Readiness Test in Egypt. |
Funding agency: Educational Records Bureau Title: Psychometric analysis for Comprehensive Testing Program 4. |
Funding agency: National Council on Measurement in Education Title: A multilevel testlet model for mixed-format tests |
Funding agency: American Educational Research Association/National Science Foundation #DRL-0941014 Title: Latent differential item functioning analysis for testlet-based assessments |
Funding agency: National Council of State Boards of Nursing, Joint Research Committee CO-PI with PI: Kathleen C. Haynie Title: A Partial Credit Modeling Study of NCLEX Innovative Items |
Funding agency: University of Michigan, Ann Arbor Title: Evaluating the dimensionality of the Michigan English Language Assessment Battery |
COURSES TAUGHT
Graduate Courses
Course Title |
Instrumentation |
Applied Measurement: Issues and Practices |
Classification and cognitive diagnosis |
Computerized adaptive testing |
Psychometrics in large-scale assessment |
Quantitative methods I |
Modern measurement theory |