FAST FACTS:

  • Our program offers Ph.D. and M.A. degrees, as well as a graduate certificate, focused solely on quantitative methods.
  • Funded students get valuable experience as departmental research or teaching assistants, as graduate assistants in affiliated areas on campus, or on contract/assistantship work in partnership with research/testing agencies in the Washington, DC area.
  • Graduate students are heavily involved in research projects, with over 80% presenting at national and international conferences, and many publishing in leading academic journals. Within the last two years alone our students have had first-authored methodological papers appear in, or accepted for publication in, such journals as Multivariate Behavioral Research, Psychological Methods, Journal of Educational Measurement, Journal of Education and Behavioral Statistics, Applied Psychological Measurement, International Journal of Testing, Educational and Psychological Measurement, Frontiers in Psychology (Quantitative Psychology and Measurement section), and Journal of Experimental Education.
  • Over half of our Ph.D. students get summer internships (e.g., College Board, Educational Testing Service, American College Testing, Measured Progress) and/or dissertation fellowships (American Educational Research Association, Spencer Foundation, Society of Multivariate Experimental Psychology, Educational Testing Service).
  • Graduates of our program get academic positions all over the country (e.g., Arizona State University, University of Georgia, ​University of Texas, ​University of Minnesota, Georgetown University, The George Washington University, University of North Carolina, University of Hawaii, University of Denver), as well as jobs at leading research/testing agencies (e.g., Educational Testing Service, American College Testing, Westat, American Institutes for Research, ​Stanford Research International, Mathematica, ​Association of American Medical Colleges).
  • Our faculty include award-winning mentors, Fellows in leading national associations, and editors/editorial board members on such leading journals as Journal of Educational Measurement, Journal of Educational and Behavioral Statistics, Structural Equation Modeling: A Multidisciplinary Journal, Multivariate Behavioral Research, Psychometrika, Psychological Methods, Journal of Research on Educational Effectiveness, and AERA Open.
Gregory R. Hancock

Gregory R. Hancock
Professor and Program Director
UM Distinguished Scholar-Teacher

1230D Benjamin Bldg.
curriculum vitae

ghancock@umd.edu

Jeffrey R. Harring

Jeffrey Harring 
Professor

1230E Benjamin Building

harring@umd.edu

Hong Jiao

Hong Jiao 
Associate Professor

1230C Benjamin Building

hjiao@umd.edu

YangLiu

Yang Liu 
Assistant Professor

1230B Benjamin Building

yliu87@umd.edu

Laura Stapleton

Laura Stapleton 
Professor

1230A Benjamin Building

lstaplet@umd.edu

Peter Steiner

Peter Steiner 
Associate Professor

1233 Benjamin Building

psteiner@umd.edu

Tracy Sweet

Tracy Sweet 
Associate Professor

1229 Benjamin Building

tsweet@umd.edu

Ji Seung Yang

Ji Seung Yang 
Associate Professor

1225 Benjamin Building

jsyang@umd.edu

   
EMERITUS, ADJUNCT, and AFFILIATE FACULTY  
Chan Dayton

C. Mitchell Dayton 
Professor Emeritus

cdayton@umd.edu

Robert Lissitz

Robert W. Lissitz 
Professor Emeritus

rlissitz@umd.edu

George Macready

George Macready 
Professor Emeritus

macready@umd.edu

Robert Mislevy

Robert Mislevy 
Professor Emeritus

rmislevy@umd.edu

William Schafer

William Schafer 
Associate Professor Emeritus

wschafer@umd.edu

Kathryn Alvestad

Kathryn Alvestad 
Adjunct Associate Professor

alvestad@umd.edu

 

Daniel Bonnery

Daniel Bonnéry 
Affiliate

dbonnery@umd.edu

Ji AnJi An 
Ph.D. student 
jian12@umd.edu

I graduated from Nanjing University in China with a Bachelor's degree in Teaching Chinese as a Second Language. Before joining EDMS in 2013, I received a MA in Teaching & Curriculum from Michigan State University, and had some classroom teaching experience in a public school in Michigan, where I became interested in educational measurement and evaluation.

 

 

Kristina CassidayKristina Cassiday
Ph.D. student 
kcass@umd.edu

I received my M.A. degree in TESOL from the Monterey Institute of International Studies and spent two years in Guizhou province, China as a University English teacher through the Peace Corps. I joined the EDMS program in Fall 2013 and also work at an education non-profit as the Assistant Director of Research and Evaluation. I feel fortunate to be a part of such a wonderful community of graduate students and outstanding faculty.

 

Yishan DingYishan Ding
Ph.D. student 
ysding@terpmail.umd.edu

I graduated with a master in International Education and a certificate in Survey Design and Data Analysis from the George Washington University in 2016. I then joined the EDMS program in 2017 as a master student to pursue my interest in educational statistics research. My previous research experience includes working as a research fellow at UNESCO office in Jakarta, where I worked on the water education project.

 

Yi FengYi Feng
Ph.D. student 
yifeng94@umd.edu

I earned my Master's degree in Applied Psychology at New York University, after which I spent two years conducting empirical research on adolescents' socio-emotional development at Purdue
University. The experience of dealing with data and applying different analytic methods to answer various research questions has contributed to my growing interest in quantitative methodology. It led me to EDMS and now I am pursuing a Master's degree in this program.

 

 

Charlie FiskCharlie Fisk
Ph.D. student 
fisk@umd.edu

B.S. in physics & mathematics, Penn State. I started my research career in the Children's Hospital of Philadelphia's radiology lab studying auditory steady-state responses in children with autism and adults with schizophrenia. I returned to school to study research methodology, graduating from Vanderbilt's quantitative methods master's program in 2019. Along the way I forged connections with the Vanderbilt School of Medicine where I worked on scale development and latent growth modeling. I am very excited to be here in the EDMS program where I can work and grow as a research methodologist.

 

Ari HouserAri Houser
Ph.D. student 
ahouser@umd.edu

Ph.D. student, ABD.  My current EDMS research focus is generalized models for bivariate growth processes. I also work full time as a senior methods advisor for the AARP Public Policy Institute, where my areas of expertise include demographics, disability, Medicaid, long-term services and supports, and quality measurement.  BA in Physics and Engineering, Swarthmore College.  MA in EDMS. 

 

Bernadette JeromeBernadette Jerome
M.A. student 
bernadette.jerome@gmail.com

I am excited to be a part of the EDMS program. Before coming to Maryland, I worked in northern Uganda coordinating a randomized control trial of an early literacy program with the University of Michigan. My background is in education and I gained my experience in Peace Corps Uganda where I taught mathematics and computers to pre-service teachers and facilitated training for in-service teachers regarding continuous assessments. I graduated with a BS in Mathematics and Economics from the University of California, Los Angeles.

 

Tessa JohnsonTessa Johnson
Ph.D. student 
johnsont@umd.edu

I am a PhD student in the EDMS program with a methods background in latent variable models and an applied background in prevention science. I got my bachelor's degree in psychology from the University of Georgia in Athens and my master's in counseling and educational research from Georgia State University. My statistical research is focused on the development of methodologies used in the social, behavioral, and health sciences, especially in the areas of longitudinal processes and unobserved population heterogeneity.

 

Jung-Jung LeeJung-Jung Lee
Ph.D. student 
jlee1256@terpmail.umd.edu

I received both my Bachelor’s in psychology and my Master’s in Industrial and Organizational Psychology from California State University, San Bernardino in 2014 and 2016.  After I obtained my Master’s degree, I worked as a Workforce Research Analyst with the primary duty of reporting labor market data for the local Southern California region surrounding a community college. My strong interest in Item Response Theory has pushed me to further my education to the next level and I joined EDMS as a doctoral student in 2017.

 

 

Mancy LiaoMancy Liao
Ph.D. student 
mancyliao@gmail.com

I earned my B.S. degree in Applied Psychology in Sun Yat-sen University, China. I am always fascinated about the quantitative aspect of psychology. I came across educational measurement in the junior year of my undergraduate study, and became interested in it while participating in a research project to develop large-scale educational assessments. Though I have taken some courses related to Psychometric Theory and Item Response Theory, I believe that this area is well worth delving into. Gladly, I joined the EDMS Program as a master's student in the fall of 2015.

 

Kaiwen ManKaiwen Man
Ph.D. student 
mankaiwen@hotmail.com

Before joining EDMS as a Ph.D. Student in 2014 Fall, I have earned double master's degrees in Statistics and Economics from the University of Illinois at Urbana-Champaign (UIUC), and double Bachelor's degrees in Economics and Psychology from Lanzhou University. Now, I am interested in CAT, Item Response Time, and Cheating Behavior Detecting. I hope to make a great difference to these wonderful areas!

 

Christian MeyerChristian Meyer
Ph.D. student 
meyerct@umd.edu

Before starting the EDMS doctoral program, I studied physics at Creighton University in Omaha, Nebraska. After graduating, I received an Intramural Research Training Award (IRTA) from the National Institute of Mental Health (NIMH) to obtain additional research and methodological training. I then worked as a faculty research assistant in the Department of Psychology at UMD researching the neurological underpinnings of cognitive and emotional processes. Through these experiences, I developed an interest in understanding the theoretical basis for statistical methodology.

 

Monica MorellMonica Morell
Ph.D. student 
mmorell@umd.edu

I am a third year Ph.D. student and the 2017-2018 UPASS (the EDMS student group) president. I earned a Master’s degree in Research, Measurement, and Evaluation form the University of Miami in 2015 while working as a Clinical Research Coordinator at the Sylvester Comprehensive Cancer Center in Miami, FL. Through my work I became interested in the role of questionnaires and surveys in research, which led me to pursue a Ph.D. in the EDMS program in 2015. My research interests include: Item Response Theory, Item Factor Analysis models, and the effects of measurement error on causal inferences.

 

 

Erin MurphyErin Murphy
M.A. student 
emurphy@terpmail.umd.edu

I am a recent graduate of the University of Maryland where I earned a Bachelor's of Science in Public Health Science and a minor in Statistics. I learned of the field of measurement, statistics, and evaluation through an internship with the Centers for Disease Control and Prevention as well as my behavioral health coursework. My research interests include the methodological bases for the evaluation of behavioral health education interventions (e.g., using structural equation modeling, path analysis, and factor analysis).

 

Linet OgotiLinet Ogoti
M.A. student 
 

linetogoti@gmail.com

I was born in Kenya and spent most of my life in Nairobi. I received my B.A. in Economics with a minor in Applied Mathematics from Smith College in 2013. I currently work at the QED Group as a part-time Human Resources Assistant. Prior to QED, I was a Survey Staff specialist at Westat. I believe that an EDMS degree will enable me to make a significant contribution in the area of statistics and research. When I am not studying or working, I enjoy reading, listening to music, hiking, and meeting new people.

 

 

Mary PetrasMary Petras
M.A. student 
mpetras@terpmail.umd.edu

Prior to joining the EDMS community at UMD, I attended Duquesne University in Pittsburgh, PA and graduated in 2013 with a BA in Math and a BS in Math Education. I want to use evaluative methods that make children's experience learning in math the best it can be. I hope to form a researcher's perceptive in addition to an educator's perspective on quantitative methods that inform many decisions in schools. While enrolled part time in graduate courses, I am currently teaching middle school math at an independent school in Washington, DC.

 

 

Jordan PrendezJordan Yee Prendez
Ph.D. student 
jprendez@umd.edu

Jordan Yee Prendez joined EDMS in the fall of 2014. He is interested in statistical model evaluation, causal inference, and statistical computing.  Jordan was past president of the EDMS student club UPASS, and has experience in institutional research, and statistical consulting.

 

 

Xin QiaoXin (Joye) Qiao
Ph.D. student 
xin.qiao56@gmail.com

I was a Test Prep teacher helping Chinese students with TOEFL, SAT, GRE, and GMAT after I received my bachelor degree in Electrical Engineering from the Nanjing University of Posts and Communications. It was from this working experience that I developed a strong interest in educational measurement. Then I earned a Master's degree in Statistics and Measurement from University of Pennsylvania. After that, I joined EDMS doctoral program with the aim to learn more in the field of psychometrics. My research interest now mainly lies in Computer Adaptive Testing and Item Response Theory.

 

Kathleen RobensKathleen Robens
M.A. student 
kcrobens@gmail.com

 

I am a Nationally Board Certified mathematics teacher and have been teaching in Montgomery County Public Schools, Maryland for over 20 years.  I’ve realized projects to strengthen cross curricular knowledge in STEM and was a teacher intern at the National Institute on Drug Abuse.  My bachelor and master’s degrees are from the University of New Mexico and Teachers College, Columbia University.

 

Patrick SheehanPatrick Sheehan
M.A. student 
psheehan@umd.edu

 

In 2016, I graduated from The George Washington University with a bachelor's degree in psychology with a minor in statistics. For the next two years, I worked as a research analyst where I assisted in data collection and analysis for several program evaluations of federal employment programs. In 2018, I joined EDMS as a master's student, where I am excited to learn more about statistical modeling and its applications to the development of large-scale assessments.

 

Marian StrazzeriMarian Strazzeri
Ph.D. student 
mstrazz@terpmail.umd.edu

I am currently pursuing a Ph.D. in Measurement, Statistics, and Evaluation (EDMS) at UMD to advance my ability to conduct and ultimately lead applied psychometric research.  I am particularly interested in the use of clinical outcome assessments (COAs) in clinical research (such as drug development) and health care settings. I earned my BAs in Mathematical Statistics and Sociology from the University of Virginia (UVA) in 2008 and then went on to also earn my M.S. in Statistics from UVA in 2009.  After graduating with my master's, I worked as a research analyst in the Office of Surveillance and Epidemiology (OSE) at the US Food and Drug Administration’s Center for Drug Evaluation and Research (FDA/CDER).  In OSE, among other things, I helped conduct observational, epidemiologic post-market safety studies to investigate potential safety signals generated from adverse event reports.  Subsequently, I worked as a Mathematical Statistician at the National Center for Health Statistics (NCHS) from 2013 to 2015.  At NCHS, I gained experience with survey design and sampling methods, small area estimation, multiple imputation techniques (such as hot deck), and methods for evaluating the impact of imputed data on design effects and variance estimates.  In July 2015, I transferred to the Office of Biostatistics (OB) at FDA/CDER (where I still currently work), where my work has been focused primarily on the use of COAs (particularly patient-reported outcome measures, or PROs) to inform medical product development and regulatory review.  It is through my work in FDA/CDER’s Office of Biostatistics that I became interested in the fields of psychometrics and measurement theory—particularly the application of these methodologies to the development and use of PROs and other types of COAs in clinical research.

 

Chen TianChen Tian
Ph.D. student 
ctian1@terpmail.umd.edu

I earned my bachelor’s degree in Psychology from Sun Yat-Sen University, China in 2016. After that, I studied at the University of Illinois at Urbana-Champaign and got an M.S. degree in Educational Psychology. I became interested in educational measurement in an introductory course when I was a junior, and learned more about item response theory and computerized adaptive testing at UIUC. Those experiences brought me to this field, and I am excited to begin my Ph.D. at EDMS.

 

Anna Van WieAnna Van Wie
Ph.D. student 
avanwie@hotmail.com

I received my master's degree in EDMS in 1998. I returned to complete my Ph.D. in 2009. I work full time for UMUC assisting instructors with learning outcomes assessment. In addition to school and work, I have 4 little active boys to keep me very busy.

 

Weimeng WangWeimeng Wang
Ph.D. student 
weimengbonnie.gmail.com

I got my bachelor's degree in Arabic and literature from Beijing Foreign Studies University in 2014 and received my master's degree in school counseling at University of Maryland in 2016. Now, I am a first year master's student in EDMS. Before I joined the program, I interned at local public schools as a school counselor. Meanwhile, I conducted educational research focused on bullying and bullying prevention with one of my professors.

 

 

Reni Xu

Reni Xu
Ph.D. student 
renixu@umd.edu

I received my bachelor’s degree in Digital Media Technology from Beijing University of Technology, and I finished my master’s study at EDMS in 2015. After my graduation, I went back to China and worked as a virtual reality (VR) engineer focusing on 3D motion tracking for a year. I rejoined the EDMS program as a doctoral student in 2017. I am currently working with Dr. Yang Liu, and my research interests include (but not limited to) item response theory and diagnostic models.

 

 

Qiwen ZhengQiwen Zheng
Ph.D. student 
qzheng12@umd.edu

I joined the EDMS doctoral program in 2013 after I earned my master's degree in statistics at the University of Illinois at Urbana-Champaign. I decided to go further in educational measurement after taking some related courses and volunteering in the Cognitive Development Lab in the Department of Psychology at UIUC. My previous research experiences were dimension detection of latent variables and non-parametric regression programming. I'm really excited to join this big EDMS family and I'm willing to devote myself to educational measurement, statistics, and evaluation during my doctoral study at the University of Maryland.

 

Yating ZhengYating Zheng
Ph.D. student 
yzheng12@umd.edu

I graduated from Fudan University with a bachelor's degree in history. Motivated by an intense interest in psychometrics and quantitative methodology, I went to the Educational Psychology Quantitative Methods program at the University of Texas at Austin where I learned some psychometric and statistical models, developed data analysis skills and got a master's degree. To go further in this fascinating field, I joined the EDMS program as a doctoral student in 2015. My research interests include multilevel modeling, classification, and large-scale assessment in educational settings.

 

Jinwang ZouJinwang Zou
Ph.D. student 
jwzou@umd.edu

I graduated from Nanjing University in 2012 with a bachelor's degree in Economics. Before joining EDMS in 2014, I received a master's degree in Economics of Education from Renmin University of China. I found that EDMS is a good match for my research interest and experience so I joined this wonderful program.

Ph.D. Program

The Ph.D. program is designed to qualify individuals to teach courses at the university level in applied measurement, statistics, and evaluation, to provide leadership in the conduct of research studies, and to serve as applied statistics, measurement, or evaluation specialists in school systems, industry and government.

Courses within the program are selected from offerings of Measurement, Statistics and Evaluation program and other departments of the University. A program for a student will be structured to take into account the background and future aims of the individual. There is a common core of courses comprised of:

EDMS 623 Applied Measurement: Issues and Practices (3)

EDMS 626 Instrumentation (3)

EDMS 646 General Linear Models I (3)

EDMS 647 Causal Inference and Evaluation Methods (3)

EDMS 651 General Linear Models II (3)

EDMS 655 Introduction to Multilevel Modeling (3)

EDMS 657 Exploratory Latent and Composite Variable Methods (3)

EDMS 722 Structural Modeling (3)

EDMS 724 Modern Measurement Theory (3)

EDMS 779 Mathematical Foundations and Simulation Techniques (3)

EDMS 787 Bayesian Inference and Analysis (3)

EDMS 899 Doctoral Dissertation Research (12)

In addition, at least 21 credits of elective courses must be selected in consultation with the student's advisor. A minimum of 30 credit hours (including EDMS 899) must be taken following admission. In addition to courses, students must complete a doctoral preliminary examination and a doctoral comprehensive examination and are expected to participate in research and publication. A faculty advisor may require courses beyond those specified in this document. The EDMS Program does not offer the Doctor of Education (Ed.D.) degree.

The many and varied professional interests of faculty members provide opportunities for presentations of research reports by students as well as faculty. Recent presentations by students have dealt with a variety of original research topics, in such areas as applied statistics, modeling of traits, test construction, test evaluation, and survey research.

 

M.A. Program

The M.A. program is designed to provide balanced intermediate level graduate training in quantitative methods for students preparing for a variety of positions in government, educational institutions, and private industry. Proximity to Washington, D.C., provides opportunities for students to engage in a variety of academic and professional experiences.

The program requires a minimum of 30 credit hours in courses acceptable for credit toward a graduate degree. At least 18 must be selected from courses numbered 600 or above, combining content from statistics, measurement, evaluation, and related fields outside of the Department (a maximum of 6 outside hours).

Courses within the program are selected from offerings of Measurement, Statistics and Evaluation program and other departments of the University. A program for a student will be structured to take into account the background and future aims of the individual. There is a common core of courses comprised of:

EDMS 623 Applied Measurement: Issues and Practices (3) 

EDMS 646 General Linear Models I (3) 

EDMS 647 Causal Inference and Evaluation Methods (3)

EDMS 651 General Linear Models II (3)

EDMS 655 Introduction to Multilevel Modeling (3)

EDMS 657 Exploratory Latent and Composite Variable Methods (3) 

EDMS 724 Modern Measurement Theory (3)

Additional elective coursework completes the 30 credit hours. A written comprehensive examination based on the first four courses of the core (646, 651, 623, 647) is required. The Graduate School allows transfer of up to six credits of appropriate prior graduate work. Both thesis and non-thesis options are available in the Master of Arts program. All coursework applied toward completion of the degree must be completed in a five year period. The program does not offer the Master of Education degree

 

Combined B.A. and M.A. program

The Measurement, Statistics and Evaluation program in the College of Education offers a 5th Year M.A. program for undergraduates interested in quantitative methods. This allows highly motivated undergraduates the chance to develop their skills in quantitative methods and complete both the bachelor's and master's degrees in just 5 years. This degree prepares one for careers that include statistical data analysis, developing survey and testing instruments, designing research studies, performing quantitative analysis, evaluating educational programs, psychometric analysis of assessment results, conducting marketing studies, and conducting surveys.

Almost any undergraduate major would be appropriate for this program, including psychology, sociology, mathematics, statistics, computer science, communications, business, and economics, and even English and history. The critical quality that a student needs to bring to the department is high quantitative ability. This is a profession that trains at the graduate level, and the specific undergraduate major is not a critical factor. Fields involving the application of modern quantitative research techniques provide an outlet for those trained in this program.

While an undergraduate, students can take three of the following four courses: EDMS646, EDMS651, EDMS623, EDMS647. These 9 credits would count toward both the undergraduate and graduate degree program. Students would need to also formally apply to the master’s program in their senior year. Please contact the EDMS Director of Graduate Studies for more information on the enrollment process.

 

Certificate

Students who are enrolled in another graduate program at the University of Maryland can opt to apply to enter our certificate program. This program allows students to obtain skills and knowledge to have a quantitative specialization that they may use with their content area of interest. The certificate requires completion of 21 credits, including the following courses:

EDMS 623 Applied Measurement: Issues and Practices (3) 

EDMS 646 General Linear Models I (3) 

EDMS 647 Causal Inference and Evaluation Methods (3)

EDMS 651 General Linear Models II (3)

The courses to be taken for the remaining nine credits should be chosen in consultation with the EDMS advisor for the student. For more information about the enrollment process for the certificate, please contact the Director of Graduate Studies for the EDMS program.

EDMS 451 Introduction to Educational Statistics (3 credits). Restriction: Sophomore standing or higher. Credit only granted for: BIOM301, BMGT230, CCJS200, ECON230, ECON321, EDMS451, GEOG306, GEOL351, GVPT422, PSYC200, or SOCY201. Introduction to statistical reasoning; location and dispersion measures; computer applications; regression and correlation; formation of hypotheses tests; t-test; one-way analysis of variance; analysis of contingency tables.

EDMS 489 Field Experiences in Measurement and Statistics (1-4 credits). Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. Repeatable to 4 credits. Planned field experience in education-related activities. Credit not to be granted for experiences accrued prior to registration.

EDMS 498 Special Problems in Measurement and Statistics (1-3 credits). Prerequisite: Available only to education majors who have formal plans for individual study of approved problems. Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. Repeatable to 6 credits. Available only to education majors who have formal plans for individual study of approved problems.

EDMS 610 Classroom Assessment and Evaluation (3 credits). Develop the understandings and skills needed to validly, reliably, and accurately assess student learning and to provide focused leadership in the area of classroom assessment.

EDMS 622 Theory and Practice of Standardized Testing (3 credits). Prerequisite: EDMS451; or EDMS645. Principles of interpretation and evaluation of aptitude, achievement, and personal-social instruments; theory of reliability and validity; prediction and classification; norm- and criterion-referenced testing concepts.

EDMS 623 Applied Measurement: Issues and Practices (3 credits). Prerequisite: EDMS410. And EDMS645; or students who have taken courses with comparable content may contact the department. Measurement theory and its application at an intermediate level; test development, validation and interpretation; issues and recent developments in measurement.

EDMS 626 Instrumentation (3 credits). Prerequisite: EDMS623. Theory, development, and applications of various affective, cognitive, or behavioral measurement instruments and procedures, including questionnaire and test items, observational protocols, and cutting-edge innovative game and scenario-based assessments.

EDMS 645 Quantitative Research Methods I (3 credits). Research design and statistical applications in educational research: data representation; descriptive statistics; estimation and hypothesis testing. Application of statistical computer packages is emphasized.

EDMS 646 General Linear Models I (3 credits). Prerequisite: EDMS645; or an equivalent introductory statistics course. A first post-introductory inferential statistics course, with emphasis on analysis of variance procedures and designs from within the general linear modeling framework. Assignments include student analysis of education and related data; application of statistical software packages is emphasized.

EDMS 647 Causal Inference and Evaluation Methods (3 credits). Prerequisite: Must have completed or be concurrently enrolled in EDMS651. Counterfactual (potential outcomes) framework for causal inference, design/analysis strategies for confounder control, and specific best-practice applications to the evaluation of programs.

EDMS 651 General Linear Models II (3 credits). Prerequisite: EDMS646; or students who have taken courses with comparable content may contact the department. Multiple regression and correlation analysis; trend analysis; hierarchical and stepwise procedures; logistic regression; software for regression analysis.

EDMS 655 Introduction to Multilevel Modeling (3 credits). Prerequisite: EDMS651; or students who have taken courses with comparable content may contact the department. Introduction to multilevel models and methodology as strategies for modeling change and organizational effects.

EDMS 657 Exploratory Latent and Composite Variable Methods (3 credits). Prerequisite: EDMS651. Development of models for exploratory factor analysis and their practical applications. Additional topics will draw from latent class analysis, cluster analysis, mixture models, canonical correlation, multidimensional scaling, and configural frequency analysis.

EDMS 665 Data Analysis and Statistical Consulting (3 credits). Prerequisite: EDMS651; or students who have taken courses with comparable content may contact the department. Advanced data analysis procedures applied to real-world clients' problems arising in a wide variety of substantive research settings within and beyond education.

EDMS 722 Structural Modeling (3 credits). Prerequisite: EDMS657. Statistical theory and methods of estimation used in structural modeling; computer program applications; multisample models; mean structure models; structural models with multilevel data (e.g., sampling weights, growth models, multilevel latent variable models).

EDMS 724 Modern Measurement Theory (3 credits). Prerequisite: EDMS623 and EDMS651. Theoretical formulations of measurement from a latent trait theory perspective.

EDMS 738 Seminar in Special Problems in Measurement (1-3 credits). Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. Repeatable to 3 credits. An opportunity for students with special interests to focus in depth on contemporary topics in measurement. Topics to be announced, but will typically be related to applied and theoretical measurement. Recent topics include: Large Scale Assessment, Advanced Item Response Theory, Computer Adaptive Testing

EDMS 747 Design of Program Evaluations (3 credits). Prerequisite: EDMS626, EDMS651, and EDMS647. Or permission of instructor; and permission of EDUC-Human Development and Quantitative Methodology department. Analysis of measurement and design problems in program evaluations.

EDMS 769 Special Topics in Applied Statistics in Education (1-4 credits). Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. Designed primarily for students majoring or minoring in measurement, statistics or evaluation. Recent topics include: Mixture Models, Longitudinal Data Analysis, Advanced Structural Equation Modeling, Nonparametric Structural Equation Modeling

EDMS 771 Multivariate Data Analysis (3 credits). Prerequisite: EDMS651. Principal components, canonical correlation, discriminant functions, multivariate analysis of variance/covariance and other multivariate techniques.

EDMS 779 Seminar in Applied Statistics: Mathematical Foundations and Simulation Techniques (3 credits). Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. And must be in Measurement, Statistics and Evaluation (Master's) program; or must be in Measurement, Statistics and Evaluation (Doctoral) program. One part of the course will be dedicated to learning about simulation design, implementation, execution, and dissemination of results. For this material, SAS and R will be the primary computing platforms for both course delivery and student work. The second part of the course will include an introduction into statistical computing. This course material will be delivered in R and will involve using and writing code in R and/or SAS.

EDMS 787 Bayesian Inference and Analysis (3 credits). Prerequisite: EDMS651. Credit only granted for: EDMS769B or EDMS787. Formerly: EDMS769B. Models and model fitting methods commonly used in Bayesian Inference, such as Markov Chain Monte Carlo methods (e.g., Gibbs, Metropolis Sampling), with applications within and beyond the social and behavioral sciences. Analytical and philosophical differences between Frequentist and Bayesian statistics will also be highlighted.

EDMS 798 Special Problems in Education (1-6 credits). Restriction: Must be in Measurement, Statistics and Evaluation (Master's) program; or must be in Measurement, Statistics and Evaluation (Doctoral) program. Master's, EDMS majors, or doctoral candidates who desire to pursue special research problems under the direction of their advisors may register for credit under this number.

EDMS 799 Master's Thesis Research (1-6 credits). Restriction: Must be in a major within EDUC-Human Development and Quantitative Methodology department. Registration required to the extent of 6 credits.

EDMS 879 Doctoral Seminar (1-3 credits). Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. Seminar that supports analysis of doctoral projects and theses, and of other on-going research projects.

EDMS 889 Internship in Measurement and Statistics (3-12 credits). Prerequisite: Open only to students advanced to candidacy for doctoral degree. Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. Provides internship experiences at a professional level of competence in a particular role with appropriate supervision. Credit not to be granted for experience accrued prior to registration. Open only to students advanced to candidacy for doctoral degree.

EDMS 898 Pre-Candidacy Research (1-8 credits)

EDMS 899 Doctoral Dissertation Research (1-8 credits). Registration required to the extent of 12 credits.

EDMS Psychometric Computation and Simulation Lab

The EDMS Psychometric Computation and Simulation (PCS) lab provides EDMS students with state-of-the-art software resources to conduct advanced cutting-edge simulation and computational research, and provides students and faculty a means to conduct highly specialized training sessions for students and researchers from across our campus and from other institutions who are seeking high level methodological training.

 

Maryland Assessment Research Center (MARC)

The Maryland Assessment Research Center (MARC) provides support to the range of assessment activities in the State, the region and nation by conducting basic and applied research to enhance the quality of assessment practice and knowledge. MARC is a project of the Measurement, Statistics, and Evaluation (EDMS) program in the Department of Human Development and Quantitative Methodology in the College of Education at the University of Maryland.

To accomplish its purposes, MARC houses expertise in assessment design, development, implementation, analysis, reporting and policy issues as well as the technical aspects of the quantitative theories that form the foundations of measurement. The nationally recognized EDMS program in the Department of Human Development and Quantitative Methodology, College of Education, and College Park Campus of the University of Maryland (UMCP) augment the capabilities of the MARC staff.

 

Center for Integrated Latent Variable Research (CILVR)

CILVR is a center whose goal is to serve as a national and international focal point for innovative collaboration, state-of-the-art training, and scholarly dissemination as they relate to the full spectrum of latent variable statistical methods.

CILVR is housed within the Measurement, Statistics and Evaluation (EDMS) program at the University of Maryland. EDMS faculty are recognized scholars in various facets of latent variable statistical models, whether it be item response theory, latent class analysis, mixture models, or structural equation modeling. EDMS is also unique in its geographic location, situated along the East Coast of the United States near the nation's capital, centrally located for scholars from the US and Europe, and a short distance from some of the world's top latent variable scholars working in academia, government, and applied research settings. Thus, EDMS is well positioned in its composition and its location to serve as a focal point for integrated latent variable research.

DATES

EVENT/TOPIC

INSTRUCTOR

MORE INFO

Nov 7-8, 2019

19th annual MARC Conference: Enhancing Effective Instruction and Learning Using Assessment Data: Theory and Practice

Dr. Hong Jiao & Dr. Robert Lissitz, Hosts

For more information, please visit:

https://education.umd.edu/research/centers/marc/workshops-and-conferences/2019-marc-conference

Jan 6-10, 2020

Short Course: Structural Equation Modeling: A First Course (Jan 6-8)

Short Course: Structural Equation Modeling: A Second Course (Jan 9-10)

Dr. Gregory R. Hancock, University of Maryland

For more information, please visit:

https://education.umd.edu/SEM-2020

Feb 20-21, 2020

Short Course: Introduction to Meta-Analysis

Dr. Joshua Polanin, American Institudes of Research

More information, please visit:

https://education.umd.edu/META-2020

Apr 13-15, 2020

Short Course: Introduction to Graphical Models

Dr. Peter Steiner, University of Maryland

More information coming soon...

June 8-12, 2020

Short Course: Bayesian Statistical Modeling: A First Course (June 8-10)

Short Course: Bayesian Statistical Modeling: A Second Course (June 11-12)

Dr. Roy Levy, Arizona State University

More information coming soon...

 

            

The Measurement, Statistics and Evaluation (EDMS) program at the University of Maryland is continually seeking qualified students for its master's and doctoral level programs.

We offer:

  • One of the nation's best programs in quantitative methods.
  • A large number of Research and Teaching Assistantships.
  • Training in a field with great career potential.
  • Opportunities for field work on campus, in the Washington area, and nationally.
  • A capable and active group of graduate students.
  • Opportunities to work on projects through grants, contracts, and research centers.

 

What are Measurement, Statistics, and Evaluation?

Measurement: At its most basic level, the field of Measurement is concerned with the assignment of numbers to objects in some systematic, meaningful way. In education and the social sciences, measurement typically refers to a rational assignment of numbers so that they quantitatively describe some unobservable (i.e., latent) construct like ability, personality, attitude, satisfaction, etc. The assignment of these numbers must proceed in a carefully prescribed, reproducible fashion. In most cases, the process is based on a mathematical model that defines how the numbers should behave as the underlying construct changes. Measurement professionals develop and utilize a variety of mathematical tools to determine the reliability, validity and meaning of the numbers assigned during the measurement process. They are employed in both research and applied settings that involve psychological testing, educational testing, or the measurement of attitudes and preferences. These include large scale testing programs like the SAT and ACT.

Statistics: The field of Applied Statistics is primarily concerned with the development of testable research hypotheses, the application of powerful statistical tests to determine the plausibility of a given research hypothesis, and the design of experiments to control extraneous sources of variation. Other facets of this discipline include the design of schemes to collect sample data that are representative of a given population, the description of a population using alternative characteristics of sample data, and the development of alternative models to explain relationships between observable variables. Additionally, the assessment of model fit and the estimation of model parameters also fall under the rubric of Applied Statistics. Professionals in Applied Statistics develop and utilize quantitative techniques to test hypotheses, develop models for observed data, and assess the adequacy of those models. They are employed in a variety of settings where practical decisions must be made on the basis of observed data (e.g., business, industry, government, and education). These include agencies like the American Institutes fo Research, the U.S. Census Bureau, the National Center for Education Statistics, and a large number of private organizations that conduct surveys or polls.

Evaluation: The field of Evaluation is concerned with the application of measurement and statistical principles to objectively evaluate institutional and organizational programs. Programs are generally evaluated with regard to the way in which they are planned and implemented, and the degree to which they accomplish their mission. Examples of such evaluation efforts include the Tennessee model for evaluating school effectiveness, the extensive work on the effects of teacher training upon student success, and the evaluations of the National Head Start Program. Evaluation professionals might work for the Federal Government as contract monitors, for an industry interested in determining their level of success in marketing, sales, or production, for the State Superintendent of Schools examining the results of large expenditures of time and money, and of course, for a University interested in teaching and research in the systematic application of measurement and statistics to the determination of program value.

 

The Programs

Measurement, Statistics and Evaluation offers programs of study leading to both the Master of Arts (M.A.) and Doctor of Philosophy (Ph.D.) degrees. M.A. students generally take introductory coursework in measurement, applied statistics, and evaluation. Ph.D. students typically complete an analogous program of study, after which they may continue with a multidisciplinary focus or concentrate primarily on a single discipline.

Most of the educational and social science research that takes place today relies on the expertise of those who develop data collection instruments (such as assessments, questionnaires, and interview protocols), plan research and evaluation studies, develop new models and methods, design sampling frameworks, and collect and analyze data. The University of Maryland programs in Measurement, Statistics and Evaluation provide students with advanced skills in these areas. The master's program gives individuals the broad range of skills necessary to serve as research associates in academic, government, and business settings. The doctoral program qualifies individuals to provide leadership in the conduct of research studies, to serve as applied statisticians, measurement, or evaluation specialists in school systems, industry, and government, and to teach quantitative courses at the university level. We are widely recognized as one of the best applied quantitative programs in the country. Our students are exceptional. Our faculty members are respected leaders within their specialties.

 

Assistantships

We can usually offer research or teaching assistantships to most of our full-time students. Assistantships are full-time appointments (20 hours per week) that carry both a stipend and a tuition waiver. During a research assistantship, students pursue both individual and collaborative research projects with a faculty member advisor. These appointments are renewable. The duties of teaching assistants range from instructional support (e.g., grading) to taking full responsibility for teaching a section of an undergraduate course (with appropriate supervision and support).

 

Career Opportunities

There is a tremendous demand for individuals with the quantitative skills provided by our program. Recent surveys of degree programs and employment in measurement show that there will continue to be a shortfall in the number of measurement professionals relative to the number of available employment opportunities. Similarly, the National Science Foundation predicts a strong job market for statisticians over the next ten years.

The need for applied quantitative professionals is strikingly evident in educational research. As education in the United States pushes toward greater accountability and better informed decision making, there has been an ever-increasing demand for professionals who can interpret the abundance of collected data. Each of the nation's 16,000 school districts and each of the 50 state education departments collects large quantities of data to assess the impact of schooling in its community. There are many state and federal data-collection programs, ranging from a regular biennial survey of U.S. schools to specialized cross-sectional and longitudinal surveys. Many of these survey and testing programs result from legislative mandates and they all rely heavily on instrumentation, research design, sampling schemes, and data analysis. Almost all educational research today depends on hard data, rigorous quantitative approaches, and individuals with research expertise. This is a field that is expanding; people recognize that quantitative methods are critical to developing educational theory and are needed to provide a sound foundation for educational progress. As summarized by the American Statistical Association, [People with quantitative skills] concerned with educational issues readily find employment in universities, state and federal government agencies, and private research agencies. The many needs of the education sector for statistical expertise promise exciting careers to all whose interests lie in this field.

Quantitative professionals are not only in high demand, but they also have jobs that are both professionally and personally desirable. The problems that these professionals help solve often have substantial impact on the lives of many other individuals. Additionally, many sources often rank the job of "Statistician" as the most attractive profession with regard to working conditions.

 

Field Work and Postgraduate Opportunities

Our reputation and location provide tremendous opportunities for our students to conduct special projects and be involved with real-world, often ground-breaking, applications in government, research firms, associations, and private industry. Current and recent students have conducted special projects with or have been employed by American Institutes for Research, The Substance Abuse and Mental Health Services Administration, The Census Bureau, Westat, State Departments of Education, Local Education Agencies, National Education Association, U.S. Department of Education, Educational Testing Service, Maryland Assessment Research Center, Center for Applied Linguistics, and the Center for Biologics Evaluation and Research of the Food and Drug Administration. They also have taken professorial positions at such institutions as Arizona State University, University of Nebraska, University of Georgia, University of Texas - Austin, The George Washington University, University of Houston, Southern Illinois University, University of Hawaii, and University of Maryland Baltimore County.

 

Student Body

We are committed to maintaining high standards for admission to both our master's and doctoral programs, and we have been able to attract top students from across the United States and the world. Our students come from undergraduate institutions, government, professional associations, consulting firms, research units of companies, and the public schools. Our doctoral students are among the best on any campus. The real-world experiences, skills, and aptitude of our students help make our program intellectually rigorous while providing exceptional peer-to-peer support. Approximately one-half of our students attend full-time, and there is a very active departmental student organization.

 

Frequently Asked Questions

How many students are in the program?  We currently have approximately 45 students enrolled in our M.A. and Ph.D. programs.

Do I need to have experience in education to be accepted?  No. Our students come from a variety of disciplines both within and outside of education. The dominant characteristic of successful candidates is that they are very interested in the application of quantitative techniques to solve practical problems. Many of these problems arise in educational settings, but others emerge in the social sciences, government, business, and industry.

I have an undergraduate degree (B.A./B.S.) - may I enter directly into the Ph.D. program?  This is a possibility, but very rare. Typically, Ph.D. applicants first complete a relevant master's program. If one completes the EDMS master's program first, then one may apply for the Ph.D. program subsequently. Note that all courses taken as part of an EDMS master's degree apply toward the Ph.D., and the doctoral preliminary examination is waived for those who successfully complete the EDMS master's degree.

Do I need to have a background in mathematical statistics to enter your program? No, but strong quantitative skills and interests are necessary. Undergraduate courses in calculus and linear algebra provide necessary skills for our program.

Do you have many foreign students in your programs? Yes. Approximately 50% of our students are from countries other than the United States. For example, we have strong contingents of students from China and Korea.

What are the admissions criteria? The EDMS Admissions Committee meets regularly during the academic year. Decisions are based on several sources of information including scores on the Graduate Record Examination (GRE), undergraduate transcripts, letters of recommendation, and a statement of interest.

What are the deadlines for admission? Please see the Graduate School web site for current information about admissions for applicants including those holding foreign visas. Our selection committee meets throughout the academic year. However, note that the Graduate School clearance procedures for foreign students can take several months and we do not see your application until these clearances are complete. All applicants should keep track of their applications by visiting the Graduate School web site rather than by e-mailing EDMS.

How much is tuition? Please see the Bursar's Office web site for current information about student tuition. Note that students with full-time teaching or research assistantships, or fellowships, within the Department receive a tuition waiver in the amount of 10 credit hours per semester.

What types of financial aid are available? In addition to Department teaching and research assistantships, the University also offers financial assistance in the form of graduate fellowships, tuition scholarships, support grants, and variety of need-based financial aid programs. However, since financial aid is limited, those applicants who need financial aid should apply as early as possible.

When are classes held? Most classes are held once per week during the evening hours (e.g., 4:15-7:00 PM) so that part-time students with employment/family commitments can attend along with full-time students.

How long does it take a student to finish the program? The degree programs are structured so that a full-time student could finish requirements for the M.A. degree in approximately 2 years whereas approximately 4 years would be required to finish the Ph.D. requirements.

How well does our program accommodate part-time students? Most courses in the department are scheduled during late afternoon or evening hours to enable part-time students to attend classes. Faculty are generally available for consultation during evening hours as well. This provides part-time students with opportunities for both traditional instruction and individual study/research mentored by department faculty. However, some required courses for Ph.D. students require attendance during daytime hours.

More Information?  Contact Dr. Gregory R. Hancock, EDMS Program Director

 

The University of Maryland is an equal opportunity institution with respect to both education and employment. The university's policies, programs and activities are in conformance with state and federal laws and regulations on non-discrimination regarding race, color, religion, age, national origin, political affiliation, gender, sexual orientation or disability

 

The College of Education has long provided instruction in quantitative research methods. Prior to 1964, there was one full-time professor in the area serving a College faculty of approximately 50 members. Although there were recognized areas of specialization, there was, at that time, no departmental structure within the College. Rather, the College operated as a single administrative unit headed by a Dean. During the next ten years there was rapid growth in the size of College, reaching about 200 faculty by the mid-1970s. In this period, several new departments were formed including EDMS around 1972 (we’re a little fuzzy on the exact year). Throughout its history as a recognized area within the College, as a department, and now as a program within the HDQM Department, EDMS has engaged in the dual roles of professional training and service to the College.

In its first role, EDMS has provided training at the master's and doctoral levels for students planning to pursue careers in quantitative areas related to applied statistics, measurement, and evaluation. An important impetus for the development of the major program was a relatively large fellowship program funded in 1966 by Title IV of the Elementary and Secondary Education Act (ESEA). This grant provided support for 12 doctoral students per year with about half of the positions going to EDMS majors and the remainder to minors with majors in other departments within the College. Graduates from the original ESEA fellowship program pursued a variety of important career paths including professorships in higher education, research positions with Maryland boards of education and research posts in the federal government. In fact, one very early graduate went on to become superintendent of the Baltimore City school system. The major program in EDMS had, and still has, a total of about 40-50 full- and part-time students. The number of fulltime faculty in EDMS has fluctuated from six to eight with a budgeted complement of seven at present.

In its second role, EDMS has provided service courses in applied statistics, measurement, and evaluation for graduate students majoring in other departments and programs in the College. In addition, faculty members have always been in high demand as members of doctoral research committees and as consultants to various externally funded projects within the College. This latter activity dates to the very earliest days of EDMS when one current faculty member was partially funded by a grant in the area of pupil personnel services and counselor education. The College has always required some course work provided by EDMS, with courses now in high demand across the campus. The professional training and service roles of EDMS serve complementary purposes. Virtually all full-time graduate students in EDMS are on some form of financial support and many of these students play key roles in service courses. In particular, several EDMS students each semester are employed as teaching assistants (tutors) for students from other units who are enrolled in courses such as EDMS 451, EDMS 645, EDMS 646, and EDMS 651. Also, advanced graduate students serve as primary instructors for the undergraduate EDMS 451 course. In effect, a strong cadre of EDMS graduate students is central to the service teaching role of the department.

Historically, EDMS had not vigorously sought external funding for its own research efforts. While, as noted, faculty members have participated in projects in other units, the procurement of major funding was rarely pursued. However, within the last 10-15 years EDMS has, with great success, focused major effort in the procurement of external funding through both grants and contracts dedicated to efforts specifically in measurement, statistics, and evaluation.

Blozis, S. A., & Harring, J. R. (2018). Fitting nonlinear mixed-effects models with alternative residual covariance structures. Sociological Research & Methods.

Hancock, G. R., & An, J. (2018). Framing and improving scale reliability assessment using structural equation models. Educational Measurement: Issues and Practice.

Liu, M., Harbaugh, A. G., Harring, J. R., Hancock, G. R. (in press). The effect of extreme response and non-extreme response styles on testing measurement invariance. Frontiers in Psychology (Quantitative Psychology and Measurement section).

Hancock, G. R., Harring, J. R., & Macready, G. B. (Eds.). (in press). Advances in latent class analysis: A Festschrift in honor of C. Mitchell Dayton. Charlotte, NC: Information Age Publishing.

Hancock, G. R., Stapleton, L. M., & Mueller, R. O. (Eds.). (2019). The reviewer's guide to quantitative methods in the social sciences (2nd ed.). New York: Routledge.

Harring, J. R., & Blozis, S. A. (in press). Modeling nonlinear longitudinal change with mixed effects models. In A. A. O’Connell & D. B. McCoach (Eds.), Multilevel modeling with introductory and advanced applications.

Harring, J. R., & Johnson, T. (2018). Two-way analysis of variance. In B. Frey (Ed.), The SAGE encyclopedia of educational research, measurement and evaluation. Thousand Oaks, CA: SAGE Publications.

Jiao, H., & Li, C. (2018). Progress in International Reading Literacy Study (PIRLS) data. In B. Frey (Ed.), The SAGE Encyclopedia of Educational Research, Measurement, and Evaluation. Thousand Oaks, CA: Sage.

Jiao, H., & Liao, D. (2018). Testlet response theory. In B. Frey (Ed.), The SAGE Encyclopedia of Educational Research, Measurement, and Evaluation. Thousand Oaks, CA: Sage.

Man, K., Harring, J., Jiao, H., & Zhan, P. (in press). Conditional joint modeling of compensatory multidimensional item responses and response times. Applied Psychological Measurement.

Jiao, H., & Lissitz, R. W. (2018, Eds.). Data analytics and psychometrics: Informing assessment practices. Charlotte, NC: Information Age Publishing.

Jiao, H., & Lissitz, R. W. (in press). Applications of artificial intelligence to assessment. Charlotte, NC: Information Age Publisher.

Jiao, H., Liao, D., & Zhan, P. (in press). Utilizing process data for cognitive diagnosis. In M. von Davier & Y. Lee (Eds.), Handbook of diagnostic classification models.

Lee, D., Harring, J. R., & Stapleton, L. M. (in press). Comparing methods for addressing missingness in longitudinal modeling of panel data. Journal of Experimental Education.

Leite, W. L., Stapleton, L. M., & Bettini, E. F. (2018). Propensity score analysis of complex survey data with structural equation modeling: A tutorial with MplusStructural Equation Modeling: A Multidisciplinary Journal

Little, T. D., Widaman, K. F., Levy, R., Rodgers, J. L., & Hancock, G. R. (2018). Error, error in my model, who’s the fairest of them all? Research on Human Development, 14, 271-286.

Liu, J., & Harring, J. R. (in press). A systematic investigation of within-subject and between-subject covariance structures in growth mixture models. In G. R. Hancock, J. R. Harring, & G. B. Macready (Eds.). Advances in latent class analysis: A Festschrift in honor of C. Mitchell Dayton. Charlotte, NC: Information Age Publishing.

Man, K., Harring, J. R., Jiao, H., & Zahn, P. (in press). Joint modeling of compensatory multidimensional item responses and response times. Applied Psychological Measurement.

Man, K., Harring. J. R., Ouyang, U., & Thomas, S. L. (2018). Response time based nonparametric Kullback-Leibler divergence measure for detecting aberrant test-taking behavior. International Journal of Testing18, 155-177.

Man, K., Sinharay, S., & Harring, J. R. (in press). Use of data mining methods to detect test fraud. Journal of Educational Measurement.

McNeish, D., & Hancock, G. R. (2018). The effect of measurement quality on targeted structural model fit indices: A comment on Lance, Beck, Fan, and Carter (2016). Psychological Methods, 23, 184-190.

Mueller, R. O., & Hancock, G. R. (2019). Structural equation modeling. In G. R. Hancock & R. O. Mueller (Eds.), The reviewer's guide to quantitative methods in the social sciences (2nd ed.) (pp. 445-456). New York: Routledge.

Qiao, X., & Jiao, H. (2018). Comparing data mining techniques in analyzing process data: A case study on PISA 2012 problem-solving items. Frontiers in Psychology.

Stapleton, L. M. (2019). Survey sampling, administration, and analysis. In G. R. Hancock, L. M. Stapleton & R. O. Mueller (Eds.), The reviewer’s guide to quantitative methods in the social sciences (2nd ed.) (pp. 467-481). New York, NY: Routledge.

Stapleton, L. M., & Kang, Y. (2018). Design effects of multilevel estimates from national probability samples. Sociological Methods & Research, 47, 430-457.

Stapleton, L. M., & Thomas, S. L. (in press). Using national and international datasets in multilevel modeling. In A. O’Connell, B. McCoach, & B. Bell (Eds.), Multilevel modeling methods with introductory and advanced applications. Charlotte, NC: Information Age Publishing.

Stegmann, G., Jacobucci, R., Harring, J. R., & Grimm, K. J. (2018). Nonlinear mixed-effects modeling programs in R. Structural Equation Modeling: A Multidisciplinary Journal25, 160-165.

Sweet, T. M. (2019). Social network analysis. In G. R. Hancock, L. M. Stapleton, & R. O. Mueller (Eds.), The reviewer's guide to quantitative methods in the social sciences (2nd ed.) (pp. 434-444). New York: Routledge.

Sweet, T. M. (in press). Modeling social networks as mediators: A mixed membership stochastic blockmodel for mediation. Journal of Educational and Behavioral Statistics.

Yang, J. S., Morell, M., & Liu, Y. (2018). Constructed response items. In B. Frey (Ed.), The SAGE Encyclopedia of Educational Research, Measurement, & Evaluation. New York, NY: Sage.

Zhan, P., & Jiao, H. (2018). Using JAGS for Bayesian cognitive diagnosis modeling: A tutorial. Journal of Educational and Behavioral Statistics.

Zhan, P., Jiao, H., Liao, D. & Li, F. (in press). A longitudinal higher-order diagnostic classification model. Journal of Educational and Behavioral Statistics.

Zhan, P., Wang, W.-C., Jiao, H., & Bian, Y. (2018). The probabilistic-inputs, noisy conjunctive models for cognitive diagnosis. Frontiers in Psychology.

Zhan, P., Jiao, H., Liao, M., & Bian, Y. (2018). Bayesian DINA modeling incorporating within-item characteristics dependency. Applied Psychological Measurement.

Zhang, S., Chen, Y., & Liu, Y. (2018). An improved stochastic EM algorithm for large‐scale full‐information item factor analysis. British Journal of Mathematical and Statistical Psychology.

Zheng, X., & Yang, J. S. (2018). Latent growth curve analysis with categorical data: Model specification, estimation, and panel attrition. Multivariate Behavioral Research, 53, 134-135.