Gregory R. Hancock
1230D Benjamin Building
1230E Benjamin Building
1230C Benjamin Building
1230B Benjamin Building
1229 Benjamin Building
1230A Benjamin Building
Ji Seung Yang
1225 Benjamin Building
|EMERITUS AND ADJUNCT FACULTY|
C. Mitchell Dayton
Robert W. Lissitz
I graduated from Nanjing University in China with a Bachelor's degree in Teaching Chinese as a Second Language. Before joining EDMS in 2013, I received a MA in Teaching & Curriculum from Michigan State University, and had some classroom teaching experience in a public school in Michigan, where I became interested in educational measurement and evaluation.
Before coming to UMD, I put both my B.S. and M.S. in mathematics to work teaching for close to 15 years in both secondary and higher education learning settings. I also co-wrote and directed grant projects focused primarily on course redesign and the evaluation of learning intervention effectiveness. I am currently interested in the way that IRT is reshaping assessment and evaluation, especially with its use in Computerized Classification Testing. As a mature student with a number of priorities to balance and with a prior career to expand upon, I've found EDMS course structures quite accommodating and the skills taught to be far reaching and applicable. There are great opportunities for professional growth here under the guidance of a truly brilliant faculty and in the company of many exceptional graduate students.Alyson Burnett
Over the past six years, I have worked in education research, first as an employee at American Institutes for Research (AIR) and currently at Mathematica Policy Research. My primary research topics include educator quality, school and district improvement strategies, preschool and early childhood, after school programming, and college and career readiness. I began the EDMS program in 2012 as a master's student in order to improve my quantitative research skills.Â As a current Ph.D. student, I hope to further advance my quantitative skills, develop a deep understanding of various methods, and learn how to assess them for use in education research. I am particularly interested in exploring research designs that can be used for causal inference when random assignment is not possible, such as regression discontinuity and propensity score matching.
I received my bachelor's and Master's degrees in Portuguese Second Language Teaching from the University of Lisbon, Portugal, in 2002 and 2009. After I graduated in 2002 I worked for the University of Lisbon as a Portuguese Second Language instructor and as a research assistant in Second Language Testing. I then decided to deepen my training in Assessment, Evaluation and Quantitative Methods. With the help of the Fulbright Program, I joined EDMS in 2010 and am looking forward to learning as much as I can about psychometrics and advanced statistical modeling.Kristina Cassiday
I received my M.A. degree in TESOL from the Monterey Institute of International Studies and spent two years in Guizhou province, China as a University English teacher through the Peace Corps. I joined the EDMS program in Fall 2013 and also work as a Research Analyst on campus. I feel fortunate to be a part of such a wonderful community of graduate students and outstanding faculty.Ryan Eu Jyn Chow
I am a Doctor of Musical Arts candidate at the University of Maryland. Since 2015, I have worked in the Development office of the Clarice Smith Performing Arts Center and the College of Arts at Humanities at the University of Maryland. I am particularly interested in prospect research and identification through quantitative methods, and how such methodologies can translate into building new audiences. With a Bachelor of Accountancy from Singapore Management University, I believe EDMS is an innovative and fulfilling area that bridges my interests across a variety of sectors including non-profit, business, education and the arts.Yishan Ding
I graduated with a master in International Education and a certificate in Survey Design and Data Analysis from the George Washington University in 2016. I then joined the EDMS program in 2017 as a master student to pursue my interest in educational statistics research. My previous research experience includes working as a research fellow at UNESCO office in Jakarta, where I worked on the water education project.
I earned my Master's degree in Applied Psychology at New York University, after which I spent two years conducting empirical research on adolescents' socio-emotional development at Purdue
University. The experience of dealing with data and applying different analytic methods to answer various research questions has contributed to my growing interest in quantitative methodology. It led me to EDMS and now I am pursuing a Master's degree in this program.
I received my bachelor's degree in Mathematics and Sociology from Boston University, and I earned my master's degree here at EDMS in 2010. I loved my EDMS experience, so I have returned to earn my Ph.D. Since moving to DC, I have worked in program evaluation, credentialing, and education assessment. I currently work on the Analytics Team for a large assessment software company.
I graduated from the University of Maryland in 2013 with a degree in psychology and a minor in human development. Now in my fifth year as a doctoral student at Maryland, I study relational reasoning and its manifestations in both team and individual discourse. As a first year master’s student in the EDMS department, I hope to find new and interesting ways of modeling team interactions as problem-solving occurs. When I’m not reading and writing, you can often find me playing roller hockey or biking the trails around the area.Bernadette Jerome
I am excited to be a part of the EDMS program. Before coming to Maryland, I worked in northern Uganda coordinating a randomized control trial of an early literacy program with the University of Michigan. My background is in education and I gained my experience in Peace Corps Uganda where I taught mathematics and computers to pre-service teachers and facilitated training for in-service teachers regarding continuous assessments. I graduated with a BS in Mathematics and Economics from the University of California, Los Angeles.Tessa Johnson
I am a PhD student in the EDMS program with a methods background in latent variable models and an applied background in prevention science. I got my bachelor's degree in psychology from the University of Georgia in Athens and my master's in counseling and educational research from Georgia State University. My statistical research is focused on the development of methodologies used in the social, behavioral, and health sciences, especially in the areas of longitudinal processes and unobserved population heterogeneity.
I work for Uniformed Services University's Consortium for Health and Military Performance, and have contributed to projects related to general military health, exertional heat stroke, injury risk, and health disparities. I have an MS in psychology from Loyola University in Baltimore, and am currently pursuing an MA at EDMS.
Daniel Yangsup Lee
Previously, I was a high school math teacher in New York City under the Teaching Fellows program. Sometime around 2010, I took an interest in Statistics and Measurement, after learning of its importance in the field of education, and started a Masterâ€™s in Applied Statistics at Teachers College. After completion, I joined the EDMS program as a full time Ph.D. student where I am now learning and working under the guidance of top-notch professors. My current research interests are broad but revolve mainly around topics prevalent in this area, such as SEM, IRT, large-scale assessment, and missing data.Jung-Jung Lee
I received both my Bachelor’s in psychology and my Master’s in Industrial and Organizational Psychology from California State University, San Bernardino in 2014 and 2016. After I obtained my Master’s degree, I worked as a Workforce Research Analyst with the primary duty of reporting labor market data for the local Southern California region surrounding a community college. My strong interest in Item Response Theory has pushed me to further my education to the next level and I joined EDMS as a doctoral student in 2017.
I graduated from Guangdong University of Foreign Studies with my Bachelor's Degree in Interpreting and Translation Studies. In 2013, I received my Master's degree in TESOL from University of Maryland, College Park. My research and work experiences in standardized language testing led me to EDMS program. I joined EDMS program as a doctoral student in 2014.
I am from Xiamen, China. In 2013, I earned a bachelor's degree in statistics from Beijing Normal University. After four years' study, I am fluent in analyzing data and statistical modeling. I got a chance to know psychometrics when auditing a psychology class, and thus became interested in it. Having a strong mathematical background, now I am keen to conduct interdisciplinary research in psychometrics to solve problems related to psychology and education.
I earned my B.S. degree in Applied Psychology in Sun Yat-sen University, China. I am always fascinated about the quantitative aspect of psychology. I came across educational measurement in the junior year of my undergraduate study, and became interested in it while participating in a research project to develop large-scale educational assessments. Though I have taken some courses related to Psychometric Theory and Item Response Theory, I believe that this area is well worth delving into. Gladly, I joined the EDMS Program as a master's student in the fall of 2015.
I got my Bachelor's degree at Xi'an Jiaotong University in China as an English major student. Fresh out of college, I joined the family of EDMS at UMCP as a master's student. Together with so many extremely knowledgeable faculties and wonderful students, I started a new life!
Before joining EDMS as a Ph.D. Student in 2014 Fall, I have earned double master's degrees in Statistics and Economics from the University of Illinois at Urbana-Champaign (UIUC), and double Bachelor's degrees in Economics and Psychology from Lanzhou University. Now, I am interested in CAT, Item Response Time, and Cheating Behavior Detecting. I hope to make a great difference to these wonderful areas!Xiulin Mao
I joined EDMS program after teaching at a Chinese university for more than 10 years. Coming back to school as a full time student was once a huge transition and challenge for me especially psychologically. Now I'm swimming in the ocean of knowledge with the help of fellow students and fantastic professors and looking forward to gaining as many achievements and experiences in this field as possible.
I am a third year Ph.D. student and the 2017-2018 UPASS (the EDMS student group) president. I earned a Master’s degree in Research, Measurement, and Evaluation form the University of Miami in 2015 while working as a Clinical Research Coordinator at the Sylvester Comprehensive Cancer Center in Miami, FL. Through my work I became interested in the role of questionnaires and surveys in research, which led me to pursue a Ph.D. in the EDMS program in 2015. My research interests include: Item Response Theory, Item Factor Analysis models, and the effects of measurement error on causal inferences.
I was born in Kenya and spent most of my life in Nairobi. I received my B.A. in Economics with a minor in Applied Mathematics from Smith College in 2013. I currently work at the QED Group as a part-time Human Resources Assistant. Prior to QED, I was a Survey Staff specialist at Westat. I believe that an EDMS degree will enable me to make a significant contribution in the area of statistics and research. When I am not studying or working, I enjoy reading, listening to music, hiking, and meeting new people.
Jordan Prendez is a fourth year Ph.D. student in EDMS and the 2016-2017 UPASS (the EDMS student group) president. He graduated from San Jose State University with a bachelor's degree in psychology. He is interested in improving statistical models treatment of real world data, and in statistical computing.
I received a bachelor's degree in Psychology from Juniata College in Huntingdon, PA in 2009. I went on to complete a Master's in Education from McDaniel College in Westminster, MD in 2011. I was a high school special education and mathematics teacher in Loudoun County, VA for 3 years before deciding to pursue a Ph.D. My interests in quantitative research, standardized testing, and education led me to the EDMS program in fall 2014.
I was a Test Prep teacher helping Chinese students with TOEFL, SAT, GRE, and GMAT after I received my bachelor degree in Electrical Engineering from the Nanjing University of Posts and Communications. It was from this working experience that I developed a strong interest in educational measurement. Then I earned a Master's degree in Statistics and Measurement from University of Pennsylvania. After that, I joined EDMS doctoral program with the aim to learn more in the field of psychometrics. My research interest now mainly lies in Computer Adaptive Testing and Item Response Theory.Kathleen Robens
I am a Nationally Board Certified mathematics teacher and have been teaching in Montgomery County Public Schools, Maryland for over 20 years. I’ve realized projects to strengthen cross curricular knowledge in STEM and was a teacher intern at the National Institute on Drug Abuse. My bachelor and master’s degrees are from the University of New Mexico and Teachers College, Columbia University.
I am currently pursuing a Ph.D. in Measurement, Statistics, and Evaluation (EDMS) at UMD to advance my ability to conduct and ultimately lead applied psychometric research. I am particularly interested in the use of clinical outcome assessments (COAs) in clinical research (such as drug development) and health care settings. I earned my BAs in Mathematical Statistics and Sociology from the University of Virginia (UVA) in 2008 and then went on to also earn my M.S. in Statistics from UVA in 2009. After graduating with my master's, I worked as a research analyst in the Office of Surveillance and Epidemiology (OSE) at the US Food and Drug Administration’s Center for Drug Evaluation and Research (FDA/CDER). In OSE, among other things, I helped conduct observational, epidemiologic post-market safety studies to investigate potential safety signals generated from adverse event reports. Subsequently, I worked as a Mathematical Statistician at the National Center for Health Statistics (NCHS) from 2013 to 2015. At NCHS, I gained experience with survey design and sampling methods, small area estimation, multiple imputation techniques (such as hot deck), and methods for evaluating the impact of imputed data on design effects and variance estimates. In July 2015, I transferred to the Office of Biostatistics (OB) at FDA/CDER (where I still currently work), where my work has been focused primarily on the use of COAs (particularly patient-reported outcome measures, or PROs) to inform medical product development and regulatory review. It is through my work in FDA/CDER’s Office of Biostatistics that I became interested in the fields of psychometrics and measurement theory—particularly the application of these methodologies to the development and use of PROs and other types of COAs in clinical research.Jie Sun
I received my master's degree in Applied Statistics from Bowling Green State University in 2014. After graduation, I started work in the National and International Statistics program at American Institutes for Research. In the last two years, I have enjoyed my work in conducting educational research, enhanced quantitative skills and expanded research areas. And, my research interests include large scale assessment, higher education research, and quantitative methods.Frances Turner
I am an Assistant Professor in Biology at Howard Community College. I am very interested in finding relevant ways to measure learning and designing useful assessment to evaluate the courses I coordinate and teach. I have a bachelor's degree in Biology from the University of Maryland Baltimore County and a master's degree in Molecular Science from Ryerson University.Anna Van Wie
I received my master's degree in EDMS in 1998. I returned to complete my Ph.D. in 2009. I work full time for UMUC assisting instructors with learning outcomes assessment. In addition to school and work, I have 4 little active boys to keep me very busy.Weimeng Wang
I got my bachelor's degree in Arabic and literature from Beijing Foreign Studies University in 2014 and received my master's degree in school counseling at University of Maryland in 2016. Now, I am a first year master's student in EDMS. Before I joined the program, I interned at local public schools as a school counselor. Meanwhile, I conducted educational research focused on bullying and bullying prevention with one of my professors.
I received my bachelor’s degree in Digital Media Technology from Beijing University of Technology, and I finished my master’s study at EDMS in 2015. After my graduation, I went back to China and worked as a virtual reality (VR) engineer focusing on 3D motion tracking for a year. I rejoined the EDMS program as a doctoral student in 2017. I am currently working with Dr. Yang Liu, and my research interests include (but not limited to) item response theory and diagnostic models.
I joined the EDMS doctoral program in 2013 after I earned my master's degree in statistics at the University of Illinois at Urbana-Champaign. I decided to go further in educational measurement after taking some related courses and volunteering in the Cognitive Development Lab in the Department of Psychology at UIUC. My previous research experiences were dimension detection of latent variables and non-parametric regression programming. I'm really excited to join this big EDMS family and I'm willing to devote myself to educational measurement, statistics, and evaluation during my doctoral study at the University of Maryland.Xiaying James Zheng
I received my master's degree at EDMS in 2014. I am extremely impressed by the faculty's dedication to students' success in life and research. I am now continuing to earn my Ph.D. at EDMS. My research interest is in measurement. Previously I worked as a program manager in a cross-cultural training organization. I also have experience working in institutional research for University System of Maryland.Yating Zheng
I graduated from Fudan University with a bachelor's degree in history. Motivated by an intense interest in psychometrics and quantitative methodology, I went to the Educational Psychology Quantitative Methods program at the University of Texas at Austin where I learned some psychometric and statistical models, developed data analysis skills and got a master's degree. To go further in this fascinating field, I joined the EDMS program as a doctoral student in 2015. My research interests include multilevel modeling, classification, and large-scale assessment in educational settings.Jinwang Zou
I graduated from Nanjing University in 2012 with a bachelor's degree in Economics. Before joining EDMS in 2014, I received a master's degree in Economics of Education from Renmin University of China. I found that EDMS is a good match for my research interest and experience so I joined this wonderful program.
EDMS 451 Introduction to Educational Statistics (3 credits). Restriction: Sophomore standing or higher. Credit only granted for: BIOM301, BMGT230, CCJS200, ECON230, ECON321, EDMS451, GEOG306, GEOL351, GVPT422, PSYC200, or SOCY201. Introduction to statistical reasoning; location and dispersion measures; computer applications; regression and correlation; formation of hypotheses tests; t-test; one-way analysis of variance; analysis of contingency tables.
EDMS 489 Field Experiences in Measurement and Statistics (1-4 credits). Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. Repeatable to 4 credits. Planned field experience in education-related activities. Credit not to be granted for experiences accrued prior to registration.
EDMS 498 Special Problems in Measurement and Statistics (1-3 credits). Prerequisite: Available only to education majors who have formal plans for individual study of approved problems. Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. Repeatable to 6 credits. Available only to education majors who have formal plans for individual study of approved problems.
EDMS 610 Classroom Assessment and Evaluation (3 credits). Develop the understandings and skills needed to validly, reliably, and accurately assess student learning and to provide focused leadership in the area of classroom assessment.
EDMS 622 Theory and Practice of Standardized Testing (3 credits). Prerequisite: EDMS451; or EDMS645. Principles of interpretation and evaluation of aptitude, achievement, and personal-social instruments; theory of reliability and validity; prediction and classification; norm- and criterion-referenced testing concepts.
EDMS 623 Applied Measurement: Issues and Practices (3 credits). Prerequisite: EDMS410. And EDMS645; or students who have taken courses with comparable content may contact the department. Measurement theory and its application at an intermediate level; test development, validation and interpretation; issues and recent developments in measurement.
EDMS 626 Instrumentation (3 credits). Prerequisite: EDMS623. Theory, development, and applications of various affective, cognitive, or behavioral measurement instruments and procedures, including questionnaire and test items, observational protocols, and cutting-edge innovative game and scenario-based assessments.
EDMS 645 Quantitative Research Methods I (3 credits). Research design and statistical applications in educational research: data representation; descriptive statistics; estimation and hypothesis testing. Application of statistical computer packages is emphasized.
EDMS 646 General Linear Models I (3 credits). Prerequisite: EDMS645; or an equivalent introductory statistics course. A first post-introductory inferential statistics course, with emphasis on analysis of variance procedures and designs from within the general linear modeling framework. Assignments include student analysis of education and related data; application of statistical software packages is emphasized.
EDMS 647 Causal Inference and Evaluation Methods (3 credits). Prerequisite: Must have completed or be concurrently enrolled in EDMS651. Counterfactual (potential outcomes) framework for causal inference, design/analysis strategies for confounder control, and specific best-practice applications to the evaluation of programs.
EDMS 651 General Linear Models II (3 credits). Prerequisite: EDMS646; or students who have taken courses with comparable content may contact the department. Multiple regression and correlation analysis; trend analysis; hierarchical and stepwise procedures; logistic regression; software for regression analysis.
EDMS 655 Introduction to Multilevel Modeling (3 credits). Prerequisite: EDMS651; or students who have taken courses with comparable content may contact the department. Introduction to multilevel models and methodology as strategies for modeling change and organizational effects.
EDMS 657 Exploratory Latent and Composite Variable Methods (3 credits). Prerequisite: EDMS651. Development of models for exploratory factor analysis and their practical applications. Additional topics will draw from latent class analysis, cluster analysis, mixture models, canonical correlation, multidimensional scaling, and configural frequency analysis.
EDMS 665 Data Analysis and Statistical Consulting (3 credits). Prerequisite: EDMS651; or students who have taken courses with comparable content may contact the department. Advanced data analysis procedures applied to real-world clients' problems arising in a wide variety of substantive research settings within and beyond education.
EDMS 722 Structural Modeling (3 credits). Prerequisite: EDMS657. Statistical theory and methods of estimation used in structural modeling; computer program applications; multisample models; mean structure models; structural models with multilevel data (e.g., sampling weights, growth models, multilevel latent variable models).
EDMS 724 Modern Measurement Theory (3 credits). Prerequisite: EDMS623 and EDMS651. Theoretical formulations of measurement from a latent trait theory perspective.
EDMS 738 Seminar in Special Problems in Measurement (1-3 credits). Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. Repeatable to 3 credits. An opportunity for students with special interests to focus in depth on contemporary topics in measurement. Topics to be announced, but will typically be related to applied and theoretical measurement. Recent topics include: Large Scale Assessment, Advanced Item Response Theory, Computer Adaptive Testing
EDMS 747 Design of Program Evaluations (3 credits). Prerequisite: EDMS626, EDMS651, and EDMS647. Or permission of instructor; and permission of EDUC-Human Development and Quantitative Methodology department. Analysis of measurement and design problems in program evaluations.
EDMS 769 Special Topics in Applied Statistics in Education (1-4 credits). Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. Designed primarily for students majoring or minoring in measurement, statistics or evaluation. Recent topics include: Mixture Models, Longitudinal Data Analysis, Advanced Structural Equation Modeling, Nonparametric Structural Equation Modeling
EDMS 771 Multivariate Data Analysis (3 credits). Prerequisite: EDMS651. Principal components, canonical correlation, discriminant functions, multivariate analysis of variance/covariance and other multivariate techniques.
EDMS 779 Seminar in Applied Statistics: Mathematical Foundations and Simulation Techniques (3 credits). Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. And must be in Measurement, Statistics and Evaluation (Master's) program; or must be in Measurement, Statistics and Evaluation (Doctoral) program. One part of the course will be dedicated to learning about simulation design, implementation, execution, and dissemination of results. For this material, SAS and R will be the primary computing platforms for both course delivery and student work. The second part of the course will include an introduction into statistical computing. This course material will be delivered in R and will involve using and writing code in R and/or SAS.
EDMS 787 Bayesian Inference and Analysis (3 credits). Prerequisite: EDMS651. Credit only granted for: EDMS769B or EDMS787. Formerly: EDMS769B. Models and model fitting methods commonly used in Bayesian Inference, such as Markov Chain Monte Carlo methods (e.g., Gibbs, Metropolis Sampling), with applications within and beyond the social and behavioral sciences. Analytical and philosophical differences between Frequentist and Bayesian statistics will also be highlighted.
EDMS 798 Special Problems in Education (1-6 credits). Restriction: Must be in Measurement, Statistics and Evaluation (Master's) program; or must be in Measurement, Statistics and Evaluation (Doctoral) program. Master's, EDMS majors, or doctoral candidates who desire to pursue special research problems under the direction of their advisors may register for credit under this number.
EDMS 799 Master's Thesis Research (1-6 credits). Restriction: Must be in a major within EDUC-Human Development and Quantitative Methodology department. Registration required to the extent of 6 credits.
EDMS 879 Doctoral Seminar (1-3 credits). Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. Seminar that supports analysis of doctoral projects and theses, and of other on-going research projects.
EDMS 889 Internship in Measurement and Statistics (3-12 credits). Prerequisite: Open only to students advanced to candidacy for doctoral degree. Restriction: Permission of EDUC-Human Development and Quantitative Methodology department. Provides internship experiences at a professional level of competence in a particular role with appropriate supervision. Credit not to be granted for experience accrued prior to registration. Open only to students advanced to candidacy for doctoral degree.
EDMS 898 Pre-Candidacy Research (1-8 credits)
EDMS 899 Doctoral Dissertation Research (1-8 credits). Registration required to the extent of 12 credits.
EDMS Psychometric Computation and Simulation Lab
The EDMS Psychometric Computation and Simulation (PCS) lab provides EDMS students with state-of-the-art software resources to conduct advanced cutting-edge simulation and computational research, and provides students and faculty a means to conduct highly specialized training sessions for students and researchers from across our campus and from other institutions who are seeking high level methodological training.
Maryland Assessment Research Center (MARC)
The Maryland Assessment Research Center (MARC) provides support to the range of assessment activities in the State, the region and nation by conducting basic and applied research to enhance the quality of assessment practice and knowledge. MARC is a project of the Measurement, Statistics, and Evaluation (EDMS) program in the Department of Human Development and Quantitative Methodology in the College of Education at the University of Maryland.
To accomplish its purposes, MARC houses expertise in assessment design, development, implementation, analysis, reporting and policy issues as well as the technical aspects of the quantitative theories that form the foundations of measurement. The nationally recognized EDMS program in the Department of Human Development and Quantitative Methodology, College of Education, and College Park Campus of the University of Maryland (UMCP) augment the capabilities of the MARC staff.
Center for Integrated Latent Variable Research (CILVR)
CILVR is a center whose goal is to serve as a national and international focal point for innovative collaboration, state-of-the-art training, and scholarly dissemination as they relate to the full spectrum of latent variable statistical methods.
CILVR is housed within the Measurement, Statistics and Evaluation (EDMS) program at the University of Maryland. EDMS faculty are recognized scholars in various facets of latent variable statistical models, whether it be item response theory, latent class analysis, mixture models, or structural equation modeling. EDMS is also unique in its geographic location, situated along the East Coast of the United States near the nation's capital, centrally located for scholars from the US and Europe, and a short distance from some of the world's top latent variable scholars working in academia, government, and applied research settings. Thus, EDMS is well positioned in its composition and its location to serve as a focal point for integrated latent variable research.
|October 2-3, 2017||Introduction to Meta-Analysis||Dr. Joshua Polanin||Click here for info.|
June 7-9, 2017
Introduction to Bayesian Statistical Modeling
Dr. Roy Levy, Arizona State University
Click here for info.
March 6-8, 2017
Introduction to Multilevel Analysis Methods: Hierarchical Linear Models
Dr. Robert Croninger, University of Maryland
Click here for info.
February 23-24, 2017
Next Generation Item and Test Development: A Practical Introduction to Automatic Item Generation
Dr. Jaehwa Choi, The George Washington University
Click here for info.
January 3-6, 2017
Structural Equation Modeling Winter Institute: Introduction and Advanced Topics
Dr. Gregory R. Hancock, University of Maryland
Click here for info.
November 3-4, 2016
2016 MARC Conference --"Data analytics and psychometrics: Informing Assessment practice"
Dr. Robert W. Lissitz & Dr. Hong Jiao, Hosts
Click here for info.
September 14-16, 2016
Longitudinal Data Analysis: A Latent Variable Perspective workshop
Dr. Jeffrey Harring
Click here for info.
June 8-10, 2016
Introduction to Bayesian Statistical Modeling workshop
Dr. Roy Levy
Click here for info.
March 9-11, 2016
Hierarchical Linear Modeling workshop
Dr. Robert Croninger
Click here for info.
January 4-8, 2016
Structural Equation Modeling Winter Institute
Dr. Gregory R. Hancock
Click here for info.
October 29-30, 2015
2015 MARC Conference --"Test fairness in the new generation of large-scale assessment"
Dr. Robert W. Lissitz & Dr. Hong Jiao, Hosts
Click here for info.
The Measurement, Statistics and Evaluation (EDMS) program at the University of Maryland is continually seeking qualified students for its master's and doctoral level programs.
- One of the nation's best programs in quantitative methods.
- A large number of Research and Teaching Assistantships.
- Training in a field with great career potential.
- Opportunities for field work on campus, in the Washington area, and nationally.
- A capable and active group of graduate students.
- Opportunities to work on projects through grants, contracts, and research centers.
What are Measurement, Statistics, and Evaluation?
Measurement: At its most basic level, the field of Measurement is concerned with the assignment of numbers to objects in some systematic, meaningful way. In education and the social sciences, measurement typically refers to a rational assignment of numbers so that they quantitatively describe some unobservable (i.e., latent) construct like ability, personality, attitude, satisfaction, etc. The assignment of these numbers must proceed in a carefully prescribed, reproducible fashion. In most cases, the process is based on a mathematical model that defines how the numbers should behave as the underlying construct changes. Measurement professionals develop and utilize a variety of mathematical tools to determine the reliability, validity and meaning of the numbers assigned during the measurement process. They are employed in both research and applied settings that involve psychological testing, educational testing, or the measurement of attitudes and preferences. These include large scale testing programs like the SAT and ACT.
Statistics: The field of Applied Statistics is primarily concerned with the development of testable research hypotheses, the application of powerful statistical tests to determine the plausibility of a given research hypothesis, and the design of experiments to control extraneous sources of variation. Other facets of this discipline include the design of schemes to collect sample data that are representative of a given population, the description of a population using alternative characteristics of sample data, and the development of alternative models to explain relationships between observable variables. Additionally, the assessment of model fit and the estimation of model parameters also fall under the rubric of Applied Statistics. Professionals in Applied Statistics develop and utilize quantitative techniques to test hypotheses, develop models for observed data, and assess the adequacy of those models. They are employed in a variety of settings where practical decisions must be made on the basis of observed data (e.g., business, industry, government, and education). These include agencies like the American Institutes fo Research, the U.S. Census Bureau, the National Center for Education Statistics, and a large number of private organizations that conduct surveys or polls.
Evaluation: The field of Evaluation is concerned with the application of measurement and statistical principles to objectively evaluate institutional and organizational programs. Programs are generally evaluated with regard to the way in which they are planned and implemented, and the degree to which they accomplish their mission. Examples of such evaluation efforts include the Tennessee model for evaluating school effectiveness, the extensive work on the effects of teacher training upon student success, and the evaluations of the National Head Start Program. Evaluation professionals might work for the Federal Government as contract monitors, for an industry interested in determining their level of success in marketing, sales, or production, for the State Superintendent of Schools examining the results of large expenditures of time and money, and of course, for a University interested in teaching and research in the systematic application of measurement and statistics to the determination of program value.
Measurement, Statistics and Evaluation offers programs of study leading to both the Master of Arts (M.A.) and Doctor of Philosophy (Ph.D.) degrees. M.A. students generally take introductory coursework in measurement, applied statistics, and evaluation. Ph.D. students typically complete an analogous program of study, after which they may continue with a multidisciplinary focus or concentrate primarily on a single discipline.
Most of the educational and social science research that takes place today relies on the expertise of those who develop data collection instruments (such as assessments, questionnaires, and interview protocols), plan research and evaluation studies, develop new models and methods, design sampling frameworks, and collect and analyze data. The University of Maryland programs in Measurement, Statistics and Evaluation provide students with advanced skills in these areas. The master's program gives individuals the broad range of skills necessary to serve as research associates in academic, government, and business settings. The doctoral program qualifies individuals to provide leadership in the conduct of research studies, to serve as applied statisticians, measurement, or evaluation specialists in school systems, industry, and government, and to teach quantitative courses at the university level. We are widely recognized as one of the best applied quantitative programs in the country. Our students are exceptional. Our faculty members are respected leaders within their specialties.
We can usually offer research or teaching assistantships to most of our full-time students. Assistantships are full-time appointments (20 hours per week) that carry both a stipend and a tuition waiver. During a research assistantship, students pursue both individual and collaborative research projects with a faculty member advisor. These appointments are renewable. The duties of teaching assistants range from instructional support (e.g., grading) to taking full responsibility for teaching a section of an undergraduate course (with appropriate supervision and support).
There is a tremendous demand for individuals with the quantitative skills provided by our program. Recent surveys of degree programs and employment in measurement show that there will continue to be a shortfall in the number of measurement professionals relative to the number of available employment opportunities. Similarly, the National Science Foundation predicts a strong job market for statisticians over the next ten years.
The need for applied quantitative professionals is strikingly evident in educational research. As education in the United States pushes toward greater accountability and better informed decision making, there has been an ever-increasing demand for professionals who can interpret the abundance of collected data. Each of the nation's 16,000 school districts and each of the 50 state education departments collects large quantities of data to assess the impact of schooling in its community. There are many state and federal data-collection programs, ranging from a regular biennial survey of U.S. schools to specialized cross-sectional and longitudinal surveys. Many of these survey and testing programs result from legislative mandates and they all rely heavily on instrumentation, research design, sampling schemes, and data analysis. Almost all educational research today depends on hard data, rigorous quantitative approaches, and individuals with research expertise. This is a field that is expanding; people recognize that quantitative methods are critical to developing educational theory and are needed to provide a sound foundation for educational progress. As summarized by the American Statistical Association, [People with quantitative skills] concerned with educational issues readily find employment in universities, state and federal government agencies, and private research agencies. The many needs of the education sector for statistical expertise promise exciting careers to all whose interests lie in this field.
Quantitative professionals are not only in high demand, but they also have jobs that are both professionally and personally desirable. The problems that these professionals help solve often have substantial impact on the lives of many other individuals. Additionally, many sources often rank the job of "Statistician" as the most attractive profession with regard to working conditions.
Field Work and Postgraduate Opportunities
Our reputation and location provide tremendous opportunities for our students to conduct special projects and be involved with real-world, often ground-breaking, applications in government, research firms, associations, and private industry. Current and recent students have conducted special projects with or have been employed by American Institutes for Research, The Substance Abuse and Mental Health Services Administration, The Census Bureau, Westat, State Departments of Education, Local Education Agencies, National Education Association, U.S. Department of Education, Educational Testing Service, Maryland Assessment Research Center, Center for Applied Linguistics, and the Center for Biologics Evaluation and Research of the Food and Drug Administration. They also have taken professorial positions at such institutions as Arizona State University, University of Nebraska, University of Georgia, University of Texas - Austin, The George Washington University, University of Houston, Southern Illinois University, University of Hawaii, and University of Maryland Baltimore County.
We are committed to maintaining high standards for admission to both our master's and doctoral programs, and we have been able to attract top students from across the United States and the world. Our students come from undergraduate institutions, government, professional associations, consulting firms, research units of companies, and the public schools. Our doctoral students are among the best on any campus. The real-world experiences, skills, and aptitude of our students help make our program intellectually rigorous while providing exceptional peer-to-peer support. Approximately one-half of our students attend full-time, and there is a very active departmental student organization.
Frequently Asked Questions
How many students are in the program? We currently have approximately 45 students enrolled in our M.A. and Ph.D. programs.
Do I need to have experience in education to be accepted? No. Our students come from a variety of disciplines both within and outside of education. The dominant characteristic of successful candidates is that they are very interested in the application of quantitative techniques to solve practical problems. Many of these problems arise in educational settings, but others emerge in the social sciences, government, business, and industry.
I have an undergraduate degree (B.A./B.S.) - may I enter directly into the Ph.D. program? This is a possibility, but very rare. Typically, Ph.D. applicants first complete a relevant master's program. If one completes the EDMS master's program first, then one may apply for the Ph.D. program subsequently. Note that all courses taken as part of an EDMS master's degree apply toward the Ph.D., and the doctoral preliminary examination is waived for those who successfully complete the EDMS master's degree.
Do I need to have a background in mathematical statistics to enter your program? No, but strong quantitative skills and interests are necessary. Undergraduate courses in calculus and linear algebra provide necessary skills for our program.
Do you have many foreign students in your programs? Yes. Approximately 50% of our students are from countries other than the United States. For example, we have strong contingents of students from China and Korea.
What are the admissions criteria? The EDMS Admissions Committee meets regularly during the academic year. Decisions are based on several sources of information including scores on the Graduate Record Examination (GRE), undergraduate transcripts, letters of recommendation, and a statement of interest.
What are the deadlines for admission? Please see the Graduate School web site for current information about admissions for applicants including those holding foreign visas. Our selection committee meets throughout the academic year. However, note that the Graduate School clearance procedures for foreign students can take several months and we do not see your application until these clearances are complete. All applicants should keep track of their applications by visiting the Graduate School web site rather than by e-mailing EDMS.
How much is tuition? Please see the Bursar's Office web site for current information about student tuition. Note that students with full-time teaching or research assistantships, or fellowships, within the Department receive a tuition waiver in the amount of 10 credit hours per semester.
What types of financial aid are available? In addition to Department teaching and research assistantships, the University also offers financial assistance in the form of graduate fellowships, tuition scholarships, support grants, and variety of need-based financial aid programs. However, since financial aid is limited, those applicants who need financial aid should apply as early as possible.
When are classes held? Most classes are held once per week during the evening hours (e.g., 4:15-7:00 PM) so that part-time students with employment/family commitments can attend along with full-time students.
How long does it take a student to finish the program? The degree programs are structured so that a full-time student could finish requirements for the M.A. degree in approximately 2 years whereas approximately 4 years would be required to finish the Ph.D. requirements.
How well does our program accommodate part-time students? Most courses in the department are scheduled during late afternoon or evening hours to enable part-time students to attend classes. Faculty are generally available for consultation during evening hours as well. This provides part-time students with opportunities for both traditional instruction and individual study/research mentored by department faculty. However, some required courses for Ph.D. students require attendance during daytime hours.
More Information? Contact Dr. Gregory R. Hancock, EDMS Program Director
The University of Maryland is an equal opportunity institution with respect to both education and employment. The university's policies, programs and activities are in conformance with state and federal laws and regulations on non-discrimination regarding race, color, religion, age, national origin, political affiliation, gender, sexual orientation or disability
The College of Education has long provided instruction in quantitative research methods. Prior to 1964, there was one full-time professor in the area serving a College faculty of approximately 50 members. Although there were recognized areas of specialization, there was, at that time, no departmental structure within the College. Rather, the College operated as a single administrative unit headed by a Dean. During the next ten years there was rapid growth in the size of College, reaching about 200 faculty by the mid-1970s. In this period, several new departments were formed including EDMS around 1972 (we’re a little fuzzy on the exact year). Throughout its history as a recognized area within the College, as a department, and now as a program within the HDQM Department, EDMS has engaged in the dual roles of professional training and service to the College.
In its first role, EDMS has provided training at the master's and doctoral levels for students planning to pursue careers in quantitative areas related to applied statistics, measurement, and evaluation. An important impetus for the development of the major program was a relatively large fellowship program funded in 1966 by Title IV of the Elementary and Secondary Education Act (ESEA). This grant provided support for 12 doctoral students per year with about half of the positions going to EDMS majors and the remainder to minors with majors in other departments within the College. Graduates from the original ESEA fellowship program pursued a variety of important career paths including professorships in higher education, research positions with Maryland boards of education and research posts in the federal government. In fact, one very early graduate went on to become superintendent of the Baltimore City school system. The major program in EDMS had, and still has, a total of about 40-50 full- and part-time students. The number of fulltime faculty in EDMS has fluctuated from six to eight with a budgeted complement of seven at present.
In its second role, EDMS has provided service courses in applied statistics, measurement, and evaluation for graduate students majoring in other departments and programs in the College. In addition, faculty members have always been in high demand as members of doctoral research committees and as consultants to various externally funded projects within the College. This latter activity dates to the very earliest days of EDMS when one current faculty member was partially funded by a grant in the area of pupil personnel services and counselor education. The College has always required some course work provided by EDMS, with courses now in high demand across the campus. The professional training and service roles of EDMS serve complementary purposes. Virtually all full-time graduate students in EDMS are on some form of financial support and many of these students play key roles in service courses. In particular, several EDMS students each semester are employed as teaching assistants (tutors) for students from other units who are enrolled in courses such as EDMS 451, EDMS 645, EDMS 646, and EDMS 651. Also, advanced graduate students serve as primary instructors for the undergraduate EDMS 451 course. In effect, a strong cadre of EDMS graduate students is central to the service teaching role of the department.
Historically, EDMS had not vigorously sought external funding for its own research efforts. While, as noted, faculty members have participated in projects in other units, the procurement of major funding was rarely pursued. However, within the last 10-15 years EDMS has, with great success, focused major effort in the procurement of external funding through both grants and contracts dedicated to efforts specifically in measurement, statistics, and evaluation.
Sweet, T. M., & Zheng, Q. (in press). Incorporating network-level covariates into hierarchical mixed membership stochastic blockmodels. Social Networks.
Spillane, J., Shirrell, M., & Sweet, T. M. (2017). Social geography and work-related intra-organizational ties: Exploring physical proximity and social interactions in the schoolhouse. Sociology of Education, 90, 149-171.
Sweet, T. M., & Zheng, Q. (2017). A model-based measure for subgroup integration in social networks. Social Networks, 48, 169-180.
Reeve, B. B., Thissen, D., DeWalt, D. A., Huang, I.-C., Liu, Y., Magnus, B., Quinn, H., Gross, H. E., Kisala, P. A., Ni, P., Haley, S., Mulcahey, M., Charlifue, S., Hanks, R. A., Slavin, M., Jette, A., & Tulsky, D. S. (2016). Linkage between the PROMIS pediatric and adult emotional distress measures. Quality of Life Research, 25, 823-833.
Liu, Y., & Yang, J. S. (in press). Bootstrap-calibrated interval estimates for latent variable scores in item response theory. Psychometrika.
Liu, Y., & Yang, J. S. (in press). Interval estimation of scale scores in item response theory. Journal of Educational Behavioral Statistics.
Depaoli, S., & Liu, Y. (in press). Review: Bayesian Psychometric Modeling. Psychometrika.
Chalmers, R. P., Pek, J., & Liu, Y. (2017). Profile-likelihood confidence intervals in item response theory models. Multivariate Behavioral Research.
Wang, X., Liu, Y., & Hambleton, R. K. (2017). Detecting item preknowledge using a predictive checking method. Applied Psychological Measurement.
Liu, Y., & Hannig, J. (2017). Generalized fiducial inference for logistic graded response models. Psychometrika.
Liu, Y., & Hannig, J. (2016). Generalized fiducial inference for binary logistic item response models. Psychometrika, 81, 290-324.
Liu, Y., Magnus, B. E., & Thissen, D. (2016). Modeling and testing differential item functioning in unidimensional binary item response models with a single continuous covariate: A functional data analysis approach. Psychometrika, 81, 371-398.
Magnus, B. E., Liu, Y., He, J., Quinn, H., Thissen, D., Gross, H. E., DeWalt, D. A., & Reeve, B. B. (2016). Mode effects between computer self-administration and telephone interviewer-administration of the PROMIS pediatrics measures, self-and proxy report. Quality of Life Research, 25, 1655-1665.
Amaya, A., & Harring, J. R. (in press). Assessing the effect of social integration on unit nonresponse in household surveys. Journal of Survey Statistics and Methodology.
Harring, J. R., McNeish, D., & Hancock, G. R. (in press). Using phantom variables in structural equation modeling to assess model sensitivity to external misspecification. Psychological Methods.
Harring, J. R., & Johnson, T. (in press). Two-way analysis of variance. In B. Frey (Ed.), The SAGE encyclopedia of educational research, measurement and evaluation. Thousand Oaks, CA: SAGE Publications.
Panlilio, C., Harring, J. R., & Jones Harden, B. (in press). Fear regulation patterns of young children in foster care: An exploratory process-centered approach. Child Abuse & Neglect: Special Issue on Measurement of Child Maltreatment.
Park, J., & Harring, J. R. (in press). Exploring symptom clusters in people with heart failure. Journal of Advanced Nursing.
McNeish, D., & Harring, J. R. (2017). Class enumeration in growth mixture models when assumptions are violated: A synthesis and simulation. Journal of Classification. DOI: 10.1007/s00357
Leech, K., Wei, R., Harring, J. R., & Rowe, M. L. (in press). A brief parent-focused intervention to improve preschooler's conversational skills and school readiness. Journal of Developmental Psychology.
Stapleton, L. M., Yang, J. S., & Hancock, G. R. (2016). Construct meaning in multilevel settings. Journal of Educational Behavioral Statistics, 41,481-520.
Hancock, G. R. (in press). Convergence. In B. Frey (Ed.), The SAGE encyclopedia of educational research, measurement and evaluation. Thousand Oaks, CA: SAGE Publications.
McNeish, D., & Hancock, G. R. (in press). The effect of measurement quality on targeted structural model fit indices: A comment on Lance, Beck, Fan, and Carter (2016). Psychological Methods.
Hodis, F. A., & Hancock, G. R. (Eds.). (2016). Special issue: Advances in quantitative methods to further research in education and educational psychology. Educational Psychologist, 51(3-4).
Myers, N. D., Celimli, S., Martin, J. J., & Hancock, G. R. (2016). Sample size determination and power estimation in structural equation modeling. In N. Ntoumanis & N. D. Myers (Eds.), An introduction to intermediate and advanced statistical analyses for sport and exercise scientists. Chichester, UK: Wiley.
Liu, M., Harbaugh, A. G., Harring, J. R., Hancock, G. R. (in press). The effect of extreme response and non-extreme response styles on testing measurement invariance. Frontiers in Psychology (Quantitative Psychology and Measurement section).
Kang, Y., & Hancock, G. R. (in press). The effect of scale referent on tests of mean structure parameters. Journal of Experimental Education.
Kang, Y., McNeish, D. M., & Hancock, G. R. (in press). The role of measurement quality on practical guidelines for assessing measurement and structural invariance. Educational and Psychological Measurement, 76, 533-561.
McNeish, D., An, J., & Hancock, G. R. (in press). Illustrating the problematic relation between measurement quality and fit index cut-offs. Journal of Personality Assessment.
Gartstein, M. A., Hancock, G. R., & Iverson, S. L. (in press). Positive affectivity and fear trajectories in infancy: contributions of mother-child interaction factors. Child Development.
Freimuth, V., Jamison, A., Hancock, G. R., Musa, D., Hilyard, K., & Quinn, S. (in press). The role of risk perception in flu vaccine behavior among African American and White Adults in the US. Risk Analysis.
Quinn, S., Jamison, A., Freimuth, V., An, J., Hancock, G. R., & Musa, D. (in press). Exploring racial influences on flu vaccine attitudes and behavior: Results of a national survey of African American and white adults. Vaccine.
Hancock, G. R., & McNeish, D. M. (2017). More powerful tests of simple interaction contrasts in the two-way factorial design. Journal of Experimental Education, 85, 24-35.
Rhemtulla, M., & Hancock, G. R. (2016). Planned missing data designs in educational psychology research. Educational Psychologist, 51, 305-316.
Holten, A-L., Hancock, G. R., Gemzøe, E. G., Persson, R., Hansen, Å. M., & Hogh, A. (2016). The longitudinal effects of organizational change on experienced and enacted bullying behaviour. Journal of Change Management, 16, 1-23.
Holten, A-L., Hancock, G. R., Persson, R., Hansen, Å. M., & Høgh, A. (2016). Knowledge hoarding: Antecedent or consequent of negative acts? The mediating role of trust and justice. Journal of Knowledge Management, 20, 215-229.
Hodis, F. A., & Hancock, G. R. (2016). Introduction to the Special Issue: Advances in quantitative methods to further research in education and educational psychology. Educational Psychologist, 51, 301-304.
Bright, C. L., Sacco, P., Kolivoski, K. M., Stapleton, L. M., Jun, H.-J., & Morris-Compton, D. (2017). Gender differences in patterns of substance use and delinquency: A latent transition analysis. Journal of Child & Adolescent Substance Abuse, 26, 162-173.
Stapleton, L. M., McNeish, D. M., & Yang, J.-S. (2016). Multi-level and single-level models for measured and latent variables when data are clustered. Educational Psychologist, 51, 317-330.
McNeish, D. M., & Stapleton, L. M. (2016). Modeling clustered data with very few clusters. Multivariate Behavioral Research, 51, 495-518.
McNeish, D. M., Stapleton, L. M., & Silverman, R. (2016). On the unnecessary ubiquity of hierarchical linear modeling. Psychological Methods.
Stapleton, L. M., & Kang, Y. (2016). Design effects of multilevel estimates from national probability samples. Sociological Methods & Research.
McNeish, D., & Stapleton, L. M. (2016). The effect of small sample size on two-level model estimates: A review and illustration. Educational Psychology Review, 28, 295-314.
Rowe, M. L., Denmark, N., Jones Harden, B. P., & Stapleton, L. M. (2016). The role of parent education and parenting knowledge in children’s language and literacy skills among White, Black, and Hispanic American families. Infant and Child Development, 25, 198-220.
Jiao, H., Lissitz, R. W., & Zhan, P. (2017). Calibrating innovative items embedded in multiple contexts. In H. Jiao & R. W. Lissitz (Eds.), Technology-enhanced innovative assessment: Development, modeling, scoring from an interdisciplinary perspective. Charlotte, NC: Information Age Publishing.
Jiao, H., & Li, C. (in press). Progress in International Reading Literacy Study (PIRLS) data. In The SAGE Encyclopedia of Educational Research, Measurement, and Evaluation. Thousand Oaks, CA: Sage.
Jiao, H., & Liao, D. (in press). Testlet response theory. In The SAGE Encyclopedia of Educational Research, Measurement, and Evaluation. Thousand Oaks, CA: Sage.
Zhan, P. Jiao, H., & Liao, D. (2017). Cognitive diagnosis modeling incorporating item response times. British Journal of Mathematical and Statistical Psychology.
Luo, Y., & Jiao, H. (2017). Using the Stan program for Bayesian item response theory. Educational and Psychological Measurement.
Li, T., Xie, C., & Jiao, H. (2016). Assessing fit of alternative unidimensional polytomous item response models using posterior predictive model checking. Psychological Methods, 22, 397-408.