Skip to Content

Peabody Journal of Education, Volume 90, Issue 4, 2015

Competency Research in Higher Education

Introduction

Competency Research in Higher Education: Conceptual and Methodological Challenges and Perspectives for Future Interdisciplinary Research

Olga Zlatkin-Troitschanskaia, Johannes Gutenberg University Mainz

Sigrid Blömeke, University of Oslo

Hans Anand Pant, Humboldt University of Berlin

This issue of the Peabody Journal of Education focuses on “Competency Research in Higher Education: Conceptual and Methodological Challenges and Perspectives for Future Interdisciplinary Research” and expands upon the limited research done on competency in higher education.   While competency research in k-12 and vocational schools has received much attention, scholars have not devoted as much time to studying the processes of competency evaluation in higher education.  Now more than ever studies of the effectiveness of these programs are important, as rapid globalization has created a need for these programs to be understood and comparable to other programs internationally. However, providing quality competency assessment is difficult.  The different aspects of higher education vary greatly, so coming up with ways to assess them can be challenging

The authors assert that the international Assessment of Higher Education Learning Outcomes (AHELO) claims that, despite the challenges, evaluating the competency of higher education is possible and doable. They note that additional longitudinal and multilevel modeling would help to increase the body of research conducted.  As for efforts thus far, some countries use generalizable tests to measure student achievement in one particular program. Also, scholars have investigated cognitive outcomes, but do not link these fully to non-cognitive outcomes.  

Valid assessment of competency requires the presence of several aspects: criteria of proper student cognition, a clear definition of what observations that reflect student competency are, and ways to make sense of the observed data.  Assessments need to be applicable in the observed environment, thus it is necessary for these tests to be adaptable to different circumstances.

CONTENT OF THIS ISSUE

In this issue, the authors explore current German research done through the Modeling and Measuring Competencies in Higher Education program (KoKoHs) an effort to highlight the types of research in competency assessment in higher education.  In addition to the description of current practices, the articles offer ideas for further study, and most notably provide a vision for what the future of this burgeoning field of research will include.  The authors state that competency tests will successfully illuminate the results of higher education and will prompt students to use their knowledge in their fields.

The articles reflect the work of an international team of experts from dozens of higher education institutions.  The articles draw on the efforts of the KoKoHs project, which provides a framework for and assesses generic and domain-specific competencies in five fields of higher education: economics, social sciences, engineering, educational sciences, and teacher training in STEM subjects.  The issue examines in four sections the competency assessments focused of four areas: engineering, business and economics, teacher training and higher education and graduates.

The engineering section contains two articles: one by Neumann, Rösken-Winter, Lehmann, Duchhardt, Heinze and Nickolaus, and the second by Taskinen, Steimel, Grafe, Engell and Frey.

The business and economics section has two articles, too.  The first is written by Bouley, Wuttke, Schnick-Vollmer, Schmitz, Berger, Fritsch, and Seifried and, the second by Brückner, Forster, Zlatkin-Troitschanskaia, Happ, Walstad, Yamaoka, and Asano.

The third section on teacher training has three articles.  Bender, Hubwieser, Schaper, Margaritis, Berges, Ohrndorf, Magenheim, and Schubert wrote the first.  The second is by Tiede, Grafe, and Hobbs.  Wäschle, Lehmann, Brauch and Nückles offer the third article.

The fourth and final section has two articles: the first written by Groß Ophoff, Schladiz, J. Leuders, T. Leuders, and Wirtz, and the second written by Braun and Brachem.

The articles together show how important it is that research examine both domain-specific and general abilities garnered in higher education institutions, and how valid assessments of these institutions and the competency of the students they graduate are needed.  The authors provide theoretical and methodological approaches that both set the standard for such competency evaluation and prompt more work in competency assessment in the area of higher education and fields like it.    

REFERENCES

1. AERA, APA, & NCME (American Educational Research Association, American Psychological Association, & National Council on Measurement in Education). (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.

2. Blömeke, S., Gustafsson, J.-E., & Shavelson, R. (2015). Beyond dichotomies: Competence viewed as a continuum. Zeitschrift für Psychologie, 223, 3–13.

3. Blömeke, S., Zlatkin-Troitschanskaia, O., Kuhn, C., & Fege, J. (Eds.). (2013). Modeling and measuring competencies in higher education. Rotterdam, the Netherlands: Sense Publishers.

4. Kane, M. T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50(1), 1–73.

5. Mislevy, R. J., & Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educational Measurement:     Issues and Practice, 25(4), 6–20.

6. Pant, H. A., Rupp, A. A., Tiffin-Richards, S. P., & Köller, O. (2009). Validity issues in standard-setting studies. Studies in Educational Evaluation, 35(2–3), 95–101.

7. Pellegrino, J. W., Chudowsky, N., & Glaser, R. (Eds.) (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: The National Academies Press.

8. Tiffin-Richards, S. P., & Pant, H. A. (2015). Arguing validity in educational assessment. In D. Leutner, J. Fleischer, J. Grünkorn, & E. Klieme (Eds.), Competence assessment in education: Research, models and instruments. Heidelberg, Germany: Springer.

9. Tremblay, K. (2013). OECD Assessment of Higher Education Learning Outcomes (AHELO): Rationale, challenges and initial insights from the feasibility study. In S. Blömeke, O. Zlatkin-Troitschanskaia, C. Kuhn, & J. Fege (Eds.), Modeling and measuring competencies in higher education (pp. 113–126). Rotterdam, the Netherlands: Sense Publishers.

10. Webb, N. M., Shavelson, R. J., & Steedle, J. T. (2012). Generalizability theory in assessment contexts. In C. Secolsky & B. D. Denison (Eds.), Handbook on measurement, assessment, and evaluation in higher education (pp. 132–149). New York, NY: Routledge.

11. Zlatkin-Troitschanskaia, O., Kuhn, C., & Toepper, M. (2014). Modelling and assessing higher education learning outcomes in Germany. In H. Coates (Ed.), Assessing learning outcomes: Perspectives for quality improvement (pp. 213–235). Frankfurt, Germany: Lang.

12. Zlatkin-Troitschanskaia, O., Pant, H. A., Kuhn, C., Toepper, M. & Lautenbach, C. (2015). Messung akademisch vermittelter Kompetenzen von Studierenden und Hochschulabsolventen—Ein U berblick zum nationalen und in- ternationalen Forschungsstand [Measuring academically taught competencies of higher education students and graduates—Analyzing   the national and international state of research]. Wiesbaden, Germany: VS-Verlag.

13. Zlatkin-Troitschanskaia, O., Shavelson, R. J., & Kuhn, C. (2015). The international state of research on measurement of competency in higher education. Studies in Higher Education, 40(3), 393–411.

CONTENTS

Journal abstracts are linked to titles.

Competency Research in Higher Education: Conceptual and Methodological Challenges and Perspectives for Future Interdisciplinary Research
Olga Zlatkin-Troitschanskaia, Sigrid Blömeke & Hans Anand Pant
Peabody Journal of Education: Issues of Leadership, Policy, and Organizations, Vol. 90, No. 4: pages 459-464.

Measuring Mathematical Competences of Engineering Students at the Beginning of Their Studies
Irene Neumann, Bettina Rösken-Winter, Malte Lehmann, Christoph Duchhardt, Aiso Heinze & Reinhold Nickolaus
Peabody Journal of Education: Issues of Leadership, Policy, and Organizations,
Vol. 90, No. 4: pages 465-476.

A Competency Model for Process Dynamics and Control and Its Use for Test Construction at University Level
Päivi H. Taskinen, Jochen Steimel, Linda Gräfe, Sebastian Engell & Andreas Frey
Peabody Journal of Education: Issues of Leadership, Policy, and Organizations, Vol. 90, No. 4: pages 477-490.

Professional Competence of Prospective Teachers in Business and Economics Education: Evaluation of a Competence Model Using Structural Equation Modeling
Franziska Bouley, Eveline Wuttke, Kathleen Schnick-Vollmer, Bernhard Schmitz, Stefanie Berger, Sabine Fritsch & Jürgen Seifried
Peabody Journal of Education: Issues of Leadership, Policy, and Organizations, Vol. 90, No. 4: pages 491-502.

Gender Effects in Assessment of Economic Knowledge and Understanding: Differences Among Undergraduate Business and Economics Students in Germany, Japan, and the United States
Sebastian Brückner, Manuel Förster, Olga Zlatkin-Troitschanskaia, Roland Happ, William B. Walstad, Michio Yamaoka & Tadayoshi Asano access
Peabody Journal of Education: Issues of Leadership, Policy, and Organizations, Vol. 90, No. 4: pages 503-518.

Towards a Competency Model for Teaching Computer Science
Elena Bender, Peter Hubwieser, Niclas Schaper, Melanie Margaritis, Marc Berges, Laura Ohrndorf, Johannes Magenheim & Sigrid Schubert
Peabody Journal of Education: Issues of Leadership, Policy, and Organizations, Vol. 90, No. 4: pages 519-532.

Pedagogical Media Competencies of Preservice Teachers in Germany and the United States: A Comparative Analysis of Theory and Practice
Jennifer Tiede, Silke Gräfe & Renee Hobbs
Peabody Journal of Education: Issues of Leadership, Policy, and Organizations, Vol. 90, No. 4: pages 533-545.

Prompted Journal Writing Supports Preservice History Teachers in Drawing on Multiple Knowledge Domains for Designing Learning Tasks
Kristin Wäschle, Thomas Lehmann, Nicola Brauch & Matthias Nückles
Peabody Journal of Education: Issues of Leadership, Policy, and Organizations, Vol. 90, No. 4: pages 546-559.

Assessing the Development of Educational Research Literacy: The Effect of Courses on Research Methods in Studies of Educational Science
Jana Groß Ophoff, Sandra Schladitz, Juliane Leuders, Timo Leuders & Markus A. Wirtz
Peabody Journal of Education: Issues of Leadership, Policy, and Organizations, Vol. 90, No. 4: pages 560-573.

Requirements Higher Education Graduates Meet on the Labor Market
Edith M. P. Braun & Julia-Carolin Brachem
Peabody Journal of Education: Issues of Leadership, Policy, and Organizations, Vol. 90, No. 4: pages 574-595.

Corrigendum
Peabody Journal of Education: Issues of Leadership, Policy, and Organizations, Vol. 90, No. 4: page 596.


Quick Links