COMPARATIVE ASSESSMENT OF PROFESSIONAL COMPETENCIES OF EDUCATIONAL STANDARDS OF INFORMATION AND MEASUREMENT TECHNOLOGIES SPECIALTY

2024;
: pp.36-41
1
State Enterprise “Ukrmetrteststandard”
2
State Enterprise “Ukrmetrteststandard”, Ukraine

National education standards are documents that define expected educational achievements, knowledge, skills, and competencies. These standards may vary between countries and regions, but they are important to ensure the quality of education and comparability between different education systems. An important element of each education standard is the definition of professional competencies to be acquired. The formation of such competencies should be carried out with the involvement of the main stakeholders. To evaluate professional competencies according to the standards of bachelor’s and master’s education in information and measurement technologies, their group expert evaluation was carried out by stakeholder specialists using the averaging of indicators and the Rasch model. The results of assessment by both methods and their comparison are considered. The obtained results showed their convergence. The analysis of the results obtained according to the multivariate Rasch model showed that the measurement data according to this model allows calculating established statistics both for the competencies under consideration and for the involved experts. The Rasch model can be a useful tool for assessing the importance of established professional competencies for different levels of higher education in different specialties. Experts had the most doubts about the competence of a bachelor in the ability to develop a regulatory and methodological framework for quality assurance and technical regulation and to develop a scientific and technical basis for quality management systems and certification tests, as well as the competence of a masters in the ability to manage projects and startups and evaluate their results. Therefore, these competencies require special attention during the next revision of education standards for greater balancing of the relevant competency set.

  1. Standards and Guidelines for Quality Assurance in the European Higher Education Area (ESG). [Online].                             Available:                 https://www.enqa.eu/wp- content/uploads/2015/11/ESG_2015.pdf.
  2. European Approach for Quality Assurance of Joint                         Programmes.        [Online]. Available: https://www.eqar.eu/assets/uploads/2018/04/02_European_Approach_QA_of_Joint_Programmes_v1_0.pdf.
  3. W. L. Magette, M. G. Richardson, “Application of Bologna cycle program structures and the European credit transfer system to Irish civil engineering programs”, European Journal of Engineering Education, no 45(5), 2020, pp. 794–808. [Online]. Available: https://eric.ed.gov/?id=EJ1270406.
  4. J. Gleeson, R. Lynch, O. McCormack, “The European Credit Transfer System (ECTS) from the perspective of Irish teacher educators”, European Educational Research Journal, no 20(11), 2021,147490412098710. DOI: 10.1177/1474904120987101.
  5. Resolution of the Cabinet of Ministers of Ukraine, Nov. 23, 2011, no. 1341 “On Approval of the National Framework of Qualifications” (in Ukrainian). [Online]. Available: https://zakon.rada.gov.ua/laws/ show/1341-2011-%D0%BF#Text.
  6. Gordiyenko T., Velychko O., Salceanu A., “The Group Expert Evaluation in Electrical Engineering Education”, in Proceedings of the 2018 Intern. Conf. and Exposition on Electrical and Power Engineering (EPE 2018), Iasi, Romania, 2018, 6 p. DOI: 10.1109/ICEPE.2018.8559852.
  7. Koren E.V., “Formation of professional competence of students of technical specialties”, Modern scientific researches, vol. 2, no 4, pp. 51-55, 2018. DOI: 10.30889/2523-4692.2018-04-02-028.
  8. Maya V. Bernavskaya, “Methodology of a system of professional competence”, Pacific Science Review, vol. 16, no 2, pp. 81-84, 2014. DOI: 10.1016/j.pscr.2014.08.017.
  9. G. M. Rocha, R. P. Landim, “Inmetro's Met- rology Executive Master’s Degree Course”. Procedia – Social and Behavioral Sciences, vol. 46, pp. 4928-4932, 2012. DOI: 10.1016/j.sbspro.2012.06.361.
  10. N. Lukyanova, Y. Daneykin, N. Daneikina, “Communicative Competence Management Approaches in Higher Education”, Procedia – Social and Behavioral Sciences, vol. 214, pp. 565-570, 2015. DOI: 10.1016/j.sbspro.2015.11.761.
  11. John H. Bond, “Evaluation of Trainee Competence”, Gastrointestinal Endoscopy Clinics of North America, vol. 5, no 2, pp. 337-346, 1995. DOI: 10.1016/S1052-5157(18)30444-6.
  12. Taina Kaivola, Tiina Salomäki, Juha Taina, “In Quest for Better Understanding of Student Learning Experiences”. Procedia – Social and Behavioral Sciences, vol. 46, pp.  8-12, 2012. DOI: 10.1016/j.sbspro.2012.05.057.
  13. J. Morrison, W P Fisher Jr., “Measuring for management in science, technology, engineering, and mathematics learning ecosystems,” Journal of Physics: Conference Series, 1379, 012042, 2019. DOI 10.1088/1742-6596/1379/1/012042.
  14. Emily Pey-Tee Oon, U Hoi-Ka, W P Fisher Jr., “Metrologically coherent assessment for learning: what, why and how,” Journal of Physics: Conference Series, 1379, 012040, 2019. DOI 10.1088/1742- 6596/1379/1/012040.
  15. Gordiyenko T., Pototskyi I., Velychko O., Kuzmenko I., Salceanu A., “Expert assessment of com- petencies and learning results for bachelor’s according to higher education standard in information and measurement technologies”, Measuring equipment and metrology, vol. 84, no. 4, pp. 30–38, 2023. DOI: 10.23939/istcmtm2023.04.030
  16. Gordiyenko T., Pototskyi I., Velychko O., Kuzmenko I., Salceanu A., “Expert assessment of competencies and learning results for master’s according to higher education standard in information and measurement technologies”, Measuring equipment and metrology, vol. 85, no. 1, pp. 42–49, 2024. DOI: 10.23939/istcmtm2024.01.042
  17. Velychko O., Gordiyenko T., “A comparative analysis of the assessment results of the competence of technical experts by methods of analytic hierarchy process and with using the Rasch model”, Eastern-European Journal of Enterprise Technologies. Control processes, no.3(93), 2018, pp. 14-21. DOI: https://doi.org/ 10.15587/1729-4061.2018.131459
  18. Hayat B., Putra M., Suryadi B., “Comparing item parameter estimates and fit statistics of the Rasch model from three different traditions”,
    Jurnal Penelitian dan Evaluasi Pendidikan, no. 24(1), 2020, pp. 39–50.DOI: 10.21831/pep.v24i1.29871
  19. Hagquist C., Bruce M., Gustavsson J. P., “Using the Rasch model in nursing research: An introduction and illustrative example”, International Journal of Nursing Studies, vol. 46, no. 3, 2009, pp. 380–393. DOI: 10.1016/j.ijnurstu.2008.10.007
  20. Communication Validity and Rating Scales: Collapsing Categories. [Online]. Available at: https://www.rasch.org/rmt/rmt101k.htm
  21. A User’s Guide to WINSTEPS®MINISTEP Rasch-Model Computer Programs (2019). Program Manual 4.4.7 by John M. Linacre. [Online]. Available at: https://winsteps.com/winman/copyright.htm