A Comprehensive Framework for Assessing Scientific Research
Effectiveness Among Academic and Research Staff
Svitlana M. Ivanova
1 a
, Oleg M. Spirin
1 b
, Oleksandr M. Shymon
1 c
, Tetiana A. Vakaliuk
2,1,3,4 d
,
Iryna S. Mintii
1,3,5,2,4 e
and Serhiy O. Semerikov
3,1,2,6,4 f
1
Institute for Digitalisation of Education of the NAES of Ukraine, 9 M. Berlynskoho Str., Kyiv, 04060, Ukraine
2
Zhytomyr Polytechnic State University, 103 Chudnivsyka Str., Zhytomyr, 10005, Ukraine
3
Kryvyi Rih State Pedagogical University, 54 Gagarin Ave., Kryvyi Rih, 50086, Ukraine
4
Academy of Cognitive and Natural Sciences, 54 Gagarin Ave., Kryvyi Rih, 50086, Ukraine
5
Lviv Polytechnic National University, 12 Stepana Bandery Str., Lviv, 79000, Ukraine
6
Kryvyi Rih National University, 11 Vitalii Matusevych Str., Kryvyi Rih, 50027, Ukraine
{iv69svetlana, oleg.spirin, o.m.shymon}@gmail.com, {tetianavakaliuk, mintii, semerikov}@acnsci.org
Keywords:
Research, Assessment Criteria, Academic Staff, Research Staff, Higher Education Institutions, Effectiveness,
Indicators, Criteria.
Abstract:
This paper addresses the crucial task of devising comprehensive criteria and indicators for evaluating the
effectiveness of pedagogical research conducted by academic and research staff in higher education institu-
tions (HEIs). Four major assessment criteria are identified: publication and dissemination, utilization, impact
on the academic community, and representation-scientific. Each criterion is further broken down into spe-
cific indicators, including involvement in project competitions, scientific publications in reputable journals
and conference proceedings, indices and citations in various databases, altmetric indicators such as electronic
repositories and social media engagement, expert involvement in academic and research activities, and attain-
ment of academic titles and honors. The classification of these criteria provides a systematic framework for
assessing the multifaceted aspects of pedagogical research effectiveness. Further research prospects involve
assigning weight coefficients to these criteria and developing a methodology that integrates digital technolo-
gies to streamline the assessment process.
1 INTRODUCTION
The Higher Education Development Strategy in
Ukraine for 2021-2031 identifies “low levels of moti-
vation, including compensation for the work of teach-
ers and university staff as one of the weaknesses
of higher education (Strategy, 2020). The need for
developing a “national system for rating the activi-
ties of HEIs” is emphasized. One of the most com-
mon approaches underlying the assessment of the
performance of academic and research staff (ARS)
is based on utilizing indicators obtained from bib-
liographic databases such as Scopus, Web of Sci-
a
https://orcid.org/0000-0002-3613-9202
b
https://orcid.org/0000-0002-9594-6602
c
https://orcid.org/0000-0001-7009-2682
d
https://orcid.org/0000-0001-6825-4697
e
https://orcid.org/0000-0003-3586-4311
f
https://orcid.org/0000-0003-0789-0272
ence, and Google Scholar. This approach is driven
by the clear interdependence between data from these
databases, institutional positions in domestic and in-
ternational rankings (Times Higher Education World
University Rankings, QS World University Rankings,
Transparent Ranking, Ranking Web, Webometrics,
Top-200 Ukraine based on Scopus bibliometric indi-
cators, Consolidated Ranking of Ukrainian HEIs), as
well as institutions’ funding.
Morze et al. (2022) have developed a structural-
functional model of a ranking system to analyze the
research activities of university lecturers, consider-
ing research and digital competencies. This model
is built on key indicators for research effectiveness,
including citation indicators from three major bib-
liographic databases: Scopus, Web of Science, and
Google Scholar.
However, academic and research staff engage
in a variety of activities beyond research publica-
156
Ivanova, S., Spirin, O., Shymon, O., Vakaliuk, T., Mintii, I. and Semerikov, S.
A Comprehensive Framework for Assessing Scientific Research Effectiveness Among Academic and Research Staff.
DOI: 10.5220/0012648300003737
Paper published under CC license (CC BY-NC-ND 4.0)
In Proceedings of the 4th International Conference on History, Theory and Methodology of Learning (ICHTML 2023), pages 156-162
ISBN: 978-989-758-579-1; ISSN: 2976-0836
Proceedings Copyright © 2024 by SCITEPRESS – Science and Technology Publications, Lda.
tions. A more comprehensive approach to evaluat-
ing ARS performance has been proposed at Kryvyi
Rih State Pedagogical University. The ARS rank-
ing is constructed by considering data such as: ar-
ticles indexed in Scopus, Web of Science, foreign
journals, and professional journals indexed by Index
Copernicus; domestic/foreign monographs (single-
authored/collaborative); research projects (domes-
tic/foreign funded by the state budget of Ukraine
or grants from foreign entities); winners of the All-
Ukrainian competition of student research papers/All-
Ukrainian Olympiads; international competitions and
Olympiads (participation/winners); mobility (EU in-
ternational programs, exchanges, advanced training
courses, internships); participation in conferences and
events abroad (with presentations); research and cul-
tural projects without funding (in Ukraine/abroad);
patents, technology implementation, authorship cer-
tificates; state budget-funded research topics; PhD
thesis defenses; academic title attainment; train-
ing higher-qualified personnel (PhD defenses); in-
ternational contacts, cooperation, signed cooperation
agreements; dual degrees; membership in editorial
boards of journals indexed by Scopus or Web of
Science; professional publications indexed in biblio-
graphic databases; academic schools; research cen-
ters, laboratories, etc. (Order, 2021; Vakaliuk et al.,
2022).
2 LITERATURE REVIEW
Let’s begin by examining studies dedicated to the as-
sessment of the performance of ARS based on pub-
lication indicators. For instance, Moral-Mu
˜
noz et al.
(2020) emphasize the significance of scientometrics
as a crucial tool for evaluating and analyzing the out-
comes not only of individual researchers but also of
collaborations between institutions. They point out
its role in understanding the influence of state fund-
ing on research outcomes. Among the frequently used
criteria, the quantity of papers and the h-index (de-
rived from various bibliographic databases like Sco-
pus, Web of Science, Google Scholar) remain promi-
nent. Masic and Begic (2016) explored quantitative
indices (indicators) of research success and identi-
fied four indices: the number of papers, journal im-
pact factor, authorship order and quantity, and cita-
tion count. Bykov et al. (2021) and Vakaliuk et al.
(2021b) analyze the correlation between institutional
rankings and metrics from bibliographic databases as
well as the development of individual ARS rankings
using such data.
In recent times, as a supplement and/or alternative
to bibliometric data, alternative metrics, or “altmet-
rics, have gained attention for evaluating ARS per-
formance. Altmetrics are based on measuring the im-
pact and popularity of research and researchers us-
ing data from various social, professional, and online
platforms. Altmetrics serve as a complement or alter-
native to traditional bibliometric metrics such as cita-
tions in scholarly journals.
Altmetric indicators encompass a wide range of
data that can be used to gauge the impact of research:
1. Views: the number of views on scientific articles,
presentations, or other scientific materials. This
indicator reflects general interest in the research
and its accessibility.
2. Discussions: the quantity of comments, discus-
sions, or debates related to a scientific article or
other research materials. This indicator represents
the level of activity and interaction within the sci-
entific community concerning the research.
3. Saves: the number of times research has been
saved or added to users’ “favorites” on a given
platform. This indicator indicates interest in sav-
ing and later using the research.
4. Citations: the number of references to the re-
search in scholarly articles, books, or other aca-
demic sources. This indicator is considered a fun-
damental measure of scientific impact and cita-
tion.
5. Recommendations: the number of recommenda-
tions (likes) that research has received on social
media or other platforms. This reflects the sat-
isfaction or endorsement of the research by the
community.
Altmetric indicators can be obtained from various
sources, including academic social networks such as
ResearchGate and Twitter, as well as specialized plat-
forms that collect data on research articles and their
impact, such as Altmetrics Explorer. For instance,
Singh et al. (2022) examined altmetric data from the
ResearchGate network, a popular professional net-
work for researchers. Similar to Google Scholar, Re-
searchGate indexing involves an automated scanning
algorithm that provides bibliographic data, citations,
and other information about research articles from
various sources. Wiechetek and Pastuszak (2022) also
analyzed the use of ResearchGate metrics and com-
pared them to metrics from the Academic Ranking
of World Universities. Although not directly address-
ing the participation of ARS in editorial boards, these
studies highlight the importance of leveraging social
networks for promoting research achievements and
enhancing visibility within the academic community.
Cao et al. (2022) suggested using Twitter as a source
A Comprehensive Framework for Assessing Scientific Research Effectiveness Among Academic and Research Staff
157
of altmetric data. Shirazi and Valinejadi (2021) com-
pared altmetric indicators from the Altmetrics Ex-
plorer system with citation quality metrics from Clar-
ivate Analytics, Scopus, and Medline. Based on their
findings, they recommended that journal editors en-
sure their presence on social networks.
Integrating altmetric indicators can help re-
searchers, academic journals, and institutions gain a
more comprehensive understanding of the impact of
their research, demonstrating their visibility and pop-
ularity within the scientific community. Given the
widespread availability and use of social media, alt-
metrics increasingly serve as a vital tool for measur-
ing research impact and communication.
Other indicators mentioned in (Order, 2021) have
not been as extensively investigated. For instance, as-
sessing ARS performance based on project activities
and the preparation of winners of student research
paper competitions and All-Ukrainian Olympiads,
which are also considered in constructing domestic
rankings (Top 200 Ukraine, Consolidated Ranking
of Ukrainian HEIs), is discussed in (Vakaliuk et al.,
2022).
Currently, few works directly investigate the par-
ticipation of ARS in editorial boards and roles as re-
viewers, experts, or other functional positions in sci-
entific journals. However, some studies touch on this
topic and provide partial recommendations. Salinas
et al. (2020) tackled reviewer selection issues and
introduced the ReviewerNet system, an online inter-
active visualization tool designed to enhance the re-
viewer selection process in the academic sphere. Al-
though not directly focused on ARS involvement in
editorial boards, it could serve as a valuable instru-
ment for improving reviewer selection and evaluation
processes. Yu et al. (2021) examined the link between
organizational support and job burnout among aca-
demic journal editors, providing insights into factors
impacting effectiveness and satisfaction among edi-
tors in their roles. Additionally, Xu et al. (2021) iden-
tified challenges faced by academic journal editors
and their underlying reasons. This information can be
valuable for understanding the context in which ARS
engage in editorial boards and provide a contextual
foundation for future studies on this topic.
Despite some existing research highlighting ap-
proaches to assessing ARS performance, criteria and
indicators for such evaluations remain underdevel-
oped.
3 RESULTS
The analysis of the scientific activities of ARS at
higher education institutions and research institutions
allowed us to identify the relevant criteria and indica-
tors for evaluating the performance of pedagogical re-
search. Building upon previous research experience,
each criterion includes from 3 to 7 indicators (Spirin
and Vakaliuk, 2017):
Project-Competition Criterion: preparation for
project competitions; participation in projects;
preparation of students for participation in student
research competitions;
Scientific-Publication Criterion: publications in
journals indexed in bibliographic databases such
as Web of Science, Scopus; publications in con-
ference proceedings indexed in Web of Science,
Scopus; publications in specialized scientific jour-
nals in Ukraine; publications in international peri-
odicals and conference proceedings; publications
in Ukrainian scientific journals not included in
the list of specialized publications and publica-
tions in domestic conference materials; publica-
tion of monographs in Ukraine / international pub-
lications; publication of educational manuals or
textbooks; supervision of students publishing re-
search results in various publications;
Scientometric Criterion: indexing in Scopus /
Web of Science / Google Scholar; citations in
Scopus / Web of Science / Google Scholar;
Altmetric Criterion: electronic libraries, reposito-
ries; electronic portfolio; number of downloads;
number of views; social media dissemination;
Expert Criterion: participation as a reviewer / ex-
pert / opponent in PhD thesis; participation in var-
ious commissions, expert councils under the Min-
istry of Education and Science (including project
selection), National Academy of Pedagogical Sci-
ences of Ukraine (NAPN), The National Research
Foundation of Ukraine (NRFU); editor-in-chief /
deputy editor-in-chief / editorial board member of
a professional journal; involvement in conference
organization;
Representative-Scientific Criterion: PhD thesis
defense; academic title attainment; honorary title
attainment; awards / distinctions / prizes / schol-
arships; supervision of a graduate student who de-
fended a PhD thesis; participation in international
internships; foreign language proficiency at the
B2 level.
We will describe each criterion and explore all the
indicators in more detail.
ICHTML 2023 - International Conference on History, Theory and Methodology of Learning
158
Project-Competition Criterion involves evaluat-
ing the performance of pedagogical research within
participation in contests and projects, including:
1. The “Preparation for project competitions” indi-
cator assesses the researcher (whether ARS or re-
search staff) based on the number of projects pre-
pared for participation in competitions. The as-
sessment period can range from 1 to 5 years. This
is related to the fact that clause 38 of the Licens-
ing Conditions for Educational Activities requires
consideration of different types of activities over
5 years, while the contract between the institution
and the employee may be signed for only 1 year
or 2 years, and so on. This clarification applies to
all indicators and criteria described in this paper.
2. The “Participation in projects” indicator accounts
for the researcher’s participation in projects as a
simple performer, principal performer, or project
leader. If the researcher participates in multi-
ple projects simultaneously, this is also consid-
ered. This indicator can be taken into account if
the researcher participates not only in ministerial
projects but also in international ones.
3. The “Preparation of students for participation in
student research competitions” indicator stipu-
lates that ARS prepare students to participate in
nationwide and international competitions of var-
ious levels, including private competitions (e.g.,
Zavtra.Ua).
Scientific-Publication Criterion encompasses
the evaluation of performance within the realm of
publication activity, which includes the following in-
dicators:
1. Publications in journals indexed in bibliographic
databases such as Web of Science, Scopus this
indicator entails having a certain number of pub-
lications in the specified journals.
2. Publications in conference proceedings indexed
in bibliographic databases such as Web of Sci-
ence, Scopus – this indicator differs from the pre-
vious one in that it refers to articles published in
books or conference journals (Proceeding Jour-
nals), which are also indexed in the mentioned
databases. Such articles in the Scopus database
are referred to as proceeding papers, although
they don’t significantly differ from full-fledged ar-
ticles.
3. Publications in specialized scientific journals in
Ukraine this indicator entails having articles
published in journals listed as specialized publica-
tions approved by the Ministry of Education and
Science of Ukraine.
4. Publications in international periodicals and con-
ference proceedings – although this indicator may
not hold considerable value, in some HEIs, it is a
mandatory clause in contracts. This indicator in-
cludes publications not covered by the first two in-
dicators. While this point may seem less valuable,
the number of publications in the international
community also contributes to a researcher’s sta-
tus, even if not in bibliographic databases like
Scopus or Web of Science, at least in Google
Scholar. Not all educational institutions and re-
search establishments have subscribed access to
the mentioned databases to explore the research
output of a specific researcher, thus making these
indicators relevant.
5. Publications in Ukrainian scientific journals not
included in the list of specialized publications and
publications in domestic conference materials
this indicator also combines two aspects, encom-
passing publications in sources not covered by the
previous indicators.
6. Publication of monographs in Ukraine / inter-
national publications publishing a monograph
serves as a culmination of work on a specific
topic, hence its publication is one of the indica-
tors.
7. Publication of educational manuals or textbooks
this indicator is particularly important for ARS,
as the presence of such publications is significant
both for teaching activities and for meeting licens-
ing requirements.
8. Supervision of students publishing research re-
sults in various publications – this indicator is de-
signed for ARS who guide student research work,
resulting in publications by students in various
sources.
Scientometric Criterion involves evaluating the
performance of ARS and researchers in institutions of
higher education and research establishments based
on indexing and citation in various scientometric
databases, including:
1. “Indexing in Scopus / Web of Science / Google
Scholar” indicators involves considering the re-
searcher’s h-index according to the corresponding
bibliographic database.
2. “Citations in Scopus / Web of Science / Google
Scholar” these indicators entails determining the
total number of citations in the corresponding bib-
liographic database.
Altmetric Criterion involves evaluating the per-
formance of HEI’s research and academic staff (RAS)
based on other equally important indicators:
A Comprehensive Framework for Assessing Scientific Research Effectiveness Among Academic and Research Staff
159
1. “Electronic Libraries, Repositories” indicator en-
tails assessing the completeness of electronic li-
braries of research establishments and HEIs with
all published works.
2. “Electronic Portfolio” indicator entails having
a well-maintained personal electronic portfolio
(Vakaliuk et al., 2021a).
3. “Number of Downloads” indicator takes into ac-
count the number of downloads of research works
from repositories and electronic libraries. This in-
dicator should be considered cumulatively for all
of the author’s publications simultaneously.
4. “Number of Views” indicator similarly to the pre-
vious one involves considering the total number
of views of all of the author’s publications in a
repository or electronic library.
5. “Social Media Outreach” indicator involves hav-
ing a presence on social media platforms and dis-
seminating one’s research activity through them.
This indicator is evaluated for specific social me-
dia platforms like Facebook, LinkedIn, etc.
Expert Criterion is no less important than the
previous ones, as it considers the researcher’s involve-
ment in various expert roles, including:
1. Involvement as a reviewer / expert / opponent for
PhD thesis this indicator entails the participation
of RAS or researchers in these roles during the
defense a PhD thesis.
2. Participation in various committees, expert coun-
cils under the Ministry of Education (including
project selection) this indicator involves partic-
ipating in different expert councils or Accredita-
tion Commissions:
Expert Council for Dissertation Examination of
the Ministry of Education and Science;
Branch Expert Council as an expert of the Na-
tional Agency for Quality Assurance in Higher
Education;
Expert commissions of the Ministry of Educa-
tion and Science or the National Agency for
Quality Assurance in Higher Education;
Interbranch Expert Council on Higher Educa-
tion of the Accreditation Commission;
Accreditation Commission;
Scientific and Methodological Council;
Scientific and Methodological Commissions
(subcommissions) on higher or specialized
postgraduate education of the Ministry of Ed-
ucation and Science;
Scientific or scientific-methodical or expert
councils of state authorities and local self-
government bodies;
State Service for Quality of Education for
conducting planned (unscheduled) measures of
state supervision (control), etc.
3. Editor-in-Chief / Deputy Editor-in-Chief / Edito-
rial Board Member of a specialized journal – this
indicator involves actual participation in one of
these roles for specialized journals in Ukraine;
4. Participation in conference organization – this in-
dicator involves participating in the organization
of conferences of various levels as a program
committee member or reviewer, which enhances
the researcher’s professional level.
Representational-Scientific Criterion is a crite-
rion that involves assessing the performance of HEI
researchers and scientific personnel based on specific
achievements:
1. “PhD Thesis Defense” involves the presence of a
defended PhD thesis (for obtaining a doctoral or
candidate of science degree) within the reporting
period.
2. Attainment of Academic Title” involves the ac-
quisition of a diploma confirming an academic ti-
tle (again, within the period specified by the insti-
tution or educational establishment).
3. Attainment of Honorary Title” involves the con-
ferment of an honorary title on a researcher as pro-
vided by the Ministry of Education and Science of
Ukraine.
4. “Receipt of Awards / Honors / Prizes / Schol-
arships” entails researchers receiving various
awards, prizes, etc., as stipulated by the Ministry
of Education and Science of Ukraine, the Cabinet
of Ministers of Ukraine, the Verkhovna Rada of
Ukraine, etc.
5. “Supervision of a Defended PhD thesis” pertains
to the presence of a defended PhD thesis under
the guidance of the researcher. Additionally, this
indicator can also encompass the supervision of
a doctoral candidate’s defense under the guidance
of this personnel.
6. “Participation in International Internships” in-
volves possessing a certificate of participation in
international scientific or scientific-pedagogical
internships once every 5 years.
7. “Proficiency in a Foreign Language at B2 Level”
involves passing an examination to demonstrate
proficiency in a foreign language (such as English,
Polish, etc.) and obtaining the corresponding cer-
tificate.
All the identified criteria can be tentatively classi-
fied into criteria related to publication and dissemi-
ICHTML 2023 - International Conference on History, Theory and Methodology of Learning
160
nation (altmetric, scientific publication-related), uti-
lization (scientometric, project competition-related),
and impact on the academic community (expert-
related, representational-scientific).
4 CONCLUSIONS AND
PROSPECTS
The article substantiates the necessity of identifying
criteria and indicators for assessing the effectiveness
of pedagogical research conducted by academic and
research staff.
The following criteria and corresponding indi-
cators have been identified and described: project
competition-related (preparation for project compe-
titions; participation in projects; preparing students
for participating in student research competitions);
scientific publication-related (publications in jour-
nals indexed in Web of Science, Scopus; publi-
cations in conference proceedings indexed in Web
of Science, Scopus; publications in domestic scien-
tific journals; publications in international periodi-
cals and conference materials; publications in do-
mestic non-listed journals and conference materials;
publication of monographs in domestic/international
publications; publication of educational guides or
textbooks; supervision of students publishing re-
search outcomes in various publications); scientomet-
ric (indexing in Scopus; indexing in Web of Sci-
ence; indexing in Google Scholar; citations in Sco-
pus; citations in Web of Science; citations in Google
Scholar); altmetric (electronic libraries, repositories;
electronic portfolio; download count; view count; so-
cial media dissemination); expert-related (participa-
tion as a thesis reviewer/expert/opponent; involve-
ment in different committees, expert councils un-
der the Ministry of Education (including project
selection); chief editor/deputy chief editor/editorial
board member of a professional journal; participa-
tion in conference organization); representational-
scientific (dissertation defense; attainment of aca-
demic titles; attainment of honorary titles; receipt of
awards/honors/prizes/scholarships; supervision of a
defending doctoral candidate; participation in interna-
tional internships; proficiency in a foreign language at
B2 level).
The identified criteria can be tentatively divided
into those pertaining to publication and dissemina-
tion, utilization, and impact on the academic commu-
nity.
The prospects for further research involve deter-
mining weight coefficients for the established crite-
ria and indicators within HEIs and research institution
personnel. Additionally, a methodology for employ-
ing information and digital technologies to assess the
effectiveness of pedagogical research could be devel-
oped.
REFERENCES
Bykov, V. Y., Spirin, O. M., Ivanova, S. M., Vakaliuk, T. A.,
Mintii, I. S., and Kilchenko, A. V. (2021). Sciento-
metric indicators for evaluating the effectiveness of
pedagogical research of scientific institutions and ed-
ucational institutions. Information Technologies and
Learning Tools, 86(6):289–312.
Cao, R., Geng, Y., Xu, X., and Wang, X. (2022). How
does duplicate tweeting boost social media exposure
to scholarly articles? Journal of Informetrics, 16(1).
Masic, I. and Begic, E. (2016). Evaluation of scientific jour-
nal validity, it’s articles and their authors. Studies in
Health Technology and Informatics, 226:9–14.
Moral-Mu
˜
noz, J. A., Herrera-Viedma, E., Santisteban-
Espejo, A., and Cobo, M. J. (2020). Software tools for
conducting bibliometric analysis in science: An up-to-
date review. Profesional De La Informacion, 29(1).
Morze, N. V., Buinytska, O. P., and Smirnova, V. A. (2022).
Designing a rating system based on competencies for
the analysis of the university teachers’ research activ-
ities. CTE Workshop Proceedings, 9:139–153.
Order (2021). Rozporiadzhennia 04 vid 25 travnia 2021
roku [Order 04 of May 25, 2021].
Salinas, M., Giorgi, D., Ponchio, F., and Cignoni, P. (2020).
ReviewerNet: A visualization platform for the selec-
tion of academic reviewers. Computers and Graphics,
89:77–87.
Shirazi, M. S. and Valinejadi, A. (2021). Investigating
of Association between Altmetrics Activity Indica-
tors and Citation Quality Indicators in Iranian Med-
ical Journals. International Journal of Preventive
Medicine, 12(1):156.
Singh, V. K., Srichandan, S. S., and Lathabai, H. H. (2022).
ResearchGate and Google Scholar: How much do
they differ in publications, citations and different met-
rics and why? Scientometrics, 127(3):1515–1542.
Spirin, O. M. and Vakaliuk, T. A. (2017). Criteria of
open web-operated technologies of teaching the fun-
damentals of programs of future teachers of informat-
ics. Information Technologies and Learning Tools,
60(4):275–287.
Strategy (2020). Stratehiia rozvytku vyshchoi osvity v
Ukraini na 2021-2031 roky [Strategy for the Develop-
ment of Higher Education in Ukraine for 2021-2031].
Vakaliuk, T. A., Ivanova, S. M., and Kilchenko, A. V.
(2021a). Electronic portfolio as a tool of reflecting
the results of scientific and pedagogical activities of
teachers of higher education institutions. Scientific
Bulletin of Uzhhorod University. Series: “Pedagogy.
Social Work”, 1(48):53–58.
Vakaliuk, T. A., Mintii, I. S., Hamaniuk, V. A., and Ivanova,
S. M. (2022). Recording the practice of teaching staff
A Comprehensive Framework for Assessing Scientific Research Effectiveness Among Academic and Research Staff
161
in project activities and the preparation of winners in
competitions and olympiads. Scientific innovations
and advanced technologies, 4(6):22–34.
Vakaliuk, T. A., Spirin, O. M., Mintiy, I. S., Ivanova, S. M.,
and Novytska, T. L. (2021b). Scientometric indi-
cators for evaluating the effectiveness of pedagogi-
cal research of scientists and research and teaching
staff. Modern Information Technologies and Inno-
vation Methodologies of Education in Professional
Training Methodology Theory Experience Problems,
60:167–184.
Wiechetek, L. and Pastuszak, Z. (2022). Academic social
networks metrics: An effective indicator for university
performance? Scientometrics, 127(3):1381–1401.
Xu, Z., Yang, D., and Chen, B. (2021). Career difficulties
that Chinese academic journal editors face and their
causes. Journal of Scholarly Publishing, 52(4):212–
232.
Yu, X., Wu, S., Chen, W., Zheng, W., Huang, M., Yang,
L., and Zhou, S. (2021). Exploring the associa-
tions between perceived organizational support and
job burnout among chinese academic journal editors:
A moderated mediation model. International Journal
of Environmental Research and Public Health, 18(22).
ICHTML 2023 - International Conference on History, Theory and Methodology of Learning
162