response page. This page can be tailored as to show
individual scores, total-scale and sub-scale means to
both the respondent and the questionnaire-
administrator.
Figure 4 shows the example of detailed feedback
in the case of the QTI-questionnaire (Questionnaire
on Teacher Interaction – e.g. Wubbels et al., 2006).
5 CONCLUSIONS
The CORF system works reasonably well. The
design and technical implementation match
expectations on all key issues. Examples are the
conceptualization of the compound object ‘research
project’, juridical issues, metadata, administering
questionnaires, sharing instruments.
A first points of improvement is making the
handling of versions more transparent. Also the user
interface could be improved. Besides this, the set of
educational metadata could be further developed, i.e.
by using ontologies developed elsewhere (Duval,
2001).
Concerning the first research question, we
conclude that CORF
®
technically provides the
facilities for data sharing and open access
publications. It also recognizably boosts research
efficiency as is well recognized by senior
researchers. CORF
®
also clearly helps to promote
the use of standardized tests amongst PhD students,
in particular when supported by supervisors.
Concerning the second research question, both
teachers and young researchers appreciate the
opportunities mutually get in touch and the low
threshold publication opportunities CORF
®
provides. The quick feedback page really adds value
to the electronic questionnaires.
Concerning the third research question, we find
that teachers appreciate CORF
®
for providing access
to acknowledged instruments. Indeed (student)
teachers started using scientific instruments,
standard questionnaires from the library and
storyline in particular.
Overall, however, the contribution CORF
®
makes to strengthening educational research and
supporting teacher research appears limited. The
CORF system as such performs adequately, but the
scale of the active community seems too small to
capitalize on the various opportunities offered.
Certificates, for example, are implemented in a
well manageable way. Nevertheless, in a small
community a uses can still judge the various
instruments directly, and certifications appears as
devious and the ‘burden’ of reviewing does not pay
off.
It is observed that most CORF users are linked to
a school or a teacher-training institute or research
institute that actively promotes the use of CORF
®
.
An obvious first step to enlarge the CORF
community would be to involve more institutes.
REFERENCES
Admiraal, W. (2013). Academisch docentschap: naar
wetenschappelijk praktijkonderzoek door docenten.
(Academic teachers: practical scientific research by
teachers). Inaugural lecture, Leiden University, Leiden
The Netherlands.
Arzberger, P., Schroeder, P., Beaulieu, A., Bowker, G.,
Casey, K., Laaksonen, L., ... & Wouters, P. (2004).
Promoting access to public research data for scientific,
economic, and social development. Data Science
Journal, 3, 135-152.
Ax, J., Ponte, P., & Brouwer, N. (2008). Action research
in initial teacher education: an explorative study.
Educational action research, 16(1), 55 – 72.
Beijaard, D. (1995). Teachers' prior experiences and actual
perceptions of professional identity. Teachers and
teaching: theory and practice, 1(2), 281- 294.
Belhajjame, K., Zhao, J., Garijo, D., Hettne, K., Palma, R.,
Corcho, Ó., ... & Goble, C. (2014). The research
object suite of ontologies: Sharing and exchanging
research data and methods on the open web.
Best Survey Software Reviews and Comparisons (2015).
http://survey-software-review.toptenreviews.com/
Birnbaum, M. H., & Reips, U. D. (2005). Behavioral
research and data collection via the Internet. In: Vu,
K. P. L., & Proctor, R. W. (Eds.) The handbook of
human factors in Web design, pp. 471-492. Boca
Raton, FL: CRC Press.
Bosnjak, M. (2012, November). ́No response is also a
response ́. Interpretations for and implications of
(non)response in self-administered surveys. Invited
talk given at the 6th Tivian Symposium, November 9,
2012, Cologne, Germany.
Broekkamp, H., & van Hout-Wolters, B. H. A. M. (2007).
The gap between educational research and practice: A
literature review, symposium, and questionnaire.
Educational Research and Evaluation, 13(3), 203-220.
Chatzinotas S. and Sampson D. (2004). eMAP: Design
and Implementation of Educational Metadata
Application Profiles. In Proc. of the 4th IEEE
International Conference on Advanced Learning
Technologies ICALT 2004, Joensuu, Finland.
CoreIdeas (2014):
http://www.cirtl.net/CoreIdeas/teaching_as_research.
Duval, E. (2001). Standardized Metadata for Education: A
Status Report.
http://files.eric.ed.gov/fulltext/ED466155.pdf.
Feldman, A. (2007). Validity and quality in action
research. Educational Action Research, 15(1), 21 – 32.
Fraser, B. J. (2012). Classroom learning environments:
CSEDU2015-7thInternationalConferenceonComputerSupportedEducation
244