Teacher Experiences of Learning Computing using a 21
st
Century
Model of Computer Science Continuing Professional Development
Lorraine Fisher, Jake Rowan Byrne and Brendan Tangney
Centre for Research in IT in Education (CRITE), School of Education and School of Computer Science & Statistics,
Trinity College Dublin, The University of Dublin, Dublin, Ireland
Keywords: Learning Computer Science, Teacher Professional Development, Social Constructivism, Evaluation.
Abstract: Computer Science (CS) is a subject which is perceived as a difficult to learn and to teach. Building on previous
work (Fisher et al., 2015), which explored post-primary school teacher reactions to a social constructivist
Continuing Professional Development (CPD) Programme in CS, this paper explores the same teachers’
experiences of learning CS during the workshops. The CS CPD workshops were delivered using the Bridge21
model of 21st century teaching and learning. This paper examines the extent to which the Bridge21 activity
model proved effective in helping teachers learn computing knowledge and skills and explores teacher
attitudes towards applying their new learning in the classroom. Nine workshops took place over the 2013/2014
academic year, resulting in 45 teaching hours and 110 teacher engagements. An exploratory case study
approach informed data collection with comparative coding used to analyse results. Analysis indicates that
peer-collaboration played an important role in assisting teachers develop computing knowledge and skills and
that teachers intend to use the Bridge21 model to teach computing in their own classrooms.
1 INTRODUCTION
This study is situated within the context of 21
st
century education, where teachers are encouraged to
use student-centred, technology-mediated teaching
and learning strategies to help students develop 21
st
century skills (Walser, 2008). Technology-mediated
learning experiences create rich contexts teachers can
use with students to develop 21
st
century skills such
as problems solving, team work and critical thinking
(English and Sriraman, 2010). Problem solving
activities used in a computing context enable students
to put into practice digital and critical thinking skills.
Autonomous learning involves developing the
confidence to apply learning strategies to solve
complex problems (Boud, 1988). Teachers can
empower students through facilitating technology
mediated lessons which use problem solving as a way
to help students develop problem solving strategies
(Smyth and Banks, 2012). Technology-mediated,
student-centred learning environments enable
teachers to help students learn problem solving while
also learning other 21
st
century skills such as critical
thinking, digital expertise and collaborative working.
Teacher adoption of teaching methods designed to
support 21
st
century learning coincides with the push
by the European Commission to encourage schools to
offer computer programming lessons in schools (EC,
2016). Against this backdrop the re-emergence of
computing at secondary level across the United
Kingdom (Brown et al., 2014), and within the
Republic of Ireland (NCCA, 2014), has prompted CS
educators to source Continuing Professional
Development (CPD) in order to upskill themselves to
meet the challenge of teaching CS.
The body of the paper is structured as follows. The
literature review provides the rational which
underpins the research questions and it is followed by
an outline of the methodology used. The data analysis
section describes the process used to code and
interpret the data and the discussion section explores
the study findings.
2 LITERATURE
Computer Science (CS) is perceived as a difficult
subject to learn (Zendler et al., 2012), with computer
programming perceived as being particularly difficult
(Connell et al., 2015). Information Communication
Technology (ICT) can play a key role in helping
students access resources and are useful for helping
Fisher, L., Byrne, J. and Tangney, B.
Teacher Experiences of Learning Computing using a 21st Century Model of Computer Science Continuing Professional Development.
In Proceedings of the 8th International Conference on Computer Supported Education (CSEDU 2016) - Volume 2, pages 273-280
ISBN: 978-989-758-179-3
Copyright
c
2016 by SCITEPRESS Science and Technology Publications, Lda. All rights reserved
273
students work through complex tasks (Brinda et al.,
2009). Student-centred learning approaches use
practical activities to help students develop key skills
(Baeten et al., 2010) and when combined with ICT
enables teachers to use problem solving as a way to
encourage students to work through computing
problems as a method of learning and understanding
coding.
2.1 Teaching Coding in a 21
St
Century
Style
Helping students learn computer programing or
coding remains problematic (Connell et al., 2015)
prompting teachers to seek assistance in designing
innovative coding lessons. A 21
st
century approach to
CS delivery, involving technology-mediated tasks
structured around problem solving activities provides
one possible solution, enabling students to develop a
combination of social, technical and cognitive skills
(Hazzan et al., 2014). Teacher CPD programmes
using a 21
st
century approach enable teachers to
obtain hands-on coding expertise in an environment
designed to support technical knowledge sharing
through peer collaboration (Bryant et al., 2006).
Exploring teachers’ attitudes to CS CPD provides
insight into what supports teachers need in order to
teach computing in schools. This could range from
assistance with developing content knowledge
through guidance with developing particular skills to
help to work through computing tasks all as part of
building the confidence to teach computing (Harland
and Kinder, 1997). This study sets out to explore
teacher attitudes in terms of understanding what
technical skills and computing knowledge teachers
learned from attending the Bridge21 CS CPD
programme.
2.2 Bridge21 CS CPD Programme
Bridge21 is a pragmatic model of 21st century
teaching and learning that has been used extensively
across a number of secondary schools in Ireland. It
uses a team-based model to promote peer-learning in
which the instructor orchestrates learning rather than
focusing on delivery of content (Lawlor et al., 2010).
It has been shown to encourage intrinsic motivation,
promote the development of 21C skills, and to be
suitable for delivering curriculum content (Lawlor et
al., 2015, Johnston et al., 2015). The Bridge21
activity model outlines the structural elements
necessary to delivery an effective 21C learning
experience and is partially inspired by ideas on
Design Thinking (Brown and Wyatt, 2010). It
consists of seven steps around which learning
activities can be designed: ‘Set-up, Warm-up,
Investigate, Plan, Create, Present and Reflect’.
2.2.1 Computing Workshops
The Bridge21 CS CPD programme discussed in this
paper consisted of six computing workshops. The first
“Digital Media and 21C Teaching and Learning” was
designed to introduce the Bridge21 model through a
hands on technology mediated learning experience.
The second workshop focused on “Problem Solving in
the 21
st
Century”; the third offered an “Introduction to
Programming through Animation using Scratch”; with
workshop four covering “Intermediate Programming
through Game Design using Scratch”. Workshop five
focused on “Advanced programming with Python”,
with six “Exploring Computer Systems with the
Raspberry Pi” (Byrne et al., 2015).
2.2.2 Research Questions
Having already explored teacher reactions to the
workshops (Fisher et al., 2015) this paper explores
teacher learning through two research questions.
Question one examined the extent to which the
workshops proved effective in helping teachers learn
computing knowledge and skills, while question two
explored teacher attitudes towards applying their
learning from the workshops in their own classrooms.
3 METHODOLOGY
The evaluation framework used to address the
research questions was adapted from that of
Kirkpatrick (1994). The Kirkpatrick framework has
been used to evaluate educational phenomena across
a number of contexts including evaluating teacher
performance (Naugle et al., 2000) and measuring
learning outcomes in teacher professional
development programmes (Coldwell and Simkins,
2011). While Kirkpatrick is criticised for its
deterministic structure (Kaufman et al., 1996, Holton,
1996, Bates, 2004) it has been adapted to evaluate
Continuing Professional Development (CPD) for in-
service teachers (Guskey, 2000) and it is this work
that guides the design of methods used to evaluate
teacher learning in this research.
3.1 Kirkpatrick Adaptation
Kirkpatrick operates over four levels. Level 1
explores participant reactions to a training
CSEDU 2016 - 8th International Conference on Computer Supported Education
274
intervention and Level 2 explores participant learning
(specified as attitudes, skills and knowledge). Level 3
examines perceived changes in behaviour while
Level 4 examines results in terms of changes made in
the workplace as a result of the training.
The authors are in the process of rollout of the full
Kirkpatrick framework to evaluate the delivery of the
Bridge21 CS CPD programme over a three year
period. Results from the administration of Level 1
Reaction Instruments are reporting positive reactions
to the CS CPD workshops (Fisher et al., 2015) and
that teachers intend using the Bridge21 model to
enhance their subject teaching (Byrne et al., 2015).
This paper focuses upon the perceptions of the
participants with regard to their learning.
3.1.1 Level 2 – Learning Evaluation
This paper presents results gained from the
administration of two data collection instruments.
Each were at Kirkpatrick Level 2 and explored
Skills’, ‘Knowledge and ‘Attitudes’. The first
instrument contained three open questions (Table 1).
Table 1: Bridge21-Individual Learning Form.
Open Questions
1 WHAT - What happened during this workshop?
What did you observe? What did you achieve?
What did your colleagues achieve? What went
well? What didn’t go well?
2 NOW WHAT- How will you apply what you
have learned today in your teaching? How will it
help you develop your students’ learning further?
How will you develop your learning further?
What information can you share with colleagues?
3 SO WHAT - What did you like / dislike about the
workshop? How did you respond? How did you
feel? Did you learn anything about yourself? Did
you learn anything about your colleagues?
Question 1 was mapped to the category of
Knowledge’; question 2 was mapped to the category
of ‘Skills’. Both items addressed research question 1.
The 3
rd
question mapped to the ‘Attitudes’ category,
and addressed research question 2. The instrument
was administered per participant, per workshop.
The second instrument contained five open
questions (Table 2). Question 1 was mapped to the
category of ‘Knowledge’ and question 2 mapped to
the category of ‘Skills’, and together they addressed
research question 1. Questions 3, 4 and 5 were all
mapped to the category of ‘Attitudes’, and addressed
research question 2. This instrument was
administered per team, per workshop.
Table 2: Bridge21-Team Learning Form.
Open Questions
1
List 3 skills the team learned today.
2
List 3 skills the team would like to develop /
improve on.
3
Overall, how would the team rate their
performance?
4
Why does the team feel this way?
5
What was the team’s best achievement
today?
3.1.2 Data Gathering Procedures
Participants attended workshops on their own accord,
thus samples were self-selecting. The authors
provided an evaluation brief and issued participants
with an ethics form at the start of each workshop. A
total of N = 48 individual learning forms and N = 10
team learning forms were obtained from N = 110
attendees during the delivery of 9 workshops between
October 2013 to May 2014. These numbers include
responses from participants, whom attended more
than one workshop.
4 DATA ANALYSIS
The authors acknowledge that the reconstruction of
participant accounts are subject to author bias and
present one of multiple readings. Moreover, the
authors use quotations as primary data, cognisant of
possible misreading’s or unintended interpretations
made from the remodelling of participant data
(LeCompte and Goetz, 1982). Data presented for
analysis consists of text responses to questions
determined by the authors, in an attempt to shine light
on phenomena described in the research questions.
The following analysis may prove limited in
supporting broader generalisations (Lewis et al.,
2003), but instead attempts to render accounts
accessible in a form that yields one reading with
which to open further, more detailed conversations.
Text responses obtained from individual (Table 1)
and team (Table 2) forms were transcribed, coded
then stored in a searchable database. A total of N =
227 data base records were transcribed from hard
copy individual and team learning forms. The authors
used comparative coding to identify themes.
The qualitative data set comprised of individual
and team responses. The authors used comparative or
analytical coding (LeCompte and Schensul, 1999) to
reduce the data set. This involved open coding across
all records to look for similar concepts in the data
Teacher Experiences of Learning Computing using a 21st Century Model of Computer Science Continuing Professional Development
275
(inductive coding cycle 1). Two further deductive
coding cycles merged similar codes together,
generating N = 6 sub themes. Table 3 illustrates the
process used to reduce the data.
Table 3: Coding Process.
Total Data Records 227
Inductive Coding Cycle 1 125
Deductive Coding Cycle 1 56
Deductive Coding Cycle 2 30
Themes 6
The authors mapped each theme to one of
Kirkpatrick’s learning sub-categories of knowledge,
skills and attitudes. While the authors acknowledge
that code alignment is a subjective process (Goetz and
LeCompte, 1981) the process of coding and theming,
enabled the authors to look for and tease out and
explore similarities and differences between themes.
4.1 Themes
Six themes emerged from comparative coding. Table
4 maps each theme to a learning category.
Table 4: Mapping Themes to Learning Categories.
Knowledge Intrinsic Motivation
Computing Comprehension
Skills Computer Programming
Hardware
Attitudes Replicating Coding Activities
Social Constructivist Learning
The themes ‘intrinsic motivation’ and ‘computing
comprehension’ relate to knowledge; the themes
‘computer programming’ and ‘hardware’ relate to
skills and both these themes address research question
1. The themes ‘replicating coding activities’ and
‘social constructivist learning’ relate to ‘attitudes
and speak to research question 2.
5 FINDINGS AND DISCUSSION
This section is organized as follows. Section 5.1
addresses research question 1 and examines the extent
to which workshops proved effective in helping
teachers learn computing knowledge and skills. The
next section (5.2) explores teacher attitudes towards
applying their learning in the context of teaching
students coding. The concluding discussion (Section
6) revisits the research questions and describes the
need for further research (Section 6.1).
5.1 Learning Computing Knowledge
and Skills
This section explores participant experiences of
learning computing, with particular focus on
developing computer programming / coding content
knowledge.
5.1.1 Knowledge
The workshops did enable participants to learn
computing concepts. For example one participant
reported obtaining a ‘good, practical understanding
of computer programming languages such as python
while another participant commented that they too
had learned ‘how to use Raspberry Pi better, (and had
developed a greater understanding) of the potential
of what you can do’. The same participant continued
enthusiastically ‘I want to know more about all of it’.
Teamwork played an important role in helping
participants learn computing, captured in the
following comment: ‘I like being taken outside my
comfort zone. I may not have skill set to do some of
the computing tasks well, but I have a better
understanding of what is involved and maybe
prompted to learn more. (I also learned the) benefits
of collaboration with other subject teachers’.
Working together created an opportunity for
participants to share their learning with their peers.
5.1.2 Intrinsic Motivation
Working in a team played a key factor in helping
participants stay motivated to complete tasks. One
participant commented that ‘group work is essential
to keep yourself motivated when the programs are too
complex for the individual’. Another participant
concurred with this statement, ‘I liked the hands on
element. Felt motivated as part of team collaboration
to achieve objectives. I like to get things done, (and)
I like defined roles with a team’. Role division within
teams also helped to keep teams on track: ‘we worked
on a video and audio clip around a topic. Achieved
objective using a variety of software programs and
techniques. Great team, all motivated and stuck to the
charter of rights, respectful of each other’s
differences’. One reported that the learning model
provided ‘excellent steps for learning. I’m motivated
to proceed with this process of using digital
technology in classroom’.
Team dynamics played also an important role in
creating bonds and stay on track to complete tasks:
we stuck to our (team) motto’! We achieved all the
tasks; we worked well together’. Team bonding also
CSEDU 2016 - 8th International Conference on Computer Supported Education
276
Figure 1: Participants setting up a Controller.
helped participants succeed with difficult tasks such
as ‘completing the ambitious radar task’ which was
a hardware configuration task completed as part of
the Raspberry Pi workshop. Working in teams to
solve complex computational problems also helped
participants to gain in confidence, as demonstrated in
the following comment: ‘I worked well with my
partner to complete tasks. Felt more confident and
able to do tasks’. Team working proved useful as a
motivational tool, helping some participants achieve
their goals, and in some cases exceeded them.
5.1.3 Computing Comprehension
Completing the tasks assigned required participants
to develop problem solving skills: ‘we worked as a
team to create a project in Scratch. We noticed that
there was a lot of trial and error involved. We all took
on board new skills through our exploration. We felt
a sense of accomplishment’. However one participant
expressed the need for ‘more practice at tasks,
building and improving basic skills’. Learning
activities also provided participants with a context in
which to learn at their own pace; ‘I learned how to
program basic python tasks, and I felt that I could
pass on that learning to others’. Working together
and sharing tacit knowledge played a pivotal role in
helping participants develop the confidence to try out
new tasks or to jump in and offer assistance. Indeed,
the importance of peer-collaboration in the context of
learning computer programming is evident in the
following comment ‘I liked the teamwork, learning
from other people. I liked seeing the product of your
work. I learnt that my knowledge is limited and I
would like to learn more about programming’.
5.1.4 Skills
Participants enjoyed experiencing the Bridge21
approach, involving ‘Set-up, Warm-up, Investigate,
Plan, Create, Present and Reflect’ as a method for
learning computing skills. One participant reflected
that they had learned ‘about the bridge21 method.
Observed (use of the Bridge21) method in action. (I
also) achieved a very basic animation (and) the group
went well to share skills’. Another participant also
enjoyed the combined approach: ‘great learning
achieved, networking digital knowledge. A lot of
knowledge still to learn. Great capabilities in varying
skills in colleagues in group’. A further participant
had also enjoyed a collaborative approach to learning
computing ‘I learnt that it is possible to use the same
format, to take an unknown concept, research,
storyboard, records and present in a short time and
verify that learning has been achieved. With a group
of strangers, and quickly recognise, skills, aptitude,
have flexibility’.
5.1.5 Computer Programming
Participants again reported that teamwork played an
important factor in learning computer programming
skills. One participant commented that, when
learning programming ‘team work can be very
effective’. Practical programing tasks also facilitated
the ‘learning and sharing of expertise’ which helped
the same participant ‘achieve coding a set of activities
for animated characters in scratch. Teamwork went
well (however I will) need more time to consolidate
learning’. The workshops used cross over activities
to help participants make linkages between visual and
text based programming languages: ‘I learnt a nice
bridging approach to highlighting similarities
between scratch and python; I gained more
confidence with the syntax’. Using Scratch as an entry
point to the Python programming environment helped
one participant ‘engage with the software (and helped
me) learn to navigate the options …much is hidden.
More examples of good programming please…
‘Discovery’ takes time – 1 day not enough’!
Figure 2: Basic Scratch Animation.
5.1.6 Hardware
The workshops also exposed participants to
hardware. This experience helped one participant
learn about the Raspberry Pi set up and Makey-
Makeys. We successfully set up Raspberry Pi and
used Scratch on it. The circuitry breadboard piece
Teacher Experiences of Learning Computing using a 21st Century Model of Computer Science Continuing Professional Development
277
was challenging’. The experience of configuring
hardware helped another participant ‘learn about the
Raspberry Pi. (I) got to use. I got to play with Makey-
Makey (which I had only heard about before). I also
watched as breadboard was wired’. Providing
participants with the opportunity to unbox computing
hardware, install devices and install software linked
to controllers, allowed participants to ‘explore the
potential of computing hardware through group
work, successfully install the raspberry pi, and
explore python’.
5.2 Attitudes towards Computing
Having examined participant perception of learning,
computing knowledge and skills, the following
section explores participant attitudes.
5.2.1 Attitudes and Intentions
Participants shared the following views of the
workshop experience. One participant liked a project
orientated approach to learning where ‘problems
arise during the workshop (and through) discussion;
- trial and error the team overcame those problems’.
Another participant liked ‘having the opportunity to
create new content that isn’t directly related to my job
as a teacher / writer. I’ve learned that I am overly
analytical and tend to complicate topics. I’ve learned
that I am in a good place in terms of my content
knowledge and IT Skills’. However one participant
disliked the lack of direct instruction or teaching
which was perceived as necessary for encouraging
peer collaborating within teams; ‘I disliked how little
guidance / explanation was given on the actual
details. I know that is partly because as a teacher I’m
used to pushing through the syllabus with very little
time for self-learning or practicing! I expect
“teacher” to give out the formulas / examples’!
5.2.2 Replicating Coding Activities
Participants reported confidence in believing that
they could replicate workshop activities in their own
teaching. One participant intended using workshops
activities to increase student engagement on return to
the classroom ‘I will research potential lessons that
would translate well into the classroom environment
and engage children further. I will try to source
additional courses that will build on what I’ve
learned. I can give examples of the potential of the
raspberry pi in the classroom’. Another participant
intended using the learning model to deliver scratch
programming to their students: ‘I will try to integrate
Scratch into my daily teaching through project work
for my more visual learners. I will share Scratches
cross-curricula usefulness with my colleagues’.
Another participant intended using workshop ideas in
the context of delivering history lessons: ‘I will use
the ideas generated in our presentation in my
classroom and apply them across the history
curriculum. I will further delve into the materials
provided for my own development. I will share my
work with my colleagues’.
5.2.3 Social Constructivist Learning
Participants also expressed a range of views in
relation to using the Bridge21 model for teaching
computing. One participant stated that they would
like to use the model to ‘to introduce hardware
aspects and perhaps try to build some of the hardware
(e.g. the controls) as part of a science /engineering
project’ while another participant would use the
model ‘to help in my approach to problem solving in
my present role. It’s also given me new methods of
working with teams and groups’. However for some,
further preparation was required to bring 21
st
century
teaching and learning into the classroomI won’t be
applying anything yet as I am not familiar enough
with scratch – a lot more time and engagement is
needed for use with the programme before I will be
doing it with my students. I would not be confident in
sharing any info with colleagues apart from telling
them that scratch is an animation program’. For one
participant, the workshop experience had introduced
them to ‘new peers. Learn new concepts to introduce
into practice. Formed friendships and support
mechanisms and felt challenged at time. Overall
enjoyed the workshop and how it created a safe
learning environment’. The workshop experience had
helped participants explore computing ideas for
teaching delivered in a safe learning environment.
6 CONCLUSIONS
This paper set out to examine two research questions.
In relation to research question 1, team work enabled
teachers to discuss ideas, ask questions and draw from
the expertise of the group to solve problems and work
through issues, independent of the facilitator. Team
working also created a safe learning environment
where teachers with varying technical expertise could
work together and produce a technical product.
Learning activities played a key role in helping
teachers apply computing skills, where they could
work at their own level, however facilitation was
CSEDU 2016 - 8th International Conference on Computer Supported Education
278
sometimes needed to guide teams through unfamiliar
problems.
In terms of research question 2, teacher attitudes
towards the use of the Bridge21 model for teaching
computing were reported as largely positive.
Teachers enjoyed the relaxed atmosphere, and the
opportunity to explore concepts at their own pace.
Furthermore, the workshop experience exposed
teachers to open questioning, where facilitators would
guide problem-solving without necessarily providing
answers. This in turn, encouraged teams to converge
to work together through problems in order to seek
out and then report back answers. Certainly, some
participants expected a more teacher-centred
approach to teaching and this in turn influenced the
reporting of some negative comments. Overall,
teachers reacted warmly to the Bridge21 approach
and reported time and time again the importance of
team work in supporting discovery oriented learning
6.1 Next Steps
This evaluation paper is the second in series, which
seeks to explore the influence of social constructivist
learning models on teaching Computer Science. This
paper explores the second level of the Kirkpatrick
framework to understand teacher perceptions of their
learning and attitudes to using a social constructivist
approach to teaching computing. The authors are in
the process of analysing Level 3 data to explore
implementation in the classroom, with follow up
interviews planned (Level 4).
ACKNOWLEDGEMENTS
This work is funded by Google and the authors would
like to acknowledge that support and that of the post
primary school teachers whom generously gave their
consent to include written contributions which appear
in this paper.
REFERENCES
Baeten, M., Kyndt, E., Struyven, K. & Dochy, F. 2010.
Using Student-Centred Learning Environments To
Stimulate Deep Approaches To Learning: Factors
Encouraging Or Discouraging Their Effectiveness.
Educational Research Review, 5, 243-260.
Bates, R. 2004. A Critical Analysis Of Evaluation Practice:
The Kirkpatrick Model And The Principle Of
Beneficence. Evaluation And Program Planning, 27,
341-347.
Boud, D. 1988. Moving Towards Autonomy Developing
Student Autonomy In Learning. Second Edition Ed.:
Taylor & Francis.
Brinda, T., Puhlmann, H. & Schulte, C. 2009. Bridging Ict
And Cs: Educational Standards For Computer Science
In Lower Secondary Education. Acm Sigcse Bulletin,
41, 288-292.
Brown, N., Sentance, S., Crick, T. & Humphreys, S. 2014.
Restart: The Resurgence Of Computer Science In Uk
Schools. Acm Transactions In Computing Education,
14, 1-22.
Brown, T. & Wyatt, J. 2010. Design Thinking For Social
Innovation. Development Outreach, 12, 29-43.
Bryant, S., Romero, P. & Du Boulay, B. 2006. The
Collaborative Nature Of Pair Programming. 7th
International Conference On Extreme Programming
And Agile Processes In Software Engineering, 53-64.
Byrne, J. R., Fisher, L. & Tangney, B. 2015. Computer
Science Teacher Reactions Towards Raspberry Pi
Continuing Professional Development (Cpd)
Workshops Using The Bridge21 Model. Ieee 10th
International Conference On Computer Science &
Education 267-272.
Coldwell, M. & Simkins, T. 2011. Level Models Of
Continuing Professional Development Evaluation: A
Grounded Review And Critique. Professional
Development In Education, 37, 143-157.
Connell, A., Edwards, A., Hramiak, A., Rhoades, G. &
Stanley, N. 2015. Developing Your Capability To
Teach Computing. A Practical Guide To Teaching
Computing And Ict In The Secondary School. 2nd Ed.:
Routledge.
Ec. 2016. Coding 21st Century Skill [Online]. Brussels,
Belgium: European Union, 1995-2016 Available:
Https://Ec.Europa.Eu/Digital-Agenda/En/Coding-21st-
Century-Skill [Accessed 08/01/2016 2016].
English, L. & Sriraman, B. 2010. Problem Solving For The
21st Century. Theories Of Mathematics Education.
Springer.
Fisher, L., Byrne, J. R. & Tangney, B. 2015. Exploring
Teacher Reactions Towards A 21st Century Teaching
And Learning Approach To Continuing Professional
Development Programme In Computer Science. 7th
International Conference On Computer Supported
Education, 22-31.
Goetz, J. P. & Lecompte, M. D. 1981. Ethnographic
Research And The Problem Of Data Reduction.
Anthropology & Education Quarterly, 12, 51-70.
Guskey, T. R. 2000. Kirkpatrick's Evaluation Model
Evaluating Professional Development. Corwin Press,
Inc.
Harland, J. & Kinder, K. 1997. Teachers' Continuing
Professional Development: Framing A Model Of
Outcomes. British Journal Of In-Service Education, 23,
71-84.
Hazzan, O., Lapidot, T. & Ragonis, N. 2014. Active
Learning And Active-Learning-Based-Teaching-
Model. Guide To Teaching Computer Science: An
Activity-Based Approach. 2nd Ed.: Springer-Verlag
London Limited.
Teacher Experiences of Learning Computing using a 21st Century Model of Computer Science Continuing Professional Development
279
Holton, E. F. 1996. The Flawed FourLevel Evaluation
Model. Human Resource Development Quarterly, 7, 5-
21.
Johnston, K., Conneely, C., Murchan, D. & Tangney, B.
2015. Enacting Key Skills-Based Curricula In
Secondary Education: Lessons From A Technology-
Mediated, Group-Based Learning Initiative.
Technology, Pedagogy And Education, 24, 423-442.
Kaufman, R., Keller, J. & Watkins, R. 1996. What Works
And What Doesn't: Evaluation Beyond Kirkpatrick.
Performance And Instruction, 35, 8-12.
Kirkpatrick, D. L. 1994. The Four Levels: An Overview.
Evaluating Training Programs: The Four Levels.
Berrett-Koehler.
Lawlor, J., Conneely, C. & Tangney, B. 2010. Towards A
Pragmatic Model For Group-Based, Technology-
Mediated, Project-Oriented Learning–An Overview Of
The B2c Model. Technology Enhanced Learning.
Quality Of Teaching And Educational Reform.
Springer.
Lawlor, J., Marshall, K. & Tangney, B. 2015. Bridge21 –
Exploring The Potential To Foster Intrinsic Student
Motivation Through A Team-Based, Technology
Mediated Learning Model,. Technology, Pedagogy And
Education, 1-20.
Lecompte, M. D. & Goetz, J. P. 1982. Problems Of
Reliability And Validity In Ethnographic Research.
Review Of Educational Research, 52, 31-60.
Lecompte, M. D. & Schensul, J. J. 1999. Using Constant
Comparison And Analytical Induction To Identiy Items
Analyzing & Interpreting Ethnographic Data Altamira
Press.
Lewis, J., Ritchie, J., Mcnaughton Nicholls, C. & Ormston,
R. 2003. Qualitative Research Practice: A Guide For
Social Science Students And Researchers, Sage.
Naugle, K. A., Naugle, L. B. & Naugle, R. J. 2000.
Kirkpatrick's Evaluation Model As A Means Of
Evaluating Teacher Performance. Education, 121, 135-
144.
Ncca. 2014. Coding And Digital Media Literacy [Online].
Dublin: National Council For Curriculum And
Assessment. Available: Http://Www.Juniorcycle.Ie/Cu
rriculum/Short-Courses [Accessed 09/02/2016.
Smyth, E. & Banks, J. 2012. ‘There Was Never Really Any
Question Of Anything Else': Young People's Agency,
Institutional Habitus And The Transition To Higher
Education. British Journal Of Sociology Of Education,
33, 263-281.
Walser, N. 2008. Teaching 21st Century Skills. Harvard
Education Letter, 24, 1-3.
Zendler, A., Mcclung, O. W. & Klaudt, D. 2012. Content
And Process Concepts Relevant To Computer Science
Education: A Cross-Cultural Study. International
Journal Of Research Studies In Computing, 1, 27-47.
CSEDU 2016 - 8th International Conference on Computer Supported Education
280