Risk Competence Training Intervention with Airport Security Professionals

Authors: Ab Bertholet, Nienke Nieveen & Birgit Pepin.

Abstract

Risk decisions by professionals in safety and security regularly appear to be made more irrationally and biased than effectively and efficiently. A sector where fallacies in risk decision making constitute a key issue for management is airport security. In this article we present the results of an exploratory mixed methods study regarding the question: How and to what extent can we train professionals, in order to help them make smarter risk decisions? This study is designed as a controlled experiment, after qualitative preliminary phases with interviews and document analysis. The main conclusion is that the workshops we offered to airport security agents had a positive effect on the awareness and risk decisions of the intervention group. Next to the awareness effect, the intervention group showed less biased self-evaluation and was capable of identifying individual, collective and organisational points for improvement. We mute that these results should only be considered as first indications of effect. As the experiment was embedded in the normal working day of the security agents, the context variables entirely could not be controlled. With a combination of quantitative and qualitative data we tried to compensate this as much as possible.

1 Introduction

Risk management is a critical task in the fields of safety and security. This is the daily work of many thousands of professionals in healthcare and rehabilitation, child welfare, transport, industry, the police and fire services, events management, and prevention of terrorism. The decisions on risk are aimed at increasing safety and security, and at limiting potential damage. For some professions risk management is core business, but most often it is a secondary responsibility. In daily practice however, risk decisions made by professionals in various sectors regularly appear to be made more irrationally and biased than effectively and efficiently. In this article, we present a generic process model of biased risk decision making by professionals in safety and security management, derived from scientific literature and empirical practice. It provides generic insights into a general problem with human factors in risk management practice.

In this study we are concentrating on the example of security checks at the international airport of Amsterdam, in the form of a randomised controlled experiment with a risk competence training. The focus is on effective learning of risk decision making skills. The central question of the experiment was: How and to what extent can we train professionals, in order to help them make smarter (i.e. more rational, effective and efficient) risk decisions? In the subsequent sections, we describe the preliminary study, the design of the training and the experiment, and we present the results of the study. Finally, the effect of the intervention is discussed.

2 Process Model of Biased Risk Decision Making

Over past decades proof of mental obstacles and reflexes that hinder rational (risk) decision making can be found in numerous studies on this topic. Although the issue is well-known and support tools (e.g. protocols and checklists) are available, in daily practice all kinds of professionals are making risk decisions that are unsatisfactory, i.e. partly or entirely ineffective or inefficient (Bertholet, 2016a, 2016b). The main sources of bias found in the literature are insufficient numeracy and risk literacy. Increasing the professionals’ theoretical knowledge by teaching reckoning and clear thinking is no conclusive remedy for professional practice and cognitive biases cannot be cured on a cognitive level only (Kahneman, 2011). Specific risk competence training is even more important (Gigerenzer, 2003, 2015).

Our concept of risk competence including a process model of biased risk decision making (as displayed in Figure 1) is based on a theoretical and empirical literature review (Bertholet, 2016b).

Figure 1. Process model with indicators of Risk Competence and biases (Bertholet, 2016b, 2017).

From common risk management models, we focus on the two critical phases where fallacies lurk: 1. the analysis or judgement phase, and 2. the decision-making phase. In the safety and security domain, there are two main methods of risk analysis: calculation and estimation. Both methods can be distorted by mental patterns and fallacies in the human brain. The extent to which professionals are able to apply available methods and instruments of risk assessment we call risk intelligence (Evans, 2012). On the basis of their risk analysis, professional risk managers take a risk decision whether or not to implement a particular intervention. Biases also occur within the risk decision itself. If professionals react effectively and efficiently, then they are said to have a high level of risk skill.

In the model, we consider Risk Intelligence (RI) as the indicator of risk analysis and Risk Skill (RS) as the indicator of risk decision making. The product of both indicators we call Risk Competence (RC), which is the overall indicator for risk management: RC = RI x RS. (Evans, 2012; Gigerenzer, & Martignon, 2015; Bertholet, 2016b). Based on field interviews and the literature (Kahneman, 2011; Gigerenzer, 2003; Dobelli, 2011, 2012; Bertholet, 2016b, 2017), we selected eleven biases for our study that we regarded as most relevant for safety and security practice. To illustrate the process model, cartoons of all eleven biases were designed and are presented in an animated 9-minute video: https://youtu.be/4rWPppdJ3YQ. The eleven biases are clustered into three groups.

Calculation biases

Numbers and percentages appear to express a risk in a quantitative and precise manner. In practice, professionals find it hard to grasp the precise meaning of chance, probability and other quantitative data. Paulos (1988), Kahneman, & Tversky (1979), Gigerenzer (2003), and Kahneman (2011) have been writing for decades about ‘innumeracy’, caused by cognitive illusion. Conditional and extremely small probabilities, statistical argumentation and causality biases cause the biggest problems in ‘reckoning with risk’ (Gigerenzer, 2003).

Estimation biases

Risks that cannot be calculated must be estimated. That is for example often the case with social safety (Gigerenzer, 2003; Dobelli 2011, 2012). In risk analysis, professionals often seek confirmation of risks they are already aware. The danger is that in focusing on a single risk profile, they may miss the bigger picture. This is known as confirmation bias. Authority bias occurs when a professional, who is either higher in the hierarchy, or more experienced, is not corrected by colleagues despite their superior analysis. It is assumed that the authority’s analysis is more accurate. The overconfidence effect is the mirror image of authority bias. Even experienced professionals can sometimes wrongly assume that they are correct. They may overvalue their own capacities, or the probability of success of a project, and they may underestimate the risks involved. Availability bias in the analysis phase causes professionals to trust readily available information about risk rather than to be aware of less visible data.

Decision biases

Because of biases, fallacies, thinking errors or distortions, which appear in the analysis phase, optimal and rational judgements can no longer be made during the decision phase. Apart from this, also in the decision phase biases lurk when it comes to determine whether the risk analysis suggests intervention and if so, which one (Gigerenzer, 2003; Dobelli 2011, 2012). At this point, confirmation bias of a second type can arise. When a professional’s assessments are endorsed, greater and more dangerous risks may be ignored. In the decision phase, we can also see a fallacy, which we call availability bias of the second type. Professionals tend not to choose the best intervention or therapy or policy; they choose the remedy they already know, the one that is at the forefront in their mind. Hindsight bias is a fallacy exhibited by more than just professionals. It may also affect public opinion, the media and politicians more deeply. Subsequently, it is easy to conclude that something else should have happened or that action should have been taken earlier.

3 Study with airport security agents: Context analysis

A sector where fallacies in risk analysis and decision making constitute a key issue for management, is airport security. In the control of passengers and hand luggage in civil aviation, security agents are deployed to prevent persons or objects on board which may endanger the safety on board airplanes. Agents make risk decisions about passengers and luggage items, by classifying persons, behaviour, situations and objects as safe, suspicious or dangerous. Their critical operational tasks in risk decision making are displayed in Figure 5. The quality indicators of the airport security operation are measured by inspections and by sampling. Samples are dangerous or otherwise prohibited items carried by so called ‘mysterious guests’ on their body or in their hand luggage, which must then be intercepted by the security agents.

Improving security agents’ performance in risk decision making could make a significant contribution to the security of the airport. Although there is a focus on avoiding fallacies during the agents’ initial training and afterwards in periodic training, the problem of fallacies in the process of decision making often remains. Hence, human factors in airport security are considered a systematic, thus predictable weak link (Kahneman, & Tversky, 1979; Gigerenzer, 2003; Ariely, 2008; Kahneman, 2011; Dobelli, 2011, 2012). This is why we set up our study at the request of an airport security company.

First, we investigated the context and the target group of security professionals via observation, and interviews, and document analysis (e.g. manuals, working instructions, procedures, training materials). Based on the findings of the preliminary study we transferred the generic model (Figure 1) into an empirical model for the specific professional domain of the airport security agent, by ranking the biases by relevance. We identified confirmation, availability, authority, overconfidence and hindsight bias as crucial fallacies. For security agents in the analysis phase the focus was on estimation, more than on calculation. Furthermore, particular preconditions and stress factors apply to the decision-making process, such as time and peak pressure. What the critical professional tasks had in common was that they had been carried out according to established procedures, and that meta-level vigilance of agents was required to watch simultaneously specific indicators for deviant of suspicious behaviour (Figure 3). These included for example luggage that did not seem to fit the specific passenger or a passenger who seemed to be extremely hurried, curious or otherwise behaved differently.

4 Training intervention design

This section describes the design of the training intervention, including its objectives, the underlying training concepts, and corresponding features. The intervention focused on two goals for the short term, set jointly by the security company and the research team: (a) agents performing better at selected critical professional tasks and (b) defining performance norms for agents’ performance. The specific learning goals of the training intervention for the security agents were: (a) to acquire knowledge of common fallacies in assessing risks and making risk decisions; (b) to recognise and be aware of these errors in their own work and behaviour; (c) to explore opportunities to avoid or reduce fallacies in the context of their own work.

Conceptual dimensions

The design of the intervention was based on four concepts regarding vocational learning, respectively learning in general: competence based learning (Mulder, 2000), self-responsible learning (Schön, 1983; Zimmerman, 1989), collaborative learning (Vygotsky, 1997) and concrete learning (Hattie, 2009; De Bruyckere, Kirschner, & Hulshof, 2015).

Competence-based learning refers to the European Qualifications Framework for Life Long Learning (European Commission, n.d.), where competences are regarded as a third qualification area, next to knowledge and skills. For risk decision making in general and therefore also for the security checks at airports, awareness of fallacies and cognitive biases is essential. Both competence-based learning and self-responsible learning require awareness of the bias effect of human intuition and perception. The theoretical dimension of self-responsible learning focuses on self-regulation, the professionalising effect on the individual (Schön, 1983; Vygotsky, 1997). Professionals who want to improve their work performance must be able to reflect on their professionalism. ’Reflective practitioners’ (Schön, 1983), can evaluate their own actions at a metacognitive level and have a picture of the path that brings them to the level of the professionals that they would like to be. Self-constructs such as self-esteem and self-efficacy are important indicators in this context, which we used as a measure of the professionals’ self-image (Bandura, 1977; Zimmerman, 1989; Judge, & Bono, 2001; Ryan, & Deci, 2009). Where scores on an assignment or a test can be considered as an external measure of risk competence, self-scores by professionals themselves can be an internal mirror image. For that reason, we asked security agents to score their own self-esteem and self-efficacy (Table 2). Self-esteem refers to the more general ratings for the professional’s performance level and stage of professional development. Self-efficacy is the belief in someone’s own capacity to succeed at specific tasks (ibid).

Collaborative learning in strong partnership with others offers opportunities to achieve better results compared to individual learning, as long as applicable design principles are respected (Valcke, 2010; Johnson, & Johnson, 2009; Hattie, 2009). The instructional principle of ‘concrete learning’ (Hattie, 2009; De Bruyckere, Kirschner, & Hulshof, 2015) claims a better learning achievement when working with realistic cases and training materials from practice, rather than with academic and theoretical exercises. This means that agents needed to recognise and acknowledge assignments and case studies as originating from and relevant to their daily work. For the dimensions of competence-based and collaborative learning, authentic rather than academic content might have had a positive effect mainly on the motivation of the agents (Ryan, & Deci, 2009). Specific choices in the intervention design (e.g. visualisation as an instructional strategy, group assignments as a metacognitive reflection strategy) were based on insights of concrete learning as well (Hattie, 2009; Gigerenzer, 2015; Kirschner, 2017).

A teaching strategy that is built on spaced learning, repetition, cyclical training of the right way of thinking and decision making, can ensure retention and securing the learning achievement. Training in groups can also lead to more active and metacognitive processing (Hattie, 2009; De Bruyckere, Kirschner, & Hulshof, 2015; Johnson, & Johnson, 1999). We used these principles in the instructional strategies and methods, as well as in the production of all the training materials: Figures 2, 3 and 4.

Content of the intervention

The training offered to an intervention group of about sixty agents consisted of two workshop sessions and an extensive training of four weeks, with one risk decision to be responded to by the participants every weekday. An intervention group and a control group both completed a pre-test and a post-test.

From the preliminary study, we compiled an inventory of problematic risk decisions as they occurred in the daily practice of the airport. Based on this inventory, we developed the training material: 50 test items around mini-cases, each with one realistic risk decision from daily practice. In addition, the biases from the generic process model were visualized in cartoons (Figure 2) and the ‘suspicious indicators’ were transformed into pictograms (Figure 3). We changed the traditional way of knowledge transfer in the form of written or spoken text by visual transfer media containing the essential points (Figures 2, 3).

We transferred the eleven cognitive biases from the generic process model (Figure 1) into cartoons, which define the respective fallacies – symbolically, exemplarily and in an instantly recognisable manner, straighter than written text can do. Figure 2 is an example of this (Confirmation bias Type I). In risk analysis, safety and security professionals often look for confirmation of risks they are already aware of. The danger is that in focusing on a single risk profile they may miss the bigger picture. This is known as confirmation bias (Dobelli, 2011). It occurs in all kinds of profiling activities, from police surveillance to intelligence and security services. It is not a theoretical concept, but the metaphor in the drawing should help the professionals recognising the situation and apply it in their own practice. In the training session, we discussed the situation in the cartoon with the agents and asked them to apply it to their own daily practice: “What kind of risks in the bigger picture do my colleagues and I overlook, by focusing on common risks that are more likely to occur?”

Figure 2. Confirmation bias Type I in the analyses phase (MYRAAAB, 2016)

We replaced the signs of specific suspicious behaviour or situations at the airport by icons, which should provide a mental shortcut, in order to help the agents recognising the situations. A visual stimulus can lead to a risk decision response via a shorter route (Gigerenzer, 2015).

Figure 3. Visualisation of indicators of deviant behaviour or dangerous intentions (MYRAAAB, 2016).

For the extensive training part with daily test items, we took photos of the security operation at the airport, along with the suspicious indicators: screenshots of the X-ray scanner and security scan, and photographs of hand luggage. For privacy reasons, we did not use any pictures of real passengers. Instead, students played the roles of passengers, and with photos from the public domain (Google) we were able to create a wide variety of ‘passengers’ as well as realistic test questions.

During the first workshop, conceptual knowledge was introduced in the form of cartoons of selected fallacies. No underlying theory was offered, we drew attention to the fallacy embodied in the cartoons. Then the agents were invited to link to their own practice and their own behaviour, in group assignments. The assignments focused on finding individual and collective answers to some key questions about awareness, responsibility, signalling, limiting conditions, professional reflection and teamwork. As a final group assignment in the second workshop, the agents were asked to optimise the airport security filter by redesign.

5. Design of the mixed methods study

Figure 5 shows the five phases of the study, with preliminary study (1), pre- (2) and post-test (5), intensive (3) and extensive (4) training. The training interventions (3 and 4) were offered to the intervention group only, the tests (2 and 5) were performed by both the intervention and the control group. In our mixed methods study we also used triangulation, in order to obtain more balanced indications of the intervention effects (Creswell, 2013).

Figure 4. Phases of the study.

Interviews and context analysis

First we carried out a preliminary study (1), in the form of interviews, document analysis and observation of the security operation at one of the departure terminals of Amsterdam Airport Schiphol. With twenty-one iterative, semi-structured interview sessions with employees and managers of the airport security company, sampling all operational and management levels, we collected facts and opinions on working procedures, performance indicators, judgement and decision making, workplace optimisation. The interviews (30–60 minutes each) were recorded and analysed with respect to selected topics: (reported) key factors for success and critical competences; professional attitude; specific tasks, judgements and decisions; personal and team biases. With these key topics, in combination with information on working routines and procedures from document analysis (e.g. manuals and in company training materials), we selected the biases most likely to occur at the airport security check. This selection we used both for creating test items and workshop materials. We also made a study of the procedures and work instructions, the key performance indicators and the service level agreement between the airport and the security company. At the end of the preliminary study, we identified four professional operations, which were confirmed by the respondents as critical tasks, covering the complete process of judgement and risk decision making by the security agents (Figure 5).

Figure 5. Critical professional operations of airport security agents

Randomised controlled experiment

The experiment (Figure 4) was designed as a randomised controlled trial, with intervention and control groups, pre- and post-tests. For this study we drew a random sample of 10 percent of all security agents from the security company (n=120). This sample of 120 employees we assigned randomly to an intervention group and a control group (n=60/60). Prior to the training intervention, we developed two similar tests with nine specific (part 1) and six general risk decision items (part 2), as pre- and post-test. Part 3 of the tests consisted of questions on self-constructs (self-esteem and self-efficacy) and background variables. Intervention (IG) and control groups (CG) were both put to the pre- and post-test. Only the intervention group was offered an intensive and an extensive training in between the two tests, the control group did not receive any specific treatment (Figure 5). The intensive first part of the training intervention was offered to the agents of the IG in groups of 10 to 15 people, in a standard training room at the airport. The agents were taking part in two workshops of three hours each, with a short interval of less than two weeks. All workshops were led by the same two trainers from Utrecht University of Applied Sciences. After the workshops, we offered an extensive training to the agents of the IG. They received one test item per day via email, on weekdays, for a period of four weeks. Test items were similar to the nine specific risk decisions of the pre- and post-tests. In total the IG agents were asked to respond to 20 email items. We collected qualitative data from the interviews (Figure 4, step 1) and from the workshop sessions (Figure 4, step 3). During the sessions the members of the intervention group shared their professional opinions, both orally and written, both individually and in small groups.

6 Findings

In this section, we present the quantitative and qualitative results of the experiment.

Quantitative results

Table 1 shows descriptive statistics of the research groups, as well as the results of a number of independent t-tests on differences between the control group and the intervention group. The (categories of) variables are:

  1. Background characteristics: gender (fraction of males), age (in years), function (fraction agents vs team leaders), country of birth (fraction Netherlands – NLD vs other);
  2. Education and work experience: education level (ascending levels of Dutch intermediate vocational education; fraction mbo2, mbo3, mbo4 vs others) and work experience (in years).

There were no significant differences between the two research groups; this was an expected result of a randomised allocation of participants among the intervention and control groups. The intervention group contained more men (64% vs 51%) and more agents were native Dutch (i.c. born in The Netherlands: 75% vs 64%). Differences in average age (≈ 34 years) and position (≈ 75% agents vs ≈ 25% team leaders) were small. Professionals in the intervention group were higher educated and more experienced (+ .41 year on ≈ 6.5 years of experience in average).

Table 1. Comparison of Intervention and Control Group based on Post-test Results.

In Table 2 the two groups after the post-test are compared:

  1. Scores on the post-test: average number of correct answers to the specific risk decisions (airport security, 9 items) and general risk decisions (6 items), and the total of both categories (15 items).
  2. SEf is the indicator of professionals’ self-efficacy in the three categories (specific and general test items, and the total of both). Respondents estimated their own competence after completing the post-test. The fourth indicator of self-efficacy we used was a yes/no response to the question whether the agent felt she was making better risk decisions than she did two months earlier during the pre-test.
  3. Two indicators of self-esteem (SEst): self-positioning compared with colleagues (scale from low 1 to high 10), self-assessment on level of professional development (% 0–100).

Post-test scores on the risk decisions test did not show any significant differences between the research groups. Control group members had a higher score on specific airport risk decisions, and on the total of the post-test. The intervention group members’ score was higher at general risk decisions. All agents had a higher percentage of correct answers to the specific airport risk decisions (≈ 7.5 of 9 = 83%), compared with general risk decisions (≈ 2.8 of 6 = 47%). Significant differences occurred in the self-construct scores. In both groups, the self-efficacy scores were substantially overestimating the test scores on general risk decisions, the overestimation by control group members was almost twice as high as that of the intervention group members. Self-efficacy at specific airport risk decisions was almost accurate in both groups. The self-esteem scores (scale 1–10 position compared to colleagues and stage in professional development on a 0–100%-scale) were higher in the control group. Intervention group members were about 15 month younger, higher educated, more experienced (all in Table 1) and had a more modest self-esteem (Table 2).

Table 2. Comparison of Intervention and Control Group based on Post-test Results.

Significance on 10%, 5% and 1% level (*p<.10, **p<.05, ***p<.01)

Extensive training 1-item test

After the workshop sessions, the intervention group (valid n = 35) underwent extensive training for four weeks, with a daily 1-item test on weekdays. In that period, each participant received twenty questions in total (Figure 4). Of the 700 items that were deployed in this way, we received 381 correct answers (72%) of a tot al of 527 replies. On average, every agent answered 15 of the 20 questions, of which 11 (73%) answers were correct. That score was somewhat higher than the average 10 correct answers (68%) that the intervention group gave on the post-test. The 1-item tests consisted only of specific airport risk decisions.

Table 3. Effect of Intervention ‘Security Performance’ on Intervention Group.

Significance on 10%, 5% and 1% level (*p<.10, **p<.05, ***p<.01)

Table 3 shows the effect of the intervention on the intervention group: there was a significant difference (.949) between the total test scores of post-test (10.25) and pre-test (9.30). The effect size, expressed as Cohen’s d, is .5. This measure indicates to what extend the training intervention has ‘made a difference’, by comparing the standardised means of pre-test and post-test of the intervention group. An effect size of .5 is usually regarded as a small to medium size effect of an intervention. According to Hattie (2009) .5 can be considered a medium to high effect, for educational interventions in particular. Due to too many missing values in the pre-test data of the control group, a valid difference in differences [(post-test minus pre-test of IG = .949) – (post-test minus pre-test of CG)] analysis was not possible.

Finally, the valid N in this study is smaller than expected. This was partly caused by operational issues: agents who were scheduled, but could not participate in the workshop sessions, for various reasons. No-shows are an issue too in the pre- and post-tests, as we saw in the extensive training of the 1-item test. The operational planning of the security company is a very complicated process. Furthermore, the energy level and motivation of an agent varied, depending on whether the 3-hour training was attended at the start of the working day or at the end of a night shift. As far as the quantitative part is concerned, we can conclude that the statistical evidence might be less strong than intended in an experimental design of a randomised controlled trial. The fact that we conducted our study in the practice of a running airport business is certainly the main reason for that conclusion.

Qualitative analysis

In the workshop sessions, we collected information on how agents reflect on their daily tasks, what problems they experienced with risk decision making and what kind of solutions they would suggest. Agents wrote their opinions and solutions on big sheets of paper, which were collected and analysed afterwards. In Tables 4–7 we summarise the most important topics and most frequent answers. Table 4 shows the most common answers to the question what agents do to be and stay sharp at their job, and what they need from others.

Table 4. How to stay sharp at the job?
What did the agents do to stay sharp?What did agents say they need from others?
Do not be distracted and think for yourselfYou must be able to count on colleagues, with support, collegiality and involvement; both co-workers and executives
Take responsibility for a healthy lifestyle (rest, sleep, nutrition etc.)Positive and also constructive feedback, given in a sympathetic way
Be honest with yourself and colleagues, if you actually know that you are not sharp and fit with circumstancesGood briefings and good planning / team layout / compliance with times / rest periods
Incorporate recreation moments and humourPleasant working environment

Reflection on the professional’s responsibilities at work indicates what they feel responsible for. Table 5 shows a summary of individual and collective input from the workshops.

Table 5. Feeling responsible at the job.
What did agents feel responsible for?What did agents not feel responsible for?
Safety of colleagues and passengersPassenger flow
Intercepting prohibited itemsOperation of equipment
For yourself: commitment, motivation, quality of your work, arriving on time, staying respectful etc.Failure and mistakes of others outside the team

Table 6 shows what fallacies frequently occurred in daily practice, according to the participants of the workshops.

Table 6. Fallacies in daily practice.
What mistakes did the agents recognize in themselves and / or with their colleagues?What typical examples did they share?
Focus on one item or subject (confirmation bias)Focusing on a bottle with liquid from the X-ray scan; missing other suspicious contents of a bag
Making all types of assumptions (availability bias)Colleagues from the security company do not need to be checked
Long-term work experience that can lead to automatism / routine leading to incorrect assessment / risk decision; (overconfidence bias, authority bias)Knowing for sure that families from a certain country are no risk at all
Being influenced by available information in the news and social media (availability bias)After an attack abroad focus too much on the modus operandi of the perpetrators of that incident
Blaming others for missing a sample test; particularly by team leaders and executives (hindsight bias) Commenting on a colleague missing a test sample at the X-ray scan: “How could you miss it? This is obvious to everyone!”

What professionals think actors at three levels (individual, team, company) could do to prevent fallacies and biased risk decision making, is displayed in table 7.

Table 7. Prevention of fallacies.
What could agents do themselves?What could the team do?What could the company do?
Take time to step back to assess the luggage / situation and see the complete picture Give feedback, motivate and coach each otherGood communication and information
In case of doubt: check (again)Talk to each other, give feedback, both positive and criticalProvide clear working procedures and refresh (keep them alive)
Accept help and ask for it if you are not sureActively point out risks and possible consequences to each otherEnsure good briefings and agreements on how information reaches everyone
Keep alert, curious and (self) criticalContinue training, motivating and keep remembering and refreshing good practice as a team
Keep your background information updated (on attacks for example)Give a colleague a break (after he or she missed a test sample, for example)
Communicate clearly (with passengers, with colleagues)

7 Conclusion

The study aimed to find an answer to the question: How and to what extent can we train professionals, in order to help them make smarter (i.e. more rational, effective and efficient) risk decisions? The main short-term goal of the intervention was to achieve a higher level of risk competence. The quantitative results showed a positive effect (.5) of the training intervention on the intervention group, which can be considered as a medium or high effect for an educational intervention (Hattie, 2009). This indicates that the complete intervention (workshops and tests) contributed to more awareness in the process of judgement and decision-making, and improved risk decisions by the security agents. The self-esteem and self-efficacy scores indicate that agents in the intervention group showed a more moderate, less biased self-evaluation. From the qualitative results, we conclude that agents were able to identify individual, collective and organisational points for improvement.

The intervention had a positive effect on the intervention group (Table 3; d = .5). Although this effect could not be purified by a difference-in-difference analysis. Even though there was no significant difference between intervention and control groups, the scores on general and specific test items at the post-test (Table 2; ≈2.8 and ≈7.5) can be used by the security company as an indication of an agent’s average risk competence. For setting standard values, the tests may provide anchors, as for the use in the recruitment and selection process of new agents in the future. However, they still need to be validated by replication.

The results indicate points of reference to expect that training such as this can be productive in various ways. Awareness can be considered as the first step on the road to better performance, and it may be assumed that awareness has been improved, both at the test scores and on self-esteem and self-efficacy. Even when intervention group self-efficacy and self-esteem scores appear to have decreased after the intervention, this could still mean that the agents were more conscious of their incompetence regarding biases.

The importance of good workplace conditions and encouraging leadership came up in all workshops, both in individual contributions and group discussions. This is an important perspective for the mid- and long-term success of a risk competence program. Where the process model (Figure 1) has proved to be adequate for evaluation of the risk management process itself, factors of organisational and managerial culture should be considered in a broader sense, since they may determine the circumstances under which judgement and risk decision making happens.

8 Discussion

Kahneman is not very optimistic about the possibility of improving people’s risk decision making competence. Gigerenzer on the other hand is convinced of educational strategies (like visualisation and heuristics) that are likely to lead to advanced risk competence (Bond, 2009; Kahneman, 2011; Gigerenzer, 2015).

In this study, the focus was on the design and testing of a training intervention for a specific professional setting. The scale of the experiment may be increased later, possibly in a modified setting. The quantitative results were modest. The effectiveness of the training in terms of statistical evidence and validity is not easy to demonstrate. Effectiveness could also be affected because the intervention was dependent on the operational planning of the security company and the airport. Sickness absence was one of the factors that caused complications, next to position, function or shift change of participants. In addition, there may have been variables outside our model and design, which therefore are not taken into account. As with many educational interventions, a Hawthorne effect could have occurred: participants in an experiment respond differently due to the fact that they are aware it is an experiment (De Bruyckere, Kirschner and Hulshof, 2015). In this case, this could be applicable to the intervention group, as well as to the control group. Furthermore, during the workshops we noticed that not all agents felt free to give their true opinion. Whether they were right or not, some agents expressed fear that their contribution to the sessions would be taken into account by the management, in one way or another.

Supporting a program of risk competence would benefit from improved internal communication. This starts by announcing a program and its backgrounds, and ends with communicating the achievements. Many agents were unaware of the training they were sent to, with consequences for their attitude at the start of the training intervention. Explaining the rationale and the meaning of a training intervention to participants before it starts, will make a big difference. It is also recommended to share the results of the experiment with the participants and the works council.

Cartoons and illustrations by MYRAAAB: Myra Beckers (myraaa.com) and Ab Bertholet.

Authors

Ab Bertholet, M.Sc., Lecturer, Researcher, Utrecht University of Applied Sciences (HU); Eindhoven University of Technology (TU/e), The Netherlands
Nienke Nieveen, PhD, Associate Professor, Netherlands Institute for Curriculum Development (SLO); Eindhoven University of Technology (TU/e), The Netherlands
Birgit Pepin, PhD, Professor of Mathematics/STEM Education, Eindhoven University of Technology (TU/e), The Netherlands


Ariely, D. (2008). Predictably Irrational. The hidden forces that Shape Our Decisions. New York: HarperCollins.

Bandura, A. (1997). Self-efficacy: the exercise of control. New York: W.H. Freeman.

Bertholet, A.G.E.M. (2016a). Exploring Biased Risk Decisions and (Re) searching for an Educational Remedy. EAPRIL Conference Proceedings 2015, 488‒499.

Bertholet, A.G.E.M. (2016b). Risico-intelligentie en risicovaardigheid van professionals in het veiligheidsdomein. Problemen bij het nemen van risicobeslissingen door denkfouten en beperkte gecijferdheid. Ruimtelijke Veiligheid en Risicobeleid, 7 (22), 60‒74. [Dutch]

Bertholet, A.G.E.M. (2017). Risk Management and Biased Risk Decision Making by Professionals. Educational video: https://youtu.be/4rWPppdJ3YQ.

Bond, M. (2009). Risk School. Nature 461 (29), 1189‒1191.

Bruyckere, P. de, Kirschner, P.A., & Hulshof, C. (2015). Urban Myths about Learning and Education. London: Academic Press.

Creswell, J. W. (2013). Research design: Qualitative, quantitative, and mixed methods approaches. Sage Publications, Incorporated.

Dobelli, R. (2011). Die Kunst des klaren Denkens. 52 Denkfehler, die Sie besser anderen überlassen. München: Carl Hanser Verlag. [German]

Dobelli, R. (2012). Die Kunst des klugen Handelns. 52 Irrwege, die Sie besser anderen überlassen. München: Carl Hanser Verlag. [German]

European Commission (N.d.). The European Qualifications Framework for Lifelong Learning. Brussels: European Commission.

Evans, D. (2012). Risk Intelligence: How to Live with Uncertainty. New York: Free Press.

Gigerenzer, G. (2003). Reckoning with Risk: Learning to Live with Uncertainty. London: Penguin Books Ltd.

Gigerenzer, G. (2015). Risk Savvy: How to make good decisions. London: Penguin Publishing Group.

Gigerenzer, G., & Martignon, M. (2015). Risikokompetenz in der Schule lernen. Lernen und Lernstörungen, 4 (2), 91‒98. [German]

Hattie, J.A.C. (2009). Visible Learning. A synthesis of over 800 meta-analyses relating to achievement. New York: Routledge.

Johnson, D.W., & Johnson, R.T. (1999). Making cooperative learning work. Theory into Practice, 38(2), 67‒73.

Johnson, D.W., & Johnson, R.T. (2009). An Educational Psychology Success Story: Social Interdependence Theory and Cooperative Learning. Educational Researcher, 38(5), 365‒379.

Judge, T.A., & Bono, J.E. (2001). Relationship of Core Self-Evaluations Traits—Self-Esteem, Generalized Self-Efficacy, Locus of Control, and Emotional Stability—With Job Satisfaction and Job Performance: A Meta-Analysis. Journal of Applied Psychology, 86(1), 80‒92.

Kahneman, D., & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, 47(2), 263‒291.

Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Strauss and Giroux.

Mulder, P. (2000). Competentieontwikkeling in bedrijf en onderwijs; achtergronden en verantwoording. Wageningen: Wageningen University. [Dutch]

Paulos, J.A. (1988). Innumeracy: Mathematical Illiteracy and its Consequences. New York: Hill & Wang.

Ryan, R.M., & Deci, E.L. (2009). Promoting Self-Determined School Engagement: Motivation, Learning, and Well-Being. In K.R. Wentzel & A. Wigfield (Eds.), Handbook of Motivation at School, (pp. 171‒195). New York: Routledge, Taylor & Francis Group.

Schön, D. (1983). The Reflective Practitioner. How professionals think in action. New York: Basic Books.

Valcke, M. (2010). Onderwijskunde als ontwerpwetenschap. Gent: Academia Press. [Dutch]

Vygotsky, L.S. (1997). The collected Works of L.S. Vygotsky: Vol. 4: The history of the development of higher mental functions. New York: Springer US.

Zimmerman, B.J. (1989). Models of self-regulated learning and academic achievement. In B.J. Zimmerman & D.H. Schunk (Eds.). Self-regulated learning and academic achievement. Theory, research and practice (pp. 1‒25). New York: Springer.

 

Digital Solutions in Teacher Education enhance Wellbeing and Expertise

Authors: Essi Ryymin, Irma Kunnari and Alexandre Fonseca D’Andréa

Teacher education programme for Brazilian teachers

Häme University of Applied Sciences (HAMK) has coordinated The VET Teachers for the Future – Professional Development Programme for Brazilian teachers since 2014 together with its partner Tampere University of Applied Sciences (TAMK). The programme has been implemented altogether three times now. One training lasts about 7-9 months including study sections both in Finnish and in Brazilian learning environments. The programme scores 30 credits.

Altogether 106 teachers have graduated from the programme so far, and thousands of students and teacher colleagues have been contributed to regional development work in Brazil. The participants of the programme represent several disciplines and sciences, for example biotechnology, agricultural engineering, agronomy, computer science, chemistry, mathematics, linguistics, educational sciences and business administration. The teachers’ work in the Federal Institutes in Brazil, which are institutions for higher, basic and professional education specialized in offering vocational and technology education. The goal of the Federal Institutes is to answer to social and economic demands of the region by using applied research to boost innovations and the local development.

The goal in The VET Teachers for the Future programme is to encourage the participants to collaboratively rethink and design innovative education and learning environments to respond to their on-going regional and future challenges. The main contents of the program include competency-based education with 21st century skills and cooperation between universities and the world of work. The teacher students create and implement an individual or a shared development project during the training. The projects include a wide spectrum of inventions from the scientific research to high tech and social innovations, for example new digital applications and games for education, school management models, new pedagogical practices and training programmes as well as pedagogical models for preventing social exclusion.

Making professional development transparent by digital solutions

In the programme digital solutions were consciously utilized in order to make teachers’ professional development visible, especially issues related to relatedness, social connections and networked expertise. For example, teachers were encouraged to solve educational challenges together and share, and further develop, their thinking collaboratively and openly on different digital platforms. Hakkarainen and his team (2004) have developed a theoretical and methodological framework to examine networked expertise; higher-level competences that arise, in appropriate environments, from sustained collaborative efforts to solving problems and building knowledge together.

Many theorists have defined relatedness as a basic human need that is essential for wellbeing (Baumeister & Leary 1995; Deci Ryan 2012), and others have suggested that having stable, satisfying relationships is a general resilience factor across the lifespan (Mikulincer 1998). The role of positive emotions in the formation of social bonds (Baumeister & Leary 1995) and in the creation of important skills and resources (Fredrickson 2001; Sheldon King 2001) has been widely noticed.

Creating wellbeing for members of the community can be understood as a learning process that enhances relatedness, competence and autonomy (Ryan & Deci 2000; Sheldon & King 2001; Hakkarainen, Palonen, Paavola & Lehtinen 2004; Seligman & Csikszentmihalyi 2000). These basic psychological needs are determinative with regard to optimal experience and wellbeing in daily life, also in an educational environment. Creating wellbeing within a teacher education programme can be seen then as an active, collaborative and situated process in which the relationship between individuals and their environment is constantly constructed and modified (Soini, Pyhältö & Pietarinen 2010).

The first study results reveals creative use of digital solutions

There in an ongoing study in which Finnish and Brazilian programme partners try to capture optimal practices of teachers’ professional development, in terms of building relatedness, feeling of competence, autonomy and networked expertise. A key question is also how the digital solutions can be used in wellbeing and networked expertise building?

During the training programme the group of teacher students from Brazil were personally interviewed. Also the data from the use of different digital platform and database was gathered, e.g. from learning diaries (blogs), discussion forums and competence demonstrations from interactive applications. The transcripts and the digital data is qualitatively analyzed. The content analysis (Krippendorff 2004) aimed to define the teachers in professional development practices by using case analysis of each participant’s descriptions of key events promoting professional development during the education programme (Patton 1990, 376-377).

The first study results (Kunnari & Ryymin 2016; Ryymin, Kunnari, Joyce & Laurikainen 2016; Ryymin et al. 2015) reveal that practices such as building, caring and respecting connections, creating positive interpretations and affordances together, adopting practices according to the perceived needs of the teachers have an impact on relationships that fostered senses of relatedness, competence and autonomy of teacher students. These relationships appeared to play an important role in creating successful social conditions for learning, wellbeing and pedagogical change. This can be seen as an interpersonal flourishing, which is a core feature of quality living across cultures.

The preliminary findings suggest also that the teachers consciously constructed networked expertise and socio-psychological wellbeing by applying digital solutions creatively, and this had a positive impact on their pedagogical practices. Creative, flexible and open use of digital solutions enhanced wellbeing for example by multiplying emotional, societal and cognitive support and by making peer support, positive feedback, reciprocal respect as well as cultural knowhow, knowledge, sensitivity and understanding transparent and accessible. The networked expertise was evolved, e.g. by sharing connections and resources, consulting colleagues and linking people and by solving relevant regional challenges together. The digital solutions seemed to facilitate the process effectively. The programme, as well as the applied research process, is ongoing, iterative and dynamic by its nature, and more detailed findings and conclusions will be reflected and dialogued later in the process. It is also very important to analyze what the challenges and obstacles in teachers’ professional development and pedagogical change are, as well as what are the qualities for successful international teacher education in the future.

http://www.hamk.fi/english/collaboration-and-research/professional-excellence/global-education-rd/Sivut/references.aspx

Picture 1. The Graduation Seminar of the third The VET Teachers for the Future – Programme on 9th of December 2016 in Maceió, Brazil. HAMK Study Group together with their Tutor Teachers.

Authors

Essi Ryymin, Ph.D., Research and Development Manager, Principal Lecturer, Häme University of Applied Sciences, essi.ryymin(at)hamk.fi
Irma Kunnari, M.Ed., Principal Lecturer, Ph.D. Student, Häme University of Applied Sciences, irma.kunnari(at)hamk.fi
Alexandre Fonseca D’Andréa, Ph.D., Teacher of Basic, Technical and Technological Education, Federal Institute of Education, Science and Technology of Paraíba,
alexdandrea(at)gmail.com

Baumeister, R. F., & Leary, M. R. 1995. The Need to Belong: Desire for Interpersonal Attachments as a Fundamental Human Motivation. Psychological Bulletin, 117(3), 497-529.

Deci, E. L., & Ryan, R. M. 2012. Motivation, personality, and development within embedded social contexts: An overview of self-determination theory. In R. M. Ryan (Ed.), Oxford handbook of human motivation (pp. 85-107). Oxford, UK: Oxford University Press.

Fredrickson, BL. 2001. The Role of Positive Emotions in Positive Psychology: The Broaden-and-Build Theory of Positive Emotions. The American psychologist. 2001;56(3): 218-226.

Hakkarainen, K., Palonen, T., Paavola, S. & Lehtinen, E. 2004. Communities of networked expertise: Professional and educational perspectives. Advances in Learning and Instruction Series. Amsterdam: Elsevier.

Krippendorff, K. 2004. Content Analysis: An Introduction to Its Methodology (2nd ed.). Thousand Oaks, CA: Sage

Kunnari, I. & Ryymin, E. 2016. Successful Teacher Development in the Digital Era – The Role of Wellbeing and Networked Expertise. Paper presented in EAPRIL (The European Association for Practitioner Research on Improving Learning) Conference, 3.-25.11.2016, Porto.

Mikulincer, M. 1998. Attachment working models and the sense of trust: An exploration of interaction goals and affect regulation. Journal of Personality and Social Psychology, 74, 1209-1224.

Patton, M.Q. 1990. Qualitative evaluation and research methods (2nd ed.). Newbury Park, CA.

Ryan, R.M. & Deci, E.L. 2001. On Happiness and Human Potential: A Review of Research on Hedonic and Eudaimonic Well-Being. Annual Review of Psychology, 52, 141-166.

Ryymin, E., Kunnari, I., Joyce, B. & Laurikainen, M. 2016. Networked Expertise Empowering Brazilian Teachers’ Professional Development and Pedagogical Change. International Journal for Cross-Diciplinary subjects in Education, 7(2), 2755-2760. DOI: 10.20533/ijcdse.2042.6364.2016.0375

Ryymin, E., Corado, C., Joyce, B., Kokkomäki, J., Kunnari, I., Laurikainen, M., Lianda, R & Viskari, M. 2015. Finnish-Brazilian Learning Process as an Experimental Path towards Pedagogical Change. Paper presented in NOLAN, The 8th Nordic Latin American Research Network Conference, 11.-13.6.2015, Helsinki.

Seligman, M.E.P., & Csikszentmihalyi M. 2000. Positive Psychology. An Introduction. American Psychologist, 55 (1), 5-14.

Sheldon, K.M., & King, L.A. 2001. Why positive psychology is necessary. American Psychologist, 56, 216-217.

Soini, T., Pyhältö, K. & Pietarinen, J. 2010. Pedagogical well-being: Reflecting learning and well-being in teachers’ work. Teaching and teachers: theory and practice, 16, 735–751.

Bridging the research-to-practice gap in education: the design principles of mode-2 research innovating teacher education

Introduction

Current changes in society address new demands on professionals’ ability to respond to new and changing circumstances quickly and adequately (Coonen, 2006; Hargreaves & Fullan, 2012; 2002; OCW/EZ, 2009). This implies the necessity of continuous development to improve professional performance throughout the entire career. This general professional demand has consequences for teacher education (Darling-Hammond & Foundation, 2008; Scheerens, 2010). To support this lifelong professional learning, the development of an inquiry-based attitude (hereinafter: IA) is specifically recommended as a goal in teacher education (e.g. Cochran-Smith & Lytle, 2009). In Dutch teacher education at both initial and post-initial level, it is assumed that IA will allow teachers to create new knowledge of practice continuously with the aim to develop themselves as a professional and to improve their school context (Onderwijsraad, 2014). To be able to get more understanding about IA as a developable goal in teacher education, Meijer, Geijsel, Kuijpers, Boei and Vrieling (2016) conducted a multiannual empirical study and refined IA from an ill-defined global concept into a concept with reliable and valid characteristics. Their results indicated IA as a concept with two dimensions: an internal reflective dimension and an external knowledge-sourcing dimension. The internal dimension concerns intentional actions to acquire new professional modes of understanding and behaviour. The external dimension concerns intentional actions to gain new information and knowledge from relevant knowledge-sources. Our goal in this study was to create knowledge to support teacher educators’ in their pedagogical approaches to stimulate their students’ IA. However, the transfer of results from educational research into educational practice has proven to be complex (e.g.Broekkamp & van Hout-Wolters, 2007; OCW, 2011). To help bridge this gap, practice-based scientific mode-2 research design is presented as a research method that can help (Martens, Kessels, De Laat, & Ros, 2012). The assumption in this method is that partnership between researchers and practitioners will contribute to creating meaningful, generalisable knowledge and contribute to the transfer of this knowledge into practice. We therefore used this research design in our two-year follow-up study. In partnership with educators, we designed, tested and redesigned a professional development programme and we conducted a multiple case study. In this study (Meijer, Kuijpers, Boei, Vrieling, & Geijsel, in press) we gained insight into specific characteristics of professional development interventions that encourage teacher educators’ deep learning in stimulating IA-development of their students.

To our knowledge, there are few studies that provide specific insight into the design of practice-based scientific mode-2 research (hereinafter: mode-2 research) or into the actual impact of this methodology. To contribute to an understanding of how mode-2 research can help to bridge the gap between educational research and practice, this conceptual paper will reflect on how the partnership between the researcher and five educators resulted in creating practice-based scientific knowledge, professionalising teacher educators and simultaneously contributed to innovating teacher education practice. With this reflection, we aim to contribute to the development of mode-2 research as promoted in a research manifest on practice based scientific research (Martens et al., 2012). The study we are reflecting on is summarised in Table 1 and Table 2.

In what follows we first describe mode-2 research as a relatively new mode in social science and the general scientific requirements and usability criteria our research had to meet. Secondly, we report researchers role; recruiting practitioners and organising research meetings. Thirdly, we reflect from theoretical perspectives as to how and why our approach affected educators’ professional development and brought innovation to teaching practice. In conclusion, we present our working hypothesis on design principles in mode-2 research and discuss its complexity in design and the demands researchers must meet to monitor and facilitate simultaneously the quality of the research process and the learning of the practitioners.

Table 1. Process display of the mode-2 study we are reflecting on
Table 1. Process display of the mode-2 study we are reflecting on

1. Mode-2 research

Traditional methods of knowledge production and dissemination are the subject of debate in social science. Current scientific knowledge production does not transfer to practice adequately and opinions differ regarding the measures that should be taken to close the gap (Broekkamp & van Hout-Wolters, 2007). To bridge this gap, fundamental changes are suggested as a new research mode with regard to the interaction between science and society (Nowotny, Scott, & Gibbons, 2001). Social science production, in which socially robust knowledge is produced by social interventions in the context of application, was labelled by Gibbons et al. (1994) as Mode-2 research. Martens et al. (2012) promote this mode-2 research as an alternative to traditional educational research, in which randomised controlled trials still seem to be the golden standard. This, despite the fact that the complexity in educational research makes it impossible to control all variables (Cochran-Smith & Zeichner, 2010). Research based on randomised controlled trials aims to prove universal causal patterns in teaching and disparages the need for a stronger body of knowledge with practical, context-related relevance. The lack of knowledge with practical relevance is seen as one of the causes of the gap between science and practice. Hargreaves (1999) therefore even urged teachers to produce the knowledge they need by themselves. Martens et al. (2012) assume that research for which the questions are provided by practice – a partnership between researchers and practitioners – will contribute to creating meaningful, generalisable knowledge. From the perspective of learning, they argue that if practitioners participate in the knowledge creation process while participating in a practice-based scientific educational research in their own context, practical relevant knowledge will not only be created but it will also support the transfer of scientific knowledge into practice. Bronkhorst, Meijer, Koster, Akkerman and Vermunt (2013) found that collaboration with educators enabled the researcher to benefit from their expertise and that researchers’ position as a learner and researchers’ appreciation of the partnership impacts educators’ engagement ‘agency’ in the research . This means being an ‘agent’ and ‘owner’ instead being an ‘instrument’ or in other words ‘a tool for the researcher’ (p. 93). They found also that, compared to other research designs, collaboration supported the experience of research as an integrated part of everyday practice, which is also one of the goals in teacher education (Onderwijsraad, 2014). Researchers’ support of practitioner agency is thus seen as important because the more agency, the greater the chance that a solution will be found for the problem being researched (Bolhuis, Kools, Joosten-ten Brinke, Mathijsen, & Krol, 2012; Cochran-Smith & Lytle, 2009) and this will, as stated before, support the transfer of knowledge into practice.

1.1. Scientific requirements

Creating socially robust and practice-based educational scientific knowledge, under mode-2 conditions, has to meet the same generally accepted scientific standards as any other scientific research (Martens et al., 2012; Ros et al., 2012). However in mode-2 research, the relevance of the knowledge created is rooted in the (educational) context, in which the ‘problem’ occurred (Martens et al., 2012; Nowotny et al., 2001). A characteristic in this process of ‘local’ knowledge creation is to strive for external validity (i.e. generalisable insights) beyond the locus of knowledge production. Because practice-based research often works with small populations, it means that an attempt must be made, fitting within this type of search, to maximise generalisability without affecting the usability of the knowledge for the context in which the research took place (Ros et al., 2012; Verschuren, 2009). Furthermore, mode-2 research must be carried out in the wording of the scientific criteria that relate to the internal validity; controllability; cumulativeness and ethical aspects. The research must also meet the usability criteria with a view to the practice (Martens et al., 2012; Ros et al., 2012). The usability criteria define that the results must be accessible and understandable for the field of education; the results must be perceived as relevant and legitimate and the research must provide handles to improve educational practice.

1.2. Meeting scientific requirements in our study

In our two-year mode-2 research, we have secured internal validity by conducting it in the educational context in which the issue occurred. The study was executed in collaboration with an expert group of five teacher educators as co-researchers (Meijer et al., in press). The research process was characterised by iterative cycles of design, evaluation and redesign (McKenney & Reeves, 2013) and consisted of two phases: (1) a preparatory phase of designing, testing, evaluating and improving a theory-based professional development programme and (2) a main study phase in which the designed development programme was carried out. To build a strong partnership between the researcher and the participating practitioners, we followed Eri’s (2013) advice and involved them in constructing the design, and not only in testing the design, with the aim of supporting practitioners’ agency and ownership in the subject of the study.

To create generalisable knowledge we conducted the research as a parallel multiple case study (Swanborn, 2010) in four different teacher training courses. Four fairly homogeneous groups of teacher educators on four different teacher training courses at Bachelor and Master level at a professional university in the Netherlands were followed. The study resulted in clarification of the active ingredients of the designed interventions that supported the targeted development. We found that aligned ‘self-study’ interventions at personal, peer, and group level, guided by a trained facilitator, supported the aimed learning (Meijer et al., in press). To be able to reflect on this research from the perspective of partnership between researchers and teacher educators as co-researchers (hereinafter: expert group), we recorded and transcribed the research meetings (see table 2) with the expert group.

To meet the usability criteria we described our process of scientific knowledge construction and associated ethical aspects in a scientific publication and shared the results in the locus of the research. The way in which we further comply with the usability requirements is in fact seen in the focus of this reflective paper. In it, we look at how our collaboration with practitioners in the role of co-researcher resulted in socially robust scientific knowledge which contributed to professional development and is being implemented in practice. It should be noted that this implementation took place outside the scope of this research. This is because of the time that this implementation process took. In fact, the implementation process is still underway two years after the completion of this research.

2. Partnership between researcher and teacher educators in our study

The collaboration between practitioners and researchers is argued as a thriving force in developing new practices and educational change. To reflect on this assumption from our own research experience we will first successively report researchers role; recruiting practitioners and the research meetings between researcher and practitioners. Subsequently, in section 3, we will reflect on how our partnership between researcher and practitioners contributed to bridging the gap between science and practice. We reflect from theoretical perspectives on transfer of learning and development; practitioners’ knowledge creation and innovation and organisational learning.

2.1. Researcher

For mode-2 research it is important that the researcher(s) has coaching and consultancy skills in addition to research expertise and is able find balance between the relevance for the participating practitioners and the precision required by in scientific research (Martens et al., 2012). The researcher in this study (i.e. the first author) conducted research in her own professional context. She has an extensive experience as a teacher educator, trained supervisor/coach and is also responsible for the design of the professional Masters’ curriculum in the faculty where this research was conducted. This dialectic and simultaneous relationship between being a scholar and practitioner is an increasing phenomenon in educational research (Cochran-Smith, 2005). Before starting, and while conducting our research, the interwoven roles of the researcher were an explicit object of attention and reflection.

2.2. Recruiting the Practitioners

As pointed out above, besides creating practice-based scientific knowledge, the professional development of the collaborating practitioners is also one of the goals of mode-2 research. For this reason, we firstly based our research design on two preconditions in teacher-professionalisation, as reported by Van Veen, Zwart, Meirink and Verloop (2010): the subject of our study was in line with school policy and the participants were facilitated adequately by the management. Secondly, we decided to use the model of a professional learning community because this supports professional development (Lunenberg, Dengerink, & Korthagen, 2014; Van Veen et al., 2010), it supports innovation processes (Hargreaves & Fullan, 2012; Mourshed, Chijioke, & Barber, 2010) and it supports collaboration in designing, experimenting and re-designing (McKenney & Reeves, 2013; Van den Akker, Gravemeijer, McKenney, & Nieveen, 2006).

To recruit practitioners as co-designers and co-researchers in our research project, we organised a meeting with five experienced educators who were proposed by the management for practical reasons such as availability. We presented our research goal, basic design principles and the requirements that the participants had to meet. By being clear about our expectations of the participants’ qualities and commitment, we aimed to avoid drop-out on account of disappointment (e.g. Walk, Greenspan, Crossley, & Handy, 2015). First we presented our research goal as designing and redesigning a professional development programme based on theory and on practitioners’ knowledge and exploring which specific intervention characteristics support teacher educators’ professional development in stimulating students’ IA (Meijer et al., in press). We explained the importance of commitment in participating in a professional learning community during a two- year educational design-research within their own context. We also explained the importance of being an experienced teacher educator since we needed expert knowledge in designing a professional development programme. Experience was also important considering the plan that in the second phase of the study, the participants themselves would offer the designed programme to colleagues, and therefore we assumed that their credibility as a teacher educator should be beyond doubt. Furthermore, we highlighted the importance of being motivated to contribute to generalisable and reliable practice-based scientific knowledge by systematically, inimitably and accurately questioning their own practice. They also had to enjoy designing and redesigning interventions with the aim of improving them. Finally, we explained that they had to demonstrate commitment to participating in all the research meetings planned over the two years. Collaborating on this planning was presented as the first step in our partnership.

This meeting resulted in the voluntary participation of all five experienced (8-18 years) educators (hereinafter: expert group) aged between 43-58 and all female. They were facilitated with 90 hours of extra ‘professional development’ time over the two years, in addition to the standard annual time.

2.3. Research meetings

Before reflecting on ‘our’ partnership, we will give a short chronological overview of the research meetings between the researcher and the expert group (See Table 2, Overview of research meetings). All meetings can be characterised as ‘reflective dialogues’ (Mezirow & Taylor, 2009) between the researcher and the practitioners. Based on the practitioners’ wishes, we aligned our planning with the rhythm of our educational year. This meant no meetings during the busiest periods and not at the start and end of the year. The period between the meetings varied between two or three weeks.

Table 2. Overview of research meetings
Table 2. Overview of research meetings

3. Transfer of scientific knowledge into practice

To understand how collaboration with practitioners supported the transfer of scientific knowledge into practice, we firstly need to understand the underlying theories on the transfer of learning and professional development. Secondly, we need to comprehend the theories of practitioners’ knowledge creation and thirdly, we need to understand the theories of innovation and organisational learning. In these next sections, we will reflect – through the lenses of these theories – on our research journey, and illustrate our experiences with some vignettes.

3.1. Transfer of learning

The “changed and more experienced person is the major outcome of learning” (Jarvis, 2006, p. 132) is an important goal in mode-2 practice-based scientific educational research. In our research design, this learning concerned the development of teacher educators who participated as co-researchers. Since researchers in mode-2 research have to guide the participants’ learning and the transfer of this learning into educational practice, we built our research design on knowledge of learning theories in which the transfer of learning is a key concept.

Transfer of learning, and its underlying mechanisms, is still one of the most important educational research themes of the 21st century (e.g. Lobato, 2006). Thorndike (1906) introduced the concept of transfer and stated that the transfer of what is learned is dependent on the extent to which the new situations are the same as the original learning context. Thorndike conducted various empirical experiments and found that if an individual learns something in task A, it can be of benefit in task B if there are similarities between the two tasks. Although Thorndike’s view about transfer appeared to have been around for a century, later follow-up research showed that people can abstract things they have learned previously and subsequently apply this knowledge in contexts that are not obvious (e.g. Tomic & Kingma, 1988). However the transfer is stronger the more the contexts are alike. According to Piaget (1974), transfer occurs only if a measurement comes to the fore to show that what was learned had a demonstrable effect on the cognitive structure (knowing more) and that this knowledge can be operationalised in new situations. Piaget refers to this form of transfer as accomodating, by which he meant the capacity to adjust or transform familar strategies when a problem cannot (or can no longer) be resolved using the available tools and familiar methods. If this succeeds, previously acquired knowledge and insight is demonstrably transformed to a higher level.

The theory of the transfer of knowledge to other contexts was further illuminated by Branson and Schwarz (1999) in their AERA award winning review of research into transfer. They described Thorndike’s original view on transfer as the ‘Direct application theory of transfer’ which means that a person can apply previous learning directly to a new setting or problem. Based on their review, Branson and Schwarz proposed an alternative view of transfer that broadens this traditional concept by “including an emphasis on people’s ‘preparation for future learning’” (p. 68). They explicated the implications of this view for educational practices and elaborated Broudy’s (1977) instructional procedures with the aim of supporting the ability to adapt existing knowledge, assumptions and beliefs to new situations. Bransford and Schwartz highlight that people “actively interact” with their environment to adapt to new situations “if things don’t work, effective learners revise” (Bransford & Schwartz, p. 83) (See for example vignette 1). This so-called active transfer involves openness to others’ ideas and perspectives and seeking multiple viewpoints that are also important as a characteristic of critical reflection.

Vignette 1: Effective learners revise if things don’t work
Vignette 1: Effective learners revise if things don’t work

From the perspective of transfer, Illeris (2003, 2004, 2007; 2009) analyses leading theories of learning and differentiates four different learning types and looks at them in relation to their transfer capabilities. It is about mechanical learning, assimilating, accommodating and transforming. Each learning type is activated in different contexts, aims for different learning outcomes and varies according to the amount of energy learning requires. His learning theory rests on three different dimensions and two inseparable processes. He differentiates the cognitive (content), emotional (motivation) and social (interaction) dimension as well as the internal acquisition process in which new impulses are linked to earlier learning outcomes and the external interaction process that plays out between the learner, the teaching material and the social environment. According to Illeris (2014), professional learning already includes a change in practitioners’ work identity, the level of transformative learning. This happens only when the learner experiences a change in their own mental models with a perceivable impact on bringing about a change in attitude or behaviour. The individual then looks at the reality differently and also acts differently than previously (see for example vignette 2).

Vignette 2: Transformative learning
Vignette 2: Transformative learning
3.1.1. Supporting Practitioners’ Transformative Learning

To facilitate transformative learning Greeno (2006) calls for a learning environment in which stimulating and organising broad meaningful domain knowledge and automously founded actions are applied as two pro-transfer and inseparable factors. In this context, Kessels (2001) and Kessels and Keursten (2002) call for a knowledge-productive learning environment in which no educational material is prescribed, and instead research and reflection are the prime tools used to stimulate and facilitate meaningful learning. This is in line with the meta review by Taylor (2007) which indicates that accumulating personal learning experiences in a unique context about which there is critical reflection from various perspectives is one of the most powerful tools is promoting transformative learning. This is a process of communicative learning in which identifying and problematising ideas, convictions, values and feelings are critically analysed and given consideration. This requires a setting in which the participants dare to give themselves over to uncertainty and a certain degree of ‘discomfort’ so that they can learn personally. It is about daring mutual questioning of personal ‘truths’ and being prepared to modify existing paradigms on the basis of new insights. The shape transformative learning takes in education is in part dependent on the lecturer’s personal ideas about learning theories combined with the understanding of the reciprocal relationship between: (life) experience; critical reflection; dialogue; holistic orientation; context understanding and authentic relationships (Mezirow & Taylor, 2009). “Transformative learning is always a combination of unlearning and learning” (Bolhuis, 2009, p. 62). It is a radical process of falling down and getting back up again. According to Bolhuis, the unlearning element receives too little attention in research into and the forming of theories about learning. The helping hands that are offered with regard to ‘unlearning’ are implicit and are focused on reconstructing mental models and experimenting with new behaviour that can respond to behaviour and context through repetition and reflection.

In summary, this means that if mode-2 practice-based scientific educational research wants to contribute to the professionalisation of teachers, the research design must be based on ideas about learning theories with respect to the level of learning that is intended. In research into the professional beliefs and behaviour of the educator, a research setting in which transformative learning by the practitioners is facilitated is one of the design principles. This means that a research setting that is productive to knowledge is created, one which encourages and facilitates shared interactive research and the (re-)development of practical knowledge, beliefs and behaviour from different perspectives, with the aim of contributing to creating a ‘changed and more experienced person’ (see for example vignette 2).

Looking back over our research, we can typify our design of the learning environment in which the researcher and educators design and research together as a learning environment in which various levels can be learned. The accent in this was (1) having reflective dialogue which was dominated by: obtaining conceptual clarity about key concepts and the significance of this for practical actions and research into personal beliefs and the impact of these on actions; (2) the design of a theory-based analysis tool that, over a number of cycles, we ‘tested, reflected on, modified and again tested until we could work satisfactorily with it and were confident that the participants in the follow-up study could deal with effectively; (3) the design of interventions at ‘individual, peer and group level’ (Meijer et al., in press) via cycles of testing, reflecting on what worked, why it worked and how it could be improved; and (4) the design of a coherent professional development programme based on the interventions with the associated supporting materials and the basic premises of supporting learning from the participants. Because the practitioners researched with the researcher what interventions had an impact on their own development as well as how and when, they created new knowledge about professional development. They also integrated conceptual scientific knowledge about the subject of the research, ‘stimulating the inquiry-based attitude’, into their own educational repertoire.

3.2. Supporting Practitioners’ knowledge productivity

Following on from European and Americans examples (e.g. Cochran-Smith & Lytle, 2009; Loughran, 2007; Pickering et al., 2007), in the Dutch educational context and teacher training, we are increasingly seeing practitioner research used as a professional learning strategy to support individual and organisational learning. The teachers do their own research in their own context and the research itself as seen as an intervention (Bolhuis et al., 2012). According to Bolhuis et. al, practically-focused research by professionals contributes to more conscious consideration about the aims and effects of the work and promotes this approach where professionals create practical knowledge and use other people’s knowledge more in their work. The concept of practitioners’ knowledge productivity as a process in which new knowledge is created to contribute to innovation in the workplace was introduced by Kessels (1995; 2001). It refers to using relevant information to develop and improve products, processes and services. Supporting processes of practitioners’ knowledge creation requires expertise, such as “making tacit knowledge explicit, facilitating work and teambuilding, and supplying mentors and coaches with appropriate guidance abilities” (Kessels, 1998, p. 2). Knowledge productivity refers to ‘breakthrough’ learning’ which means that learners develop new approaches and are able to break with the past (Verdonschot, 2009). Both Kessels and Verdonschot believe that innovation processes are denoted as social communicative processes in which participants work in collaboration, whereby the quality of the interaction is important and should provide access to each other’s knowledge and connect these (see for example vignette 3). Paavola, Lipponen and Hakkarainen (2004) introduced the knowledge creation metaphor as a learning metaphor that concentrates on mediated processes of knowledge creation. A learning model based on knowledge-creation conceptualises “learning and knowledge advancement as collaborative processes for developing shared objects of activity […] toward developing […] knowledge” (p. 569)

Vignette 3: Social communicative knowledge creation.
Vignette 3: Social communicative knowledge creation.
3.2.1. Collaborative learning

In collaborative learning, the literature makes frequent reference to professional learning communities, group learning or learning from peers, and is seem as the most powerful driver for educational innovations (Hargreaves & Fullan, 2012; Mourshed et al., 2010). The concept of a professional community is multidimensional in nature and can be unpacked as practitioners’ peer learning with the goal of developing a shared vision that provides a framework for shared decision making on meaningful practice questions (see for example vignette 4). The aim is to improve practice from the perspective of collective responsibility, in which both group and individual learning are promoted. (Hord, 1997; Stoll, Bolam, McMahon, Wallace, & Thomas, 2006).

The positive impact of collaborative learning methods is convincingly present in research literature. The meta analysis by Pai, Sears and Maeda (2015) showed that compared to individualistic learning methods, learning in small groups ( 2-5 participants) promotes students’ acquisition of knowledge and has also positive effects on increasing the transfer of students’ learning experiences and outcomes into practice. From the perspective of cognitive load theory, that considers a collaborative learning group as an information processing system (Janssen, Kirschner, Erkens, Kirschner, & Paas, 2010), students working in a group outperform students working individually, because a group has more processing capacity than individual learners. Sharing the cognitive load increases the cognitive capacity to understand the learning objectives at a deeper level (Kirschner, Paas, & Kirschner, 2009).

Pai, Sears and Maeda (2015) found that the positive interdependence between the group members, interpersonal skills and carefully structured interaction contributed effectively to collaborative learning achievements. There is also general agreement that the reflective dialogue plays a key role in the interaction in collaborative learning (e.g. Fielding et al., 2005; Lomos, Hofman, & Bosker, 2011) and that critical friendship, with the emphasis on ‘friendship’, in the sense of equality, trust, openness and vulnerability (Schuck, Aubusson, & Buchanan, 2008) is a prerequisite for collaborative learning. Personal commitment, as in the sense of learner engagement (see for example vignette 5), is indicated as another precondition to resolve complex practice-based problems and find acceptable solutions. (Bolhuis et al., 2012; Fielding et al., 2005)

In their exploration of the relation between teacher learning and collaboration in innovative teams, Meirink, Imants, Meijer and Verloop (2010) found that collaboration in teams that focused on both “sharing of ideas and experiences” and “sharing identifying and solving problems” contributed to a higher level of interdependence. Collegial interaction that can be typified as ‘joint work’ is indicated as interaction with the highest level of interdependence. This is in line with other findings from research into factors that influence the transfer of good practice (e.g. Fielding et al., 2005). In this study, the transfer of good practise is seen as ‘joint practice development’ which depends on relationships, institutional and teacher identity, having time, and most important learner engagement. The importance of “the quality of relationships between those involved in the process” (p. 3) is highlighted because the transfer of practice is relatively intrusive and hard to achieve.

08_vignette4
Vignette 4: Developing a shared vision

 

08_vignette5
Vignette 5: Personal commitment and agency

In summary, this means that supporting practitioners’ knowledge productivity during mode-2 research requires a research design incorporates the theoretical ideas regarding collaborative workplace learning. Here, the practitioners use practice-focused as a professional learning strategy and not just as a tool to create knowledge.

Looking back on the knowledge productivity of the educators in our research design, we see strong correlations with, for example, the practitioner research self-study method (Loughran, 2007; Lunenberg, Zwart, & Korthagen, 2010). The aim of our research is very close to the central goal of the self-study methodology. This goal is to uncover deeper understandings of the relationship between teaching and learning about teaching, with the aim of improving the alignment between intentions and actions in the practitioners’ teaching practice. Like the self-study approach, our research design strongly appeals to individuals’ scholarly notions and qualities, where the systematic collation and analysis of personal data in a personal context supports a personal deeper professional understanding that can be shared with other colleagues. However, where we differ explicitly from the self-study approach is that our research design centred around ‘collective’ learning in multiple settings with the aim of creating a collective deeper understanding and generalizable scientific knowledge, and implementing this new knowledge into the practice of teacher educators. The importance of well-guided collaborative knowledge creation in small-peer groups is thereby emphasised by the expert group. The expert group highlighted the importance of flexible research guidance that is aligned with the ‘reality of the daily working context’ as a precondition to staying motivated to participate in this research project (see for example vignette 6).

Vignette 6: Flexible guidance
Vignette 6: Flexible guidance

3.3. Innovation in education

As well as professional teaching, mode-2 research also aims for innovation in the professional context. Therefore it is relevant to understand the relationship between individual and collective organisational learning (Argyris, 2002; Senge, Cambron-McCabe, Lucas, Smith, & Dutton, 2012). Innovation in education programmes is a complex, broad concept and concerns multiple relations and dimensions within multiple programme components. For a definition of what we can understand innovation in education, we use Waslander’s (2007) description in her review of scientific research on sustained innovation in secondary education. To her, an innovation is a set of activities which together comprise a concept or an idea which if implemented improves practice. An innovation is something ‘new’ that has added value for the future. Further, there is only an innovation of this ‘news’ manifests itself in people’s behaviour and is embedded in their day-to-day routine.

Innovations at the organisation level always relate to relationship between individual and collective learning and successfully triggering collective learning is a first step towards innovating. The research by Peck, Gallucci, Sloan and Lippincott (2009) into teacher education practices shows that the problems related to individual practice (raised by new policies) are often the trigger for faculty (collective) learning. Even though collective learning still delivers such well designed interventions and knowledge, it is no guarantee of successful implementation at the level of the organisation (Verdonschot, 2009). Based on her meta analysis of innovation practices, Verdonschot established that the skills and ambition of the individual implementing the intervention influence its success. In addition, the new knowledge that is to be integrated must be well-timed, relevant and appropriate (Eraut, 2004, 2007; Peck et al., 2009). If the knowledge was not acquired in a personal context, but through formal learning such as, for example, schooling, it often has to be transformed to the personal situation because the new knowledge doesn’t fit the actual situation in which it is required. To integrate the new knowledge requires practitioners’ meta cognitive skills in transforming knowledge and skills to the personal situation.

3.3.1. Supporting innovation in education

In supporting professional learning that is focused on innovating, it is essential to facilitate the generation of new reality constructions (Homan, 2005). Generating new reality constructs is central to the theory on organisational learning in the familiar work by Argyris and Schön (1978) and is aligned with the previously discussed theory on transfer of learning. Argyris (1992; 2002) differentiates between single-loop learning and double-loop learning. With single-loop learning, a lot is learned but nothing is learned about how to learn better. It is generally about solutions that are more of the same. Single-loop learning will therefore not contribute to innovations because it concerns only correcting errors without altering underlying governing values. To resolve complex problems for which new solutions are needed, double-loop learning is needed. This means calling on the ability to fundamentally think the problem through and learn from this through critical reflection. Argyris stated that to change organisational routines with success, organisational and individual double-loop learning processes should both be encouraged. In his opinion, it is impossible to change organisational routines without changing individual routines, and vice versa. Senge, Cambron-McCabe, Lucas, Smith and Dutton (2012) talk in this context about fundamental changes in mental models, systems and interactions which are a prerequisite to redesigning and changing the current situation. To support double loop-learning, Argyris calls for an increase in people’s capacity “to confront their ideas, to create a window into their minds, and to face their hidden assumptions, biases, and fears by acting in these ways toward other people” (2002, p. 217). He highlights the importance of encouraging self-reflection and advocating personal principles, values, and beliefs in a way that invites inquiry into them. This is in line with Eraut’s research (2004, 2007) in which he emphasises the critical importance of support and feedback in enhancing organisational learning, especially within a working context of good relationships and supporting managers. In addition, opportunities for working alongside others or in groups, where it is possible to learn from one another, are important.

In summary, this means that if mode-2 practice-based scientific educational research wants to help in innovating educational context, more is needed than stimulating double-loop learning by practitioners during joint design and research. Encouraging transfer between individual and collective learning and securing its implementation in the professional context requires a research design that is based on innovation theories that are leading in the monitoring of this complex form of learning.

Looking back over our research, we have experienced that the transfer of personal learning into organisational learning and innovation is highly complex and time-consuming. In our opinion, a well-designed implementation plan that is guided by principles from theories on organisational learning and innovation is needed prior to the start of the research. In our view, this plan must include management support and implementation facilities to ensure that the implementation doesn’t come to a halt when the researcher leaves.
In the study we are reflecting on, the researcher had a management position in two of the four participating educational settings and was able to influence the organisational policy concerning educating teachers and the demands the educators have to meet. In these two settings, our mode-2 research resulted in a successful transfer of scientific knowledge into our practice policy (see for example vignette 7).

08_vignette7
Vignette 7: Transfer of scientific knowledge into organisational policy

In the other two settings, our research design was only successful from the perspectives of knowledge creation and professional development. Once the (co-) researcher had left, further implementation came to a halt. Our explanation is that having an implementation plan that is supported by the management (e.g. Eraut, 2004, 2007; Van Veen et al., 2010) is a prerequisite to implementing the innovation at the organisational level. We recommend that that if the researcher is not to execute the implementation plan personally, this should be done by an engaged practitioner who, in line with Verdonschot’s research (2009), has the courage, ambition and mandate to make the implementation a success. Looking back on our innovation we can see that, like many other innovations, it was triggered by new policy (Peck et al., 2009). This policy concerns the ambition of the Dutch Educational Council (2014) to promote the development of an inquiry-based attitude on the part of teachers.

4. Working hypothesis concerning design principles in mode-2 research

This conceptual paper is a reflection of our previous two-year mode-2 research journey (Meijer et al., in press) in which our partnership between researcher and practitioners successfully contributed to bridging the research-to practice-gap in education. That research concerned a multiple case study as part of which we worked with five experienced educators to design, test and explore a professional development programme. Our reflection shows that the partnership in our research helped to create socially robust scientific knowledge and that this collaboration contributed to the transfer of the knowledge created into the practice in which the research was conducted. The new knowledge was not just integrated into the practitioners’ actions, in two of the four settings where the research was conducted, it was also translated into internal policy documents. These policy documents are definitive in ensuring curriculum innovation and thus the required educational behaviour in the setting in which the researcher works.
Our contribution in shaping the theory regarding the design of mode-2 research comprises firstly the finding that partnership between the researcher and practitioners in creating practice-based scientific knowledge succeeds in closing the gap between theory and practice if the research design includes the objectives and a theoretically-based approach to both practitioners’ knowledge creation, practitioners’ development and the proposed organisational learning and innovation. Secondly our reflection resulted, from various theoretical perspectives of the partnership with practitioners, in concrete design principles, preconditions and recommendations for supporting and guiding practitioners during mode-2 research. We have set these out in the table below (see Table 3) and these can be seen as a working hypothesis for designing and guiding this kind of research. Allocation to the categories used is not a distinction because some of the recommendations apply within multiple categories.

Table 3: Design principles of mode-2 research
Table 3: Design principles of mode-2 research

To summarise: in this conceptual paper, we have reflected on the theoretical aspects of transfer of learning; professional development; practitioners’ knowledge creation; innovation and organisational learning on how partnership with practitioners can help in bridging the gap between theory and practice.

Our reflections have highlighted the importance of having three interwoven research designs in mode-2 research: (1) one design concerning the scientific knowledge creation process based on practitioners’ knowledge creation; (2) one design concerning the practitioners’ learning support in knowledge creation, professional learning and knowledge transfer and (3) and one design that guarantees implementation into practitioners’ practice at the organisational level. To gain a deeper scientific understanding in critical design variables in mode-2 research which at the same time help to create scientific practice-based knowledge, professionalise practitioners and ensure innovation, we recommend that mode-2 researchers write conceptual papers from the perspective of three interwoven designs to allow further meta analysis to be carried out in the future. We also advise further investigation into the qualities a mode-2 researcher must demonstrate as a facilitator of professional development and innovation. The researchers can use the design principles we have proposed as a working hypothesis for designing and guiding their own mode-2 research. Follow-up research into these design principles can support deeper understanding of how mode-2 research in education can bridge the gap between theory and practice.

Corresponding Author

Marie-Jeanne Meijer, PHD-student, Curriculum director at Windesheim University of Applied Sciences, Movement & Education, The Netherlands, mj.meijer(at)windesheim.nl

Author

Marinka Kuijpers, PHD, Professor at Welten Institute, Open University, The Netherlands, marinka.kuijpers(at)ou.nl

Argyris, C. (1992). On Organizational Learning. Oxford: Blackwell Publishers.

Argyris, C. (2002). Double-loop learning, teaching, and research. Academy of Management Learning & Education, 1(2), 206-218.

Argyris, C., & Schön, D. A. (1978). Organizational learning: A theory of action perspective (Vol. 173): Addison-Wesley Reading, MA.

Bolhuis, S. (2009). Leren en veranderen. Bussum: Coutinho.

Bolhuis, S., Kools, Q., Joosten-ten Brinke, D., Mathijsen, I., & Krol, K. (2012). Praktijkonderzoek als professionele leerstrategie in onderwijs en opleiding.

Bransford, J. D., & Schwartz, D. L. (1999). Rethinking Transfer: A Simple Proposal With Multiple Implications Review of research in education (Vol. 24, pp. 61-100).

Broekkamp, H., & van Hout-Wolters, B. (2007). The gap between educational research and practice: A literature review, symposium, and questionnaire. Educational Research and Evaluation, 13(3), 203-220.

Bronkhorst, L. H., Meijer, P. C., Koster, B., Akkerman, S. F., & Vermunt, J. D. (2013). Consequential research designs in research on teacher education. Teaching and Teacher Education, 33, 90-99.

Broudy, H. S. (1977). Types of knowledge and purposes of education. Schooling and the acquisition of knowledge, 1-17.

Cochran-Smith, M. (2005). Teacher educators as researchers: Multiple perspectives. Teaching and teacher education, 21(2), 219-225.

Cochran-Smith, M., & Lytle, S. L. (2009). Inquiry as stance: Practitioner research for the next generation. New York: Teachers College Press.

Cochran-Smith, M., & Zeichner, K. (2010). Studying teacher education: The report of the AERA panel on research and teacher education: Routledge.

Coonen, H. (2006). De leraar in de kennissamenleving: beschouwingen over een nieuwe professionele identiteit van de leraar, de innovatie van de lerarenopleiding en het management van de onderwijsvernieuwing: Garant.

Darling-Hammond, L., & Foundation, G. L. E. (2008). Powerful learning: What we know about teaching for understanding: Jossey-Bass San Francisco.

Eraut, M. (2004). Informal learning in the workplace. Studies in Continuing Education, 26(2), 247-273.

Eraut, M. (2007). Learning from other people in the workplace. Oxford Review of Education, 33(4), 403-422.

Eri, T. (2013). The best way to conduct intervention research: methodological considerations. Quality & Quantity, 47(5), 2459-2472.

Fielding, M., Bragg, S., Craig, J., Cunningham, I., Eraut, M., Gillinson, S., . . . Thorp, J. (2005). Factors influencing the transfer of good practice.

Gibbons, M., Limoges, C., Nowotny, H., Schwartzman, S., Scott, P., & Trow, M. (1994). The new production of knowledge: The dynamics of science and research in contemporary societies: Sage.

Greeno, J. G. (2006). Authorative, Accountable Positioning and Conected, General Knowing: Progressive Themes in Understanding Transfer. The Journal of the Learning Sciences, 15(4), 537-547.

Hargreaves, A., & Fullan, M. (2012). Professional capital: Transforming teaching in every school: Teachers College Press.

Hargreaves, D. H. (1999). The knowledge‐creating school. British journal of educational studies, 47(2), 122-144.

Homan, T. H. (2005). Organisatiedynamica: Theorie en praktijk van organisatieverandering.

Hord, S. M. (1997). Professional learning communities: Communities of continuous inquiry and improvement.

Illeris, K. (2003). Towards a contemporary and comprehensive theory of learning. International Journal of Lifelong Education, 22(4), 396-406.

Illeris, K. (2004). The Three Dimensions of Learning: Roskilde University Press.

Illeris, K. (2007). How we learn : learning and non-learning in school and beyond (2nd rev. English ed ed., pp. XIII, 289). London: Routledge.

Illeris, K. (2009). Transfer of learning in the learning society : How can the barriers between different learning spaces be surmounted, and how can the gap between learning inside and outside schools be bridged? International journal of lifelong education (Vol. 28, pp. 137-148).

Illeris, K. (2014). Transformative learning and identity. New York: Routledge.

Janssen, J., Kirschner, F., Erkens, G., Kirschner, P. A., & Paas, F. (2010). Making the black box of collaborative learning transparent: Combining process-oriented and cognitive load approaches. Educational psychology review, 22(2), 139-154.

Kessels, J. (2001). Learning in organisations: a corporate curriculum for the knowledge economy. Futures, Vol. 33, 497-506.

Kessels, J., & Keursten, P. (2002). Creating a Knowledge Productive Work Environment. Lifelong Learning in Europe, 7(2), 104-112.

Kessels, J. W. M. (1995). Opleiden in arbeidsorganisaties. Het ambivalente perspectief van de kennisproductiviteit [Training in organizations: The ambivalent perspective of knowledge productivity]. Comenius, 15(2), 179-193.

Kessels, J. W. M. (1998). The development of a corporate curriculum: the knowledge game. Gaming/Simulation for Policy Development and Organizational Change, 29(4), 261-267.

Kessels, J. W. M. (2001). Verleiden tot kennisproductiviteit [Tempting towards knowledge productivity]. Inaugural lecture. Enschede.

Kirschner, F., Paas, F., & Kirschner, P. A. (2009). A cognitive load approach to collaborative learning: United brains for complex tasks. Educational Psychology Review, 21(1), 31-42.

Lobato, J. (2006). Alternative Perspectives on the Transfer of Learning: History, Issues, and Challenges for Future Research. Journal of the Learning Sciences, 15(4), 431-449.

Lomos, C., Hofman, R. H., & Bosker, R. J. (2011). Professional communities and student achievement–a meta-analysis. School Effectiveness and School Improvement, 22(2), 121-148.

Loughran, J. (2007). Researching teacher education practices responding to the challenges, demands, and expectations of self-study. Journal of teacher education, 58(1), 12-20.

Lunenberg, M., Dengerink, J., & Korthagen, F. (2014). The Professional Teacher Educator. Roles, Behaviour, and Professional Development of Teacher Educators (Vol. 13). Rotterdam, Boston, Taipei: sense Publishers.

Lunenberg, M., Zwart, R., & Korthagen, F. (2010). Critical issues in supporting self-study. Teaching and Teacher Education, 26(6), 1280-1289.

Martens, R., Kessels, J., De Laat, M., & Ros, A. (2012). Praktijkgericht wetenschappelijk onderzoek. Practice-based scientific research. Research manifest LOOK. Heerlen, the Netherlands: Open Universiteit Nederland.

McKenney, S., & Reeves, T. (2013). Conducting educational design research: Routledge.

Meijer, M., Geijsel, F., Kuijpers, M., Boei, F., & Vrieling, E. (2016). Exploring teachers’ inquiry-based attitude. Teaching in Higher Education, 21(1), 64-78.

Meijer, M., Kuijpers, M., Boei, F., Vrieling, E., & Geijsel, F. (in press). Professional Development of Teacher-Educators towards Transformative Learning. Professional Development in Education.

Meirink, J. A., Imants, J., Meijer, P. C., & Verloop, N. (2010). Teacher learning and collaboration in innovative teams. Cambridge Journal of Education, 40(2), 161-181.

Mezirow, J., & Taylor, E. W. (2009). Transformative learning in practice: Insights from community, workplace, and higher education. San Francisco: Jossey-Bass.

Mourshed, M., Chijioke, C., & Barber, M. (2010). How the world’s most improved school systems keep getting better. Retrieved from London:

Nowotny, H., Scott, P., & Gibbons, M. (2001). Re-thinking science: Knowledge and the public in an age of uncertainty: SciELO Argentina.

OCW/EZ. (2009). Naar een robuuste kenniseconomie, Brief aan de Tweede Kamer. Den Haag.

Onderwijsraad. (2014). Meer innovatieve professionals. Den Haag.

OCW. (2011). Nationaal Plan Toekomst Onderwijswetenschappen. Den Haag: ministerie van OCW.

Paavola, S., Lipponen, L., & Hakkarainen, K. (2004). Models of innovative knowledge communities and three metaphors of learning. Review of educational research, 74(4), 557-576.

Pai, H.-H., Sears, D. A., & Maeda, Y. (2015). Effects of small-group learning on transfer: A meta-analysis. Educational Psychology Review, 27(1), 79-102.

Peck, C. A., Gallucci, C., Sloan, T., & Lippincott, A. (2009). Organizational learning and programme renewal in teacher education: A socio-cultural theory of learning, innovation and change. Educational Research Review, 4(1), 16-25.

Piaget, J. (1974). The construction of reality in the child. New York: Ballantine Books.

Pickering, J., Daly, C., Pachler, N., Gowing, E., Hardcastle, J., Johns-Shepherd, L., . . . Simon, S. (2007). New designs for teachers’ professional learning: Institute of Education, University of London London.

Ros, A., Amsing, M., Ter Beek, A., Beek, S., Hessing, R., Timmermans, R., & Vermeulen, M. (2012). Gebruik van onderwijsonderzoek door scholen. Onderzoek naar de invloed van praktijkgericht onderzoek op schoolontwikkeling. Retrieved from s Hertogenbosch:

Scheerens, J. (Ed.) (2010). Teachers Professional Development, Europe in international comparison. Luxembourg: European Commission.

Schuck, S., Aubusson, P., & Buchanan, J. (2008). Enhancing teacher education practice through professional learning conversations. European journal of teacher education, 31(2), 215-227.

Senge, P. M., Cambron-McCabe, N., Lucas, T., Smith, B., & Dutton, J. (2012). Schools that learn (updated and revised): A fifth discipline fieldbook for educators, parents, and everyone who cares about education: Crown Business.

Stoll, L., Bolam, R., McMahon, A., Wallace, M., & Thomas, S. (2006). Professional learning communities: A review of the literature. Journal of Educational Change, 7(4), 221-258.

Swanborn, P. (2010). Case study research: What, why and how? : SAGE Publications Limited.

Taylor, E. W. (2007). An update of transformative learning theory: a critical review of the empirical research (1999–2005). International Journal of Lifelong Education, 26(2), 173 – 191.

Thorndike, E. L. (1906). Principles of teaching. New York: Seiler.

Tomic, W., & Kingma, J. (1988). Accelerating Intelligence Development through Inductive reasoning Training. Advances in Cognition and Educational Practice, 5, 291-305.

Van den Akker, J., Gravemeijer, K., McKenney, S., & Nieveen, N. (2006). Educational design research: Routledge.

Van Veen, K., Zwart, R., Meirink, J., & Verloop, N. (2010). Professionele ontwikkeling van leraren. Retrieved from Leiden:

Verdonschot, S. G. M. (2009). Learning to innovate: a series of studies to explore and enable learning in innovation practices: University of Twente.

Verschuren, P. (2009). Praktijkgericht onderzoek. Ontwerp van organisatie-en beleidsonderzoek [Practice-oriented research. Design of organisation and policy research]. The Hague, the Netherlands: Boom Lemma.

Walk, M., Greenspan, I., Crossley, H., & Handy, F. (2015). Mind the Gap: Expectations and Experiences of Clients Utilizing Job‐Training Services in a Social Enterprise. Annals of Public and Cooperative Economics, 86(2), 221-244.

Waslander, S. (2007). Leren over innoveren. Overzichtsstudie van wetenschappelijk onderzoek naar duurzaam vernieuwen in het voortgezet onderwijs.