Thursday 26 November 2015

Critically Reflecting in Practice



Photo source: Google images

Introduction


In this blog post I consider what critical reflection in learning practice entails and how it can foster learning, why it is useful and how it can be applied. Finally I explore its meaning in my own practice as learning practitioner.


What is critical reflection?


Critical reflection enables a learner to analyse what has been learned and how learning fosters self-development. It is in light of these two analysis aspects that an importance is placed on critical reflection in professional development of teaching practices. Cranton (1996) defines critical reflection as the “process by which adult learners engage in and identify the assumptions governing their actions, locate the historical and cultural origins of the assumptions, question the meaning of the assumptions, and develop alternative ways of acting”. Brookfield (1995) adds that part of the critical reflective process is to challenge the dominant social, political, cultural, or professional ways of acting. Through the process of critical reflection, adult learners come to interpret and create new meaning and actions from their experiences and are able to create a learning strategy for life-long learning.


How can it be fostered in the classroom?


Brookfield (1988) identified 4 activities central to becoming critically reflective: Assumption analysis describes the activity a learner engages in to bring to awareness beliefs, values, cultural practices, and social structures regulating behaviour and to assess their impact on daily activities. Assumptions structure our way of seeing reality, govern our behavior, and describe how relationships should be ordered. Contextual awareness is achieved when adult learners come to realise that their assumptions are socially and personally created in a specific historical and cultural context. Imaginative speculation provides an opportunity for learners to challenge prevailing ways of knowing and acting by imagining alternative ways of thinking about phenomena Cranton (1996). The outcome of assumption analysis, contextual awareness, and imaginative speculation is the fourth activity referred to as Reflective skepticism the questioning of any universal truth claims or unexamined patterns of interaction. In other words, critical reflection enables us to locate ourselves within a broader social context; to understand our values, beliefs, and biases and to assess our learning so that our learning informs our practice.

Good reasons for incorporating reflection into your own practice


The value of reflecting on practice recognises the importance of taking action on the basis of assumptions that are unexamined. Critical reflection is one particular aspect of the larger process of reflection. To understand critical reflection properly we need first to know something about the reflective process in general. Brookfield (1995) highlights that critical reflection is important as, "It helps us take informed actions that are based on assumptions that have been carefully and critically investigated. It helps us develop a rationale for practice not only grounds our actions, but also our sense of who we are as educators in an examined reality. It rounds us emotionally when we neglect to clarify and question our assumptions, and when we fail to research our students, we have the sense that the world is governed by chaos. It increases democratic trust through learning whether independence of thought is really valued, or whether everything depends on pleasing the teacher. They learn either that success depends on beating someone to the prize using whatever advantage they can, or on working collectively".

How can critical reflection be applied?


The African Universities Research Approaches (AURA) programme, fosters shared learning across partner institutions. Participants on a learning programme are encouraged to reflect, ask questions and draw on concepts that can help to understand learning in their own practice. Participants are introduced to an experiential learning model developed by Kolb, 1984, to identify, investigate, reflect on and report on a learning dimension of their work. The ORID questioning process which is based on the Kolb’s experiential learning cycle allows participants to apply simple steps that supports their thinking in a critically reflective manner. The steps / framework outlined can assist the reflective writing process. 

  • (Analysing) Analyse how it made you feel? “I feel…” 
  • (Evaluating) What did you conclude from this experience? “I concluded…” 
  • (Creating) What will you do differently now? “I will do… in the future”. 

Through describing a critical incident arising from the practice learning environment the participant is able to make sense of what has been shared. During an AURA learning intervention, learners engage in a reflective practice within group and facilitated online discussions pre- and post- the face-to-face intervention and encouraged to continue reflective writing when away from the learning environment. Reflective writing enables learners to pursue the critical reflections on a deeper level and confront the challenge of explaining their research ideas. During the online and face-to-face interventions reflections underpinned their understanding of theory and course content and to link experience and knowledge.

Why is critical reflection important to an educator?


Critical reflection blends learning through experience with theoretical and technical learning to form new knowledge constructions and new behaviours or insights. Reflection is necessary to develop the skills to become lifelong learners. Through my observer role and own view point I feel I am in a good position to develop a learning strategy from my viewpoint / identity as a learning practitioner. Through the AURA programme I am able to observe a case study of my own ‘critical incident’. This may not always be the case. If tackling a learning dimension in which I have little experience, more research would need to be done, for example finding out about others experience as a learner / and or learning practitioner in relation to the learning dimension in similar contexts. My own approach as a learning practitioner are informed in part by experience (self or others) and in part by the relevant and prevailing theoretical and distinct conceptual perspectives.


References:                                                    


Brookfield, S. 1995. What it means to be a Critically Reflective Teacher in “Becoming a Critically Reflective Teacher,” San Francisco Jossey-Bass.

Brookfield, S.D. 1990. Using critical incidents to explore learners’ assumptions. In pages 177-193 of J. Mezirow (Ed).

Brookfield, S. 1988. Developing Critically Reflective Practitioners: A Rationale for Training Educators of Adults. Training Educators of Adults: The Theory and Practice of Graduate Adult Education. S. Brookfield, Editor. New York: Routledge.

Cranton, P. 1996. Professional Development as Transformative Learning: New Perspectives for Teachers of Adults. San Francisco: Jossey Bass.

Kolb, D. A. 1984. Experiential Learning Experience as the Source of Learning and Development. Prentice Hall Inc. Englewood Cliff, New Jersey. USA. Retrieved from the AURA programme course material.

How to be critical when reflecting on your teaching (2015). Retrieved on 23 November http://www.open.edu/openlearn/education/learning-teach-becoming-reflective-practitioner/content-section-2.1

Learning through Reflection (2015). Retrieved on 23 November http://www.nwlink.com/~donclark/hrd/development/reflection.html

Myrtle Adams-Gardner is the Training Quality Coordinator representing the South for the African Universities’ Research Approaches programme. She is experienced in mentoring and coaching, pedagogies and assessments of learning. She has been involved in the development of capacity development programmes promoting teaching and learning capabilities in Sub-Saharan Africa.

Thursday 19 November 2015

Challenges of monitoring and evaluating: attributing causality



Photo:  DarkoStojanovic, Pixabay
Creative Commons CC0 Public Domain License
In popular culture the laboratory is a place with largely negative connotations: the word conjures up images of beakers, test tubes and white coats. Yet, as a social scientist I look upon it with a slight sense of envy. This is because the laboratory represents a degree of experimental control rarely possible in the social sciences.  As a monitoring and evaluation specialist on the African Universities Research (AURA) programme, however, ours is a study of human behaviour in its natural surroundings. These natural surroundings happen to be diverse: the first year of the programme focuses on multiple stakeholders at several universities spread across Eastern Africa. This familiar terrain of bureaucracy and institutional politics ensures our focus is far removed from the controlled environment of the laboratory. My argument is that the further your research moves from the laboratory the harder it becomes to exercise control, and therefore the harder it becomes to attribute causal relations between variables. This is a problem given the prevalence of linearity assuming quantitative indicators in donor reporting, particularly in logframes.  So, given this degree of complexity in programmes such as AURA, it’s sensible to adopt mixed methods approaches. This is the first in a series of blogs I will be writing to make this case using examples from my experiences as an evaluator on the AURA programme. 


Measuring short-term impact and long-term value
AURA is a capacity development programme aimed at shaping behavioural change. This will be achieved, in part, through the AURA suite of capacity development courses. One of the challenges of the evaluation effort lies in showing that these courses have been effective. This is done at different levels for different attributes. So let’s first consider an example in which the objective is to measure improvements in the research skills of our workshop participants. Firstly, this requires a benchmark, something to compare progress against. So, before participating in any sort of workshop, the participant takes a combination of test and self-assessment questions that give us an idea of their current skills set. The scores on these questions are then compared to scores obtained from a similar exercise immediately after the workshop. The difference between them is what we report in our logframe; this is the increase in skills measured, and typically it is expressed as a percentage. How well this method works depends on how sophisticated the test questions are and also how well it’s supported by data obtained from other, typically qualitative, methods. At its best this builds a strong case for attributing an increase in skills to the intervention. But this snapshot covers only a short space of time: the immediate pre and post intervention periods. Yet, the real value of the AURA programme will depend on how these skills develop over a longer time period and what’s done with them. This will determine the impact the programme has had; and for good reason this is increasingly what donors are focusing on.

Correlation does not equal causation
Let’s carry on with our example. Our participants, having completed their workshop go back into their everyday lives at their respective institutions. The easiest and perhaps least rewarding evaluation exercise might be to repeat the same test given to them previously, but taken several months after the workshop to determine whether they’ve sustained their skill set. However, we are now into less secure territory as far as causality goes. What made the first assessment so useful is the very thing that hinders this one: the lapse of time. Earlier we could more confidently attribute the increase in skills to our intervention, because the testing was done immediately before and after it. Now, however, several months have passed and each of our participants have done different things in that time, some of these activities might have advanced these skills further, while others might have hindered them. So even if we have another snapshot of the participants skill set, several months down the line, how do we know whether any shifts are because or in spite of our intervention? Even if we found an upward surge in our participants skill set, how can attribution be safe given the possibility of so many other variables? Perhaps we could correlate the rise of skills with the amount of post-workshop interaction we have had with the participant. This is the point at which I hear the voice of my statistics tutor who reinforced over and over in his class: that correlation does not equal causation. Anyone who’s studied the social sciences has their favourite examples of two independent patterns that are strongly correlated, and my favourite come from military intelligence analyst Tyler Vigen, in his book Spurious Correlations. Here we learn that the per capital consumption of mozzarella cheese is strongly correlated with the number of doctorates awarded in civil engineering; and that the greater the consumption of sour cream the more motorcycle riders killed in non-collision transport accidents. It makes you wonder: are the patterns I’m finding between our interventions and a participants long term progress just as spurious as these examples. 

Towards a mixed methods approach
This needs to be investigated further. The evaluator has a choice to make at this point, to either move towards or away from the laboratory. The move towards is taken by the econometrician, only this time it is a figurative laboratory, and instead of using actual instruments of control we use numerical values in a regression analysis. A regression analysis is a tool used to investigate the relationship between different variables, which in our case might be a relationship between high performance on one of our courses and, say, an increase in earnings. However, there are several other factors that might explain an increase in earnings that are independent of the AURA programme, such as the participant’s prior levels of education, income differentials between different geographical locations and different sectors of the economy.  The aim of the regression equation is to zone in on the relationship between the variables in which we are interested while controlling for those that we are not. This approach, however, relies on being able to ascribe numerical values to your control variables. This is not always possible. To use our example further:  what if the difference in earnings is due not to what we’ve hypothesised or tried to control for, but rather simply down to the participant having the right family connections? It’s less clear how we can place numerical values on things like patronage, and knowing this and conceding the point allows us move away from the laboratory and embrace a mixed methods approach. 


Jagdeep Shokar is Monitoring and Evaluation Advisor for the African Universities’ Research project. He is an M&E specialist on the evaluation of capacity development programmes promoting research and communications capabilities. He has been involved in the monitoring and evaluation of programmes held across South Asia and Sub-Saharan Africa.