Wednesday 23 December 2015

"Many cooks may not spoil the broth but may enhance it": A multidisciplinary approach to education


Adopting and modifying the best practices of  teaching and learning may be the key to fostering innovation in teaching and learning.  The concept of borrowing innovative experiential and interactive teaching and learning methods from unrelated disciplines may be a novel strategy to ensure that graduates acquire the competencies they require.

This novel approach was this year's Global Knowledge Exchange Network (GKEN) 5th Conference themeThe conference created a platform for professionals from a multitude of unrelated disciplines to learn and share experiences of  best practices of teaching and  learning that foster development of  competencies in graduates.  Educators got the opportunity to learn from each other's discipline in such a way that it facilitated the establishment of collaborative multi-disciplinary networks and groups dedicated to enhance teaching across unrelated disciplines.

As educators, we view this approach as an inevitable extension of the global trend of fostering collaborative multidisciplinary research from seemly unrelated disciplines. As educators we have forgotten that good research comes from acquiring good research skills.  Adopting the GKEN model will not only increase  the  number of educational methods available to us, but also the opportunity to develop better educational methods through interdisciplinary collaboration. 

In our view that is the future of educational innovation, in ensuring we have the best teaching methods to impart the professional competencies required.

Professor Lwoga holds a PhD in information studies from the University of KwaZulu Natal, South Africa. She teaches and supervises both undergraduate and postgraduate students. She has facilitated a number of workshops and short courses.  She has published widely and has presented over 30 research papers in both international and local conferences.  Professor Lwoga currently coordinates the African Universities' Research Approaches (AURA) programme at MuhimbiliUniversity of Health and Allied Sciences (MUHAS), Tanzania, together with an additional four projects working with international partners in Sweden, South Africa and USA.



Dr. Doreen Mloka is a Medical Microbiologist/Molecular biologist. She is a Medical Education Fellow and the Director of Continuing Education and Professional Development at Muhimbili University of Health and Allied Sciences (MUHAS), Tanzania.



AURA developing a framework for strengthening a research culture


The African Universities' Research Approaches (AURA) programme completed its first research capacity intervention, earlier this year in Nairobi and Dar Es Salaam working with African universities to enable faculty to do more research and develop researchers of the future.


How the study of people's information behaviour informs the programme


The AURA programme's strategy draws on the study of people's information behaviour. In particular the assumption that information behaviour is intrinsically related to how we navigate and at the same time co-construct our experience. This shared experience, although open to interpretation, is reflected in our thoughts and informed and communicated through our actions including speech, writing, the creation and use of images, and with an array of technologies. The 'experience' that is the focus of the AURA programme is our learning experience, in particular that learning experience associated with conducting an inquiry into a particular topic or problem, i.e. conducting research and becoming informed.

As a consequence the project draws on theories of people's information behaviour that explores and documents the factors that influence and drive information behaviour, some generic some context specific. The sub-domain, information literacy, i.e. concentrating on the cognitive, emotional and behavioural capabilities relating to being informed; where people identify their information needs and appreciate and effectively utilise their socio-technical environment that can enable them to become informed or may be used to manage or communicate their 'research' i.e. the product of 'finding out' and learning.


Drawing on current approaches within higher education and research

The AURA programme also draws on a knowledge of current approaches to become informed within, on the whole, higher education and research environments.  These include an overview of the ontological, epistemological orientations and approaches that are taken. These include the broad ontological distinction between Cartesian and non-Cartesian and broad epistemological viewpoints within the post positivism or interpretivist paradigms such as social constructivism or critical realism or phenomenology.

Furthermore the distinction between citizen-led, highly participative or researcher-led forms of research (which are associated with distinct epistemological orientations) are highlighted.

The theoretical is however grounded in relation to the practice of conducting research and the process of developing research questions that will lead to funding from donors and research sponsors as well as getting published in respected publications plus ensuring that research is communicated effectively and has impact.  Hence, mapping the researchers network and identifying stakeholders (other academics, policy makers, publishers, international organisations etc.) and also mapping the knowledge, information and data landscape is seen as fundamental. The academic's research is therefore seen as integral to the wider socio-economic-technological context. Thus the research has to be contextualised and justified in terms of potential impact on society as well as to be clear about the academic contribution the research makes. In fact successful research is likely to stem from this contextualisation process. Without this contextualisation process, research is less likely to succeed or be seen to contribute.

Using pedagogical theory and practice to enable learners to engage with interventions


Another key component of the AURA programme is pedagogy. Pedagogy is the theory and practice of enabling learning and education. It is therefore intrinsic in that it enables us to implement interventions that are based on strong pedagogic theory and will enable learners to engage with the subject i.e. research and how to go about research. The current teaching of research, although there are examples of excellent practice, in many cases the learners do not engage and benefit little from research methods courses. Plus they tend to be patchy in their coverage. In the past the teaching of research has tended to take a teacher-centred approach rather than participative and experiential. It also tends have taken on board to a limited extent recent developments, for example, the importance placed on participative research or the blurring between quantitative and qualitative data or the increased emphasis on holistic/systemic approaches in addition to the more traditional analytical approaches. These changes are related to an increased appreciation of the complexity of many problems that we face and that need to be tackled from a multi-disciplinary, mixed methodological perspective.

Furthermore drawing on current knowledge about pedagogy enables the modelling of methods and techniques that can be applied in the universities and the learning stemming from AURA to be cascaded and institutionalised. It also enables current practices to be adopted such as reflective practice or the use of social media or blended learning which draws on different modalities of delivery and learning.

Concluding reflections

Strengthening research capacity within an institution is however challenging. There are organisational factors that impinge on the project. These include their: history, culture, goals, organisation and infrastructure. One key challenge is the large numbers of students and the shortage of staff. A host of different players operate in this context: faculty/researchers, students, the research office, the library, continuing professional development, the graduate school, deputy vice chancellors of research and teaching, the ICT providers. Each play a role and have an interest in certain aspects of the 'problem' i.e. strengthening the research culture and the capacity to do research. As a consequence AURA has involved representatives from these groups in each African university. They form the core group of partners and co-developers.


 This blog post was originally shared by Professor Mark Hepworth on LinkedIn.

Dr Mark Hepworth is Professor in People’s Information Behaviour, at the Centre for Information Management at Loughborough University, Loughborough, UK. He champions access to information for underrepresented groups in society, is passionate about research philosophy and methodology and about strengthening people’s capacity to conduct research in educational, workplace and community based contexts. He specialises in participative, qualitative research.

Read more at Mark Hepworth's Blog and on Twitter: @kampalamark.


Thursday 26 November 2015

Critically Reflecting in Practice



Photo source: Google images

Introduction


In this blog post I consider what critical reflection in learning practice entails and how it can foster learning, why it is useful and how it can be applied. Finally I explore its meaning in my own practice as learning practitioner.


What is critical reflection?


Critical reflection enables a learner to analyse what has been learned and how learning fosters self-development. It is in light of these two analysis aspects that an importance is placed on critical reflection in professional development of teaching practices. Cranton (1996) defines critical reflection as the “process by which adult learners engage in and identify the assumptions governing their actions, locate the historical and cultural origins of the assumptions, question the meaning of the assumptions, and develop alternative ways of acting”. Brookfield (1995) adds that part of the critical reflective process is to challenge the dominant social, political, cultural, or professional ways of acting. Through the process of critical reflection, adult learners come to interpret and create new meaning and actions from their experiences and are able to create a learning strategy for life-long learning.


How can it be fostered in the classroom?


Brookfield (1988) identified 4 activities central to becoming critically reflective: Assumption analysis describes the activity a learner engages in to bring to awareness beliefs, values, cultural practices, and social structures regulating behaviour and to assess their impact on daily activities. Assumptions structure our way of seeing reality, govern our behavior, and describe how relationships should be ordered. Contextual awareness is achieved when adult learners come to realise that their assumptions are socially and personally created in a specific historical and cultural context. Imaginative speculation provides an opportunity for learners to challenge prevailing ways of knowing and acting by imagining alternative ways of thinking about phenomena Cranton (1996). The outcome of assumption analysis, contextual awareness, and imaginative speculation is the fourth activity referred to as Reflective skepticism the questioning of any universal truth claims or unexamined patterns of interaction. In other words, critical reflection enables us to locate ourselves within a broader social context; to understand our values, beliefs, and biases and to assess our learning so that our learning informs our practice.

Good reasons for incorporating reflection into your own practice


The value of reflecting on practice recognises the importance of taking action on the basis of assumptions that are unexamined. Critical reflection is one particular aspect of the larger process of reflection. To understand critical reflection properly we need first to know something about the reflective process in general. Brookfield (1995) highlights that critical reflection is important as, "It helps us take informed actions that are based on assumptions that have been carefully and critically investigated. It helps us develop a rationale for practice not only grounds our actions, but also our sense of who we are as educators in an examined reality. It rounds us emotionally when we neglect to clarify and question our assumptions, and when we fail to research our students, we have the sense that the world is governed by chaos. It increases democratic trust through learning whether independence of thought is really valued, or whether everything depends on pleasing the teacher. They learn either that success depends on beating someone to the prize using whatever advantage they can, or on working collectively".

How can critical reflection be applied?


The African Universities Research Approaches (AURA) programme, fosters shared learning across partner institutions. Participants on a learning programme are encouraged to reflect, ask questions and draw on concepts that can help to understand learning in their own practice. Participants are introduced to an experiential learning model developed by Kolb, 1984, to identify, investigate, reflect on and report on a learning dimension of their work. The ORID questioning process which is based on the Kolb’s experiential learning cycle allows participants to apply simple steps that supports their thinking in a critically reflective manner. The steps / framework outlined can assist the reflective writing process. 

  • (Analysing) Analyse how it made you feel? “I feel…” 
  • (Evaluating) What did you conclude from this experience? “I concluded…” 
  • (Creating) What will you do differently now? “I will do… in the future”. 

Through describing a critical incident arising from the practice learning environment the participant is able to make sense of what has been shared. During an AURA learning intervention, learners engage in a reflective practice within group and facilitated online discussions pre- and post- the face-to-face intervention and encouraged to continue reflective writing when away from the learning environment. Reflective writing enables learners to pursue the critical reflections on a deeper level and confront the challenge of explaining their research ideas. During the online and face-to-face interventions reflections underpinned their understanding of theory and course content and to link experience and knowledge.

Why is critical reflection important to an educator?


Critical reflection blends learning through experience with theoretical and technical learning to form new knowledge constructions and new behaviours or insights. Reflection is necessary to develop the skills to become lifelong learners. Through my observer role and own view point I feel I am in a good position to develop a learning strategy from my viewpoint / identity as a learning practitioner. Through the AURA programme I am able to observe a case study of my own ‘critical incident’. This may not always be the case. If tackling a learning dimension in which I have little experience, more research would need to be done, for example finding out about others experience as a learner / and or learning practitioner in relation to the learning dimension in similar contexts. My own approach as a learning practitioner are informed in part by experience (self or others) and in part by the relevant and prevailing theoretical and distinct conceptual perspectives.


References:                                                    


Brookfield, S. 1995. What it means to be a Critically Reflective Teacher in “Becoming a Critically Reflective Teacher,” San Francisco Jossey-Bass.

Brookfield, S.D. 1990. Using critical incidents to explore learners’ assumptions. In pages 177-193 of J. Mezirow (Ed).

Brookfield, S. 1988. Developing Critically Reflective Practitioners: A Rationale for Training Educators of Adults. Training Educators of Adults: The Theory and Practice of Graduate Adult Education. S. Brookfield, Editor. New York: Routledge.

Cranton, P. 1996. Professional Development as Transformative Learning: New Perspectives for Teachers of Adults. San Francisco: Jossey Bass.

Kolb, D. A. 1984. Experiential Learning Experience as the Source of Learning and Development. Prentice Hall Inc. Englewood Cliff, New Jersey. USA. Retrieved from the AURA programme course material.

How to be critical when reflecting on your teaching (2015). Retrieved on 23 November http://www.open.edu/openlearn/education/learning-teach-becoming-reflective-practitioner/content-section-2.1

Learning through Reflection (2015). Retrieved on 23 November http://www.nwlink.com/~donclark/hrd/development/reflection.html

Myrtle Adams-Gardner is the Training Quality Coordinator representing the South for the African Universities’ Research Approaches programme. She is experienced in mentoring and coaching, pedagogies and assessments of learning. She has been involved in the development of capacity development programmes promoting teaching and learning capabilities in Sub-Saharan Africa.

Thursday 19 November 2015

Challenges of monitoring and evaluating: attributing causality



Photo:  DarkoStojanovic, Pixabay
Creative Commons CC0 Public Domain License
In popular culture the laboratory is a place with largely negative connotations: the word conjures up images of beakers, test tubes and white coats. Yet, as a social scientist I look upon it with a slight sense of envy. This is because the laboratory represents a degree of experimental control rarely possible in the social sciences.  As a monitoring and evaluation specialist on the African Universities Research (AURA) programme, however, ours is a study of human behaviour in its natural surroundings. These natural surroundings happen to be diverse: the first year of the programme focuses on multiple stakeholders at several universities spread across Eastern Africa. This familiar terrain of bureaucracy and institutional politics ensures our focus is far removed from the controlled environment of the laboratory. My argument is that the further your research moves from the laboratory the harder it becomes to exercise control, and therefore the harder it becomes to attribute causal relations between variables. This is a problem given the prevalence of linearity assuming quantitative indicators in donor reporting, particularly in logframes.  So, given this degree of complexity in programmes such as AURA, it’s sensible to adopt mixed methods approaches. This is the first in a series of blogs I will be writing to make this case using examples from my experiences as an evaluator on the AURA programme. 


Measuring short-term impact and long-term value
AURA is a capacity development programme aimed at shaping behavioural change. This will be achieved, in part, through the AURA suite of capacity development courses. One of the challenges of the evaluation effort lies in showing that these courses have been effective. This is done at different levels for different attributes. So let’s first consider an example in which the objective is to measure improvements in the research skills of our workshop participants. Firstly, this requires a benchmark, something to compare progress against. So, before participating in any sort of workshop, the participant takes a combination of test and self-assessment questions that give us an idea of their current skills set. The scores on these questions are then compared to scores obtained from a similar exercise immediately after the workshop. The difference between them is what we report in our logframe; this is the increase in skills measured, and typically it is expressed as a percentage. How well this method works depends on how sophisticated the test questions are and also how well it’s supported by data obtained from other, typically qualitative, methods. At its best this builds a strong case for attributing an increase in skills to the intervention. But this snapshot covers only a short space of time: the immediate pre and post intervention periods. Yet, the real value of the AURA programme will depend on how these skills develop over a longer time period and what’s done with them. This will determine the impact the programme has had; and for good reason this is increasingly what donors are focusing on.

Correlation does not equal causation
Let’s carry on with our example. Our participants, having completed their workshop go back into their everyday lives at their respective institutions. The easiest and perhaps least rewarding evaluation exercise might be to repeat the same test given to them previously, but taken several months after the workshop to determine whether they’ve sustained their skill set. However, we are now into less secure territory as far as causality goes. What made the first assessment so useful is the very thing that hinders this one: the lapse of time. Earlier we could more confidently attribute the increase in skills to our intervention, because the testing was done immediately before and after it. Now, however, several months have passed and each of our participants have done different things in that time, some of these activities might have advanced these skills further, while others might have hindered them. So even if we have another snapshot of the participants skill set, several months down the line, how do we know whether any shifts are because or in spite of our intervention? Even if we found an upward surge in our participants skill set, how can attribution be safe given the possibility of so many other variables? Perhaps we could correlate the rise of skills with the amount of post-workshop interaction we have had with the participant. This is the point at which I hear the voice of my statistics tutor who reinforced over and over in his class: that correlation does not equal causation. Anyone who’s studied the social sciences has their favourite examples of two independent patterns that are strongly correlated, and my favourite come from military intelligence analyst Tyler Vigen, in his book Spurious Correlations. Here we learn that the per capital consumption of mozzarella cheese is strongly correlated with the number of doctorates awarded in civil engineering; and that the greater the consumption of sour cream the more motorcycle riders killed in non-collision transport accidents. It makes you wonder: are the patterns I’m finding between our interventions and a participants long term progress just as spurious as these examples. 

Towards a mixed methods approach
This needs to be investigated further. The evaluator has a choice to make at this point, to either move towards or away from the laboratory. The move towards is taken by the econometrician, only this time it is a figurative laboratory, and instead of using actual instruments of control we use numerical values in a regression analysis. A regression analysis is a tool used to investigate the relationship between different variables, which in our case might be a relationship between high performance on one of our courses and, say, an increase in earnings. However, there are several other factors that might explain an increase in earnings that are independent of the AURA programme, such as the participant’s prior levels of education, income differentials between different geographical locations and different sectors of the economy.  The aim of the regression equation is to zone in on the relationship between the variables in which we are interested while controlling for those that we are not. This approach, however, relies on being able to ascribe numerical values to your control variables. This is not always possible. To use our example further:  what if the difference in earnings is due not to what we’ve hypothesised or tried to control for, but rather simply down to the participant having the right family connections? It’s less clear how we can place numerical values on things like patronage, and knowing this and conceding the point allows us move away from the laboratory and embrace a mixed methods approach. 


Jagdeep Shokar is Monitoring and Evaluation Advisor for the African Universities’ Research project. He is an M&E specialist on the evaluation of capacity development programmes promoting research and communications capabilities. He has been involved in the monitoring and evaluation of programmes held across South Asia and Sub-Saharan Africa.