Six very common flaws of foreign language assessment

download (2)

  1. Teaching – Assessment mismatch

Often foreign language instructors test students on competences that have not been adequately emphasized in their teaching or, in some cases, have not even been taught.

The most common example of this refers to the issue of task unfamiliarity, i.e. the use of an assessment tool or language task the students have never or rarely carried out prior to the test. This can be an issue, as research clearly shows that the extent to which a learner is familiar with a task will affect his/her performance. The reasons for this refer to the anxiety that the unfamiliarity engenders, the higher cognitive load that grappling with an unfamiliar task obviously poses on working memory (especially when the task is quite complex) and of course the fact that memory (and, consequently, task-knowledge) is context-dependent – which means that knowledge is not easily transferred from task to task (the so-called T.A.P. or Transfer Appropriate Processing principle). By doing a task over and over again prior to an assessment involving that task, the student develops task-related cognitive and metacognitive strategies which ease the cognitive load and facilitate its execution.

Another common scenario is when students are not explicitly focused on and provided sufficient practice in a given area of language proficiency (e.g. accuracy, fluency, vocabulary range, grammar complexity); yet their teachers use assessment scales which emphasize performance in that area (e.g. by given grammatical accuracy a high weighting in speaking performance whilst practicing grammar only through cloze tasks). I have had several colleagues in the past who taught their students through project-based work involving little speaking practice even though they knew that the students would be assessed in terms of fluency at the end of the unit. Bizarre!

Language unfamiliarity is another instance of this mismatch, in my opinion. This refers to administering to students a test which requires them to infer from context or even use unfamiliar words and results in assessing the learners not on the language learnt during the unit but on compensation strategies (e.g. guessing words from context). Although compensation strategies are indeed a very important component of autonomous competence, I do believe that a test needs to assess students only on what they have been taught and not on their adaptive skills – or the assessment might be perceived by the learners as unfair, with negative consequence for student self-efficacy and motivation. A test must have construct validity, i.e. it must assess what it sets out to assess. Hence, unless we explicitly provide extensive practice in inferential skills, we should not test students on them.

Some teachers feel that since the students should possess the knowledge of the language required by the task whether the students are familiar with the task or not will not matter; this assumption, however, is based on a misunderstanding of L2 language acquisition and task-related proficiency.

  1. Knowledge vs control

Very often teachers administer ‘grammar’ tests in order to ascertain whether a specific grammar structure has been ‘learnt’. This is often done through gap-fill/cloze tests or translations. This approach to grammar testing is correct if one is purporting to assess declarative (intellectual) knowledge of the target structure(s) but not the extent of the learners’ control over it (i.e. the ability to use grammar in real operating conditions, in relatively unmonitored speech or written output). An oral picture task or spontaneous conversational exchange eliciting the use of the target structure would be more accurate ways to assess the extent of learner control over grammar and vocabulary. This is another common instance of construct invalidity.

  1. Listening vs Listenership

This refers less to a mistake in assessment design than to a pedagogical flaw and assessment deficit and is a very important issue because of its major wash-back effect on learning. Listening is usually assessed solely through listening comprehension tasks; however, this does not test an important set of listening skills, ‘listenership’, i.e. the ability to respond to an interlocutor (a speaker) in real conversation. If we only test students on this aspect of listening, the grade or level we assign to them will only be reflecting an important set of listening skills (comprehending a text) but not the one they need the most in real-life interaction (listening to an interlocutor as part of meaning negotiation). Listening assessments need to address this important deficit, which, in my opinion is widespread in the UK system.

  1. Lack of piloting

To administer a test without piloting it can be very ‘tricky’ even if the test comes from a widely used textbook assessment pack. Ambiguous pictures and solutions, speed of delivery, inconsistent and/or very subjective grading of tests and construct validity issues are not uncommon flaws of many renowned course-books’ assessment materials. Ideally, tests should be piloted by more than one person on the team, especially when it comes to the grading system; in my experience this is usually the most controversial aspect of an assessment.

  1. ‘Woolly’ assessment scales

When you have a fairly homogenous student population, it is important to use assessment scales/rubrics which are as detailed as possible in terms of complexity, accuracy, fluency, communication and range of vocabulary. In this respect, the old UK National Curriculum Levels (still in use in many British schools) were highly defective and so are the GCSE scales adopted by UK examination boards. MFL departments should invest some quality time to come up with their own scales, making specific reference in the grade descriptors to the traits they emphasize the most in their curriculum (so as to satisfy the construct validity criterion).

  1. Fluency – the neglected factor

Just like ‘listenership’, fluency is another factor of language performance that is often neglected in assessment; yet, it is the most important indicator of the level of control someone has achieved in TL receptive and productive skills. Whereas in speaking UK MFL departments do often include fluency amongst their assessment criteria, in writing and reading this is not often the case. Yet, it can be relatively easily done. For instance, in essay writing, all one has to do is to set a time and word limit for the task-in-hand and note down the time of completion for each student as they hand it in. Often teachers do not differentiate between students who score equally across accuracy, complexity and vocabulary but differ substantially in terms of writing fluency (i.e. the time to word ratio). By so doing we fail to assess one of the most important aspects of language acquisition: executive control over a skill. In my view, this is something that should not be overlooked, both in low-stake and high-stake assessments.

12 challenges of ICT integration in MFL instruction

course__teachertrainingaustralia_courses_HigherOrderThinkingAndEf__course-landing-still-1395051971.51

INTRODUCTION

ICT integration in the MFL curriculum and its implementation in the classroom pose a series of important challenges which refer to the way digital tools and tasks affect language acquisition as well as the students’ general cognitive, emotional, social and moral development.

Unfortunately, not much is known about the impact of digital assisted teaching and learning on language acquisition and I am not aware of any research-based approaches to ICT integration in MFL. Teachers apply their intuitions and experience accumulated through non-digital assisted teaching to emerging technologies often taking a leap of faith.

In what follows I advocate a more systematic, thorough and reflective approach to ICT integration in MFL teaching and learning, one which considers cognitive, ethical, pragmatic and, of course, pedagogical factors. I also advocate, that whilst I believe that emerging technologies can indeed enhance learning, they will not unless they are effectively harnessed. This requires (a) an understanding of language acquisition; (b) a grasp of how each digital medium affects acquisition; (c) how it fits an individual teacher’s ecosystem.

TWELVE CHALLENGES OF ICT INTEGRATION IN MFL

Please note that the list below is not exhaustive; it only includes what I deem to be the most important challenges I have identified based on my own experiences with ICT integration.

Challenge 1: Enhancement power vs Integrating ICT for ICT integration’s sake

The first question you have to ask yourself when planning to use an app or web-tool or carrying out any digitally-based task in your lessons is: Does it really enhance teaching and learning as compared to my usual non-ICT based approaches? How can I be sure of that?

Teachers often feel compelled to use an app or web-tool for fear of not appearing innovative enough in the eyes of their managers; however, after all, if their classroom presence, charisma, humour, drama techniques, etc. are more compelling than Kahoot, iMovies, Tellagami and the likes, it would be silly to use less impactful teaching tools just to check the ICT-integration box – unless, of course, one is doing it for the sake of variety or to get some rest at the end of a long and tiring school day.

In my experience this is one of the most difficult challenges when a school goes 1:1 with lap-tops or tablets. Teachers feel that if they do not use ICT as much as possible they are somehow failing the school or will be left behind. This can cause some teachers, especially the less ICT-savvy ones, serious anxiety. And if this issue is not properly tackled from the beginning, learning will be affected quite negatively, especially during the first phase of integration when teachers are trying things out.

Possible solutions

This issue can be addressed by:

  1. Providing training in how specific digital tools and tasks actually affect student cognition and language acquisition at different stages of MFL proficiency developments. Emerging technologies workshop facilitators and ICT integration coaches usually show teachers how to use a specific tool or conduct an ICT-based task but do not delve into how it enhances language acquisition across the four macro-skills and why it is superior to its non-digital counterpart. To buy into ICT integration in MFL, teachers must be provided with a solid rationale as to why, how and when in a teaching sequence specific digital tools and tasks enhance learning. This is rarely done. Experimentation with a digital tool at the expenses of student learning is ethically wrong considering that little teacher contact time is usually available to MFL learners.
  1. Deciding on a set of clear pedagogic guidelines, both generic and subject-specific, for ICT integration in lessons (e.g. Generic guidelines such as: Is there an expectation for the iPad to be used in every single lesson? ; MFL specific guidelines: When doing a project about school, students should not do the filming around school during lesson time). Whilst the provision of an overly prescriptive framework would impact teaching and learning negatively, in my view, a clearly laid out pedagogic framework for ICT integration would prevent some teachers from over-using digital tools and others from under-using them. The elected framework will have to be convergent with the MFL department’s espoused methodological approach to language instruction.
  1. Producing an MFL-department ‘ICT handbook’ detailing the benefits and drawbacks of each digital tools and embed in curriculum design–This would complement the above-envisaged guidelines. The ‘handbook’ could take the form of an interactive google sheet onto which each member of the MFL team logs in the pros and cons for learning of each digital tool as their understanding of its impact on learner cognition increases. Unlike what often happens this should not merely consist of a dry detailed list of the latest and coolest apps with technical tips and a cryptic description like “Good for listening” or “Children find it fun”. After the first ICT integration cycle, embed the best recommendations found in the ‘handbook’ in your schemes of work.
  1. Piloting the digital tools prior to use –Get a few students – including ‘weaker’ students, too – to try the app or website at lunchtime or after school, possibly in your presence; if not possible, ask them to do it at home. Then gather some feedback on it about the ‘mechanics’, how much they learnt, how enjoyable they found it, how they would improve it. Remember to ask them to compare the impact of that app with the non-digital approaches to the same tasks that you would normally use. Piloting digital apps and tasks has been a lifesaver for me on several occasions.
  1. Student voice – see below (Challenge 5)

Challenge 2 – Language learning metaphors we live by

Digital literacy within and without the language learning domain is very important. So are E-learning strategies (i.e. the set of approaches and techniques which MFL students can deploy in order to independently use web-based resources to enhance their learning). However, if, as part of our teaching philosophy, we set out to forge effective lifelong language learning habits with our students we also have the ethical imperative to impress on them the importance of human-to-human interaction as a fundamental lifelong learning skills. As discussed in a previous post (‘Rescuing the neglected skill’), ICT integration in MFL often results in oral proficiency and particularly fluency (the proceduralization of speaking performance) being neglected. This is usually due to a tendency to involve students in (a) ludic activities (e.g. online games of low surrender value); (b) the creation of digital artefacts through single or multiple apps; (c) digitally-assisted projects of the likes envisaged by Puentedura’s Redefinition criteria.

By skewing the classroom language acquisition process towards an interaction-poor and mainly digitally-based model of learning we give rise to metaphors of learning which do not necessarily truly reflect the way language is produced and learnt in the real world. These metaphors – rightly or wrongly – may form the core of our students’ beliefs as to how languages are learnt for the rest of their lives!

Possible solutions

Teachers ought to ensure that there is a balanced mix of digital and non-digital tasks and that oral activities aimed at developing oral competence and involving learner-to-learner interaction feature regularly in MFL lessons. After all, we want to forge students who can interact with other humans, don’t we?

Challenge 3 – Curricular unbalance

As mentioned above, ICT MFL integration has the potential to be detrimental to oral fluency development. In my experience, however, other aspects of language acquisition are penalized too; one is listening fluency and another one is grammar proceduralization. As far as listening goes, one reason is that there are not very many free apps and web-based resources which promote aural skills. Another issue relates to the fact that one very important aspect of listening competence, listenership, is often overlooked in the ‘techy’ classroom. Listenership refers to the ability to listen, comprehend and respond as part of an interaction with an interlocutor; when learners work with a machine, there is no interlocutor to react to

As for grammar, there are verb conjugators like mine (www.language-gym.com) and apps and websites involving work on morphology and syntax; but these are not designed to develop cognitive control over the target grammar structures (i.e. accurate written or oral performance under real operating conditions).

Language and communicative functions (e.g. agreeing and disagreeing; sequencing information; comparing and contrasting) are also often neglected in digitally-based PBL.

Possible solutions 

Ensure that in every unit in your schemes of work ICT-based integration does not result in an excessive focus on reading, writing, word-level vocabulary learning and grammatical drills of the gap-filling/mechanical kind.

Challenge 4 – Divided attention

Unless students are highly proficient in the target language, digital manipulation has the potential to hijack student attention away from language learning. Cutting, pasting, app-smashing, searching for sound-effects, downloading images, etc. take up a lot of the students’ cognitive resources. My students also often complain about being distracted by the notifications they get from e-mail and social network whilst on task. The obvious consequence is divided attention which may lead to poor retention.

Another source of divided attention is the cue-poor nature of digital media. In a typical interaction with a teacher or peer an MFL student can rely on many verbal and non-verbal cues (e.g. facial expressions and gestures) which will enhance understanding or clarify any doubts s/he may have; however, when it comes to digital environments, apps or web-tools are not as cue-rich and may pose various challenges that may disorientate less ICT-savvy or confident learners. If one has twenty or more students, it is not rare to have two or three experiencing this sort of problem at the same time. Hence, yet another source of divided attention.

Possible solutions

First and foremost, get your students to turn off social networks and social network notification and establish some clear sanctions for those who do not.

Do not involve students in any digital manipulation in lesson unless absolutely necessary. Flip it: get it done at home!

Model the use of an app / web-tool to death, until you are 100% sure everyone has understood how to use it. Also, group the less ICT-savvy students together – you will spot them very soon, trust me. This will save you the hassle of running from one side to the other of the classroom and will avoid other students being distracted by requests for help coming from those very students.

Challenge 5 – Time-cost

This point follows on from the previous one, in that, in view of the very little teacher contact time available to MFL teaching, one has to put every single minute available to very good use; to invest a lot of valuable lesson time into the mechanical manipulation of a digital tool which does not directly contribute to language acquisition verges on the unethical considering that the ultimate objective of language learning is to automatize receptive and productive skills – which requires tons of practice.

Possible solutions

The same provided for the previous point.

Challenge 6 – Variety

I have observed many ‘techy’ lessons where students spend a whole hour on the iPad! This practice is counter-productive not only for what I have discussed above, but also because there is research evidence that many students do find it tedious and not conducive to learning (Seymour,2015). In a small survey I carried out some time ago, my students recommended that the continued use of the iPad should last no longer than 15-20 minutes.

Possible solutions

Even though you may change app or game or task, you cannot keep your students on the iPad, laptop or whatever device you use for a whole lessons! As discussed in a previous post, it is good practice to alternate short and long focus, individual and group work, digital and non-digital tools/tasks, receptive and productive skills, ludic and academic activities, etc.

Challenge 7 – Student voice – How do we get the information ?

At some point you will want to find out how ICT integration into your teaching eco-system is being perceived by your students. Many education providers will go about it by administering questionnaires, possibly a google form as it is easy to set up, administer and analyze quantitavely. However, there are many problems with that if you want to obtain rich data which will help you improve the next cycle of ICT integration. The main problem is that a typical google-form survey ‘imposes’ onto the informants closed questions which require them to answer by choosing yes or no, or a frequency (e.g. often, rarely) or quantity (e.g. a lot, some) adverb. However, say 30 % of the students indicate they do not find ICT useful; you will not know why and which aspects they are referring to. Not to mention the fact that this kind of data elicitation encourages a very superficial ‘tick the box’ approach which does not spark off deep thinking on the issues-in-hand.

Hence, to enhance the richness of the data so obtained, course administrators ought to at least triangulate the google surveys with one-on-one semi-structured interviews involving a cross-section of learners of different ability and gender. A former colleague of mine, in her master’s dissertation research, found out that her interview findings on ICT integration often contradicted data on the same issues obtained through a google-form survey.

Challenge 8 – Evaluation of ICT integration by peers /SLT

In many schools ICT integration in MFL is evaluated by using the SAMR model. This can be detrimental to foreign language learning for reasons that I have discussed at length here:  http://tinyurl.com/o3yp64v (Of SAMR and SAMRitans – Why the adoption of the SAMR model may be detrimental to foreign language learning)

Possible solution

See solution to challenge 1, above (point 2)

Challenge 9 – Learner attitudes towards digital tools

If your school has gone 1:1 with the iPad or other tablet you may find that in most of your classes there will be one or more students who will not enjoy using the device for language learning.  This presents you with an ethical dilemma. If it is imperative to differentiate our teaching so as to cater for every learner’s cognitive and emotional needs what can you do to suit their learning preferences?

Possible solution

If, after trying various tactics, you still have not managed to ‘win the learner over’ and their aversion for the mobile device still remains, you have three options that I can think of. First one: tough, they need to adapt – not the most empathetic of approaches. Second option: give them alternative tasks with similar learning outcomes using traditional media. Third option: if you use the iPad for snappy activities of no more than 10-15 minutes alternated with other non-digitally-based tasks, as I do, you will find that these students will not complain much at all, in the end.

Challenge 10 – Lack of meta-digital language learning awareness

The internet offers a vast array of TL learning opportunities. However, students need to be equipped with the metacognitive tools which will enable them to seize those opportunities. This entails teaching learners how to learn from TL internet input sources, web-tools and App so as to be able to use them autonomously (e.g. how to best use verb conjugation trainers, Youtube, Memrise, Wordreference forums, etc. for independent learning ).

Possible solutions

Embed one or more explicit learner training program in meta-digital learning strategies (e.g. e-learning strategies) per ICT integration cycle. Since learner training programs do require extensive practice in the target strategies, one should offer instruction in only a limited range of strategies per cycle. A widely-used instructional framework can be found in this post of mine:  http://tinyurl.com/qxafy7m  (‘Learner training in the MFL classroom’)

Challenge 11 – Ethical issues

Easy access to the digital medium and apps entails the danger of ‘cut and paste’ plagiarism and the use of online translators.

Possible solution

Besides alerting the students to the dangers of such practices, teachers may ask their students to sign a ‘contract’ in which they pledge never to commit plagiarism nor to use online translators. During the initial phase of integration, teachers will have to remind students of their pledge to keep it into their focal awareness and spell out the sanctions for flouting the ‘rules’. In my experience this is a highly successful tactic to pre-empt such issues.

Challenge 12 – Product vs Process

Since a lot of apps and emerging technologies workshops concern themselves with the production of a digital artefact (e.g. iMovies) ICT integration in MFL is often characterized by an excessive concern with the product of learning. However, language teaching should focus mainly on the development of listening, speaking, reading and writing skills (the process of learning) rather than merely putting together a digital movie, app-smashing or presentation (the product).

Possible solution

The production of a digital artefact can be conducive to language acquisition if teachers and learners lay as much emphasis on the process of learning as they do on the final product. This involves identifying what levels of complexity and fluency in each macro-skill are expected of the learners at each stage in the development of a given project and provide extensive contextualised practice throughout the process with those aspirational levels in mind.

Conclusion

Successful ICT integration in the MFL curriculum and its practical implementation are no easy tasks. First and foremost they require a pedagogical framework which is consonant with whatever instructional approach to L2 learning is espoused by the MFL team. Secondly, its effectiveness will be a function of the extent to which a teacher knows how, why and when digital tools enhance learning and affect learner cognition and language acquisition. Thirdly, teachers need to control the tendency, common during the first cycle of integration for teachers to let the digital medium take over and drive teaching and learning; this is very common amongst emerging technologies enthusiasts. This is highly detrimental to learning and I have experienced first-hand the damage it causes to student enjoyment and motivation as well as to their levels of oral and aural fluency. Fourthly, learners’ meta-digital learning awareness and strategies need to be enhanced through structured learning-to-learn programs. Fifthly, teachers must not forget that they have the ethical imperative to form a balanced linguist versed in all four macro-skills and endowed with high levels of fluency and autonomous competence. ICT integration must be eclectic and mindful of this.

In conclusion, ICT integration in MFL is not an exact science as yet. For it to be successful, the best recipe is not to rush it for the fear of been ‘left behind’ and not appear as an ‘innovative’ school or teacher. Course administrators and teachers must always have at heart the linguistic, cognitive, socio-affective and moral needs of their students and integrate ICT with those needs in mind. Not merely integrate for integrations’ sake. Ultimately, ICT integration should assist, not drive, teaching and learning.

13 key steps to successful vocabulary teaching.

images

The following are the principles that underpin vocabulary teaching in my everyday practice. The reader may want to refer to my article ” How the human brain stores and organizes vocabulary and implications for the EFL/MFL classroom” for the theoretical background to my approach.

1. Select the vocabulary based on :

  • Learnability (how easy or challenging lexis is in terms of length, pronunciation, spelling, meaning, grammar, word order in a sentence etc.);
  • Frequency;
  • Relevance to students’ interests, back-ground, culture/sub-culture;
  • Semantic relatedness (the more strongly semantically inter-related the target words are, the stronger the chances of retention)

Since I usually create my own vocabulary teaching resources (see the work-out section of  www.language-gym.com or https://www.tes.co.uk/member/gianfrancoconti1966 )  it is easy for me to keep track of the target lexis through each step of the lesson. If you do not make your own resources, especially if you are a novice teacher, it may be useful to draw a list of the words you intend the students to learn to ensure that systematic recycling does occur throughout the lesson.

2. Decide which lexical items you are planning for students to learn receptively (for recognition only) and productively (for use in speech and writing) – ‘receptive learning’ being obviously easier.

3.Decide on how ‘deep’ your teaching of the target lexis is going to go. In other words, which levels of knowing a word you are going to teach. Nation (1990) identified the following dimensions of knowing a word:

Learner knows:

  1. Spoken form of a word;
  2. Written form of a word;
  3. Grammatical behavior of a word;
  4. Collocational behavior of a word;
  5. Frequency of a word;
  6. Stylistic appropriateness of a word;
  7. Concept meanings of a word;
  8. Association words have with other related words.

4.The number of words that you select per lesson will depend largely on the students you are teaching and how systematically one wants the target lexis (every single item) to be recycled. There is a myth that one should teach 7+/- 2 words per lesson. This rule of thumb is based on a misunderstanding of Miller’s (1965) law which posits that Working Memory can only hold and rehearse 7+/-2 digits at any one time. But Working Memory span has nothing to do with how many words one can learn in a lesson. In my experience, with an able group (i.e. students with highly efficient working memories) one can aim at as many as 20-25 words/lexical chunks receptively (especially if the lexis includes cognates) and around 10 to 15 productively. This on condition that the words are recycled frequently, systematically and as many retrieval cues as possible are provided  (by building in lots of semantic associations);

5.In order to avoid the risk of cross-association, avoid selecting items which are similar in sound and/or spelling with younger or less able learners:

6. Ensure that the lexical items selected include a good balance of nouns and verbs – as I discussed in previous posts, there is an unhealthy tendency in many MFL classrooms for vocabulary teaching to be noun-driven.

7.Plan for several recycling opportunities throughout the lesson through various modalities (e.g. listening, speaking, reading, writing, body gestures). Ensure students process receptively and productively each and every target lexical item 8 to 10 times during a given lesson – more if possible! Since research clearly shows that learners notice adjectives and adverbs less, greater attention should be given to these two word-classes;

8.Ensure the recycling opportunities include activities which involve:

  • Higher order thinking skills / Depth of processing (e.g. odd one out; matching with synonyms; inferencing meaning from context, etc.) – the deeper and more elaborate the level of semantic analysis of the target lexis, the stronger the chances of ensuring retention;
  • Communicative activities involving information gaps and negotiation of meaning (surveys; find someone who; find who does what; etc.)– a substantial body of research indicates that these activities significantly enhance vocabulary acquisition;
  • Work on orthography;
  • Work on phonological awareness;
  • Work on the words’ grammar;
  • Work on the words’ collocational behavior;
  • Semantic and phonetic associations with previously learnt lexis;
  • Inferential strategies (e.g. understanding texts in which the target lexis is instrumental in grasping the meaning of unfamiliar words);
  • Creating associations with each individual learner’s personal life experiences;
  • A competitive element (games) ;
  • Personal investment / self-reliance (e.g. using dictionaries; creative use of the target words).

Give as much emphasis as possible to the correct pronunciation of the target lexis from the very early stages as vocabulary recall is phonologically mediated. Also, ensure that when you teach vocabulary work is as student-centred as possible in order to maximize the level of individual cognitive investment in the learning process. Do remember that retention is more likely to occur when learning involves deep levels of processing and substantial personal investment.

9. Ensure that words are practised in context not in isolation – hence, if you are staging games or other ludic activities, ensure that they involve the processing or deployment of the target lexis within meaningful sentences. Since exposure to the target lexis through the listening medium is often neglected in the typical UK classroom, ensure that students get plenty of aural practice.

10. When using visuals ensure they are as unambiguous as possible. If using visuals to present new lexis, make sure that they are not exposed to the spelling of the words until after you have practised its pronunciation a few times;

11. Draw on the distinctiveness principle as much as possible to ensure that through visuals, anecdotes, jokes or special effects the most challenging vocabulary items are made to stand out, memorable;

12. Occasionally – not in every lesson – select a strategy or set of memory strategies (e.g. the keyword technique) to model to and train students in. If you do teach memory strategies, ensure that you recycle and scaffold practice in those strategies in several subsequent lessons to keep them in the learners’ focal awareness for as long as you deem necessary for uptake to occur;

13. Plan for systematic and distributed (a little bit every day rather than a lot in one go) practice/recycling of the target lexis in homework and future lessons. Remember Ebbinghaus’ curve (figure 1, below), mapping out humans’ rate of forgetting and set homework accordingly so as to prevent memory decay. The fact that your students WILL forget  to aound 67 % of what they ‘learnt’ in lesson after one day should prompt to plan your recycling carefully. Figure 2 shows how I do it for grammar and vocabulary, i.e. a spreadsheet that lists vertically the items to recycle and horizontally each week of the present term; the sheet allows you to keep track of how often you have been teaching a given set of vocabulary and to plan for future recycling.

Figure 1 – The rate of human forgetting

ebbinghaus-graph

Figure 2 – Recycling tracking sheet

ticket

Here are ten commonly made mistakes in vocabulary instruction that every EFL / MFL teacher ought to look out for: 10 commonly made mistakes in EFL vocabulary instruction

Ten questions you may want to ask yourself in planning a unit of work

548a8

Here are ten questions that I believe – ideally – curriculum designers should be asking themselves in planning and evaluating a topic-based unit of work in the context of an eclectic instructional approach with a communicative focus where grammar is taught explicitly and whose ultimately aim is the acquisition of cognitive control over TL use. The reader should note that I have avoided any explicit reference to the expression ‘learning outcomes’ (although they are implicitly referred to throughout the below). This is because this term in my perception – and maybe only in my perception – evokes an excessive concern with the product of learning, whereas I advocate that the pedagogic focus shift onto the process of learning, that is, the acquisition of procedural linguistic ability, i.e. effective executive control over language reception and production.

1.What kind of language do I expect my learners to be able to understand (in reading and listening) and produce (speaking and writing) in terms of:

  • Range and depth of vocabulary;
  • Range of grammar structures;
  • Range of tenses;
  • Range of language functions?

2.How complex should teacher input and learner output be at different stages in the development of the unit of work? This is a very useful question as it usually results in the curriculum designer putting more thought and effort in the sequencing of activities based on the developmental needs of the learners. Teacher needs to have this issue in their focal awareness so as to agree on what constitutes comprehensible input and achievable output. The most challenging issue is possibly – as I discussed in my last post – deciding on what constitutes a complex grammar structure.

3. What levels of fluency do I expect the learners to have acquired at key points in the development of the unit of work? This issue is hugely important as it refers to the notion of automatization of language production. Sadly, in my experience, this is a dimension of proficiency which is usually neglected by curriculum designers – with disastrous consequences for language learning. Teachers must agree where they would like their students to be in terms of spontaneity of production at key stages in the unfolding of the unit of work.

4. What levels of accuracy in and (cognitive) control over the target language do I expect students to exhibit at key stages in the development of the unit? This is linked to the previous point but develops it a notch or two further in that it does not refer simply to the ability to be fluent and comprehensible, but also to perform accurately under real operating conditions. In other words, it refers to the extent to which grammar structures, the phonology system, etc. are not merely learnt declaratively but also procedurally;i.e. where on the declarative (explicit knowledge of the TL) to procedural (automatic application of grammar rules) continuum do we expect our learners to be?

5. Which ones of the following areas of competence am I going to focus and to what extent?

  1. Phonology (awareness of the TL sound system) and Phonetics (production of the TL system)
  2. Orthography
  3. Morphology (word formation)
  4. Syntax (word order)
  5. Language functions (e.g. compare and contrast, asking questions, predicting, agreeing and disagreeing, sequencing, summarizing)
  6. Communication skills (e.g. listenership) – this is a crucial, yet very neglected skill-set, especially in PBL;
  7. Learner strategies / Life-long learning skills – this is another neglected area in curriculum  planning;
  8. Learner autonomy (the extent to which we promote and scaffold beyond the classroom  independent learning) – also a very neglected area of competence;
  9. Cross-cultural competence

6. Which of the above should be prioritized? – This decision will be dictated by the stakeholders’ demands/needs as well by the espoused instructional methodology.

7. What would I like learners to do independently? – This does not include homework. It refers to what you would like students to do independently, without any teacher request, to enhance their own learning and how you would go about triggering this desire to learn out of the classroom.

8. How and how often am I going recycle the target language items, competences and skills/strategies throughout and beyond the present unit of work? – my regular readers would know how obsessed I am with short-/medium- and long-term recycling. I strongly believe this is the most crucial and most neglected area of teaching and learning.

9. How am I going to resource the teaching and learning of the target language items, competences and skills/strategies?

10. How and how often am I going to assess all of the above?- This is the most challenging question to answer. However, if one has answered the previous questions, it will be relatively straightforward. The issues which are most often neglected in assessment design are:

  1. Does the test actually measure what it purports to measure? (construct validity) – hence, if following the above framework, issues a to f must be taken into account here;
  2. Is the assessment going to have a positive wash-back effect on learning? – it should, really;
  3. Is the assessment fair? – Students must be prepared effectively for a test; hence, the test papers should be ready and piloted way before the teaching of a unit begins;

Moreover, several low-stakes assessments (not too many, obviously) should take place before the end-of-unit test(s) in order to be fair to our students and not base our evaluation of their performance over a term based on one snapshot at the end of it.

Conclusion

It is obvious that the answer to the above questions will be to a great extent influenced by the teaching and learning methodology espoused and/or adopted by the department as well as by the demands and needs of the stake-holders. MFL curriculum designers may find certain areas more or less relevant to their learning context or more or less worthy of an explicit focus. To expect busy teachers to do all of the above thoroughly and ‘perfectly’ would be unrealistic, especially in view of the time constraints.

However, there is one thing that every curriculum designers MUST definitely plan carefully for: how to foster and facilitate the process of automatization of every skill and language item they set out to teach. This includes extensive recycling and practice and will often require going beyond the textbook and the time normally allocated to curriculum delivery (the ‘two- chapters-or-topics-per-term syndrome’, as I call it). As I have often reiterated in my posts this is the most important, yet the most neglected dimension of language learning.

Crucial issues in the assessment of speaking and writing (Part 1)

fluent-english-speech

In the last few weeks I have been thinking long and hard about the assessment of the productive skills (speaking and writing), dissatisfied as I am with the proficiency measurement schemes currently in use in many UK school which are either stuck in the former system (National Curriculum Levels) or strongly influenced by it (i.e. mainly tense driven)

However, the challenge of finding a more effective, valid and cost-effective alternative for use in secondary British schools like is no easy task. The biggest obstacles I have found in the process refer to the following questions that have been buzzing in my head for the last few weeks, the answers to which are crucial to the development of any effective approach to the assessment of proficiency in the productive skills

  1. What is meant by ‘fluency’ and how do we measure it?
  2. How do we measure accuracy?
  3. What do we mean by ‘complexity’ of language? How can complexity be measured?
  4. How do we assess vocabulary richness and/or range? How wide should L2 learner vocabulary range at different proficiency stages?
  5. What does it mean to ‘acquire’ a specific grammar structure or lexical item?
  6. When can one say that a specific vocabulary and grammar item has been fully acquired?
  7. What linguistic competences should teacher prioritize in the assessment of learner proficiency? Or should they all be weighted in the same way?
  8. What task-types should be used to assess learners’ speaking and writing proficiency?
  9. How often should we assess speaking and writing?
  10. Should we assess autonomous learning strategy use? If so, how?

All of the above questions refer to constructs commonly used in the multi-traits scales usually adopted by researchers, language education providers and examination boards to assess L2 performance and proficiency. In this post, for reason of space, I will only concern myself with the first three questions reserving to deal with the rest of them in future posts. The issues they refer to are usually acronymized by scholars as CAF (Complexity, Accuracy, Fluency) but I find the acronym FAC (Fluency, Accuracy, Complexity) much more memorable… Thus I will deviate from mainstream Applied Linguistics on this account.

  1. The issues

2.1 What do we mean by ‘fluency’ in speaking and writing? And how do we measure it?

2.1.1 Speaking

Fluency has been defined as ‘the production of language in real time without undue pausing or hesitation’ (Ellis and Barkhuizen 2005: 139) or, in the words of Lennon (1990), as ‘an impression on the listeners that the psycholinguistic process of speech planning and speech production are functioning easily and automatically’. Although many, including teachers, use the term ‘fluency’ as synonymous of competence in oral proficiency, researchers see it more as temporal phenomenon (e.g. how effortlessly and ‘fast’ language is produced). In L2 research Fluency is considered as a different construct to comprehensibility, although from a teacher’s point of view it is obviously desirable that fluent speech be intelligible.

The complexity of the concept of ‘fluency’ stems mainly from its being a multidimensional construct. Fluency is in fact conceptualized as:

  1. Break-down fluency – which relates to how often speakers pause;
  2. Repair fluency – which relates to how often speakers repeat words and self-correct;
  3. Speed fluency – which refers to the rate of speaker delivery.

Researchers have come up with various measures of fluency. The most commonly adopted are:

  1. Speech rate: total number of syllables divided by total time taken to execute the oral task in hand;
  2. Mean length of run: average length of syllables produced in utterances between short pauses;
  3. Phonation/time ratio: time spent speaking divided by the total time taken to execute the oral task;
  4. Articulation rate (rate of sound production) : total number of syllables divided by the time to produce them;
  5. Average length of pauses.

A seminal study by Towell et al (1996) investigated university students of French. The subjects were tested at three points in time: (time one) the beginning of their first year; (time 2) in their second year and (3) after returning from their year abroad (in France). The researchers found that improvements in fluency occurred mainly in terms of speaking rate and mean length of runthe latter being the best indicator of development in fluency. Improvements in fluency were also evidenced by an increase in the rate of sound production (articulation rate), but not in a major way. In their investigation, Towell et al. found that assessing fluency based on pauses is not always a valid procedure because a learner might pause for any of the following reasons:

  1. The demands posed by a specific task;
  2. Difficulty in knowing what to say;
  3. An individual’ personal characteristic;
  4. Difficulty in putting into words an idea already in the brain;
  5. Getting the right balance between length of utterance and the linguistic structure of the utterance.

Hence, the practice of rating students’ fluency based on pauses may not be as valid as many teachers often assume. As Lambert puts it: “although speed and pausing measures might provide an indication of automaticity and efficiency in the speech production process with respect to specific forms, their fluctuation is subject to too many variables to reflect development directly.”

2.1.2 Writing

When it comes to writing, fluency is much more difficult to define. As, Bruton and Kirby (1987) observe,

Written fluency is not easily explained, apparently, even when researchers rely on simple, traditional measures such as composing rate. Yet, when any of these researchers referred to the term fluency, they did so as though the term were already widely understood and not in need of any further explication.

In reviewing the existing literature I was amazed by how much disagreement there is amongst researchers on how to assess writing fluency, which begs the question: if it is such a subjective construct on whose definition nobody agrees, how can the raters appointed by examination boards be relied on to do an objective job?

There are several approaches to assessing writing fluency. The most commonly used in research is composition rate, which is how many words are written per minute. So for instance, in order to assess the development of fluency a teacher may give his/her class a prompt, then stop after a few minutes and ask the students, after giving guidelines on how to carry out the word count, to count the words in their output. This can be done a different moments in time, within a given unit of work or throughout the academic year, in order to map out the development of writing fluency.

Initial implications

Oral fluency is a hugely important dimension of proficiency as it assesses the extent to which speaking skills have been automatized. A highly fluent learner is one who can speak spontaneously and effortlessly, with hardly any hesitation, backtracking and self-correcting.

Assessing, as I have just discussed, is very problematic as there is no international consensus on what constitutes best practice. The Common European Framework of Reference for Languages, which is adopted by many academic and professional institutions around the world provides some useful – but not flawless – guidelines.(http://www.coe.int/t/dg4/education/elp/elp-reg/Source/Key_reference/Overview_CEFRscales_EN.pdf ). MFL department could adapt them to suit their learning context mindful of the main points put across in the previous paragraphs.

The most important implications for teachers are:

  1. Although we do not have to be as rigorous and pedantic as researchers, we may want to be mindful in assessing our students’ fluency of the finding (confirmed by several studies) that more fluent speakers produce longer utterances between short pauses (mean length of run);
  2. However, we should also be mindful of Towell and al.’s (1996) finding that there may be individuals who pause because of other issues not related to fluency but rather to anxiety, working memory issues or other personal traits. It is important in this respect to get to know our students and make sure that we have repeated oral interactions with them so as to get better acquainted with their modus operandi during oral tasks;
  3. In the absence of international consensus on how fluency should be measured, MFL departments may want to decide whether and to what extent frequency of self-repair, pauses and speed should be used in the assessment of their learners’ fluency;
  4. If the GCSE or A level examination adopted by their school does include degrees of fluency as an evaluative criterion– as Edexcel for instance does – then it is imperative for teachers to ask which operationalization of fluency is applied in the evaluation of candidates’ output so as to train students accordingly in preparation for the oral and written exams;
  5. Although comprehensibility is a separate construct to fluency in research, teachers will want their students to speak and write at a speed as close as possible to native speakers’ but also to produce intelligible language. Hence, assessment criteria should combine both constructs.
  6. Regular mini-assessments of writing fluency of the kind outlined above (teacher giving a prompt and students having to write under time conditions) should be conducted regularly, two or three times a term, to map out students’ progress whilst training them to produce language in real operating conditions. If this kind of assessment starts at KS3 or even KS2 (with able groups and ‘easier’ topics), by GCSE and A-levels, it may have a positive washback effect on learner examination performance.

3.Accuracy

Accuracy would seem intuitively as the easiest way to assess language proficiency, but it is not necessarily so. Two common approaches to measuring accuracy involve: (1) calculating the ratio of errors in a text/discourse to number of units of production (e.g. words, clauses, sentences, T units) or (2) working out the proportion of error-free units of production. This is not without problems because it does not tell us much about the type of errors made; this may be crucial in determining the proficiency development of a learner. Imagine Learner 1 who has made ten errors with very advanced structures and Learner 2 who has made ten errors with very basic structures without attempting any of the advanced structures Learner 1 has made mistakes with. To evaluate these two learners’ levels of accuracy as equivalent would be unfair.

Moreover, this system may penalize learners who take a lot of risks in their output with highly challenging structures. So, for instance, an advanced student who tries out a lot of difficult structures (e.g. if –clauses, subjunctives or complex verbal subordination) may score less than someone of equivalent proficiency who ‘plays it safe’ and avoids taking risks. Would that be a fair way of assessing task performance/proficiency? Also, pedagogically speaking, this approach would be counter-productive in encouraging avoidance behavior rather than risk-taking, possibly the most powerful learning strategy ever.

Some scholars propose that errors should be graded in terms of gravity. So, errors that impede comprehension should be considered as more serious than errors which do not. But in terms of accuracy, errors are errors, regardless of their nature. We are dealing with two different constructs here, comprehensibility of output and accuracy of output.

Another problem with using accuracy as a measure of proficiency development is that learner output is compared with native like norms. However, this does not tell us much about the learner’s Interlanguage development; only with what degree of accuracy she/he handles specific language items.

Lambert (2014) reports another important issue pointed out by Bard et al.(1996):

In making grammaticality judgments, raters do not only respond to the grammaticality of sentences, but to other factors which include the estimated frequency with which the structure has been heard, the degree to which an utterance conforms to a prescriptive norm, and the degree to which the structure makes sense to the rater semantically or pragmatically. Such acceptability factors are difficult to separate from grammaticality even for experienced raters.

I am not ashamed to say that I have experienced this myself on several occasions as a rater of GCSE Italian oral exams. And to this day, I find it difficult not to let these three sources of bias skew my judgment.

3.1 Initial implications for teachers and assessment

Grammatical, lexical, phonological and orthographic accuracy are important aspects of proficiency included in all the examination assessment scales. MFL departments ought to collegially decide whether it should play an equally important or more or less important role in assessment than fluency/intelligibility and communication.

Also, once decided what constitute more complex and easier structures amongst the structures the curriculum purports to teach for productive use, teachers may want to choose to focus in assessment mostly or solely on the accuracy of those structures – as this may have a positive washback effect on learning.

MFL teams may also want to discuss to what extent one should assess accuracy in terms of number or types of mistakes or both. And whether mistakes with normally late acquired, more complex structures should be penalized considering that such assessment approach might encourage avoidance behavior.

  1. Complexity

Complexity is the most difficult construct to define and use to assess proficiency because it can refer to different aspects of performance and communication (e.g. lexical, interactional, grammatical, syntactic). For instance, are lexical and syntactic complexity two different aspects of the same performance or two different areas altogether? Some researchers (e.g. Skehan) think so and I tend to agree. So, how should a students’ oral or written performance exhibiting a complex use of vocabulary but a not so complex use of grammar structures or syntax be rated? Should evaluative scales then include two complexity traits, one for vocabulary and one for grammar/syntax? I think so.

Another problem pertains to what we take ‘complex’ to actually mean. Does complex mean…

  • the number of criteria to be applied in order to arrive at the correct form‘ as Hulstijn and De Graaff (1994) posit? –In other words, how many steps the application of the underlying rule involves? (e.g. perfect sense in French or Italian with verbs requiring the auxiliary ‘to be’)
  • variety? Meaning, that, in the presence of various alternatives, choosing the appropriate one flexibly and accurately across different contexts would be an index of high proficiency? (this is especially the case with lexis)
  • cognitively demanding, challenging? Or
  • acquired late in the acquisition process? (which is not always easy to determine)

All of the above dimensions of complexity pose serious challenges in their conceptualization and objective application to proficiency measurement.

Standard ways of operationalizing language complexity in L2 research have also focused on syntactic complexity, and especially on verbal subordination. In other words, researchers have analyzed L2 learner output by dividing the total number of finite and non-finite clauses by sentential units of analysis such as terminal units, communication units, speech, etc. One of the problems with this is that the number thus obtained is just a figure that tells us that one learner has used more verbal subordination than another but does not differentiate between types of subordination – so, if a learner uses less but more complex subordination than another, s/he will still be rated as using less complex language.

4.1 Implications for teachers

Complexity of learner output is a very desirable quality of learner output and a marker of progress in proficiency, especially when it goes hand in hand with high levels of fluency. However, in the absence of consensus as to what is complex and what is not, MFL departments may want to decide collegially on the criteria amongst the ones suggested above (e.g. variety, cognitive challenge, number of steps required to arrive at the correct form and lateness of acquisition) which they find most suitable for their learning contexts and curricular goals and constraints.

Also, they may want to consider splitting this construct into two strands, vocabulary complexity and grammatical complexity.

Finally, verbal subordination should be considered as a marker of complexity and emphasized with our learners. However, especially with more advanced learners (e.g. AS and A2) it may be useful to agree on what constitute more advanced and less advanced subordination.

In addition, since complexity of language does appear as an evaluative criterion in A-level examination assessment scales, teachers may want to query with the examination boards what complexity stands for and demand a detailed list of which grammar structures are considered as more or less complex.

Conclusions

Fluency, Accuracy and Complexity are very important constructs central to all approaches to the assessment of the two productive macro-skills, speaking and writing. In the absence of international consensus on how to define and measure them, MFL department must come together and discuss assessment philosophies, procedures and strategies to ensure that learner proficiency evaluation is as fair and valid as possible and matches the learning context they operate in. In taking such decisions, the washback effect on learning has to be considered.

Having only dealt with three of the ten issues outlined at the beginning of this post, the picture is far from being complete. What is clear is that there are no clear norms as yet, unless one decides to adopt in toto an existing assessment framework such as the CEFR’s (http://www.coe.int/t/dg4/education/elp/elp-reg/Source/Key_reference/Overview_CEFRscales_EN.pdf ). This means that MFL departments have the opportunity to make their own norms based on an informed understanding – to which I hope this post has contributed – of the FAC constructs and of the other crucial dimensions of L2 performance and proficiency assessment that I will deal with in future posts.

A little experiment with Padlet. Of teacher myths and learning reality

download (1)

I often read on the Internet or hear from ‘techy’ teachers of the presumed learning benefits of various Apps. More than often, when I investigate those claims with my students they are down-sized. The truth is that the way adults’ cognition – especially teachers’ – interacts with a learning tool often differs from an adolescent’s.

A few months’ back, one such claim was made in my presence. The context: an ICT training session in which one of the participants criticized Google classroom based on the fact that, unlike Padlet, when a student works on an assignment the other pupils cannot see what their peers are writing and consequently the opportunities for learning would be drastically reduced.

Reflecting on that claim on my way home I thought about myself, an avid and highly motivated learner of 8 languages – would I, whilst doing a Padlet task, actually actively look at my class mates’ output so as to learn from it? My answer was: possibly not, maybe a fancy idiom might attract my attention and would ask the teacher about it, but nothing more than that. But maybe students would; so I decided to put my colleague’s claim to the test.

The next day I asked students from four of my Spanish classes (2 year 8 and 2 year 9 groups), immediately after creating a padlet wall to which all of them had contributed, to note down in their books or iPads three language items they found in their classmates’ writing which they found interesting and/or useful. Two weeks later I asked them to recall the items they had noted down. Guess what? Only two of the 68 students involved in this experiment actually remembered something (one item each). Not surprising. Even when a student does notice something in another learner’s output they must do something with it for some degree of deep processing to occur and retention to ensue.

I also asked my students how often they actually look at what their classmates write and ‘steal’ useful language items from them. The vast majority of them replied ‘very rarely’ or ‘never’. Yet another mismatch between teacher presumptions and foreign language learner cognition.

The lesson to be learnt: not much is known as yet about the way adolescent foreign language learners interact cognitively and affectively with many of the Apps currently used in many MFL classrooms. ‘Experiments’ like mine should be carried out as often as possible in order for teachers not to be misguided by Internet myths. Padlet can be a useful App, no doubt; but assumptions about what learners do with it, should be rooted in evidence not simply wishful thinking.

Five useful things many MFL teachers don’t do

images

  1. Teach word structural analysis

As I argued in my previous post ‘Why foreign language teachers may want to rethink their approach to reading instruction’, effective reading comprehension results from the successful use of Top-down and Bottom-up processing. In that post I did not provide an exhaustive list of all the bottom-up comprehension strategies L2 teachers can model to their students, but I did underscore the importance of enhancing learner word recognition skills.

One strategy that can be taught to our students in order to enhance their chances to comprehend unfamiliar words is a powerful strategy called Structural Analysis which is not often taught in the typical mainstream UK classroom and has literally ‘saved my life’ in many situations.

This consists in training students in analyzing the morphology of the unfamiliar words they encounter in the written input they are exposed to by dividing them into parts, i.e. separating the root word from its prefixes and suffixes. Instruction in Structural analysis will include teaching

  1. what the most common prefixes in the target language mean (see: http://www.myenglishteacher.eu/blog/prefixes-suffixes-list/ for fairly comprehensive lists of prefixes meaning in English);
  2. what the most common suffixes are and mean (see: http://takelessons.com/blog/french-vocabulary-prefixes-and-suffixes-z04 for a very exhaustive list of suffixes and prefixes in French); and,
  3. with the help of web-based resources, the most useful root words and how to use them in order to infer the meaning of unknown lexis (e.g. https://www.learnthat.org/pages/view/roots.htmlhttp://www.readingrockets.org/article/root-words-roots-and-affixes ).

Learner training in Structural analysis may be carried out using the following framework:

Step 1 – Rationale for the training (i.e. to enhance students’ ability to comprehend unknown words)

Step 2 –  Raising awareness of what Prefixes, Affixes and Root words are;

Step 3 – Modelling strategy use (e.g. through think-aloud: teacher shows examples of how applying one’s knowledge of a prefix/suffix/root word as well as the context one can correctly infer the meaning of a word)

Step 4 – Extensive word-meaning-inference practice with written (or even spoken texts) preferably in the context of the unit-of-work topics. Also, the target words should not occur in isolation but in short texts;

Step 5 – Recycling the same strategy in three or four subsequent lessons in the context of 10 -15 minutes word-meaning-inference activities relevant to the topic-in-hand;

This kind of activities develop important compensation strategies, which are essential life-long language learning skills; they involve a high degree of creativity thereby tapping in high-order thinking skills. They may also have the very positive effect of enhancing our learners’ self-efficacy as readers as they will feel equipped with a new powerful tool for comprehending TL text without having to resort to the dictionary. This effect will only be brought about if the teacher plans the activities carefully, providing lots of initial ‘small wins’.

2. Train learners in oral compensation strategies

If the main aim of our language teaching is to develop our learners’ ability to cope with the linguistic demands of target language interaction, it should be imperative for us to train them regularly and systematically (not the one-off tip or session) in the effective use of compensation (communication) strategies – an important life-long survival skill. These strategies refer to the ways in which an individual creatively makes up for the expressive limitations of his target language competence (e.g. lack of vocabulary). Here are four useful compensation strategies to teach MFL learners:

  • Coinage – i.e. the individual makes up a word that does not exist using a related lexical item in their repertoire. For instance: a learner of French does not know the French verb ‘Nourrir’ (=to feed) but knows the noun ‘Nourriture’ (=‘Food’). He then makes up the word ‘Nourriturer’, which is wrong, but conveys the meaning;
  • Approximation – i.e. to go back to the above example, instead of saying ‘Nourrir’ the learner says ‘Donner de la nourriture’ (to give food), which is not exactly the best translation, but it is very close in meaning, conveys the idea and is acceptable French;
  • Paraphrasing – i.e. the students do not know a given word so they explain it, a bit like a dictionary definition would do;
  • Simplification – i.e. instead of using the same complex sentence structure he/she would use in his/her native language, the learner simplifies it so as to be able to convey the basic meaning.

The same framework outlined in the previous paragraph can be applied here, except that step 4 would include productive rather than receptive practice.

Compensation strategies are a very important skill and just like any other skill they must be practised extensively and regularly in order to be learnt. Some teachers may frown upon this type of instruction on the grounds that it may lead to sloppy L2 output or pidginization. Nonetheless, these strategies are powerful learning tools; for example when you produce a wrong but intelligible word through coinage, a native or expert TL interlocutor will understand you and usually provide you with the correct L2 form. This lexical transaction is more likely to lead to retention than looking up the same word in the dictionary.

3. Explain how the brain works

Foreign language learners, especially the more committed and metacognitive ones, do, in my experience, enjoy knowing more about how their brain works when they learn the target language. This does not mean that one should spend a whole lesson talking about neuroscience…However, showing them the diagram below, that maps out how the brain retains vocabulary over time and involving them in devising a personalised schedule based on that diagram would not take too much of your lesson time; would not require scientific jargon and would foster metacognition (thinking about and planning one’s learning).

With some of my groups I also venture in brief and concise explanations of (a) how Working Memory operates in an attempt to hammer home the importance of concentration, good pronunciation and associative strategies; (b) how forgetting is often cue-dependent; (c) why performance errors occur. Whenever one teaches about the above it is important to keep it as brief and visual as possible; not to over intellectualize and to relate the discussion to your students’ learning whilst showing them clearly why and how the knowledge you are imparting on them will benefit them in the short- and long-term.

4. Implement attitude-changing programs

In every class there are students who are less engaged, disaffected or even hostile. After a few pep-talks, disciplinary measures, referrals to pastoral middle-managers, things may get a bit better but there is rarely much change. This is often due to the fact that these scenarios are not handled through a structured, principled, well though-out attitude change program.

It is beyond the scope of this post to delve into the ins and outs of attitude-changing-programs implementation; here it will suffice to point out how for any attitudinal change to work, it must start by identifying the issues at the root of the negative attitude one wants  to change vis-a-vis the following metacomponents of attitude identified by Zimbardo and Leippe (1991):

  • Behaviours (what students do in their daily approach to languages)
  • Intentions (what their short-/medium-/long-term objectives are with the language)
  • Cognitions (their beliefs about languages)
  • Affective responses (how much they enjoy language learning)

The picture below (from: http://sites.hamline.edu/~./psunnarborg/attchange.htm)  illustrates where the most positive and committed of our students usually are in terms of these four attitudinal components:

The principles I outlined in my post on motivation (“Eight motivational theories and their implications for the classroom”) can then be applied to address the four areas one by one. For instance, one may want to start by inducing a level of cognitive dissonance to address learner beliefs (cognitions); by identifying what students excel at and enjoy so as to cater for their preferences in lessons and enhance their sense of fulfillment and enjoyment; to set short-term manageable goals in order to increase their sense of self-efficacy; etc.

Of course, most teachers are busy and over worked and may object that they do not have the time to do all this. My main point here is that, if one does wish to turn around a disaffected student or group of students, one should at least first try to ascertain what the actual roots of the problem are, through a structured and deep inquiry process based on the above framework.

5. Ask older language students to observe you

We often have our colleagues or senior teachers observe us. A recent trend in some schools in the UK is also to have students to observe us using checklists where they tick or cross things they see us do or not do. A strategy I have used in the past and that has paid huge dividends was to ask older A2 language students ( around 18 years old) to observe one or more lessons of mine and give me feedback on one or more specific areas of my teaching in which a younger person’s perspective may be more useful than an adult’s, e.g. motivation, empathy towards students, levels of pupils’ engagement, etc. What I find useful about having an older language student observe me is that students of this age are more in sync with adolescent-learner mentality and affective responses than my colleague’s whilst being still fairly mature and cognizant of what language learning entails; moreover, being former students of mine, these individuals are usually more able to relate the observed students’ experiences to their own when they were indeed taught by me and provide even more useful feedback as a result.

The best lesson planning ever…really? – Of the perils of educational sensationalism

As I was ‘snooping around’ the Internet I bumped into a title that attracted my attention: “9 ways to plan transformational lessons – Planning the best curriculum unit ever” (on the Edutopia site) .  At first I thought the author may have been a tad arrogant but when I read that the article had been shared 4.9K times on various social media I said to myself “Let’s read this”. And I did. After reading I felt immediately compelled to write this post as I got worried:  were the language teachers amongst the 4.9 K people who shared that article and/or the people who shared them with going to do what the article advises in the belief they will, by so doing, plan the best lessons ever?

The following are the points in the article that I had the most issues with:

  1. A… ‘transformational’ lesson??!!

This was the first – but not the greatest of my worries. Transparent learning; Visible learning; Dynamic learning; Collaborative learning; Positive education; and now…Transformational learning? For real? Another label thrown upon teachers by educational consultants wanting to create a brand? Yet another trendy category that if you do not belong to you are not up-to-date and feel left out? EVERY SINGLE LESSON where a student learns something you taught them is a transformational lesson, as you add new cognition to their brains. Even if you do not do any fancy stuff, any backward design or pedantically follow pseudo-scientific learning rubrics.

  1. For the best lessons ever shift from solo to collaborative design…really?

Whilst I do enjoy working collaboratively with my colleagues, especially doing long-term planning with them (e.g. preparing a unit of work), I am definitely at odds with the statement that in order to plan the best lessons ever I need to plan them with another teacher. When I plan my lessons I have every single student of mine in mind and the vocabulary and grammar structures they learnt with me in our last lesson as well as the mistakes and problem areas I want to address in their output; I know what strategies and learning activities keep them focused and motivated; my colleagues would not know that. Also, not all of my colleagues share the same theory or approach to language learning as me and often use or sequence the very same activities differently.

  1. Create the assessment before developing content… the Wiggins and Tighe curse

There is nothing less educational in language teaching than the adoption of Wiggins and Tighe’s backward design approach to MFL lesson planning – whist I can see its merits in the planning of longer units of work. There is nothing more straight-jacketing than teaching a language lesson with the assessment in mind. Whereas I do agree that one must have aspirational learning outcomes to work towards in a lesson and sequence of lessons, to set these in stone and let them drive the way we teach language is not only unethical, but counter-productive in terms of language acquisition; unless, that is, we are willing to sacrifice sound cognitive and psycholinguistic development in the name of a trendy curriculum design principle. Unless we teach robotically to our schemes of work and we are not willing to adapt our plans when we feel that our students need more practice.

A good L2-classroom practitioner teaches adaptively, creatively and should not teach with a specific test in mind; learning a language structure or function is a process which may need more time than planned. A good teacher must be prepared to change their plans if needs be. Teaching is not simply about planning and assessing.

  1. Don’t forget the introverts…

Quoting work by Susan Cain, the author points out how introverts enjoy working autonomously and how currently a lot of teaching includes group work. Is the implicit recommendation for the best lesson ever not to involve introverts in group-work? If it is so, I do not agree; whilst there has to be a balance of individual and collaborative work  in most lessons, I deliberately encourage students who do not enjoy working collaboratively to do team work so as to get them out of their comfort zone and learn a major lifelong skill; also, learning a language is about communicating with others!

Then the author goes on to say that increasing wait time to seven seconds – when asking a question in front of the classroom – will play to the strengths of introverts. Really? I give any of my students, regardless of their personality as long as they need unless they say ‘pass’… I would be interested to know why seven seconds is the ideal wait time as for some of my students it would not be enough, and I would rather wait a bit longer than dent their self-esteem.

Integrate Productive struggle in the curriculum

This is a direct quote from the article:

Don’t lower the expectations of your next lesson plan. Instead, scaffold instruction and check to see that you are challenging students appropriately with Hess’ Cognitive Rigor Matrix.

In other words, in order to plan progression and challenge in the best lesson ever, teachers ought to refer to Hess’ Cognitive Rigor Matrix one of many Bloom taxonomy surrogates… Speechless!

In conclusion, I apologize to the author of the article for sounding a bit too harsh. The article does contain some sound recommendations (e.g points 7 and 8); however, the harshness was elicited by the sensationalist title which clearly alludes to outstanding – the best ever – practice. When one is passionate as I am about teaching and learning one can only find it unacceptable and even ‘dangerous’ for the pedagogic recommendations made in this article to be divulged as ‘the best ever’ practice, especially when they are endorsed by a very authoritative educational website as Edutopia followed by a vast number of teachers from all over the world.

What does the message sent by the author and Edutopia to novice teachers about what best teaching practice is entail? A fixed wait time of seven seconds? Planning challenge in language learning based on Hess’ Matrix (please read my article on the Bloom Taxonomy and you will understand my ‘fury’)? Teaching a lesson based on a set-in-stone piece of assessment? Or that best lessons must be planned collaboratively? Whilst many of this recommendation can be useful, they should not be presented as evangelical truths, the pre-requisites for the best ever practice.

Edutopia should know better and put ethics before sensationalism.

 

The top 10 foreign language learning research areas MFL teachers are most interested in

images (5)

When I started writing my blog I did not expect I would get many readers. I started it much in the same spirit as one starts writing a journal; as a way to reflect on my practice and on my beliefs about language learning. When, however, people from all over the globe started to contact me requesting to know more about specific areas of foreign language acquisition and pedagogy and “Six useless things foreign language teachers do” got more than 9,000 readers in one night, it finally hit me: there are keen and reflective classroom practitioners out there who do want to know more about L2 acquisition theory, neuroscience and L2 research than what they were taught on their teacher training courses or during a few hours of CPD by educational consultants.

In actual fact, the reasons why I decided to embark on a Master’s course in Applied Linguistics after two years of UK comprehensive-school teaching refer mainly to my deep dissatisfaction with the way I was taught how to teach on my teacher training course (PGCE). My PGCE tutors were great, do not get me wrong. However, on my teacher training I was basically quickly shown a few tricks of the trade – mostly through videos or demo lessons with the ‘perfect’ classes – equipped with a few ‘lesson templates’ and sent off to my teaching practice schools.

No need to tell you what happened there; you have all been through that nerve-racking baptism of fire; you have all come across the really motivated and inspirational teachers who tried their best to support you and the demotivated and disgruntled ones who tried to discourage you. The outcome: I learnt more tricks of the trade, but not a shred of understanding of how the human brain acquires languages; not a hint of what research says sound MFL pedagogy looks like; I left my last teaching practice placement with glimpses, intuitions of what may work and what may not, totally unsupported by a theory of learning or research evidence.

Things did not improve much in my first school as an NQT. More trial and error; more tricks of the trade; more useless training sessions; still not a sound pedagogic framework, a theory that I could refer to, rooted in neuroscience. I had to use quite a lot of my savings to finally get that framework and that research-based knowledge from some of the greatest names in EFL pedagogy and research at the Reading University Centre for Applied Language Studies (CALS). Costly, but worth every penny!

Yet many of the colleagues I have worked with in 25 years of MFL teaching firmly believed that a good MFL teacher does not need to know about theory or research. I remember how one of them used to refer to the applied linguistics handbooks that I used to read during my frees as ‘silly books’. An understandable attitude considering how inaccessible many researchers have made their often very useful findings to classroom teachers with their convoluted jargon and obscure statistics.

A fairly recent survey carried out by my former PhD supervisor, Professor Macaro of Oxford University, however, found that although only 3% of the teachers interviewed found research accessible, 80% of them were actually interested in what research has to say about language acquisition and pedagogy. The following are the top ten areas of research Dr Macaro’s informants identified as most useful. Please note that the sample was not huge – only about 100 – and that the people who filled in the questionnaire were Heads of Department, i.e. very experienced teachers. The ranking is based on the means of the score each topic area received (1 being very useful and 4 being not at all useful):

  • Vocabulary acquisition74 % of the teachers found this topic very useful
  • How the grammar rules of the language are best learnt 73 % of the teachers found this topic very useful
  • Motivation68 % of the teachers found this topic very useful
  • How learners make progress with language learning58 % of the teachers found this topic very useful
  • Differences amongst learners (e.g. age; gender)53 % of the teachers found this topic very useful
  • Speaking51 % of the teachers found this topic very useful
  • How the brain stores and retrieves language58 % of the teachers found this topic very useful
  • KS4 (lower intermediate) research37 % of the teachers found this topic very useful
  • Writing 39 % of the teachers found this topic very useful
  • KS3 (beginner) research 29 % of the teachers found this topic very useful

What is interesting is the absence from the above list of two topics that I have written about and seem to have been very popular amongst my readers, i.e. listening and reading research. In fact, of Dr Macaro’s informants only 25 % were interested in listening and 27 % in reading.

What areas of research are you most interested in? I would love to have your input!

L.I.F.T. – an effective writing-proficiency and metacognition enhancer

14322719_10153708319066744_962110694165759072_n

Many years ago, as an L2 college student writer of English and French I often had doubts about the accuracy of what I wrote in my essays, especially when I was trying out a new and complex grammar structure or an idiom I had heard someone use.  However, the busy and under-paid native-speaker university language assistants charged with correcting my essays rarely gave me useful feedback on those adventurous linguistic exploits of mine. They simply underlined or crossed out my mistakes and provided their correct alternative. As an inquisitive and demanding language learner I was not satisfied. I wanted more.

So, I decided to try out a different approach; in every essay of mine I asked my teachers questions about things I was not sure about, in annotations I would write in the margin of my essays (e.g. should I use ‘with’ or ‘by’ here?; should this be ‘whose’ or ‘which’?) eagerly awaiting their replies – which I regularly got. Knowing my teachers were busy I would focus only on five or six things I had particular issues with and only after looking through my books and dictionaries in search for clues as to whether I was right or wrong.

This process ‘forced’ my teachers to give me more feedback than I had been getting; consequently, not only I learnt more, but I also became more ‘adventurous’ and ‘daring’ in my writing. This strategy really helped me a lot.

Later on in life, when I became a foreign language teacher, I recycled this strategy with my students. I call it L.I.F.T (Learner Initiated Feedback Technique). Although I had been using it for a long time already, a few years ago I decided to put its effectiveness to the test by conducting a little experiment. I used L.I.F.T. with one of two groups of able 14 years old I was teaching, whilst I used traditional error correction with the other. The students were asked to underline anything they were not sure about and write a question on margin explaining briefly what their problem or doubt was about; one condition I put was that they had to research the issues they were asking me about using web-based resources (e.g the www.wordreference.com forums). I used exactly the same teaching materials and covered the same topics with both groups.

When I compared how the two groups evolved over the time of the ‘experiment’, what I found out was interesting: they both made more or less the same type and number of mistakes; however, the group I had tried L.I.F.T. with had generally written longer and more complex sentences using more ambitious grammar structures and idioms. Moreover, I found that the questions the students were asking in their annotations had become increasingly more complex; a sign that they were becoming more inquisitive, ambitious and risk-taking. Why?

One reason refers, I think, to the fact that learners, especially the less confident ones, usually tend to avoid structures or idioms they are not sure about. However, L.I.F.T counteracts this avoidance behavior as it encourages them to try new things out and take risks; knowing the teacher is encouraging and endorsing this kind of risk-taking by ‘pushing’ them to ask for feedback elicits the use of this technique even more.

Moreover, since I made clear to them that they had to try and solve the problems by themselves first and then write down their questions, my students reported doing more independent study than before, especially the less committed ones. I also felt that they became more inquisitive as a result of the process as they were asking themselves and me more questions about grammar and vocabulary usage that google had no answer for (not a straightforward and easy-to-find one, at least).

Finally, quite a few of them reported paying more attention to my corrective feedback than before as they had requested it in the first place!

Another benefit was that, as part of the process, giving feedback became more interesting and enjoyable for me because it felt like a real dialogue between the students and myself, especially with the more adventurous and ambitious linguists; not just a top-down approach totally directed and owned by  the teacher. Also, because I felt that– most of the time, not always – they had indeed tried to answer the questions themselves, I put more effort into it.

Also, and more importantly, this process provided me access to some of my students’ thinking process and to the kind of hypotheses they formulated about how French worked.

Recently I discovered a study by Andrew Creswell (2000) who used a very similar approach with higher proficiency students than mine and reported very similar gains. His students reacted very favourably to the technique and he states that training learners in this technique created ‘a context in which students were able to work responsibly’.

I strongly recommend this technique not only for the benefits reported above, but also because if your students eventually do incorporate this technique into their learning strategies repertoire, they will acquire a powerful life-long metacognitive strategy that they might transfer to other domains of their learning and professional life.

More on this  and on my appraoch to language teaching and learning in the book I co-authored with Steve Smith “The Language Teacher Toolkit’, available here