The writing skill most foreign language teachers don’t teach: interactional writing

online-chat-communication-23416424

1.What is writership?

Chatting online or texting via SMS, WATSAPP, etc. has become part and parcel of our daily life, the verb and noun ‘chat’ alluding to the fact that although we are writing we are in fact ‘talking’ to someone. Just like in a face-to-face conversation when chatting online we have to respond to our interlocutor in real time if we want to ‘stay’ in the conversation and, most importantly, if we want to keep him or her engaged.

Since in real-life face-to-face communication applied linguists refer to interactional listening as ‘Listenership’ I will henceforth call the set of skills involved in interactional writing: ‘Writership’.

Listenership and Writership have many similarities in terms of the cognitive processes they involve. There are, however, important differences, too. Besides the most obvious difference , i.e. the fact that  communication does not happen through the oral medium, there is another important one: when texting or chatting we do not usually see our interlocutor. This means that the all-important non-verbal aspects of communication (e.g. the cues we get from our interlocutor’s body language) are missing – which often leads online chatters to use imagery as a compensatory strategy. This entails that effective writership must not simply include fluency (as in: speed of production) but also a level of mastery of TL vocabulary and discourse functions which makes up for the lack of non-verbal cues and pre-empts ambiguity.

2. Should we be concerning ourselves with interactional writing?

Whilst skyping with Steve Smith of www.frenchteacher.net last night we were talking about the time teachers should allocate to writing. Steve made a very important observation that echoed what I have always thought: writing is less important than listening, speaking and reading as students are not very likely to use the target language after passing their GCSEs; hence, not much time at all was the answer– writing can be done by students at home. But then it suddenly dawned on me that that was true of Steve’s generation and mine, but not of the current one.

In the highly inter-connected world where global communication happens at the speed of a few milliseconds on Facebook, Twitter, Watsapp and SMSs many of our students are very likely to engage in interactional TL (target language) writing in the future – in fact many already do on a daily basis. Another example of how emerging technologies affect not only the way we communicate in real life but also, inevitably, the way we teach foreign languages.

3. But how do we teach interactional writing?

First of all let us consider what it involves.

First and foremost, obviously, the ability to understand an interlocutor’s input.

Secondly, the ability to respond to that input in real time, maybe not necessarily at the same speed as one would do in oral interaction but still quite rapidly. In other words, effective writership requires writing fluency.

Thirdly, effective writership requires a high level of intelligibility of output – not necessarily grammatical accuracy and complexity. Spelling becomes more important than it is in essay writing in that the speed of the interactional exchange does not allow the interlocutor a lot of time for working out ambiguous items.

Fourthly, the command of a sizeable repertoire of high frequency lexical items, a fairly wide range of discourse functions, the basic tenses and communication strategies (e.g. ways to compensate for lack of vocabulary).

Fifthly, in dealing with a TL native speaker an effective interactional writer must be able to grasp cultural features in their input, including the jargon and abbreviations used in TL instant-messaging communication (e.g. knowing that in French LOL is MDR).

The obvious corollary of the above is that most of the traditional communicative activities we use to foster autonomous communicative competence (through both the oral and written medium) and listenership apply to the teaching of writership too (e.g. information gap tasks and role-plays). In fact the oral communicative practice that takes place in your classroom will have a major impact on learner writership.

These are some of the activities I use in the classroom to promote writership:

  1. As a starter or plenary I stand in front of the class and type questions in the TL on the classroom screen. The students, equipped with mini-boards or iPads, have two minutes to write an answer including three details. In order to differentiate I usually ask two questions, the second being an extension for the more fluent students. Accuracy is not a concern, but intelligibility is.
  2. Picture tasks. This is similar to the previous task, except that the stimulus the students have to respond to is visual. The rationale for this task is that (a) in social media students often do have to respond to a visual stimulus; (b) it taps into their creativity; (c) it may elicit language that transcends the boundaries of the topic-at-hand.
  3. ‘What is the question?’ tasks. Students are provided with a very short dialogue in the TL where the questions have been omitted. Their task is to provide the missing questions
  4. Social media slow chat. Using Edmodo I ask my students to chat with each other about a given topic. I give out red cards to the chat-initiators and blue cards to the responders. The initiators are in charge of asking questions to any of the responders in the class. Every ten minutes the initiators and responders will switch cards. The students are given a time limit to answer, which varies from group to group – this is hard to monitor, of course, but my students are usually honest. The reason why I use Edmodo rather than Twitter is that (a) Edmodo allows the teacher to edit mistakes (please note: I only correct major intelligibility mistakes); (b) it looks a lot like Facebook but it is safer; (c) the teacher has total control over everything that happens in the interaction. Last year I paired up my class with a class from an overseas school and we chatted on Edmodo for thirty minutes. Great experience that intend to do again this year. This activity is great as a prelude to oral activities as it allows the students more time to make the same communicative choices they will have to make in the oral interaction whilst still putting communicative pressure on them
  1. Very short translations under time constraints; students need to translate the teacher’s input on mini-boards. Again, focus is on intelligibility and fluency here rather than on100 % accuracy.
  2. Agree or disagree. A simple statement appears on the screen (e.g. Tennis is very enjoyable; I like it when it rains; the food in the canteen is great) and the students have to write a response on their mini boards under time constraints.
  3. Fluency assessment. At key stages in the unfolding of a unit of work I use the activity described in point 1, above, but ask students a much broader question and give them a lot more time to answer it on paper or on using google classroom. At the end of the allocated time I ask the students to stop and note down how many words they wrote. The time to word ratio will give me an indication of the levels of writing fluency in my class at that moment in time. I value this activity as fluency is an important pre-requisite of effective writership.

Conclusion

Emerging technologies, especially the internet and social media have transformed the way we live and we use language in communication. On a daily basis I find myself chatting on social media in four different languages and I find the linguistic challenges this poses quite taxing as it requires faster language processing ability and sociolinguistic competences that I do not always possess.

Whether we like it or not, the vast majority of our students communicate via social media or other forms of instant messaging. Hence, if we are to prepare them for communication in the real world this phenomenon cannot be ignored. Teaching interactional writing skills is therefore a must, in my opinion.

Teaching this set of skills has also the added benefit of preparing our students for oral communication as it requires them to process language in real operating conditions whilst allowing more time than the oral medium does. In this sense, the attainment of effective writership may be seen not just as an end in itself but also as instrumental to the attainment of oral fluency intended as the ability to retrieve information from Long-term Memory under communicative pressure. Do you currently work on developing TL writing fluency in your learners?

Advertisement

Seven metacognition-enhancing interventions I will implement this year

download (5)

Metacognition-enhancement is the area of teaching and learning that has always interested me the most as a teacher. As a researcher I investigated this area as part of my PhD 15 years ago and, as a research officer, on a classroom-based research study involving six English state schools under the supervision of Professor Ernesto Macaro of Oxford University –  one of the greatest authorities in the realm of learning-to-learn research.

This year I intend to embed the following metacognition-enhancing interventions in my teaching practice to test their effectiveness as one of my professional development targets. My ‘guinea pigs’ will be a class of 18 year 10 students of French preparing for their IGCSE examination. The reason for wanting to enhance my students’ self-regulation and meta-learning skills has to do with the nature of the examination they will sit next year and the relatively limited contact time available (two hours per week). I do believe that these students – especially the weaker ones – will benefit from the following interventions as they need to become more responsible for their own learning, more aware of their problem areas and must learn to optimize their use of the little teaching and learning time available.

The reader should note that in many cases, as a result of the self-regulatory processes and metacognitive dialogues that the activities below will spark off I will also have to model to my students specific cognitive and affective strategies to address any issues identified in their learning.

1. Reflective journal – Every week I will ask my students a different question which will ‘force’ them to reflect on how their learning is going. I have set up a google folder and google doc per student in which they will write a 50-words-minimum answer to each question. The first question – next week – will be: What aspects of French learning cause you the most anxiety? How can I help you? How can you help yourself? Every week I will change the focus of the students’ reflection; but every so often I will go back to an ‘old’ question to see if there has been any progress in a specific area.

I will not ‘mark’ the students’ journal entries, but will give them a rapid read and respond with a concise comment and/or a request for clarification or expansion. If I do think that the quality of the reflection is below my expectations of a given student, I will have a chat with them at the end of the next lesson.

2. Retrospective verbal reports on essay-writing – the week after next I will ask my students to write a short essay (around 150 words) under time constraints which I will assess through the same criteria used by our examination board. At the end of the essay I will ask them to reflect on and write in as much detail as possible about any issues they encountered in carrying out the task in the areas of grammar and vocabulary as well as any other problem they experienced (e.g. stress; cognitive block). As they write I will walk around and scaffold the process by asking probing questions if I feel they need prodding. I intend to do this twice a term.

Every time I have carried out retrospective verbal reports they have yielded valuable data and have served another very important goal: enhancing students’ awareness of their problem areas. This has always provided me with a very useful platform for starting a very productive metacognitive dialogue with my students.

3. Think-aloud protocols – later on in the term, after identifying the three students who are most seriously underachieving in reading and/or writing I will involve them in think-aloud sessions in which they will perform a reading or writing task whilst verbalizing their thoughts; I will often intervene in the process by asking probing questions to delve further in their thinking process. This technique, as I have already discussed in a previous post, has a double effect: firstly, it yields incredibly useful data as to how the students tackle the tasks and where they go wrong or experience linguistic and/or cognitive deficits; secondly, it engages their metacognition.

I will only focus on three extreme cases not because there is something special about this number, but merely for reasons of manageability/time constraints. I tried bigger numbers before and did not cope very well.

4.L.I.F.T. – I will encourage the students to use L.I.F.T in every single essay of theirs as much as possible – although I will not make it compulsory. As I have already discussed in another post, L.I.F.T stands for Learner Initiated Feedback Technique, i.e.: whenever a student has a doubt about a grammatical or lexical structure she will ask the teacher a question that she will annotate on margin (e.g. have I been right in using the present subjunctive here?). The teacher will then answer the questions in her written or oral feedback on that essay. L.I.F.T enhances students’ metacognition by scaffolding their ownership of the corrective process whilst fostering risk-taking and task-related awareness.

5. Error log – In order to raise their awareness of their problematic areas – which hopefully will have started with the first retrospective verbal report (see 1, above) – I am going to ask my students, on giving their essays back, to log on a google document five different types of mistakes I highlighted in their essays along with a concise explanation of the possible cause of those mistakes (e.g. didn’t know the rule; got confused with Spanish) and a reminder of the grammar rule broken. The process will enhance their awareness of their problem areas and may trigger the future deployment of editing strategies aiming at addressing them.

6.Lesson videoing + student ‘pet hates’ – I will video one lesson per term and ask my students to write down – anonymously – one or more things about that lesson that they found useful and enjoyable and one or more things they found annoying, tedious and/or not very useful. I will then go through the students’ comments and view the video to get a better grasp of the issues they refer to; I might do this with colleagues to get their opinions and suggestions.

This process will serve three important purposes. Firstly, it will involve students more actively in the learning process by getting them to think about how my teaching impacts their learning; secondly, it will give them the feeling that I heed their opinion; thirdly, it will pave the way for the kind of activities illustrated in the next point.

7. Videoing of student speaking performance with introspection – After showing the students that I am willing to be videoed, evaluated and assessed by them, I am less likely to encounter resistance when I ask to do the same to them. Once a term I will video students I have concerns about for five minutes as they converse with me in French and spend 15-20 minutes going through the video together, discussing key points in their performance and possible strategies to address any issue identified. The metacognitive element of this process refers not simply to problem identification but also to the introspection that my questions will trigger.

At the end of the year I will interview my students in order to find out how the above interventions impacted their attitudes to French and their learning.

Although the above list may look like a tall order, it is much more manageable than it seems as it is mostly student-led. I am particularly looking forward to activities 6 and 7.

Why narrowing the speaking assessment focus can have a positive washback effect on L2 learning

download (10)

Every assessment we carry out in an MFL classroom ought to have a positive washback effect on learning. In this post I argue that with pre-intermediate to intermediate students the way MFL learners are typically assessed does not impact learning as much as it ought to, due to a failure to consider the complex nature of oral skill acquisition and the cognitive demands it poses on learners.

What I mean is that most often learners are assessed using complex multi-traits or holistic scales which are designed to rate their performance across a number of dimensions of proficiency such as fluency, intelligibility,  grammatical accuracy, pronunciation, complexity, range of vocabulary, ability to comprehend and respond to an interlocutor, etc. However, this approach has the potential to have a negative washback effect on learning when we deal with novice to intermediate learners due to the huge cognitive demands it puts on them.

This is because, as I have often reiterated in my posts, at this level of proficiency foreign language learners struggle to cope effectively with all of the demands posed by oral production in real operating conditions. Hence, by assessing them using multi-traits or holistic scales which assess simultaneously all of the above components of oral proficiency we are being hugely unfair to them, as we are not taking into account how finite their cognitive resources are.

Although there is indeed a place for a more multi-dimensional type of assessment in high-stake tests (e.g. end of unit tests), when it comes to the all-important low-stake tests we should administer throughout the learning cycle, I advocate a different approach which takes into account developmental factors in the acquisition of cognitive control, i.e. a type of assessment which focuses on one or maximum two traits at a time. For instance, at one key stage in the unfolding of a unit of work one would focus on the assessment of fluency + intelligibility of output; at another one would focus on range of vocabulary and pronunciation; etc. Obviously, your students will be informed at all times as to which trait will constitute the focus of the forthcoming assessment; this will channel their cognitive resources in one or two directions thereby pre-empting the risk of them chasing too many rabbits at the same time and ending up catching none.

This approach, which I have been using for years, not only has the advantage of focusing the learners on one aspect of cognitive control over oral production at the time – with an obvious positive washback effect on learning; but addresses another important pitfall of oral performance assessment carried out using complex multi-traits scales: the cognitive overload such scales cause oral test raters. Unless raters record the students’ oral performances and listen to them over and over again after the test– which rarely happens with low-stake assessments – complex assessment scales are very likely to cause divided attention, as it is extremely challenging to attend to speaker output and evaluate it at the same time across all of the traits and criteria.

In this sense, low-stake speaking tests assessed using a narrow focus approach do kill two birds with one stone. On the one hand, they optimize the use of the student’s cognitive resources; on the other, they facilitate the test-rater’s task. I will add another advantage I have experienced whilst using this approach, which refers to my professional development: by focusing on a different aspect of oral proficiency at a time for each low-stake assessment one gains a higher level of awareness vis-à-vis the variables affecting its development than one would normally do when focusing on several oral proficiency components at the same time.

It goes without saying that in high-stake assessments the use of multi-traits or holistic rubrics is more useful as we do want to have a more comprehensive view of how our students are faring along all the major components of oral proficiency. However, I do feel that many of the holistic and analytical scales adopted by MFL teachers with novice to intermediate speaker learners share a common shortcoming, which has a detrimental washback effect on learning: they do not lay enough emphasis on fluency and on the ability to effectively comprehend and respond to an interlocutor. In their quest for comprehensiveness and for a ‘one fits all’ solution, they fail to consider that each level of proficiency has different developmental features and should therefore be approached differently in terms of assessment. The more novice the learner the more skewed towards fluency the scale should be, the emphasis on accuracy and complexity gradually increasing as we progress further along the language acquisition continuum.

Why the reliability of UK Examination Boards’ assessment of A Level writing papers is questionable

The Language Gym

download

Often, our year 12 or Year 13 students who have consistently scored high in mock exams or other assessment in the writing component of the A Level exam paper, do significantly less well in the actual exam. And, when the teachers and/or students, in disbelief, apply for a remark, they often see the controversial original grade reconfirmed or, as it has actually happened to two of my students in the past, even lowered. In the last two years, colleagues of mine around the world have seen this phenomenon worsen: are UK examinations boards becoming harsher or stricter in their grading? Or is it that the essay papers are becoming more complicated? Or, could it be that the current students are generally less able than the previous cohorts?

Although I do not discount any of the above hypotheses, I personally believe that the phenomenon is also partly due to serious issues…

View original post 1,999 more words

Ten questions to ask foreign language teaching CPD providers

download

Intro

I have attended lots of MFL PD sessions throughout my career dishing out lots of WHAT’s (i.e. activities) and HOW’s (i.e. their classroom implementation). What was usually missing is that quid that has the power to transform teaching, i.e. the answers to the following difficult questions that everyone attending such PD sessions should ask. In writing our ‘MFL teacher’s toolkit’ Steve Smith of http://www.frenchteacher.net and I are keeping these questions very much in our focal awareness throughout the whole process.

Ten questions to ask your CPD provider

  1. Why this approach? – What is the rationale for this approach? Why should I use the activities you are recommending? How do you know they are going to work? Teachers are rarely told this by PD facilitators. This is, in my view, the greatest shortcoming of all.
  1. Where is the evidence that this approach ACTUALLY works? – By this I do not mean ‘conclusive’ evidence with long lists of reference and statistics; but at least some indication based on classroom research, some objective data that the recommended approach has worked with at least some foreign language students. Teachers, in my experience, need some degree of ‘certainty’ that something they are expected to use in their classrooms actually ‘works’ in order to buy into a new methodology or technology.
  1. How do I sequence the great activities you are recommending? Why? – This is one of the most important questions for many teachers, as it affects the nitty gritty of their daily practice. Teachers are very busy people; as much as we want them to be reflective and work the ideal sequencing out by themselves, they want and must be provided by people running inset training sessions with some sort of reference framework.
  1. How do these activities affect students’ cognition and language acquisition? Why? – MFL PD facilitators usually tell you things like ‘This activity develops your students’ vocabulary. They are really effective and fun’; then they show us a video or ask us to try them out with our neighbours. But they never tell us what aspects of grammar or vocabulary learning they impact and why. This in my opinion is crucial in order to empower teachers with the all-important ability to use those activities effectively and flexibly across contexts in the future.
  1. How do I get my students to ACQUIRE the target language grammar, not just LEARN grammar rules? – As it usually happens in MFL PD sessions, you never get to hear about how to bring students from declarative knowledge (knowing grammar rules) learning to actual acquisition (using those rules automatically and accurately in spontaneous speech). You are at best shown ‘activities which aim at memorizing grammar rules and practise them (e.g. mechanical or gap-filling drills; fun games) but you are never told how one gets the students to use them correctly in real time communication.
  1. How do the AFL strategies and the assessment rubrics you are showing actually help me assess my students’ development in term of FLUENCY, COMPLEXITY, ACCURACY, VOCABULARY RANGE and DEPTH and COGNITIVE CONTROL in a principled, valid and consistent way? – the AFL strategies and the assessment rubrics usually shown by UK MFL consultants are usually very limited in their power to asses performance and proficiency and help teachers identify at which developmental stage along the language acquisition continuum MFL learners are located.
  1. How are language skills acquired? How do we scaffold and monitor skill acquisition? – More than often UK PD providers will show you tons of rubrics and how you should use (very simplistic) rubrics to scaffold skill learning. I wish language learning was that simple…
  1. How can I get learners to acquire the memory strategies you recommend and use them autonomously? – PD facilitators often show scores of slides with memory techniques but they regularly fail to tell teachers how you get students to use them autonomously without any teacher prodding. Strategy training in memory strategy requires a specific set of knowledge and skills that the average teacher has not received training in; moreover, it requires extensive training (lasting months) to be effective and intensive scaffolding. Teachers are rarely – if ever – told this.
  1. How can we ensure that PBL implementation actually integrate all four skills and develops and assesses effectively the development of fluency and cognitive control? – Crucial and challenging question everyone should ask anyone showing you great example of projects integrating emerging technologies. Yet, never had or heard of a PD session where this issue is effectively tackled
  1. Does the kind of differentiated teaching you propose actually work? Where is the evidence and/or theoretical rationale behind? – None of the PD sessions I have ever attended on differentiation has ever attempted to provide me with any research evidence that the recommended differentiation strategies actually achieved their intended purpose; or at least with a theoretical rationale for the approach. Yet, since differentiated learning is very laborious and time-consuming to set up one does expect these questions to be answered.

Conclusion

The above ten questions refer to only a few of the many shortcomings of PD sessions I have been involved in in 25 years of teaching. Bizarre how, despite so much research having been carried out in L2 acquisition and pedagogy in the last twenty years or so, lots of MFL PD in the UK seems to have been recycling the same old topics ad nauseam – the only notable additions being PBL and emerging technologies.

As I have often reiterated in my blogs, for MFL PD to be effective it has to empower teachers with the WHY of language acquisition and pedagogy and the WHEN. The HOW should focus more on the process of learning rather than on how to use an activity or App; i.e. on how language learning is impacted by each step we decide to take in our planning, execution and assessment. Take PD in emerging technologies, for instance: the facilitator comes in, shows you a few Apps and web-tools and how to use them; teachers try them out and…that’s it! How about: how do they impact the process of learning at different stages of proficiency and why?

MFL CPD providers should not presume that teachers are not capable of or interested in learning how MFL students learn. For transformational professional development to work, it must provide a clear and convincing rationale as to why the methodological framework or principles proposed have the potential to be effective. Teachers must feel a sense of empowerment which cannot simply be brought about by being provided with new teaching strategies; rather, first and foremost it requires an understanding of how language learning happens; how what we do in the classroom affects acquisition and cognition; what fuels and sustains the development of cognitive control over language reception and production skills; what the markers of fluency, accuracy and complexity are at the various stages of language acquisition; how we bring about learner autonomy, etc.

On the subject of learner autonomy, I find it scandalous that after thirty years of research in learner training (or learning to learn) UK PD providers’ knowledge of this area of research and pedagogy can be so inadequate, especially considering how some very well-known learner-training researchers are actually UK based (e.g. Ernesto Macaro, Vee Harris, Suzanne Graham).

Of course, another major pitfall of PD refers to its follow-up; how firmly the content of the training session(s) is kept by teachers in their focal awareness and implemented and self-monitored in their daily practice well after the PD event(s). But this is beyond the scope of this post.

Six very common flaws of foreign language assessment

download (2)

  1. Teaching – Assessment mismatch

Often foreign language instructors test students on competences that have not been adequately emphasized in their teaching or, in some cases, have not even been taught.

The most common example of this refers to the issue of task unfamiliarity, i.e. the use of an assessment tool or language task the students have never or rarely carried out prior to the test. This can be an issue, as research clearly shows that the extent to which a learner is familiar with a task will affect his/her performance. The reasons for this refer to the anxiety that the unfamiliarity engenders, the higher cognitive load that grappling with an unfamiliar task obviously poses on working memory (especially when the task is quite complex) and of course the fact that memory (and, consequently, task-knowledge) is context-dependent – which means that knowledge is not easily transferred from task to task (the so-called T.A.P. or Transfer Appropriate Processing principle). By doing a task over and over again prior to an assessment involving that task, the student develops task-related cognitive and metacognitive strategies which ease the cognitive load and facilitate its execution.

Another common scenario is when students are not explicitly focused on and provided sufficient practice in a given area of language proficiency (e.g. accuracy, fluency, vocabulary range, grammar complexity); yet their teachers use assessment scales which emphasize performance in that area (e.g. by given grammatical accuracy a high weighting in speaking performance whilst practicing grammar only through cloze tasks). I have had several colleagues in the past who taught their students through project-based work involving little speaking practice even though they knew that the students would be assessed in terms of fluency at the end of the unit. Bizarre!

Language unfamiliarity is another instance of this mismatch, in my opinion. This refers to administering to students a test which requires them to infer from context or even use unfamiliar words and results in assessing the learners not on the language learnt during the unit but on compensation strategies (e.g. guessing words from context). Although compensation strategies are indeed a very important component of autonomous competence, I do believe that a test needs to assess students only on what they have been taught and not on their adaptive skills – or the assessment might be perceived by the learners as unfair, with negative consequence for student self-efficacy and motivation. A test must have construct validity, i.e. it must assess what it sets out to assess. Hence, unless we explicitly provide extensive practice in inferential skills, we should not test students on them.

Some teachers feel that since the students should possess the knowledge of the language required by the task whether the students are familiar with the task or not will not matter; this assumption, however, is based on a misunderstanding of L2 language acquisition and task-related proficiency.

  1. Knowledge vs control

Very often teachers administer ‘grammar’ tests in order to ascertain whether a specific grammar structure has been ‘learnt’. This is often done through gap-fill/cloze tests or translations. This approach to grammar testing is correct if one is purporting to assess declarative (intellectual) knowledge of the target structure(s) but not the extent of the learners’ control over it (i.e. the ability to use grammar in real operating conditions, in relatively unmonitored speech or written output). An oral picture task or spontaneous conversational exchange eliciting the use of the target structure would be more accurate ways to assess the extent of learner control over grammar and vocabulary. This is another common instance of construct invalidity.

  1. Listening vs Listenership

This refers less to a mistake in assessment design than to a pedagogical flaw and assessment deficit and is a very important issue because of its major wash-back effect on learning. Listening is usually assessed solely through listening comprehension tasks; however, this does not test an important set of listening skills, ‘listenership’, i.e. the ability to respond to an interlocutor (a speaker) in real conversation. If we only test students on this aspect of listening, the grade or level we assign to them will only be reflecting an important set of listening skills (comprehending a text) but not the one they need the most in real-life interaction (listening to an interlocutor as part of meaning negotiation). Listening assessments need to address this important deficit, which, in my opinion is widespread in the UK system.

  1. Lack of piloting

To administer a test without piloting it can be very ‘tricky’ even if the test comes from a widely used textbook assessment pack. Ambiguous pictures and solutions, speed of delivery, inconsistent and/or very subjective grading of tests and construct validity issues are not uncommon flaws of many renowned course-books’ assessment materials. Ideally, tests should be piloted by more than one person on the team, especially when it comes to the grading system; in my experience this is usually the most controversial aspect of an assessment.

  1. ‘Woolly’ assessment scales

When you have a fairly homogenous student population, it is important to use assessment scales/rubrics which are as detailed as possible in terms of complexity, accuracy, fluency, communication and range of vocabulary. In this respect, the old UK National Curriculum Levels (still in use in many British schools) were highly defective and so are the GCSE scales adopted by UK examination boards. MFL departments should invest some quality time to come up with their own scales, making specific reference in the grade descriptors to the traits they emphasize the most in their curriculum (so as to satisfy the construct validity criterion).

  1. Fluency – the neglected factor

Just like ‘listenership’, fluency is another factor of language performance that is often neglected in assessment; yet, it is the most important indicator of the level of control someone has achieved in TL receptive and productive skills. Whereas in speaking UK MFL departments do often include fluency amongst their assessment criteria, in writing and reading this is not often the case. Yet, it can be relatively easily done. For instance, in essay writing, all one has to do is to set a time and word limit for the task-in-hand and note down the time of completion for each student as they hand it in. Often teachers do not differentiate between students who score equally across accuracy, complexity and vocabulary but differ substantially in terms of writing fluency (i.e. the time to word ratio). By so doing we fail to assess one of the most important aspects of language acquisition: executive control over a skill. In my view, this is something that should not be overlooked, both in low-stake and high-stake assessments.

Micro-listening skills (Part 2) – More micro-listening tasks for the foreign language classroom

images (2)

As a follow-up to my post ‘Micro-listening enhancers you may not be using in your foreign language lessons’ here is a new list of micro-listening enhancers I frequenty use in my lessons.

  1. Parallel sentences

I usually read out – or record myself or a native-speaker reading out – ten or more sets of two semi-identical sentences, which differ only slightly, at native speaker speed. For instance, in the example below the difference will be ‘vers’ (around) vs ‘à’ (at). The students have to identify the differences and note them down on mini-boards. I am not at all bothered with the spelling of the target word.

Je suis allée au cinéma vers huit heures

Je suis allée au cinéma à huit heures  

  1. Sudden stop

I give students a transcript of a listening text, then play or read the text at native speed, suddenly stopping when I feel fit. The students write on mini-boards the last word I uttered.

  1. Spot the wrong sound

I pronounce a target language sentence or short text making sure that I make a typical L1 transfer phonetic error. For instance, in the sentence below, I would pronounce the ‘h’ the English way. The students need to identify my pronunciation mistake.

J’habite en Malaisie

  1. Spot the correct transcription

The teacher reads out a sentence at near native speed. The students are provided with a gapped version of that sentence and with three near-homophones (words that sound very similar). The task is to choose the correct option.

Teacher says : J’y vais avec lui

Students see (on screen/whiteboard) : J’y vais avec _______

Options to choose from : Louis – lui – l’huile

  1. Sound-tracking

Students are given a short target language text. The teacher reads a sound, for instance /wa/ (as in ‘moi’) and the students, under time conditions, have to scan the text in search for any letter or combination of letters that refer to that sound and highlight them.

  1. Silent letters (group work)

Students are given a set of sentences and, working in group, take turns to read them out to each other, circling the letter they think will not be voiced. The teacher then provides the answers to confirm or not the students’ assumptions.

  1. Break the flow

This is a classic. Students are presented with sentences (on whiteboard/screen) written out as shown below, with no spacing between words. The teacher will utter the sentence two or three times at native speed and the students will have to rewrite the sentences with the appropriate spacing

What students get: Jenefaispasdesportcarjesuisparesseux (=I do not do sport because I am lazy)

What they are meant to do: je ne fais pas de sport car je suis paresseux

  1. Anagrams

Students are given a set of anagrams of words they have never come across before which the teacher will pronounce one by one two or three times. Based on the target sound – which will have been practised beforehand – the students will have to rewrite the words in their accurate form. I like this exercise because, when the words are truly unfamiliar, it does require a good grasp of TL phonology and inferential skills.

  1. IPA practice – back-transcription

This is meant as training in the International Phonetic Alphabet or IPA. After much work on phonological awareness and practice and teaching of the IPA equivalent of each target sound (e.g. oi = /wa/; en= / ɑ̃ /) the students are presented with a list of phonetic transcription of words which I usually get from www.wordreference.com. Their task is to write them back in their normal graphemic form.

What students are given: [mwa], [pɑ̃dɑ̃], etc.

The correct answers: moi, pendant, etc.

  1. IPA practice – IPA transcription

This is best done after extensive practice with the previous activity and with the IPA symbols in general. The teacher utters a series of words and students write them out on white boards using the symbols provided. Sheets with the target IPA symbols can be given to students as scaffolding. A tip: do not deal with too many symbols at any one time and keep the words as short as possible

12 challenges of ICT integration in MFL instruction

course__teachertrainingaustralia_courses_HigherOrderThinkingAndEf__course-landing-still-1395051971.51

INTRODUCTION

ICT integration in the MFL curriculum and its implementation in the classroom pose a series of important challenges which refer to the way digital tools and tasks affect language acquisition as well as the students’ general cognitive, emotional, social and moral development.

Unfortunately, not much is known about the impact of digital assisted teaching and learning on language acquisition and I am not aware of any research-based approaches to ICT integration in MFL. Teachers apply their intuitions and experience accumulated through non-digital assisted teaching to emerging technologies often taking a leap of faith.

In what follows I advocate a more systematic, thorough and reflective approach to ICT integration in MFL teaching and learning, one which considers cognitive, ethical, pragmatic and, of course, pedagogical factors. I also advocate, that whilst I believe that emerging technologies can indeed enhance learning, they will not unless they are effectively harnessed. This requires (a) an understanding of language acquisition; (b) a grasp of how each digital medium affects acquisition; (c) how it fits an individual teacher’s ecosystem.

TWELVE CHALLENGES OF ICT INTEGRATION IN MFL

Please note that the list below is not exhaustive; it only includes what I deem to be the most important challenges I have identified based on my own experiences with ICT integration.

Challenge 1: Enhancement power vs Integrating ICT for ICT integration’s sake

The first question you have to ask yourself when planning to use an app or web-tool or carrying out any digitally-based task in your lessons is: Does it really enhance teaching and learning as compared to my usual non-ICT based approaches? How can I be sure of that?

Teachers often feel compelled to use an app or web-tool for fear of not appearing innovative enough in the eyes of their managers; however, after all, if their classroom presence, charisma, humour, drama techniques, etc. are more compelling than Kahoot, iMovies, Tellagami and the likes, it would be silly to use less impactful teaching tools just to check the ICT-integration box – unless, of course, one is doing it for the sake of variety or to get some rest at the end of a long and tiring school day.

In my experience this is one of the most difficult challenges when a school goes 1:1 with lap-tops or tablets. Teachers feel that if they do not use ICT as much as possible they are somehow failing the school or will be left behind. This can cause some teachers, especially the less ICT-savvy ones, serious anxiety. And if this issue is not properly tackled from the beginning, learning will be affected quite negatively, especially during the first phase of integration when teachers are trying things out.

Possible solutions

This issue can be addressed by:

  1. Providing training in how specific digital tools and tasks actually affect student cognition and language acquisition at different stages of MFL proficiency developments. Emerging technologies workshop facilitators and ICT integration coaches usually show teachers how to use a specific tool or conduct an ICT-based task but do not delve into how it enhances language acquisition across the four macro-skills and why it is superior to its non-digital counterpart. To buy into ICT integration in MFL, teachers must be provided with a solid rationale as to why, how and when in a teaching sequence specific digital tools and tasks enhance learning. This is rarely done. Experimentation with a digital tool at the expenses of student learning is ethically wrong considering that little teacher contact time is usually available to MFL learners.
  1. Deciding on a set of clear pedagogic guidelines, both generic and subject-specific, for ICT integration in lessons (e.g. Generic guidelines such as: Is there an expectation for the iPad to be used in every single lesson? ; MFL specific guidelines: When doing a project about school, students should not do the filming around school during lesson time). Whilst the provision of an overly prescriptive framework would impact teaching and learning negatively, in my view, a clearly laid out pedagogic framework for ICT integration would prevent some teachers from over-using digital tools and others from under-using them. The elected framework will have to be convergent with the MFL department’s espoused methodological approach to language instruction.
  1. Producing an MFL-department ‘ICT handbook’ detailing the benefits and drawbacks of each digital tools and embed in curriculum design–This would complement the above-envisaged guidelines. The ‘handbook’ could take the form of an interactive google sheet onto which each member of the MFL team logs in the pros and cons for learning of each digital tool as their understanding of its impact on learner cognition increases. Unlike what often happens this should not merely consist of a dry detailed list of the latest and coolest apps with technical tips and a cryptic description like “Good for listening” or “Children find it fun”. After the first ICT integration cycle, embed the best recommendations found in the ‘handbook’ in your schemes of work.
  1. Piloting the digital tools prior to use –Get a few students – including ‘weaker’ students, too – to try the app or website at lunchtime or after school, possibly in your presence; if not possible, ask them to do it at home. Then gather some feedback on it about the ‘mechanics’, how much they learnt, how enjoyable they found it, how they would improve it. Remember to ask them to compare the impact of that app with the non-digital approaches to the same tasks that you would normally use. Piloting digital apps and tasks has been a lifesaver for me on several occasions.
  1. Student voice – see below (Challenge 5)

Challenge 2 – Language learning metaphors we live by

Digital literacy within and without the language learning domain is very important. So are E-learning strategies (i.e. the set of approaches and techniques which MFL students can deploy in order to independently use web-based resources to enhance their learning). However, if, as part of our teaching philosophy, we set out to forge effective lifelong language learning habits with our students we also have the ethical imperative to impress on them the importance of human-to-human interaction as a fundamental lifelong learning skills. As discussed in a previous post (‘Rescuing the neglected skill’), ICT integration in MFL often results in oral proficiency and particularly fluency (the proceduralization of speaking performance) being neglected. This is usually due to a tendency to involve students in (a) ludic activities (e.g. online games of low surrender value); (b) the creation of digital artefacts through single or multiple apps; (c) digitally-assisted projects of the likes envisaged by Puentedura’s Redefinition criteria.

By skewing the classroom language acquisition process towards an interaction-poor and mainly digitally-based model of learning we give rise to metaphors of learning which do not necessarily truly reflect the way language is produced and learnt in the real world. These metaphors – rightly or wrongly – may form the core of our students’ beliefs as to how languages are learnt for the rest of their lives!

Possible solutions

Teachers ought to ensure that there is a balanced mix of digital and non-digital tasks and that oral activities aimed at developing oral competence and involving learner-to-learner interaction feature regularly in MFL lessons. After all, we want to forge students who can interact with other humans, don’t we?

Challenge 3 – Curricular unbalance

As mentioned above, ICT MFL integration has the potential to be detrimental to oral fluency development. In my experience, however, other aspects of language acquisition are penalized too; one is listening fluency and another one is grammar proceduralization. As far as listening goes, one reason is that there are not very many free apps and web-based resources which promote aural skills. Another issue relates to the fact that one very important aspect of listening competence, listenership, is often overlooked in the ‘techy’ classroom. Listenership refers to the ability to listen, comprehend and respond as part of an interaction with an interlocutor; when learners work with a machine, there is no interlocutor to react to

As for grammar, there are verb conjugators like mine (www.language-gym.com) and apps and websites involving work on morphology and syntax; but these are not designed to develop cognitive control over the target grammar structures (i.e. accurate written or oral performance under real operating conditions).

Language and communicative functions (e.g. agreeing and disagreeing; sequencing information; comparing and contrasting) are also often neglected in digitally-based PBL.

Possible solutions 

Ensure that in every unit in your schemes of work ICT-based integration does not result in an excessive focus on reading, writing, word-level vocabulary learning and grammatical drills of the gap-filling/mechanical kind.

Challenge 4 – Divided attention

Unless students are highly proficient in the target language, digital manipulation has the potential to hijack student attention away from language learning. Cutting, pasting, app-smashing, searching for sound-effects, downloading images, etc. take up a lot of the students’ cognitive resources. My students also often complain about being distracted by the notifications they get from e-mail and social network whilst on task. The obvious consequence is divided attention which may lead to poor retention.

Another source of divided attention is the cue-poor nature of digital media. In a typical interaction with a teacher or peer an MFL student can rely on many verbal and non-verbal cues (e.g. facial expressions and gestures) which will enhance understanding or clarify any doubts s/he may have; however, when it comes to digital environments, apps or web-tools are not as cue-rich and may pose various challenges that may disorientate less ICT-savvy or confident learners. If one has twenty or more students, it is not rare to have two or three experiencing this sort of problem at the same time. Hence, yet another source of divided attention.

Possible solutions

First and foremost, get your students to turn off social networks and social network notification and establish some clear sanctions for those who do not.

Do not involve students in any digital manipulation in lesson unless absolutely necessary. Flip it: get it done at home!

Model the use of an app / web-tool to death, until you are 100% sure everyone has understood how to use it. Also, group the less ICT-savvy students together – you will spot them very soon, trust me. This will save you the hassle of running from one side to the other of the classroom and will avoid other students being distracted by requests for help coming from those very students.

Challenge 5 – Time-cost

This point follows on from the previous one, in that, in view of the very little teacher contact time available to MFL teaching, one has to put every single minute available to very good use; to invest a lot of valuable lesson time into the mechanical manipulation of a digital tool which does not directly contribute to language acquisition verges on the unethical considering that the ultimate objective of language learning is to automatize receptive and productive skills – which requires tons of practice.

Possible solutions

The same provided for the previous point.

Challenge 6 – Variety

I have observed many ‘techy’ lessons where students spend a whole hour on the iPad! This practice is counter-productive not only for what I have discussed above, but also because there is research evidence that many students do find it tedious and not conducive to learning (Seymour,2015). In a small survey I carried out some time ago, my students recommended that the continued use of the iPad should last no longer than 15-20 minutes.

Possible solutions

Even though you may change app or game or task, you cannot keep your students on the iPad, laptop or whatever device you use for a whole lessons! As discussed in a previous post, it is good practice to alternate short and long focus, individual and group work, digital and non-digital tools/tasks, receptive and productive skills, ludic and academic activities, etc.

Challenge 7 – Student voice – How do we get the information ?

At some point you will want to find out how ICT integration into your teaching eco-system is being perceived by your students. Many education providers will go about it by administering questionnaires, possibly a google form as it is easy to set up, administer and analyze quantitavely. However, there are many problems with that if you want to obtain rich data which will help you improve the next cycle of ICT integration. The main problem is that a typical google-form survey ‘imposes’ onto the informants closed questions which require them to answer by choosing yes or no, or a frequency (e.g. often, rarely) or quantity (e.g. a lot, some) adverb. However, say 30 % of the students indicate they do not find ICT useful; you will not know why and which aspects they are referring to. Not to mention the fact that this kind of data elicitation encourages a very superficial ‘tick the box’ approach which does not spark off deep thinking on the issues-in-hand.

Hence, to enhance the richness of the data so obtained, course administrators ought to at least triangulate the google surveys with one-on-one semi-structured interviews involving a cross-section of learners of different ability and gender. A former colleague of mine, in her master’s dissertation research, found out that her interview findings on ICT integration often contradicted data on the same issues obtained through a google-form survey.

Challenge 8 – Evaluation of ICT integration by peers /SLT

In many schools ICT integration in MFL is evaluated by using the SAMR model. This can be detrimental to foreign language learning for reasons that I have discussed at length here:  http://tinyurl.com/o3yp64v (Of SAMR and SAMRitans – Why the adoption of the SAMR model may be detrimental to foreign language learning)

Possible solution

See solution to challenge 1, above (point 2)

Challenge 9 – Learner attitudes towards digital tools

If your school has gone 1:1 with the iPad or other tablet you may find that in most of your classes there will be one or more students who will not enjoy using the device for language learning.  This presents you with an ethical dilemma. If it is imperative to differentiate our teaching so as to cater for every learner’s cognitive and emotional needs what can you do to suit their learning preferences?

Possible solution

If, after trying various tactics, you still have not managed to ‘win the learner over’ and their aversion for the mobile device still remains, you have three options that I can think of. First one: tough, they need to adapt – not the most empathetic of approaches. Second option: give them alternative tasks with similar learning outcomes using traditional media. Third option: if you use the iPad for snappy activities of no more than 10-15 minutes alternated with other non-digitally-based tasks, as I do, you will find that these students will not complain much at all, in the end.

Challenge 10 – Lack of meta-digital language learning awareness

The internet offers a vast array of TL learning opportunities. However, students need to be equipped with the metacognitive tools which will enable them to seize those opportunities. This entails teaching learners how to learn from TL internet input sources, web-tools and App so as to be able to use them autonomously (e.g. how to best use verb conjugation trainers, Youtube, Memrise, Wordreference forums, etc. for independent learning ).

Possible solutions

Embed one or more explicit learner training program in meta-digital learning strategies (e.g. e-learning strategies) per ICT integration cycle. Since learner training programs do require extensive practice in the target strategies, one should offer instruction in only a limited range of strategies per cycle. A widely-used instructional framework can be found in this post of mine:  http://tinyurl.com/qxafy7m  (‘Learner training in the MFL classroom’)

Challenge 11 – Ethical issues

Easy access to the digital medium and apps entails the danger of ‘cut and paste’ plagiarism and the use of online translators.

Possible solution

Besides alerting the students to the dangers of such practices, teachers may ask their students to sign a ‘contract’ in which they pledge never to commit plagiarism nor to use online translators. During the initial phase of integration, teachers will have to remind students of their pledge to keep it into their focal awareness and spell out the sanctions for flouting the ‘rules’. In my experience this is a highly successful tactic to pre-empt such issues.

Challenge 12 – Product vs Process

Since a lot of apps and emerging technologies workshops concern themselves with the production of a digital artefact (e.g. iMovies) ICT integration in MFL is often characterized by an excessive concern with the product of learning. However, language teaching should focus mainly on the development of listening, speaking, reading and writing skills (the process of learning) rather than merely putting together a digital movie, app-smashing or presentation (the product).

Possible solution

The production of a digital artefact can be conducive to language acquisition if teachers and learners lay as much emphasis on the process of learning as they do on the final product. This involves identifying what levels of complexity and fluency in each macro-skill are expected of the learners at each stage in the development of a given project and provide extensive contextualised practice throughout the process with those aspirational levels in mind.

Conclusion

Successful ICT integration in the MFL curriculum and its practical implementation are no easy tasks. First and foremost they require a pedagogical framework which is consonant with whatever instructional approach to L2 learning is espoused by the MFL team. Secondly, its effectiveness will be a function of the extent to which a teacher knows how, why and when digital tools enhance learning and affect learner cognition and language acquisition. Thirdly, teachers need to control the tendency, common during the first cycle of integration for teachers to let the digital medium take over and drive teaching and learning; this is very common amongst emerging technologies enthusiasts. This is highly detrimental to learning and I have experienced first-hand the damage it causes to student enjoyment and motivation as well as to their levels of oral and aural fluency. Fourthly, learners’ meta-digital learning awareness and strategies need to be enhanced through structured learning-to-learn programs. Fifthly, teachers must not forget that they have the ethical imperative to form a balanced linguist versed in all four macro-skills and endowed with high levels of fluency and autonomous competence. ICT integration must be eclectic and mindful of this.

In conclusion, ICT integration in MFL is not an exact science as yet. For it to be successful, the best recipe is not to rush it for the fear of been ‘left behind’ and not appear as an ‘innovative’ school or teacher. Course administrators and teachers must always have at heart the linguistic, cognitive, socio-affective and moral needs of their students and integrate ICT with those needs in mind. Not merely integrate for integrations’ sake. Ultimately, ICT integration should assist, not drive, teaching and learning.

13 key steps to successful vocabulary teaching.

images

The following are the principles that underpin vocabulary teaching in my everyday practice. The reader may want to refer to my article ” How the human brain stores and organizes vocabulary and implications for the EFL/MFL classroom” for the theoretical background to my approach.

1. Select the vocabulary based on :

  • Learnability (how easy or challenging lexis is in terms of length, pronunciation, spelling, meaning, grammar, word order in a sentence etc.);
  • Frequency;
  • Relevance to students’ interests, back-ground, culture/sub-culture;
  • Semantic relatedness (the more strongly semantically inter-related the target words are, the stronger the chances of retention)

Since I usually create my own vocabulary teaching resources (see the work-out section of  www.language-gym.com or https://www.tes.co.uk/member/gianfrancoconti1966 )  it is easy for me to keep track of the target lexis through each step of the lesson. If you do not make your own resources, especially if you are a novice teacher, it may be useful to draw a list of the words you intend the students to learn to ensure that systematic recycling does occur throughout the lesson.

2. Decide which lexical items you are planning for students to learn receptively (for recognition only) and productively (for use in speech and writing) – ‘receptive learning’ being obviously easier.

3.Decide on how ‘deep’ your teaching of the target lexis is going to go. In other words, which levels of knowing a word you are going to teach. Nation (1990) identified the following dimensions of knowing a word:

Learner knows:

  1. Spoken form of a word;
  2. Written form of a word;
  3. Grammatical behavior of a word;
  4. Collocational behavior of a word;
  5. Frequency of a word;
  6. Stylistic appropriateness of a word;
  7. Concept meanings of a word;
  8. Association words have with other related words.

4.The number of words that you select per lesson will depend largely on the students you are teaching and how systematically one wants the target lexis (every single item) to be recycled. There is a myth that one should teach 7+/- 2 words per lesson. This rule of thumb is based on a misunderstanding of Miller’s (1965) law which posits that Working Memory can only hold and rehearse 7+/-2 digits at any one time. But Working Memory span has nothing to do with how many words one can learn in a lesson. In my experience, with an able group (i.e. students with highly efficient working memories) one can aim at as many as 20-25 words/lexical chunks receptively (especially if the lexis includes cognates) and around 10 to 15 productively. This on condition that the words are recycled frequently, systematically and as many retrieval cues as possible are provided  (by building in lots of semantic associations);

5.In order to avoid the risk of cross-association, avoid selecting items which are similar in sound and/or spelling with younger or less able learners:

6. Ensure that the lexical items selected include a good balance of nouns and verbs – as I discussed in previous posts, there is an unhealthy tendency in many MFL classrooms for vocabulary teaching to be noun-driven.

7.Plan for several recycling opportunities throughout the lesson through various modalities (e.g. listening, speaking, reading, writing, body gestures). Ensure students process receptively and productively each and every target lexical item 8 to 10 times during a given lesson – more if possible! Since research clearly shows that learners notice adjectives and adverbs less, greater attention should be given to these two word-classes;

8.Ensure the recycling opportunities include activities which involve:

  • Higher order thinking skills / Depth of processing (e.g. odd one out; matching with synonyms; inferencing meaning from context, etc.) – the deeper and more elaborate the level of semantic analysis of the target lexis, the stronger the chances of ensuring retention;
  • Communicative activities involving information gaps and negotiation of meaning (surveys; find someone who; find who does what; etc.)– a substantial body of research indicates that these activities significantly enhance vocabulary acquisition;
  • Work on orthography;
  • Work on phonological awareness;
  • Work on the words’ grammar;
  • Work on the words’ collocational behavior;
  • Semantic and phonetic associations with previously learnt lexis;
  • Inferential strategies (e.g. understanding texts in which the target lexis is instrumental in grasping the meaning of unfamiliar words);
  • Creating associations with each individual learner’s personal life experiences;
  • A competitive element (games) ;
  • Personal investment / self-reliance (e.g. using dictionaries; creative use of the target words).

Give as much emphasis as possible to the correct pronunciation of the target lexis from the very early stages as vocabulary recall is phonologically mediated. Also, ensure that when you teach vocabulary work is as student-centred as possible in order to maximize the level of individual cognitive investment in the learning process. Do remember that retention is more likely to occur when learning involves deep levels of processing and substantial personal investment.

9. Ensure that words are practised in context not in isolation – hence, if you are staging games or other ludic activities, ensure that they involve the processing or deployment of the target lexis within meaningful sentences. Since exposure to the target lexis through the listening medium is often neglected in the typical UK classroom, ensure that students get plenty of aural practice.

10. When using visuals ensure they are as unambiguous as possible. If using visuals to present new lexis, make sure that they are not exposed to the spelling of the words until after you have practised its pronunciation a few times;

11. Draw on the distinctiveness principle as much as possible to ensure that through visuals, anecdotes, jokes or special effects the most challenging vocabulary items are made to stand out, memorable;

12. Occasionally – not in every lesson – select a strategy or set of memory strategies (e.g. the keyword technique) to model to and train students in. If you do teach memory strategies, ensure that you recycle and scaffold practice in those strategies in several subsequent lessons to keep them in the learners’ focal awareness for as long as you deem necessary for uptake to occur;

13. Plan for systematic and distributed (a little bit every day rather than a lot in one go) practice/recycling of the target lexis in homework and future lessons. Remember Ebbinghaus’ curve (figure 1, below), mapping out humans’ rate of forgetting and set homework accordingly so as to prevent memory decay. The fact that your students WILL forget  to aound 67 % of what they ‘learnt’ in lesson after one day should prompt to plan your recycling carefully. Figure 2 shows how I do it for grammar and vocabulary, i.e. a spreadsheet that lists vertically the items to recycle and horizontally each week of the present term; the sheet allows you to keep track of how often you have been teaching a given set of vocabulary and to plan for future recycling.

Figure 1 – The rate of human forgetting

ebbinghaus-graph

Figure 2 – Recycling tracking sheet

ticket

Here are ten commonly made mistakes in vocabulary instruction that every EFL / MFL teacher ought to look out for: 10 commonly made mistakes in EFL vocabulary instruction

Ten questions you may want to ask yourself in planning a unit of work

548a8

Here are ten questions that I believe – ideally – curriculum designers should be asking themselves in planning and evaluating a topic-based unit of work in the context of an eclectic instructional approach with a communicative focus where grammar is taught explicitly and whose ultimately aim is the acquisition of cognitive control over TL use. The reader should note that I have avoided any explicit reference to the expression ‘learning outcomes’ (although they are implicitly referred to throughout the below). This is because this term in my perception – and maybe only in my perception – evokes an excessive concern with the product of learning, whereas I advocate that the pedagogic focus shift onto the process of learning, that is, the acquisition of procedural linguistic ability, i.e. effective executive control over language reception and production.

1.What kind of language do I expect my learners to be able to understand (in reading and listening) and produce (speaking and writing) in terms of:

  • Range and depth of vocabulary;
  • Range of grammar structures;
  • Range of tenses;
  • Range of language functions?

2.How complex should teacher input and learner output be at different stages in the development of the unit of work? This is a very useful question as it usually results in the curriculum designer putting more thought and effort in the sequencing of activities based on the developmental needs of the learners. Teacher needs to have this issue in their focal awareness so as to agree on what constitutes comprehensible input and achievable output. The most challenging issue is possibly – as I discussed in my last post – deciding on what constitutes a complex grammar structure.

3. What levels of fluency do I expect the learners to have acquired at key points in the development of the unit of work? This issue is hugely important as it refers to the notion of automatization of language production. Sadly, in my experience, this is a dimension of proficiency which is usually neglected by curriculum designers – with disastrous consequences for language learning. Teachers must agree where they would like their students to be in terms of spontaneity of production at key stages in the unfolding of the unit of work.

4. What levels of accuracy in and (cognitive) control over the target language do I expect students to exhibit at key stages in the development of the unit? This is linked to the previous point but develops it a notch or two further in that it does not refer simply to the ability to be fluent and comprehensible, but also to perform accurately under real operating conditions. In other words, it refers to the extent to which grammar structures, the phonology system, etc. are not merely learnt declaratively but also procedurally;i.e. where on the declarative (explicit knowledge of the TL) to procedural (automatic application of grammar rules) continuum do we expect our learners to be?

5. Which ones of the following areas of competence am I going to focus and to what extent?

  1. Phonology (awareness of the TL sound system) and Phonetics (production of the TL system)
  2. Orthography
  3. Morphology (word formation)
  4. Syntax (word order)
  5. Language functions (e.g. compare and contrast, asking questions, predicting, agreeing and disagreeing, sequencing, summarizing)
  6. Communication skills (e.g. listenership) – this is a crucial, yet very neglected skill-set, especially in PBL;
  7. Learner strategies / Life-long learning skills – this is another neglected area in curriculum  planning;
  8. Learner autonomy (the extent to which we promote and scaffold beyond the classroom  independent learning) – also a very neglected area of competence;
  9. Cross-cultural competence

6. Which of the above should be prioritized? – This decision will be dictated by the stakeholders’ demands/needs as well by the espoused instructional methodology.

7. What would I like learners to do independently? – This does not include homework. It refers to what you would like students to do independently, without any teacher request, to enhance their own learning and how you would go about triggering this desire to learn out of the classroom.

8. How and how often am I going recycle the target language items, competences and skills/strategies throughout and beyond the present unit of work? – my regular readers would know how obsessed I am with short-/medium- and long-term recycling. I strongly believe this is the most crucial and most neglected area of teaching and learning.

9. How am I going to resource the teaching and learning of the target language items, competences and skills/strategies?

10. How and how often am I going to assess all of the above?- This is the most challenging question to answer. However, if one has answered the previous questions, it will be relatively straightforward. The issues which are most often neglected in assessment design are:

  1. Does the test actually measure what it purports to measure? (construct validity) – hence, if following the above framework, issues a to f must be taken into account here;
  2. Is the assessment going to have a positive wash-back effect on learning? – it should, really;
  3. Is the assessment fair? – Students must be prepared effectively for a test; hence, the test papers should be ready and piloted way before the teaching of a unit begins;

Moreover, several low-stakes assessments (not too many, obviously) should take place before the end-of-unit test(s) in order to be fair to our students and not base our evaluation of their performance over a term based on one snapshot at the end of it.

Conclusion

It is obvious that the answer to the above questions will be to a great extent influenced by the teaching and learning methodology espoused and/or adopted by the department as well as by the demands and needs of the stake-holders. MFL curriculum designers may find certain areas more or less relevant to their learning context or more or less worthy of an explicit focus. To expect busy teachers to do all of the above thoroughly and ‘perfectly’ would be unrealistic, especially in view of the time constraints.

However, there is one thing that every curriculum designers MUST definitely plan carefully for: how to foster and facilitate the process of automatization of every skill and language item they set out to teach. This includes extensive recycling and practice and will often require going beyond the textbook and the time normally allocated to curriculum delivery (the ‘two- chapters-or-topics-per-term syndrome’, as I call it). As I have often reiterated in my posts this is the most important, yet the most neglected dimension of language learning.