EN ROUTE TO SPONTANEITY (PART 1): THE CURVE OF SKILL ACQUISITION AND ITS IMPLICATIONS FOR THE LANGUAGE CLASSROOM

The curve of skill-acquisition

You may have heard the expression ‘it’s been a learning curve’. Well, Cognitive psychologists working in the Skill Theory paradigm- which I am currently reading and writing about for my forthcoming book – have observed, for skills as different as making cigars out of tobacco leaves or writing computer programs, that learning follows a curve representing a power function that looks like the curve in Figure 1 below.

Figure 1: the curve representing the power law of learning

Learning curve

The same is true of language learning. After intensive training (massed practice), L2 learners first experience a drastic improvement in task performance in terms of decreasing reaction time and error-making (represented by the steep decline); this decline indicates that the learners, after performing a task a few times have routinised it (what we call ‘proceduralization’).

For instance, a student may have learned to use the perfect tense of French regular verbs in the context of a structured communicative drill. Whilst at the beginning s/he was performing the task slowly, having to retrieve and apply the relevant grammar rule consciously (declarative knowledge),  after several repetitions of the task s/he has now mastered the task.

When mastery has occurred and your students have routinized the task-at-hand (i.e. can perform it fairly effortlessly and speedily) and errors have decreased drastically, the curve starts to flatten (around point 20 on the curve) . This is where the typical grammar test, for instance, indicates the student has ‘got’ it and can apply the grammar rule fairly accurately and with some ease in the task they have been practising (although not necessarily in other tasks).

Automatization is a long process

At this point, if practice is continued, research shows that the curve flattens. This flattening of the curve shows that the process of automatization is very slow; the gains in speed and accuracy are steady but minimal. Why? The brain goes slowly when it comes to automatizing things, because anything we automatize – including mistakes – cannot be subsequently unlearnt (a phenomenon known as ‘fossilization’ in language learning).

The problem in much language learning in English schools at Key Stage 3 (beginner to intermediate stage) is that once the students evidence ‘mastery’, teachers  often move to the next topic/grammar structure instead of  providing automatization practice – what cognitive psychologists often refer to as ‘overlearning’.

Automatization as fluency

In language learning, automatization practice equals work on fluency, getting the students to use the target vocab and/or structures under communicative pressure and time constraints, what Johnson (1996) calls R.O.C.S (aka Real Operating Conditions). ROCs are ‘desirable difficulties’ (Bjork and Linn 2006) which pose variable demands on learners’ processing ability when performing the targeted behaviour or task, in resemblance with real-life conditions. Pedagogically speaking, the application of ROCs in LT results in task grading: manipulating different factors to vary the complexity of the tasks. Among them, Johnson (1996) highlights degree of form focus, time constraints, affective factors, cognitive and processing complexities.

Paul Nation calls this all-important dimension of language learning the ‘fluency strand’.

Figure 2: the fluency strand

Cumbria transfer mistake

Most teachers, partly because they feel under pressure to ‘cover the syllabus’ and partly because they are reassured by progress tests that their students ‘know’ the target L2 items neglect this area of L2 proficiency. Yet, automatization practice is key, because (1) it is essential for spontaneity; (2) what is automatized is never forgotten and (3) if fluency training aims – as it should – at automaticity across a range of linguistic contexts and tasks, it enables students to use what they have learnt more flexibly in real-life interactions.

Basically, tragically, practice with the target items ends exactly when it is needed the most ! And, more importantly, that practice is needed over a long period of time ( through distributed practice).

Automatization is more than speed of retrieval

This kind of training gets eventually the students not merely to fast retrieval which requires little or no conscious awareness (what psychologists call ‘ballistic processing’) but, more interestingly, produces qualitative changes in the way we retrieve and apply the vocab/structures we need to successfully execute the target task. In other words, automaticity means that the brain finds a more efficient way to produce that vocabulary and those structures in the context of a given task. We know this because MRIs show that when L2 learners have become fluent in the execution of a task, the brain areas activated during the execution of that task shrink, a sign that the brain makes less effort and needs to recruit fewer neural circuits. This qualitative change is called by researchers restructuring.

Main implications for language pedagogy

The main implications for language learning are obvious: we need to spend more time on automaticity (aka fluency) training. This means:

(1) cutting down the curriculum to allow for lots of recycling, task repetition and fluency practice once the students show they master the content of a unit of work. Textbooks go  way too fast !

(2) lots of recycling and task repetition at regular intervals. These will be very close in time to each other at the beginning of the curve and gradually more distant as the curve flattens;

(3) deliberate work on fluency/automaticity, by increasing the communicative pressure and time constraints in the execution of tasks (e.g. the 4,3,2 technique, Messengers, Market place, Speed dating, my mixed-skill ‘Spot the difference’, etc.)

(4) ensuring that the same set of vocabulary and grammar structures are practised across different contexts, as learning is context and skill dependent (e.g. what is learnt in reading and listening is not transferred automatically to writing and speaking; what is learnt practising a task won’t be transferred to another, even though the two tasks are very similar).

(5) teaching chunks (e.g. sentence frames and heads), as it reduces cognitive load thereby speeding up processing and fluency. Teaching multi-word chunks means that there are fewer grammar rules (if any) to apply and automatize, as opposed to teaching single words that the learners must learn to bind together grammatically in real time.

The curriculum-design matrix in figure 3 below (aka the “Conti matrix”), makes provision for all of the above. As explained in previous posts, in my approach automatization occurs in the context of tasks designed to elicit language processing and production under what Keith Johnson calls R.O.C. (real operating conditions), i.e. in communicative drills and tasks I will discuss in greater detail in my next post.

Figure 3- The Conti Matrix

Untitled

Practice with the language they KNOW

Of course, before venturing into this type of training, it is crucial that the students have mastered the target items, i.e. they can recall them with some ease and without the help of reference materials.

This is unfortunately one of the most common problems in much of the communicative language learning I have observed in 25+ years of teaching; the students are interacting orally, yes, but having way too often to resort to the help of word/phrase lists. When you stage fluency-development activities, the students shouldn’t need to do this. They should have had already plenty of retrieval practice which has weaned them off such lists first. That’s why, in my Recycling Matrix, the students get to the automatization phase at the very end of a macro-unit (i.e. unit 5).

Make time for fluency training

Some may object that there is not enough time for this type of training. My response is that it is all down to effective curriculum design and smart use of lesson time. But it is also dependant on your mission as a language teacher: are you imparting abstract knowledge of grammar structures or are you forging confident and effective L2 speakers ? Are you preparing students for exams or for real-life use? Are you teaching to cover the syllabus or for durable learning and spontaneity?

In the early years of language instruction, when exams are not and should not be a concern, you can and must make time for fluency practice.  Focus on the target items and tasks you students truly must learn to perform confidently, spontaneously and as accurately as possible and cut down the superfluous.

Less is more.

Untitled

Book review: Dannielle Warren’s “100 ideas for the classroom – MFL outstanding lessons

I rarely write reviews of books, but I have decided to make an exception for Dannielle Warren’s “100 ideas for secondary teachers – Outstanding MFL lessons”, edited by Bloomsbury. The reason: it is a concise, clearly written book that every teacher, both novice and experienced – regardless of their geographical location or working context – will find useful. A real treasure trove for pre-service teachers, of course, looking for a versatile and varied repertoire of tested instructional techniques and strategies.

I have always appreciated the contributions Dannielle has made to the UK modern foreign languages community over the years on various social media and teacher platforms such as TES. I have also always liked the persona she has displayed in the process: a passionate language educator, willing to share free resources and always humbly acknowledging the work of others that she dutifully magpies and adapts creatively often surpassing the original. I remember when years ago she politely asked me if she could copy and adapt my Spanish GCSE revision quickies ideas and came up with her own improved version of the original Conti format.

The book is a well thought-out ensemble of teaching ideas for the classroom categorised as follows: Speaking; Listening; Reading; Writing; Grammar; Translation; Vocabulary; Marking, Feedback and Improvements; Revision. It must have not been easy for Dannielle to pick and choose only 100 ideas, as I am sure she knows and uses many hundreds more. However, the activities she ended up selecting constitute a formidable language teacher toolkit which includes some classics and some less known but very effective games, tasks and strategies that, having tried many of them myself, I know work, even with the most challenging classes.

These are my favourites in the very comprehensive collection the book offers:

  • Chatty Jenga
  • Speaking ladders
  • Talking frames
  • Dictation drawing
  • What’s next
  • Break it up
  • The detectives
  • Mosaic writing (of course)
  • One pen one dice
  • Verb towers
  • Translation grids
  • Spot the error race
  • Revision pong

The book is reasonably priced and has had fantastic reviews by practising teachers on various social media and on Amazon which confirm my opinion of the book as an extremely useful resource to have on your desk at home or in your department to dip into during your lesson planning when you are short of ideas or looking for inspiration.

What you will not find in the book is a rationale for each activity or where and why they should occur in an instructional sequence. This is possibly my only ‘even better if’ for Dannielle, should she plan a second edition.

In conclusion, Dannielle’s book is a little masterpiece that every language teacher should read. It is good value for money, very accessible, very clearly written and contains ideas adaptable to any teaching context I can think of. What is more, the book has an accompanying website with links to relevant online resources:

https://www.bloomsbury.com/cw/100-ideas-for-secondary-teachers-outstanding-mfl-lessons/online-resources/

WHY I AM STILL DISAPPOINTED WITH THE NCELP RESOURCES BUT WHY IT IS STILL A LAUDABLE INITIATIVE

So I browsed the NCELP websites and whilst there are many good things a language teacher can get out it – and for free ! – especially the research, the visuals and some nice vocabulary-building ideas, I was very disappointed with the schemes of work they published. This is a link to one of them, which is emblematic of my concerns.

So whilst I do believe NCELP to be a very laudable initiative, one that every language teacher should be aware and take advantage of, I think it has serious shortcomings which refer to their claims, principles, resources and the transformative power of the whole initiative. These are the major issues I see with the Schemes of Work and resources found on the website:

  • The pace of the SOW is too fast, especially in view of the ineffective and insufficient recycling, for acquisition and fluency to happen;
  • The sequencing of the content and activities doesn’t often make sense from a language acquisition point of view. The claim is that the NCELP material is evidence-based. However, there is little research evidence guiding the selection and sequencing of the linguistic content; no reference whatsoever to research into natural order of acquisition of French, German or Spanish morphemes. No apparent consideration of learnability issues in the acquisition of L2 morphemes, e.g. as per processability theory or the very useful heuristic found in Nation and Newton (2013), for example;
  • The recycling is limited and it is largely recycling of single words, not lexical or syntactic patterns;
  • The sequences found in the Power points (old school PPP) that are supposed to supplement the SOWs are light on the receptive skills, especially listening as modelling. It is not clear where the students get the aural modelling they need for speaking. The modelling  phase is short and not very substantive. It reminds me of the pre-NCELP classic Rachel-Hawkes-style PPTs from TES – so where’s the innovation?
  • The variety of resources is limited, and the activities on the slides are frankly a bit dull;
  • Quite a few of the oral activities,  e.g. ‘Yes or No’ pair-work tasks  (1) do not allow for effective retrieval practice because the students can basically answer ‘yes I can’ or ‘no I can’t’ without evidencing they understand what the other person is saying; (2) do not elicit forced output, so the students can basically choose to say what they like and avoid more challenging items. This reduces the chances of recycling the target items effectively in the productive tasks.
  • Both encoding and retrieval practice are insufficient. Quite superficial work which often barely scratches the surface. Lots of supplementation is required of teachers;
  • There is no deliberate planned effort to achieve spontaneity; no time is built in the units of work for that. In fact , you hardly find any rich communicative tasks, not even at more advanced proficiency levels;
  • Not obvious at all how they do actually apply the skill-theory framework that appears to underpin their pedagogical approach. There is no attempt at automatization. Not even a sense that they try to build up to it;
  • The frequency word lists are not very useful because knowing how frequently a word is used without knowing what it is most frequently used with presupposes that words are used in isolation. One wonders how useful it is for a teacher to know that a word is number 106 in terms of frequency of use without listing alongside what phrases / patterns they usually occur most frequently in. Also, examination boards not being aligned with such lists, they are not going to be useful for examination purposes either;
  • No work on collocations or colligations – this focus on single words to me is rooted in the dark ages of language pedagogy ignoring all the great research done by Michael Lewis and other great minds in SLA research. I have made my argument against single-word teaching many times over on this blog. e.g. here;
  • No mention of a Comprehensible Input threshold. In fact, they recommend the use of authentic texts, which we know for a fact are not conducive to learning if they contain less  95 % comprehensible input. Nor do they suggest a substantive pre-reading / -listening sequence (e.g. the ones Steve and I lay out in “Breaking the Sound barrier”). The belief that training students in reading and listening strategies with native or native-like texts can effectively enhance receptive skills and learning is based on findings from research studies where the so-called “Hawthorne effect” (the fact that trainer and subject know they are part of a research project), with substantive injection of external support and resources  and with hand-picked groups of students all end up skewing the outcomes in favour of the intervening variable (the training); but to reproduce the same type of training with every single class of yours on a full timetable with the limited time and resources available to the average teacher and without specialised training is much harder – especially with more challenging classes (I talk from experience!). One major piece of advice I would like to give anyone wanting to use the NCELP reading and listening comprehension tasks: try to supplement the listening activities with some of my L.A.M. (listening as modelling tasks) which focus students on the levels of processing that most NCELP activities neglect or superficially touch upon (i.e. syllable, lexical retrieval,  morphological / syntactic parsing and discourse levels).
  • Vocabulary is not modelled sufficiently through the aural medium. This is bizarre considering the emphasis laid by NCELP on phonics and decoding skills; surely the students’ decoding skills would benefit from getting lots of aural input, no?
  • They mention the importance of recycling vocabulary across a wide number of contexts (based, I presume, on the Transfer Appropriate Processing principle) – something I have been advocating for in every single post of mine. Yet this is not applied at all in the schemes of work or Power Points;
  • Although NCELP claims that the end of a typical instructional sequence should integrate all four skills, that is hardly ever the case. The best they can do is Dictogloss, which does not really integrate all four skills, unless the whole activity is carried out in the target language – what are the chances of that happening in the average classroom?
  • No serious work on oral fluency anywhere, as already mentioned. It seems like spontaneity will magically happen. Task-based learning is the dominant fad in SLA research at the moment. NCELP ignores it altogether, whilst Australia and New Zealand are making it a mandatory and crucial aspect of their National Curricula. I deliver workshops and keynotes in Australia on a regular basis and I am always impressed with their attempt to integrate rich communicative tasks in their lesson sequences;
  •  There is a massive emphasis on phonics but little on phonotactics and on reading aloud; they could do with integrating some of these interactive read-aloud tasks a la Conti in their instructional sequences. No point in mastering phonics without extensive oral practice in the phonotactics of the language. Moreover, phonics can be boring unless you make them more playful and interactive;
  • The resources appear rushed, as if the need for populating the website is driving the production of PPTs ( An example of a PPT here) and worksheets, rather than the attainment of a quality product;
  • One of the principles they claim underpin their pedagogic practice echoes the point I have been making for ever on this blog and in my latest book with Steve Smith, which they state as follows: Establish grammatical knowledge in reading and listening before expecting learners to produce the grammar in writing and speaking. Yet, this is not done in any PPT of theirs – not a single one not a single one of the fifty I reviewed ! This is one of the most important problems in the lesson sequences they propose; finally,
  • I have had several complaints from the NCELP portal users about the user-friendliness of the website.

There are other issues with (1) the assessments which are not often reflective of valid testing practice, (2) the pedagogic principles that NCELP claims are based on research but are actually based on hand-picked research (Emergentism, Comprehensible Input, Task-based learning, Lexical priming, Statistical learning, Implicit instruction effectiveness, Natural order of acquisition and other research are completely ignored); (3) there is a gap – quite big at times – between what is preached in the principles and the actual implementation in the resources; (4) the CPD Power Points need spicing up and more practical examples ought to be provided in order to be more impactful. The list could go on.

In conclusion, I do honestly think that NCELP is a great initiative and there are lots of freebies that can benefit teachers, especially those on a low budget. However, the resources are often botched up, fairly repetitive and not very inspiring. The most disappointing thing for me are the schemes of work, especially the pace, the poor recycling and a disregard for some key research and language acquisition theory. I don’t see anything massively innovative about the resources; but this wouldn’t be a problem if they looked like they could be effective in modelling, consolidating, recycling and retrieving the instructional input. The truth is: they don’t.

I don’t believe the stuff I see on the NCELP website at the moment has transformative potential. Nor that it has the power to inspire and enthuse teachers and pupils in a big way, which is really the objective a transformative initiative such as this one should prioritize. However, it is still early days and the website has a plethora of resources which teachers can tweak and adapt to their contexts. Of course, each PPT will require a lot of supplementation, adaptation and spicing up, especially when it comes to reading and listening.

 

PLEASE NOTE: you can find out more about my approach to modern language instruction, especially in the areas of Listening and Speaking, in the book I co-authored with the legendary Steve Smith. Here’s a link to a very comprehensive review of the book, authored by Rebeca Arndt of UCF (Florida) , recently published in the prestigious journal Applied Linguistics.

 

 

 

My approach: Extensive Processing Instruction (E.P.I.) – an important clarification in response to many queries

Introduction

This post was written in response to a query by a Modern Language teacher on the professional platform I co-founded with Dylan Vinales, the 11,000-teachers-strong Facebook group Global Innovative Language Teachers. The query echoed many other queries I have received in the last few months and have not had the time to answer due to my touring commitments, so I felt finally compelled to respond.

The query

The query reads as follows:

“Hello! It’s been a year and half now that we have CONTIFIED our lessons and our curriculum in my Department and we are so happy to see the benefits that this is having on pupils’ listening and reading skills as well as their fluency at writing. However, pupils feel that the sentence builders are not having quite the same impact on their ability to speak more spontaneously (the A.R.S. of the EARS part of the MARS EARS?) – especially GCSE groups – and they think that the speaking activities such as “read my mind” or “find someone who” are fake speaking activities, as they are actually reading/ listening. Can I ask you what would you suggest to address this? Can you bombard me with successful speaking activities you do in your classes? Gianfranco Conti / Steve Smith any articles/ blogs/opinions on this matter? Any chapter from your latest book? Thanks for your help”

The response

Sentence builders and Parallel texts are merely MODELLING tools for presenting the target L2 chunks and patterns and how they work in highly comprehensible and structured contexts. They include worked examples which reduce cognitive load and enhance language awareness; whilst they contribute to spontaneity – as scaffolding tools – they are by no means sufficient in developing spontaneity. There is much more to it!

After the modelling, in my approach, one needs to stage an intensive phase of listening and reading tasks (RECEPTIVE PROCESSING PHASE) involving lots of comprehensible input, thorough processing and input-flooding (lots of repetition – quite repetitive and structured for weaker learners and less structured for stronger ones). The interactive reading aloud activities (e.g. Mind reading, Sentence stealer, Sentence chaos, Liar liar) in this phase are only meant to practice decoding skills and articulatory fluency. Not spontaneity. They are desirable with weaker learners with poor or emerging decoding skills and are solely aimed at developing the students’ mastery of the phonotactics of the language (an important sub-set of decoding skills), which is an important prerequisite of fluency. This phase would last one lesson or even longer until you are satisfied that receptive mastery has been attained/fine-tuned. The activities I envisage for this phase are described in great detail in our latest book.

After this modelling phase you will do lots of highly structured forced (controlled) output tasks which recycle every single chunk you have just modelled as well as ‘old’ ones (from previous lessons). This INTENSIVE retrieval practice phase, which will involve at least one full lesson, is KEY if you want to attain  fluency and spontaneity. In this phase you gradually wean the students off the sentence builder. I find that a lot of teachers who claim to be espousing my approach neglect this bit. This phase includes my oral / interactive translation games (No snakes no ladders, Communicative translation drills, Chain reaction, Oral Ping-Pong, etc.), traditional drills and highly structured communicative tasks which – based on the principles of ‘Task naturalness”, “Task utility” or “Task essentialness” – force the students to use the target chunks. If you don’t stage this phase, you will never wean your students off the sentence builder. You will move on to the next phase, only when you have verified your students have retained the target chunks successfully.

In the third main phase you may (but don’t have to, if you adopt a radical lexicogrammar approach) want to focus on the grammar underlying the chunks and patterns in more detail and provide consolidating practice (highly controlled still, to avoid cognitive load). In some cases (e.g. for verb conjugations and agreements in French) you have to, if you want to increase the generative power of the target L2 chunks.

Finally, you will work on Fluency and Spontaneity through gradually less controlled but usually PLANNED communicative tasks, until you get to complete autonomy (this starts in the unit in hand but will continue later on) At this point, task-planning shouldn’t be necessary any longer. This task-supported phase is key and is often neglected by many on the basis that there is not much time available. My counter-arguments are laid out below:

(1) with year 7 to 9 (UK system) students, scrap lengthy, cumbersome, useless and time-consuming end-of-unit assessments; instead, assess the students every five or six lessons through short and easy-to-mark low-stake assessments worth about 10 to 20% of the overall final grade – this helps you keep track of your students’ progress- and through frequent retrieval practice (verifying the uptake of the target chunks);

(2) in year 7 to 9 high-stake tests are not that important and if done in massive doses they can be counterproductive;

(3) in Stimmt, Viva, Studio, Mira, Expo, etc. the end-of-unit assessments are UNSCIENTIFIC and UNRELIABLE on a number of accounts (e.g. construct validity, internal and external validity, face validity, etc.) – hence, wasting three lessons on them doesn’t yield useful data at all;

(4) automaticity (fluency) training yields learning that is long-lasting;

(5) any assessment data you obtain at the end of a unit is of little use in terms of advancing learning, as you can’t go back and re-teach a unit, can you, if you find out several of your students are not doing well;

(6) ending a term with the thing students hate the most (tests) is the worst thing you can do for students’ motivation – the last few lessons of a terms should be used to celebrate learning and showing students they CAN DO languages, thereby enhancing their self-efficacy (you can still assess them in the process if you really want in less threatening ways, by observing and listening in as they carry out tasks);

(7) you still get the data your senior management wants by having several smaller and easy-to-mark low-stake assessments;

(8) less is more: no harm staying on a unit longer if you have a wide repertoire of interesting and challenging (but still within the learners’ zone of optimal development) tasks that students enjoy, i.e. : Things in common, Messengers, 4,3,2 technique, Speed dating, Market place, Role plays, Post and praise, Chain reaction, Find someone who (without cards), Alibi, etc.

In this phase you will recycle materials for previous units too and will be more tolerant of errors. Task repetition is a must to enhance fluency development (Bygate, 2009, 2015).

In conclusion, sentence builders and my reading-aloud games are no panacea, they are only the first step in a gradual fluency and spontaneity build-up from modelling and controlled input tasks to unplanned and less structured tasks which focus on autonomous competence. You can truly say you have ‘contified’ your lessons if you stage all the phases outlined above. I have used that approach for years, so I know it can be done, even with low-ability and challenging students, if one is brave enough to (1) cut down content; (2) create comprehensible-input materials and resources; (3) scrap traditional assessment. It is not easy, that’s why if you really buy into this, you would start one year at a time. CONSTANT RECYCLING IS KEY !

The whole pedagogical cycle – as outlined above – is time consuming because achieving fluency and spontaneity is a time-consuming business. You can’t move from unit to unit every six weeks hoping to achieve durable learning and fluency and hoping that a couple of self-quizzing tasks a la Michael School, Quizlet activities or Kahoots a day, recycling previous items will suffice. That’s a foolish assumption. This may work in geography and science, but not in languages, which require skill automaticity (fluent L2 readers recognize around 250 vocabulary items per minute !). At KS3 (UK System), i.e. 11 to 13/14 yrs old, in the absence of high-stake national examinations constraints this can be achieved.

The Conti recycling Matrix below shows how I envisage the planning of a unit-of-work with middle school learners (yr 7 to yr 9 UK system) and intermediate learners. In each sub-unit, the first two lessons do not recycle ‘old’ items in order to avoid interference; they only focus on the new target items. As you can see, the Fluency / Spontaneity phase occurs at the end of a unit (sub-unit 5 in the picture), whilst the constant recycling across all sub-units (represented by the ticks) keeps the items learnt in every previous unit alive. So, every time you move to the next sub -unit, the items from the previous sub-units are constantly recycled through retrieval practice in which the previous sub-unit items are interwoven with the items-at-hand in the receptive and productive activities you stage – what textbooks NEVER do.

Untitled