Why Do So Many UK Students Drop Modern Foreign Languages?

In a country facing a rising language skills deficit, the dramatic drop in the number of students taking modern foreign languages at GCSE and A-level has become a matter of serious concern. Despite efforts like the English Baccalaureate (EBacc) to incentivise uptake, MFL remains one of the most frequently dropped subjects in the UK curriculum. But why?

Research spanning over a decade reveals that the problem is not unidimensional. It’s systemic, rooted in curriculum design, assessment practices, social factors as well as pedagogical traditions. Below are ten of the most widely evidenced reasons, followed by a research-informed summary table.

1. Perceived Difficulty

Students routinely identify languages as one of the hardest subjects, especially when compared to other GCSE or A-level options. The unfamiliar vocabulary, grammar rules, and listening/speaking components contribute to this perception. Many students feel overwhelmed and opt for subjects in which success appears more attainable. In many of my previous blogs and in my workshops I have tried to show that traditional teaching practices often contribute to this, especially the excessive reliance on explicit grammar instruction and overcharged curricula.

2. Lower Predicted Grades and Grading Severity

MFL subjects are among those with the harshest grade distributions. Students often achieve lower predicted or actual grades in languages than in other subjects. This discourages them from continuing with the subject, especially when university admissions and sixth-form entry hinge on strong academic performance.

3. Lack of Immediate Relevance

Many students fail to see the practical application of learning a foreign language. In a predominantly English-speaking society, languages are viewed as less relevant to daily life or future career plans, particularly when compared to STEM subjects or vocational courses that have clearer job pathways.

4. Limited Curriculum Time and Specialist Support

In many schools, particularly non-selective state schools, MFL has a reduced presence on the timetable. Cuts in lesson time, lack of qualified language teachers, and underfunding result in poor continuity, patchy provision, and limited support — all of which negatively affect motivation and attainment.

5. Peer Influence and Social Stigma

Peer pressure plays a major role in subject choices. MFL is sometimes seen as a “geeky” or “uncool” subject, and boys in particular may feel socially discouraged from taking it up. The perception that “nobody else is doing it” leads many to opt out regardless of personal interest or aptitude.

6. Teaching Approach and Curriculum Design

Traditional MFL instruction in the UK has long been criticised for its overemphasis on grammar, memorisation, and rote learning, rather than meaningful communication. When lessons are dominated by worksheets, verb tables, and repetitive drills, many students disengage — particularly lower-attaining learners.

7. Perceived Elitism and Lack of Inclusion

Languages are often associated with grammar schools, private schools, and higher-income families. Students in comprehensive schools — particularly those from disadvantaged backgrounds — are statistically less likely to continue with MFL, reinforcing the idea that languages are “not for people like me.”

8. Lack of Parental Encouragement and Role Models

In homes where parents don’t speak or value additional languages, learners are less likely to view language learning as important. The absence of visible multilingual role models in the community or media further contributes to low motivation and cultural disconnect.

9. Post-16 Curriculum Narrowing

The narrowing of subject options after age 16 means that students often prioritise core academic subjects or those with perceived higher value. Even when students enjoy MFL, the pressure to choose three A-levels typically leads them to drop languages in favour of subjects that better align with university or career ambitions.

10. Poor Transition from KS2 to KS3

The lack of consistency and continuity between primary and secondary language instruction is a major stumbling block. Students often restart from scratch in Year 7 regardless of prior learning, leading to frustration and the sense that their earlier efforts were pointless.

Summary Table: Why Students Drop Languages in the UK

ReasonExplanationKey Research / Source
1. Perceived DifficultyLanguages are seen as harder than other subjects.DfE (2019); Tinsley & Board (2017); Taylor & Marsden (2014)
2. Lower Predicted Grades / Grading SeverityLanguages yield lower grades, deterring students.Ofqual (2016); Routledge & Searle (2019)
3. Lack of Immediate RelevanceLearners struggle to see real-life applications.Tinsley & Doležal (2018); British Council (2022)
4. Limited Curriculum Time / SupportReduced lesson time and lack of specialist staff affect learning.ALL (2021); Ofsted (2016)
5. Peer Influence and Social StigmaSeen as “uncool,” particularly by boys.Courtney (2017); British Council (2020)
6. Teaching Approach / Curriculum DesignRote grammar learning over communicative use.Graham et al. (2016)
7. Perceived Elitism / Lack of InclusionMFL is skewed toward selective schools.Strand (2015); Hodgen et al. (2018)
8. Lack of Parental Encouragement / Role ModelsFewer multilingual influences at home.Murphy & Unwin (2019); Tinsley & Doležal (2018)
9. Post-16 Curriculum NarrowingFewer subject slots at A-level lead to early dropout.DfE (2017); Hodgen et al. (2018)
10. Poor Transition from KS2–KS3Gaps between primary and secondary learning disrupt continuity.Graham et al. (2021); Ofsted (2021)

What Now?

Reversing the trend will require more than just policy mandates. Schools need intelligent curriculum redesign, pedagogical innovation, and greater systemic support — especially in state schools serving disadvantaged communities. Unless these barriers are tackled head-on, the UK risks falling further behind in global language competence, limiting both individual opportunity and national competitiveness.

The decline is not inevitable. But the solution demands more than rhetoric — it demands reform.

What makes certain L2 learners better editors of their writing than others? – Part 1: Cognitive and affective factors identified by research

Why Some MFL Students Are Better at Monitoring Their Writing: A Research-Informed Guide for Teachers

In 2004 I completed my PhD investigation of L2 students Self-monitoring habits as essay writers. One of the most striking differences I observed during my study was that while some learners instinctively spot and fix their errors as they write, others barely notice a mistake even when it stares them in the face. Why is that? Why do some learners consistently revise, self-correct, and refine, while others either can’t or won’t? In this article, I explore what the research says about the cognitive and affective factors that affect learners’ ability to monitor their own output during writing. In a follow-up post I will explore the implications for teaching self-monitoring skills with specific reference to A2-B1 L2 learners (GCSE level in England).

What Makes Some Learners Better Self-Monitors? Key Research Insights

Working Memory Capacity Learners with greater working memory can retain and manipulate multiple elements at once — such as subject-verb agreement, spelling accuracy, and word order. This supports real-time monitoring. Studies by Miyake & Friedman (1998) and Robinson (2002) show that working memory correlates strongly with writing fluency and grammatical accuracy in L2 learners.

Metalinguistic Awareness Metalinguistic awareness refers to a learner’s ability to reflect on and manipulate the structural features of language. Roehr-Brackin (2018) found that learners with high metalinguistic ability are significantly better at identifying and correcting errors. Ellis (2004) also links it to increased success in rule-based tasks.

L2 Proficiency Level Beginners often struggle with output monitoring simply because their cognitive resources are consumed by word retrieval and syntactic construction. Ortega (2009) and Kormos (2012) argue that fluency allows learners to ‘free up’ mental space for revision.

Motivation and Attitude Learners who are more motivated to improve often engage more deeply in self-regulation, including self-monitoring. Dörnyei (2001) and Ushioda (2011) highlight the strong link between motivation and strategy use, especially in revision. Learners with a growth mindset are more likely to see error correction as a path to mastery rather than failure.

Strategy Use and Training Proficient self-monitors employ specific strategies: rereading, pausing to plan, and rephrasing. Research by Graham & Harris (2005) and Oxford (1990) confirms that explicit strategy instruction significantly improves writing outcomes

Can self-monitoring be taught?: What the Research Says About Monitoring Strategy Training

Over the past two decades, a growing body of research has highlighted the potential of explicit strategy training to improve learners’ ability to monitor their output in second language writing. While early SLA research focused largely on input, more recent studies have turned their attention to metacognition — particularly the kinds of strategies learners can be taught to deploy during written production.

One of the earliest and most cited contributions comes from Graham and Harris (2005), who developed a Self-Regulated Strategy Development (SRSD) model for writing. Their research, primarily with L1 learners but extended to L2 contexts, showed that when students were taught to plan, monitor, evaluate, and revise their writing using checklists and self-talk routines, their writing quality significantly improved.

In L2-specific contexts, Manchón and Roca de Larios (2007) investigated how learners approach self-monitoring during composition and found that students often do not spontaneously engage in rereading or revision unless explicitly taught to do so. Their study called for more structured integration of strategy instruction in writing curricula.

Schoonen et al. (2003) conducted research on L1 and L2 writers and found that cognitive fluency (speed and ease of lexical and syntactic retrieval) is a predictor of successful monitoring behaviour. Their findings imply that strategy instruction needs to be paired with fluency-building activities.

Sasaki (2000) carried out a longitudinal study of Japanese EFL learners and demonstrated that explicit instruction in writing strategies, including self-monitoring and planning, had a long-term positive impact on both writing fluency and grammatical accuracy.

More recently, Fernandez Dobao (2012) examined peer feedback as a scaffold for self-monitoring and found that learners who engaged in structured peer-review tasks were better able to internalise error detection strategies and apply them independently.

Lastly, Conti (2001) proposed a principled framework for teaching self-monitoring in L2 classrooms which included the following features:

(a) Enhancement of learner error-related metacognition

(b) A long-term process of self-monitoring

(c) Modelling of and extensive practice in the use of effective self-correction / editing strategies

(d) Personalisaton of error treatment

(e) Focus on the process rather than the product of writing and learning in general

(f) Synergistic use of various forms of EC

Like many other Explicit Strategy Training programmes his training consisted of the following phases (O’Malley and Chamot, 1990; Cohen, 1998; Macaro, 2001):

(1) Pre-test needs assessment: The learners’ needs are assessed, usually through a combination of different instruments (e.g. questionnaires, interviews, think-aloud protocols and other forms of self-reports) in order to strengthen the validity of the data.

(2) Introductory phase: The rationale for the training is given and the target strategies are presented and modelled.

(3) Scaffolding phase: The learners receive extensive practice in the target strategies with the help of “scaffolding”, i.e., activities and materials which remind and encourage the learners to apply the target strategies.

(4) Autonomous phase: The learners are left to their own devices without any intervention on the part of the teacher.

(5) Evaluative phase: The learners’ use of the target strategies and their impact on their performance are verified. Normally the same diagnostic instruments used at pre-test are re-deployed here.

Conclusion: Monitoring Is Learnable

Self-monitoring isn’t a fixed trait. It emerges from a complex interplay of cognitive factors (like working memory), affective factors (like motivation), and educational experiences (like strategy training and task design). Every learner can get better at it, but only if we teach it. Our job as MFL teachers is not to expect self-monitoring as a by-product of instruction, but to teach it as a skill in its own right. With time, modelling, and structured reflection — much like the strategies outlined in Conti (2001) — we can nurture that quiet, internal editor that turns output into real learning.

References

  • Dörnyei, Z. (2001). Teaching and Researching Motivation. Harlow: Pearson Education.
  • Ellis, R. (2004). The Study of Second Language Acquisition. Oxford: Oxford University Press.
  • Fernandez Dobao, A. (2012). Collaborative writing tasks in the L2 classroom: Comparing group, pair, and individual work. Journal of Second Language Writing, 21(1), 40–58.
  • Graham, S., & Harris, K. R. (2005). Improving the writing performance of young struggling writers: The Self-Regulated Strategy Development model. Teaching Exceptional Children, 38(1), 19–27.
  • Grabe, W., & Zhang, C. (2013). Reading and writing together: A critical component of English for academic purposes teaching and learning. TESOL Journal, 4(1), 9–24.
  • Kormos, J. (2012). The role of individual differences in L2 writing. Journal of Second Language Writing, 21(4), 390–403.
  • Manchón, R. M., & Roca de Larios, J. (2007). Writing-to-learn in instructed language learning contexts. Journal of Second Language Writing, 16(4), 225–250.
  • Miyake, A., & Friedman, N. P. (1998). Individual differences in second language proficiency: Working memory as language aptitude. In A. F. Healy & L. E. Bourne Jr. (Eds.), Foreign Language Learning: Psycholinguistic Studies on Training and Retention (pp. 339–364). Mahwah, NJ: Erlbaum.
  • Ortega, L. (2009). Understanding Second Language Acquisition. London: Hodder Education.
  • Oxford, R. L. (1990). Language Learning Strategies: What Every Teacher Should Know. New York: Newbury House.
  • Sasaki, M. (2000). Toward an empirical model of EFL writing processes: An exploratory study. Journal of Second Language Writing, 9(3), 259–291.
  • Schoonen, R., van Gelderen, A., de Glopper, K., Hulstijn, J., Simis, A., Snellings, P., & Stevenson, M. (2003). First language and second language writing: The role of linguistic knowledge, speed of processing, and metacognitive knowledge. Language Learning, 53(1), 165–202.

10 Research-Backed Principles for Effective Grammar Assessment in MFL Classrooms

Introduction


When it comes to assessing grammar in a Modern Foreign Language (MFL) classroom, especially with beginner to intermediate learners, it should be much more than just ticking boxes. Grammar learning is messy, gradual, and happens best through real use—not through artificial drills. Good grammar assessments need to reflect how languages are actually acquired. After all, in real communication, we use grammar to tell stories, explain ideas, argue our points—not just to get full marks on a worksheet. In this article, drawing on some of the most trusted research in second language acquisition (SLA) and cognitive psychology, we’ll walk through 10 key principles that can help make grammar assessment in MFL classrooms more meaningful, motivating, and effective.

1. Focus on Meaningful Use


Grammar should never be an end in itself. It’s a tool to communicate. So, assessments should check if students can use grammar meaningfully. For example, instead of giving students random sentences to correct, why not ask them to write a blog post about their weekend, naturally applying the passé composé (Samedi, je suis allé au cinéma). Assessments should make grammar “task-essential,” meaning students can’t succeed without using it. In other words, the task must naturally require the target structure for successful completion. For instance, a beginner Spanish activity might ask students to describe their family members using simple adjectives (Mi madre es simpática; Mi padre es alto), thereby assessing adjective-noun agreement authentically. When tasks demand real application of grammar for communicative purposes, learners are more likely to process the form deeply and retain it.

2. Prioritise Core Structures and 3. Put mastery before breadth


Let’s be realistic: some grammar structures are way more useful than others. Assessment should zoom in on high-frequency, high-utility structures—the ones students will actually need in everyday life. Think present tense -er verbs, basic negatives (ne…pas), and simple prepositions. Nation (2001) argues that focusing on what students use most pays off much more than chasing rarer forms. Focus on a handful of non-negotiables every year, ensuring that you don’t lose sight of them and keep assessing them until you believe they have become fluent in their deployment. Do bear in mind that the acquisition of a grammar structure may take months or even years; hence, assessing the acquisition of fewer foundational structures several times a year is more likely to pay higher dividends then assessing a multitude of structures (a la NCELP/LDP). The former approach will ensure that mastery of key grammar structures is attained and retained in the long term, so that you don’t have to keep re-teaching them ; the latter will obtain short-term gains that are likely to fade away in the space of a few weeks.

4. Assess Receptive and Productive Knowledge


Grammar competence usually shows up first in receptive skills like listening and reading before becoming visible in speaking and writing. That’s why it’s so important to assess both. For example, you could combine a listening task where students spot tense errors with a writing task where they narrate a past event. Larsen-Freeman (2015) and Hulstijn (2001) highlight that this sequence—recognising first, producing later—is how grammar naturally develops. Learners often notice forms months before they can reliably produce them. By embedding both recognition and production in assessments, we mirror this developmental path and avoid penalising students unfairly for being in the early phases of acquisition. Sadly, in many MFL classrooms this very important point is flouted and grammar assessments tend to be exclusively productive in nature.

5. Reward Emerging Accuracy and Scaffold Risk-Taking


Language learning is a messy, step-by-step process. Full accuracy doesn’t happen overnight! That’s why assessments should reward partial success. If a student writes je parl avec mon ami, we should acknowledge the correct stem, even if the ending is missing. Giving credit where it’s due builds confidence and encourages students to take risks with new structures, which is crucial for growth (Pienemann, 1998; Ellis, 2008; Dörnyei, 2005). Risk-taking is essential because learners often avoid using newly learned forms unless they feel safe experimenting. A reward system that notices approximate success gives learners emotional permission to stretch themselves beyond their current comfort zone.

6. Vary Modes of Assessment


Real language use involves all four skills—listening, speaking, reading, and writing. So, grammar assessment should do the same, but always keep the same structure in focus. For example, all tasks could target the passé composé:


This varied approach also keeps students engaged and gives them different ways to show what they know (Dörnyei, 2009; Purpura, 2004). Exposure across skills also strengthens grammatical automaticity and reduces the reliance on slow, conscious monitoring.

7.Provide Immediate, Formative Feedback where possible (e.g. when preparing for high-stake examinations)


Students benefit massively from getting feedback while they’re still working things out. During mock speaking exams, for instance, teachers could gently recast errors: il allé au parc? → il est allé au parc? This kind of immediate feedback supports restructuring and stops mistakes from fossilising (Hattie & Timperley, 2007; Lyster & Ranta, 1997). It’s important that feedback be timely, encouraging, and specific, drawing learners’ attention not only to what went wrong but to how the correct form feels and sounds in authentic usage.

8.Track Progress Over Time


Grammar learning doesn’t follow a straight line. It zigzags, loops, and sometimes even goes backwards before moving forward again. One-off tests can’t tell the full story. Instead, keep “Grammar Growth Trackers” where students log milestones—like the first time they correctly used a tricky structure. This aligns better with the staged, dynamic way grammar develops (Pienemann, 1998; Ellis, 2008). Using trackers also empowers learners to take ownership of their grammatical progress, making growth visible and therefore more motivating.

This links up with what we said above about prioritising a set of non-negotiable structures every year to assess several times at spaced intervals in order to ensure that the students become fluent with them.

9. Ensure Validity of Scoring in Mixed-Skill Tasks


In translation tasks, it’s common for students to make vocabulary and grammar mistakes in the same sentence. To be fair, assessments should separate the two: mark grammar independently of vocabulary. If a student mistranslates a noun but uses the correct verb tense, they should still earn grammar marks. Clear rubrics are key here (Alderson, 2005; Purpura, 2004). Without such clarity, learners risk being penalised twice for a single idea error, leading to skewed assessment outcomes that demoralise rather than diagnose accurately.

10. Avoid Over-Reliance on Guessable Formats


Tasks like grammaticality judgment activities (“Is this sentence correct or incorrect?”) can seem quick and easy, but they come with a big downside: students have a 50/50 chance of guessing the right answer! That’s not a true test of knowledge. If you do use grammaticality judgment tasks as assessment tools, you can enhance their validity by asking the students to justify their answers and correct the mistakes in the target sentences where applicable.

To really assess what students know, it’s always good practice to mix comprehension tasks with production tasks that require active use (Alderson, 2005; Hulstijn, 2001). Production tasks (like rewriting or filling blanks contextually) force deeper retrieval and construction processes, giving a much clearer window into learners’ true grammatical competence.

Conclusion


Moving towards smarter, research-informed grammar assessment means making a few key shifts: focusing on meaning over isolated accuracy, prioritising frequent structures, rewarding effort and risk-taking, and tracking growth over time. It also means creating tasks that reflect real communication—and avoiding shortcuts that let students succeed through guessing. With these principles in mind, grammar assessment can become a powerful tool not just for measuring learning, but for building it.


Summary Table: Key Principles for Grammar Assessment

References

Alderson, J. C. (2005). Assessing Reading. Cambridge University Press.

Smith, S. & Conti, G. (2020). Memory: What Every Language Teacher Should Know. Independently Published.

Doughty, C., & Williams, J. (1998). Focus on Form in Classroom Second Language Acquisition. Cambridge University Press.

Dörnyei, Z. (2005). The Psychology of the Language Learner: Individual Differences in Second Language Acquisition. Routledge.

Dörnyei, Z. (2009). The L2 Motivational Self System. In Z. Dörnyei & E. Ushioda (Eds.), Motivation, Language Identity and the L2 Self (pp. 9–42). Multilingual Matters.

Ellis, R. (2005). Measuring Implicit and Explicit Knowledge of a Second Language: A Psychometric Study. Studies in Second Language Acquisition, 27(2), 141–172.

Ellis, R. (2008). The Study of Second Language Acquisition (2nd ed.). Oxford University Press.

Hattie, J., & Timperley, H. (2007). The Power of Feedback. Review of Educational Research, 77(1), 81–112.

Hulstijn, J. H. (2001). Intentional and Incidental Second Language Vocabulary Learning: A Reappraisal of Elaboration, Rehearsal and Automaticity. In P. Robinson (Ed.), Cognition and Second Language Instruction (pp. 258–286). Cambridge University Press.

Larsen-Freeman, D. (2015). Research into Practice: Grammar Learning and Teaching. Language Teaching, 48(2), 263–280.

Lyster, R., & Ranta, L. (1997). Corrective Feedback and Learner Uptake: Negotiation of Form in Communicative Classrooms. Studies in Second Language Acquisition, 19(1), 37–66.

Marsden, E., & Kasprowicz, R. (2017). Foreign Language Practice Activities for Beginner Learners: A Comparison of Three Types of Activity. The Language Learning Journal, 45(1), 20–40.

Nation, I. S. P. (2001). Learning Vocabulary in Another Language. Cambridge University Press

What Really Matters in Vocabulary Acquisition? A Ranked Analysis of Key Influencing Factors

Introduction

As I often reiterate in my posts, vocabulary acquisition is the lifeblood of successful second language learning. It underpins listening comprehension, reading fluency, expressive speaking, and even grammatical development. Yet, while it is clear that acquiring a large and usable lexicon is essential, it is less obvious which factors most powerfully drive vocabulary growth.

Over decades, research in second language acquisition (SLA), cognitive psychology, and educational linguistics has revealed that some variables matter far more than others. This article ranks fourteen such variables by their likely impact on L2 vocabulary learning, based on accumulated empirical evidence. Along the way, it seeks to offer teachers, curriculum designers, and language learners themselves a roadmap to more efficient, strategic word acquisition.

1. Word Frequency

As discussed extensively in a previous post, word frequency is consistently identified as the most powerful predictor of successful vocabulary acquisition. Nation (2001) and Webb (2007) demonstrated that words occurring frequently in input are learned faster and remembered longer than low-frequency words, even when difficulty is controlled. Learners generally need at least 8–12 meaningful encounters to establish a stable mental representation of a word, and around 20–30 spaced repetitions to embed it into long-term memory.

However, not all exposures are equally useful: the diversity of contexts matters. Seeing se débrouiller (to cope, manage) in various settings like Il s’est bien débrouillé au travail, Je dois me débrouiller seul, and Comment te débrouilles-tu ? helps learners build rich semantic networks. Similarly, a word like chômage (unemployment) is better consolidated when used across structures like le taux de chômage, être en situation de chômage, and lutter contre le chômage.

Empirical studies show that with fewer than six spaced encounters, learners tend to remember fewer than 30% of new words after a week (Webb, 2007), whereas with 10 or more spaced encounters, recall rises above 80%. Frequency in input remains therefore a non-negotiable driver of vocabulary learning.

2. Quality of Vocabulary Instruction

Exposure alone is insufficient. The quality of vocabulary instruction dramatically modulates how efficiently and deeply words are acquired. Schmitt (2010) identified that explicit phonological modelling, semantic explanation, morphological decomposition, and repeated retrieval all significantly boost vocabulary retention compared to simple exposure.

Barcroft (2004) found that learners exposed to multimodal presentation — image, gesture, spelling, pronunciation, and sentence use — recalled 34% more words after two weeks than learners taught using isolated translation.

Meanwhile, Karpicke and Roediger (2008) demonstrated that retrieval practice — the act of recalling a word actively — produced 150% better long-term retention than re-reading or re-exposing learners to the word passively.

In practical terms, presenting épargner (to save money) by showing a photo of someone saving coins, miming the action, writing the word, pronouncing it carefully, and embedding it in J’épargne chaque mois pour mes vacances allows multiple memory routes to be built simultaneously, strengthening the trace exponentially.

Table 1 – Top 5 Effective Practices for Vocabulary Instruction

PracticeDescriptionKey Research Support
Pre-teaching VocabularyIntroducing and explaining key vocabulary before exposing learners to complex input tasks (e.g., texts, videos)Nation (2001), Graves (2006)
Multimodal PresentationEngaging visual, oral, physical (gestures) and textual channels to teach new wordsPaivio (1991), Mayer (2001)
Retrieval PracticeEncouraging effortful recall through quizzes, low-stakes tests, gapfills, and oral productionKang (2016), Bahrick (1979)
Rich and Varied ContextsRecycling words in multiple genres, syntactic frames, and communicative tasksWebb (2007), Schmitt (2010)
Spaced RepetitionRevisiting vocabulary over expanding intervals (e.g., 1 day, 3 days, 7 days, 14 days)Cepeda et al. (2006), Nation (2001)

3. Time Spent on Word Learning in the Classroom

A major body of research links time on vocabulary learning with larger vocabulary gains — but only when the time is cognitively engaged (Nagy & Townsend, 2012).

Spending as little as 10 additional minutes per session on structured, meaningful vocabulary activities can lead to 20–40% greater acquisition rates across a semester (Graves et al., 2011).

However, Pashler et al. (2007) and others stress that distributed retrieval is essential: learners must recall and use words actively at spaced intervals, not merely re-read them.

Whiteboard quizzes, retrieval flashcards, oral sentence building races, and quickfire memory games are among the classroom strategies that most efficiently drive vocabulary retention. Research indicates that when vocabulary practice involves active, varied retrieval and manipulation, retention rates can triple compared to passive exposure (Webb, 2007).

4. Word-Related Factors

Not all words are equally difficult to learn. Properties like concreteness, part of speech, morphological transparency, and polysemy make substantial differences.

Paivio’s (1991) dual-coding theory found that concrete words like bousculer (to jostle) or gronder (to scold) — easily visualised actions — are remembered more easily than abstract verbs like ressentir (to feel) or soutenir (to sustain/support emotionally).

Similarly, morphologically transparent words (prévoir — to foresee) are acquired faster than opaque forms like se méfier (to distrust), where no immediate morphology clues are available.

Polysemous words — like livre (book/pound) — add complexity: learners must develop multiple semantic traces to master full meaning usage, which research shows may require 20% more exposures on average (Crossley et al., 2009).

Table 2 – Word-related factors affective learnability

5. Time Spent by Learners Outside the Classroom

Outside engagement is critical. Nation (2001) and Webb (2007) found that learners who studied independently for even 10–15 minutes per day using structured activities gained 25–30% more vocabulary per term than their peers relying only on lessons.

Webb (2008) found that reading one graded reader (around 8,000–10,000 running words) resulted in incidental acquisition of 40–50 new words.

Listening to podcasts like InnerFrench with transcripts, maintaining a digital spaced repetition deck (e.g., The Language Gym or Anki), or writing short diary entries using newly learned words powerfully extend classroom learning into deep consolidation.

6. Learners’ Mastery of Word-Learning Strategies

Research consistently shows that learners who use conscious strategies — like creating mental imagery, using semantic grouping, deducing from context, and self-quizzing — outperform peers by margins as large as 50% (Gu & Johnson, 1996; Schmitt, 2000).

Explicit instruction and modelling of strategies (e.g., visualising lutter by imagining a boxing match, or mapping synonyms and antonyms) can dramatically enhance vocabulary depth and flexibility.

Learners must not only be taught words — but also taught how to learn words effectively.

Table 3 – Most researched word learning strategies

7. Learners’ Current Level of Proficiency

Higher proficiency enables deeper and faster vocabulary acquisition. Laufer (1998) found that intermediate learners could infer meaning from context approximately twice as accurately as beginners.

Advanced learners can make use of suffixes (-ment, -tion), prefixes (re-, dé-) and infer unknown words by analogy with known ones, making them far more efficient in incidental vocabulary learning.

Curriculum scaffolding should therefore carefully match input difficulty to learners’ inferencing abilities.

8. Learners’ L1 Vocabulary and Literacy

Strong L1 literacy correlates positively with L2 vocabulary acquisition.
Bilinguals or literate L1 learners more readily notice morphological patterns (e.g., entraide — mutual help, métissage — cultural mixing) and leverage semantic clues to decode new words.

Cummins’ (2000) Common Underlying Proficiency model suggests that cross-linguistic transfer of cognitive skills explains much of the variance in vocabulary learning success rates.

9. Learners’ Working Memory

Working memory capacity — particularly phonological memory — predicts the ability to rehearse and consolidate novel word forms.

Gathercole and Baddeley (1990) showed that phonological memory accounted for 40–50% of differences in new word learning rates among children and adults.
Training memory using rhythmical drills, chants, and repetition bursts significantly boosts early-stage vocabulary acquisition.

10. Learners’ Attitudes to Word Learning

Dörnyei (2005) emphasised that motivation was one of the strongest mediators of vocabulary acquisition: highly motivated learners devoted up to twice as much time to vocabulary practice outside class and retained words 25–30% better.

Simple motivational practices — setting achievable goals, celebrating milestones (e.g., Word Wizard awards), linking vocabulary to learners’ interests (e.g., sports, music, travel) — foster higher learner engagement and persistence.

11. Learners’ L1(s)

The influence of a learner’s first language (L1) on vocabulary acquisition depends largely on the degree of typological similarity between the L1 and the target language. Ringbom (2007) showed that when learners’ L1 and L2 share cognates or similar morphosyntactic structures, vocabulary transfer is facilitated, sometimes accounting for up to 25% faster acquisition during early learning stages.

However, when no such cognate scaffolding exists, as in the case of learners moving from a non-Indo-European L1 (e.g., Mandarin, Korean) to French, explicit vocabulary learning becomes significantly harder.
Thus, a Spanish learner may infer déjeuner (to have lunch) through analogy with desayuno, whereas a Japanese learner has no such linguistic support and must build word knowledge from scratch.

Nevertheless, studies (Lindqvist, 2009) demonstrate that cross-linguistic training — teaching students to consciously notice similarities and differences — can substantially reduce this gap, boosting the speed of acquisition even for distant-language learners.

12. Curriculum and Assessment Focus

Curriculum design and assessment philosophy exert profound effects on vocabulary learning trajectories. When curricula focus heavily on explicit vocabulary recycling, strategy training, and meaningful usage, learners’ gains in both receptive and productive vocabulary are markedly higher (Carlo et al., 2004).

Conversely, traditional curricula that prioritize memorisation of isolated word lists, followed by discrete-point testing (e.g., fill-the-blank items requiring only recall of definitions), tend to produce shallow and transient vocabulary knowledge (Schmitt, 2010).

Evidence from Stæhr (2009) shows that when assessments reward active, flexible usage of vocabulary (e.g., writing tasks, oral summaries), learners acquire and retain 20–30% more words compared to cohorts evaluated solely through isolated vocabulary tests.

Curriculum designers must therefore weave rich, usage-oriented vocabulary tasks into both teaching sequences and assessments to optimise acquisition.

13. Learners’ Socio-Economic Factors

Socio-economic status (SES) undeniably impacts early vocabulary development. Hart and Risley (1995) found that by age three, children from high-SES households had been exposed to around 30 million more words than children from lower-SES backgrounds, a gap that persists and compounds into adolescence.

However, in the context of second language learning, SES effects can be mitigated substantially.
Structured interventions focusing on explicit vocabulary teaching, extensive listening and reading programs, strategy instruction, and repeated retrieval practice have been shown to narrow vocabulary gaps within as little as two years (Stahl & Fairbanks, 1986).

Thus, while learners’ socio-economic backgrounds shape their starting points, principled teaching can dramatically alter their vocabulary growth trajectories.

14. Learners’ Age

Age affects vocabulary acquisition differently depending on the stage of learning and the nature of exposure.

Adults often demonstrate faster initial acquisition rates due to more developed working memory, greater metalinguistic awareness, and stronger inferencing skills (Singleton, 2001). For instance, older learners can explicitly deduce that entraîner and entraîneur share the root entraîne- and thus infer meanings more easily.

The greater working memory capacity, does give older learners (see Table 4 below) a significant advantage in formal,acquisition-poor environments. However, children who are immersed in a rich L2 environment from an early age often surpass adults in achieving native-like depth, automaticity, and nuance over time (Muñoz, 2006). They benefit from neural plasticity and may build implicit, proceduralised word knowledge more effectively if exposed early and intensively.

Thus, while age offers certain cognitive advantages or disadvantages depending on the stage and setting, it is the quality and quantity of input, engagement, and recycling that ultimately determine long-term vocabulary outcomes.

Table 4 – Ease of vocabulary learning as a function of agerelated working memory capacity

AgeTypical Working Memory CapacityEase of Vocabulary LearningResearch Evidence
8 years oldLimited (2–3 new lexical items reliably at once); phonological loop still developingStruggles to retain multiple unfamiliar words without very heavy scaffolding and multimodal reinforcement; benefits from visual aids, repetition, and chunkingGathercole & Baddeley (1990); Alloway et al. (2006)
11 years oldImproved (3–5 new lexical items at once); faster rehearsal speed; better chunking abilityCan retain moderate sets of new words (~4–6 words per exposure) if supported by strong context and phonological support; still vulnerable to overloadGathercole (1995); Service (1992)
16 years oldAdult-like capacity (5–7 new lexical items at once); strategic rehearsal and retrieval emergeCan learn larger vocabulary batches (~6–10 words) with limited scaffolding; benefits more from self-testing and morphological awareness strategiesBaddeley (2003); Ellis (2002); Nation (2001)

Conclusion

Vocabulary acquisition is a multidimensional process, shaped by a combination of word-related, learner-related, instructional, and environmental variables.
However, the factors do not carry equal weight.

Frequency of exposure, high-quality instruction, time-on-task, and deep, meaningful engagement with vocabulary emerge as the most powerful and controllable levers for promoting robust lexical growth.
While learner-specific factors like L1 background, working memory, age, and socio-economic context influence the rate and route of vocabulary learning, they are consistently secondary to the curriculum, pedagogy, and cognitive practices implemented.

For teachers, this ranking provides a clear roadmap: focus energy and time on ensuring rich input, principled recycling, frequent retrieval, strategy training, and motivation-enhancing practices.
By doing so, we dramatically increase our learners’ chances of building a powerful, flexible, and enduring lexicon — the foundation upon which true language mastery rests.

Summary table

The Impact of Listening-Strategy Instruction on Lower-Intermediate Students in Graham and Macaro (2008)

Introduction

As I gear up for my upcoming speaking tour—six cities across Australia over the next two weeks—I’ve been reflecting on the foundational studies that have shaped the way we teach and think about instructed second language acquisition. In doing so, I’ve returned once more, this time with particular focus – aural strategies instruction, to a study that has been quietly transformative in how we approach listening: Graham and Macaro (2008).

Why this study? Because listening comprehension, though undeniably central to language learning, is often the least explicitly taught of the four core skills. For beginner and lower-intermediate learners, the challenges are considerable: decoding fast speech, parsing unfamiliar vocabulary, coping with reduced forms, and making meaning with limited phonological awareness. These are compounded by low self-confidence and scarce exposure to naturalistic input.

In their 2008 paper, Graham and Macaro proposed a practical and well-structured model of listening strategy instruction tailored to the secondary school classroom. Their research showed that explicitly teaching students how to listen—rather than simply testing whether they had understood—led not only to improved comprehension outcomes but also to increased confidence and greater strategic awareness. The study’s findings have since been echoed in subsequent replications by other researchers, underscoring its potential.

But as any teacher will ask: is this model realistic in the daily grind of an overloaded classroom? Can its benefits be maintained when scaled down or adapted? And what does it actually look like in practice?

In this article, I explore these questions by offering a thorough summary of the study, unpacking its methodology and findings, and evaluating the feasibility of adapting its core principles into mainstream language teaching contexts.

Summary of Graham & Macaro (2008)

Graham and Macaro’s study investigated the effects of explicit strategy instruction on the listening skills of Year 9 French learners in the UK. Based on Vandergrift’s metacognitive framework, the researchers developed a 12-week programme in which learners were taught to plan, monitor, problem-solve, and evaluate their listening. Students were guided through a strategy cycle comprising pre-listening prediction, first listening for gist, post-listening strategy reflection, second listening with focused goals, and an evaluative phase.

Pre-Test Findings

Before the intervention, both the experimental and control groups completed a diagnostic listening comprehension assessment. The findings revealed that learners struggled particularly with understanding spoken French when confronted with longer texts, fast speech, or tasks requiring inference. The majority of students approached listening as a test of word recognition and expressed frustration when they could not understand every word. Metacognitive awareness, such as the ability to reflect on strategies or evaluate performance, was low across both groups, and few students reported using prediction, selective attention, or inference strategies. These pre-test results provided a strong rationale for introducing explicit strategy training.

Methodology

The study employed a quasi-experimental design with two matched groups: one receiving strategy-based instruction (experimental group) and one following traditional listening pedagogy (control group). Both groups consisted of Year 9 learners with comparable attainment levels.

The experimental group received 12 weeks of strategy instruction embedded in their regular listening lessons. Each session followed a structured cycle:

  1. Pre-listening: Students activated prior knowledge and predicted content based on titles and visuals.
  2. First listening: Students listened for gist without writing answers.
  3. Reflection: Learners discussed what they understood and the strategies they used.
  4. Second listening: Focused listening tasks aimed at identifying detail or specific language features.
  5. Post-listening: Learners evaluated their comprehension and reflected on the effectiveness of the strategies used.

The control group received the same listening texts and tasks but without explicit strategy teaching or metacognitive reflection. Assessments included pre- and post-tests of listening comprehension, a metacognitive awareness questionnaire, and follow-up interviews.

Materials Used

The materials for the intervention were carefully designed to be age-appropriate, linguistically accessible, and pedagogically structured. Audio recordings consisted of scripted and semi-authentic dialogues between native French speakers, selected to include a range of topics relevant to the learners’ syllabus, such as school life, leisure activities, and personal descriptions. Each audio track was accompanied by visual prompts (e.g., pictures, headlines, or comprehension grids) to aid prediction and focus attention.

Listening tasks were designed to assess both gist and detail comprehension and included true/false statements, multiple-choice items, and sequencing tasks. Additionally, learners were given printed worksheets that prompted pre-listening predictions and post-listening evaluations. These materials also embedded strategy prompts such as “What do you already know about this topic?” and “What helped you understand the second time?”—thereby reinforcing strategic habits.

Figure 1 – Listening reflection sheet (listening strategy checklist

Target Strategies Modelled

Throughout the intervention, teachers explicitly modelled and scaffolded the following strategies:

  • Prediction: Using context and visual prompts to anticipate content before listening.
  • Selective Attention: Focusing on key words and ignoring unimportant details.
  • Inference: Making educated guesses about unfamiliar language using known vocabulary and tone.
  • Monitoring: Checking for understanding and recognising when something doesn’t make sense.
  • Problem-solving: Using a combination of strategies to recover lost meaning or continue listening despite gaps.
  • Evaluation: Reflecting on how well a strategy worked and adjusting approach accordingly.

Each strategy was not only introduced and explained but also consistently revisited across lessons. Teachers modelled strategies using think-aloud techniques and then encouraged learners to articulate their own strategic thinking during reflection phases.

Results

The results showed measurable differences between the two groups. Students in the experimental group showed significantly greater improvement on post-listening comprehension tests than those in the control group. These students demonstrated gains in both gist and detail comprehension, as recorded through test scores and self-reports. They were also more likely to report applying strategies such as prediction, inferencing, and monitoring during listening tasks.

The metacognitive awareness questionnaire revealed measurable gains in the experimental group in areas such as planning before listening, evaluating after listening, and remaining focused despite difficulties. In contrast, the control group showed minimal change. Qualitative data from student interviews reinforced these findings: learners in the strategy group reported feeling better equipped to cope with difficult passages and were more willing to persevere when faced with difficult listening passages. Some students expressed a sense of increased satisfaction and motivation, though such statements should be interpreted cautiously given the self-reported nature of the data.

Feasibility in Real-Classroom Settings

Despite its strengths, replicating this model on a daily basis in the typical classroom is not straightforward. Unlike the tightly controlled and researcher-supported context of the original study, real classrooms often struggle with behavioural disruptions, student disengagement, time constraints, and teacher workload.

Teachers in mainstream settings typically teach multiple year groups, plan lessons across key stages, and are accountable for high-stakes exam results. Embedding multi-phase listening cycles and structured metacognitive reflection into every lesson would require a level of consistency and time that most timetables do not allow. Furthermore, students may lack the metacognitive maturity to engage meaningfully with strategy talk, particularly in schools where language learning is not strongly valued.

It is also important to consider researcher bias. Graham and Macaro were highly motivated to make their intervention work: the time and effort invested in design, delivery, and analysis needed to produce observable results to justify publication. This level of intensity and motivation is rarely transferable to busy school settings. While the intervention was undoubtedly effective, the conditions of its delivery were unusually favourable.

Effort vs Outcome: A Cost-Benefit Evaluation

When evaluating the cost-benefit balance of this intervention, the picture is mixed. On the one hand, the learning gains were statistically significant and pedagogically meaningful. On the other, the intervention was labour-intensive, and its full implementation is unlikely to be sustainable at scale. A more pragmatic approach is to identify core components of the strategy cycle that can be maintained regularly, while spacing out or simplifying the more demanding elements.

Scalable Listening Strategy Implementation Matrix

ComponentEffort RequiredImpact on LearningFeasibility in Busy ClassroomRecommended Frequency
Pre-listening prediction routineLowModerateHighEvery lesson (5 mins)
Multiple listening cycles (gist/detail)MediumHighMedium1–2x per week
Post-listening metacognitive discussionHighHighLowOccasional (after assessments)
Strategy checklist use (see figure 1 above)MediumModerateHighWeekly self-check
Teacher modelling of strategy useMediumHighMediumWeekly modelling
Student reflective logsHighModerateLowEnd of unit

Conclusion

Graham and Macaro’s 2008 study remains a cornerstone of strategy-based listening instruction, highlighting the value of teaching learners how to think about and manage their listening processes. Their 12-week intervention provided learners with tools that improved both performance and confidence, and encouraged greater listening autonomy. However, its full implementation is pedagogically ambitious and structurally difficult in many secondary schools. A strategic compromise is both necessary and desirable: teachers can adopt a reduced set of high-impact, low-effort routines that nurture listening autonomy without overwhelming their schedule. With careful planning and judicious adaptation, the core principles of metacognitive listening training can still yield meaningful gains—even under typical classroom constraints. The key is not to replicate the study in its entirety, but to extract and embed what works, sustainably.

A very important lesson to be learnt from this study, which confirms what I have observed from my own implementation of a metacognitive strategy-training programme as part of my PhD, is that teaching strategies is a very challenging, time-consuming and effortful process which must be sustained overtime in order for it to pay substantive dividends. Hence, if we want listening strategies to be taught effectively, we need to design a bespoke programme based on the needs of our students which must be delivered sistematically over a period of 3 months or even longer (my hunch is more like six months!).

References

Graham, S., & Macaro, E. (2008). Strategy instruction in listening for lower-intermediate learners of French. Language Learning, 58(4), 747–783.

Vandergrift, L. (1997). The strategies of second language (French) listeners: A descriptive study. Foreign Language Annals, 30(3), 387–409.

Vandergrift, L., Goh, C. C. M., Mareschal, C. J., & Tafaghodtari, M. H. (2006). The metacognitive awareness listening questionnaire: Development and validation. Language Learning, 56(3), 431–462.

Goh, C. C. M. (2000). A cognitive perspective on language learners’ listening comprehension problems. System, 28(1), 55–75.

Cross, J. (2011). Metacognitive instruction and second language listening: Theory, practice and research implications. TESOL Quarterly, 45(2), 269–279.

Using Confidence Rating Scales to Deepen Vocabulary and Grammar Learning in the MFL Classroom

Introduction: What if students could tell us not just what they know, but how sure they are about it?

In modern language classrooms, we work hard to help our students master vocabulary and grammar, but how often do we ask them how confident they are in their knowledge? Confidence rating scales are a simple but powerful way to do just that. By getting students to reflect on how certain they are when recalling a word or grammar rule, we give them the chance to take ownership of their learning. Even better, the research backs us up.

What are confidence rating scales?

Confidence rating scales are tools that ask students to rate how sure they are about an answer, a word, or a grammar structure. You might use a simple three-point scale:

  • 😊 I’m sure I know this
  • 😐 I think I know it, but I’m not sure
  • 😟 I don’t know this yet

Or you might go a little deeper with a 0–3 scale:

ScoreWhat it means
3I knew this instantly and confidently
2I remembered part of it with hesitation
1I needed help or guessed
0I didn’t recall it at all

Students use these after a quiz, during flashcard work, in retrieval grids, or even as part of dictation or translation tasks.

Why should we use them in MFL?

1. They promote real learning, not false confidence. Sometimes students get things right by chance or because it “looks familiar.” But they might not be able to use it fluently or accurately later. Confidence scores help them realise the difference between recognising a word and truly knowing it.

2. They improve memory. Cognitive scientists like Roediger and Butler (2011) have shown that the harder your brain has to work to recall something, the stronger the memory becomes. When students notice which words they’re shaky on and review them again later, they’re more likely to remember them long-term.

3. They support spaced retrieval. Instead of cramming, students revisit tricky words over time—especially the ones they’re not confident with. This helps beat the forgetting curve (Ebbinghaus) and means they’re more prepared for later assessments.

4. They build metacognition. Metacognition means thinking about your own thinking. When students reflect on how sure they are, they learn to study more strategically. Research shows this self-awareness makes learners more effective (Dunlosky & Metcalfe, 2009).

How can we use confidence ratings in practice?

Here are a few examples that are easy to implement in your classroom:

  • During vocabulary quizzes: After each answer, students rate their confidence—even if they got it right. This tells them which words to review again.
  • In vocab books or learning journals: Students can keep a running record of how confident they feel about new words, updating scores over time.
  • With retrieval grids or sentence builders: After completing a task, students mark confidence levels for each word or structure.
  • At revision time: Ask students to sort their vocabulary into three columns: “I’m confident,” “I need a little review,” and “I need to revise this thoroughly.”

What does the research say?

There’s a growing body of research supporting this approach:

  • Roediger & Butler (2011) found that retrieval combined with self-monitoring leads to better long-term retention.
  • Dunlosky & Metcalfe (2009) show that confidence ratings help students target what they really need to work on, reducing wasted time.
  • Pyc & Rawson (2009) proved that recalling difficult or low-confidence items leads to stronger memory than reviewing what’s already known.
  • Nation (2007) and Webb (2007) both emphasise that multiple exposures and reflection are key to deep vocabulary learning.

Final thoughts: A small change with a big impact

Confidence rating scales are easy to use, cost nothing, and take very little time. Yet they can dramatically improve how our students learn and revise. They also give us teachers valuable insight: not just what students got right, but what they’re unsure about—even if they’re not saying it out loud.

In the context of GCSE or KS3 MFL, especially with spaced retrieval cycles for vocabulary (like holidays, media, or school life), confidence ratings add a layer of depth. They help us and our students focus not just on performance, but on progress.

Want to try it? Start small: at the end of your next vocab lesson, ask students to rate how confident they feel about 10 key words. Use smiley faces, numbers, or thumbs up/down. Then revisit those low-confidence words in a week.

It’s a simple change. But in language learning, simple and consistent often wins the race.

References

  • Roediger, H. L., & Butler, A. C. (2011). The critical role of retrieval practice in long-term retention. Trends in Cognitive Sciences.
  • Dunlosky, J., & Metcalfe, J. (2009). Metacognition. Sage Publications.
  • Pyc, M. A., & Rawson, K. A. (2009). Testing the retrieval effort hypothesis. Journal of Memory and Language.
  • Nation, I. S. P. (2007). The vocabulary learning strand.
  • Webb, S. (2007). The effects of receptive and productive vocabulary learning. Studies in Second Language Acquisition.

Cognates: their Power, Pitfalls and Untapped Potential and implications for the classroom

Introduction

Let’s face it: in the chaotic symphony of vocabulary lists, verb drills, and assessment pressure, cognates are the low-hanging fruit we often forget to pick. And yet, they’re the closest thing to linguistic gold we’ve got in the MFL classroom—a built-in bridge between English and French (or Spanish, or German) just waiting to be walked across.

Cognates are the words that look like English, sound like English, and—hallelujah—mean the same thing as English. Nation, communication, participer, pollution… You say them, and students get them. Or do they?

Because here’s the twist: while cognates can fast-track recognition, build confidence, and turbo-charge reading and listening skills, they can also trip learners up, mislead them, or lull them into false fluency. Not all that glitters is gold—especially if it turns out to be a false friend in disguise.

This article dives into the power, the pitfalls, and the untapped potential of cognates. We’ll unpack why they work, when they fail, and how to teach them with more than just a passing nod. You’ll get research, strategy grids, and even a few false friends to watch out for.

So whether you’re new to teaching or seasoned in the MFL trenches, read on: it’s time to give cognates the spotlight they deserve.

What Are Cognates, and Why Do We Use Them?

Cognates are words in two different languages that look or sound similar and mean the same thing. For example, information in English and information in French are cognates. Teachers love using cognates because they help learners feel successful early on. Up to 30–40% of words in French GCSE lists have a cognate in English, so it’s tempting to highlight them often.

Table 1 – Percentage of English cognates in French, Spanish, Italian and German

Psycholinguists say that when we see a cognate, both our first language (L1) and second language (L2) “light up” in the brain. This speeds up word recognition. In terms of brain science, this effect is tied to how bilinguals store and retrieve words across both languages. According to the BIA+ model (Dijkstra & Van Heuven, 2002), word recognition happens in a shared network that handles both languages simultaneously, allowing for what’s called non-selective lexical access. In short, the brain doesn’t “switch off” one language when using the other—it activates both. Brain imaging studies (van Hell & Tanner, 2012) confirm this, showing that learners process cognates with less effort, especially in reading tasks, due to overlapping neural representations in areas like the left inferior frontal gyrus.

What Makes Cognates Helpful?

a. They speed up understanding

When we see a familiar-looking word in a new language, our brain recognises it faster. Costa et al. (2000) found that learners were 30% faster at recognising cognates than non-cognates in word recognition tasks. This is because both language systems in the brain activate together, allowing a kind of shortcut through the brain’s semantic network.

b. They support memory

Because cognates activate both L1 and L2 memory at the same time, they are more likely to stick. This is called co-activation, and it helps learners retrieve the word more easily later. Connectionist models of language processing (like those developed by McClelland & Elman, 1986) suggest that overlapping nodes in a neural network strengthen memory links through repeated exposure, making cognates more retrievable than unrelated new words.

c. They create early success

In beginner classrooms, recognising a word like chocolat or télévision can boost students’ confidence. This can increase motivation and encourage learners to keep going. From a psycholinguistic point of view, this early success taps into the brain’s reward system, reinforcing continued effort through a feedback loop of positive emotional response and reduced cognitive load.

Why Cognates Are Not Always Easy: The Hidden Challenges

Despite these advantages, recognising and using cognates isn’t always as easy as it seems. First, students often fall for false friends—words that look the same but don’t mean the same. For instance, actuellement in French means currently, not actually. These “traps” can mislead learners and cause misunderstandings.

Table 2 – 15 tricky false friends in French

English WordFrench False FriendReal Meaning in French
ActuallyActuellementCurrently
LibraryLibrairieBookshop
CoinCoinCorner
SympatheticSympathiqueNice, friendly
CollegeCollègeSecondary school
AttendAssisterTo be present (not to help)
PretendPrétendreTo claim (not to act)
SensibleSenséSensible
NameNomName
FabricFabriqueFactory
DelayDélaiDeadline / time limit
InjuryInjureInsult
LargeLargeWide
ResumeRésumerTo summarise
EventuallyÉventuellementPossibly

Second, research shows that even true cognates aren’t always recognised correctly. A study by Otwinowska (2015) found that Polish learners of English only recognised 45% of potential cognates in a text, even at intermediate level. This shows that simply pointing them out once isn’t enough. Why is it? The answer is that even though certain cognates might look and sound familiar in French or Spanish, students will only successfully recognise them as “safe matches” if they’ve built up a strong, reinforced mental representation of the word in their first language (L1)and done so in the same modality (reading, listening, etc.). Let’s unpack this with research:

  1. Input modality matters:
    According to Carroll (2017) and Otwinowska (2015), learners are far more likely to correctly recognise a cognate in reading if they’ve encountered the English equivalent in print. Conversely, if they’ve mostly heard a word orally (e.g., through TV or conversation), they’re better at recognising the cognate in listening tasks. The brain’s phonological and orthographic pathways are somewhat separate in early L2 development.
  2. Frequency builds strength:
    In connectionist models of language acquisition (e.g. McClelland & Elman, 1986), word recognition relies on the strength of neural activation pathways, which grow through repetition and context diversity. If a learner has read “information” in ten different articles, the concept and form become tightly linked. When they see information in French, their brain is more likely to light up with the correct match.
  3. Cross-modal mismatch leads to errors:
    Learners who hear a word often but have never seen it written may not recognise a cognate when reading—and vice versa. This is especially problematic in French, where spelling-sound mismatches are common. For example, a student who hears “accident” on the news might not recognise accident in French reading tasks if they’ve never seen the English version written down.
  4. Shallow L1 familiarity = higher error rate:
    Otwinowska (2015) also showed that learners with lower L1 vocabulary depth had up to 40% lower cognate recognition rates, even when the words were formally transparent. In other words, even a perfect formal match doesn’t guarantee transfer—semantic familiarity is key.

Third, hearing cognates in spoken language is even harder. In listening tasks, differences in pronunciation often mask the similarity. Carroll (2017) found that learners recognised only 28% of oral cognates they had previously identified in writing.

This is because phonological input relies heavily on auditory processing, and the brain must match incoming sounds to stored word forms across two languages. This can be especially difficult if the sound patterns differ significantly from those in the learner’s L1. Neural studies (Proverbio et al., 2004) show increased activation in the anterior cingulate cortex—a region linked to conflict monitoring—when learners process false or unfamiliar cognates, especially under time pressure.

Why Cognate Training Needs Time and Structure

Many teachers assume that students will naturally spot cognates, but research into Strategy Training shows the opposite. Learners need structured, repeated practice in identifying and using cognates across different skills—especially listening.

Research indicates that metacognitive strategy training—such as learning how to guess meaning from context or sound out similarities—must be taught explicitly over several weeks. Without this, students revert to shallow guessing or ignore difficult words.

Gu & Johnson (1996) found that learners who received at least 6 weeks of metacognitive strategy training showed significantly greater vocabulary retention and inference accuracy—including the ability to use cognates appropriately.

Macaro (2001) emphasised that lexical inferencing (the process of guessing meaning, often via cognates) is not an innate skill and requires explicit modelling, repeated practice, and metacognitive reflection across at least 5–6 weeks.

Otwinowska (2015) reported that effective cognate identification required multimodal exposure (reading, listening, speaking) over time, with structured support, particularly for learners with lower L1-L2 language distance.

Table 3 – Recommended duration of a cognate-recognition training based on research

Focus AreaMinimum Effective DurationResearch Source
Written cognate recognition (reading)4–6 weeksGu & Johnson (1996); Otwinowska (2015)
Oral cognate recognition
(listening)
6–8 weeksCarroll (2017); Macaro (2001)
Cross-skill integration6+ weeksMacaro (2001); Nation (2007)

Note: Most MFL classrooms do not offer this kind of long-term structured training, especially at KS3 and KS4. Cognates are often mentioned in passing, but not practised systematically.

Conclusions

Cognates can be a powerful teaching tool—when used with intention. They offer early wins, speed up recognition, and make learners feel more confident. But they can also lead to confusion, mislearning, and overconfidence if used without care.

To make the most of cognates:

  • Teach the difference between true and false cognates
  • Train reading and listening strategies that help students spot cognates in speech. Listening with transcripts can be very effective in training students in oral recognition of cognates.
  • Use retrieval and metacognition (like confidence scales) to track which ones stick
  • Don’t assume students “just see them”—train them to do it

Used well, cognates help unlock the language. Used poorly, they can block real understanding.

Teachers cannot expect students to recognize cognates easily, especially under the highly demanding cognitive constraints posed by an exam, without any substantive and systematic formal training in cognate-recognition strategies.

Some key stats and research facts that every teacher should always bear in mind

  • Students recognise 30% cognates faster than non-cognates (Costa et al., 2000)
  • In practice, learners only recognise 45% of true cognates in texts (Otwinowska, 2015)
  • In listening, recognition drops to just 28% (Carroll, 2017)
  • A strategy-training programme in cognate recognition needs to last at least 6 weeks
  • Students from low socio-economic background are less likely to recognize cognates as these tend to be latinate words used in higher-register English

References (Selected)

  • Carroll, G. (2017). Cognate recognition in L2 listening. Journal of French Language Studies.
  • Costa, A., Caramazza, A., & Sebastián-Gallés, N. (2000). The cognate facilitation effect. Journal of Experimental Psychology.
  • Dijkstra, T., & Van Heuven, W. J. B. (2002). The architecture of the bilingual word recognition system: From identification to decision. Bilingualism: Language and Cognition.
  • Gu, Y., & Johnson, R. K. (1996). Vocabulary learning strategies and language learning outcomes. Language Learning.
  • Macaro, E. (2001). Learning strategies in foreign and second language classrooms.
  • McClelland, J. L., & Elman, J. L. (1986). The TRACE model of speech perception. Cognitive Psychology.
  • Otwinowska, A. (2015). Cognate vocabulary in language acquisition and use. Multilingual Matters.
  • Proverbio, A. M., et al. (2004). Electrophysiological analysis of semantic processing in bilinguals. Psychophysiology.
  • van Hell, J. G., & Tanner, D. (2012). Neural activity during cognate processing in L2 learners. Language Learning.

Input Flood: A Research-Based Strategy for Enhancing Language Acquisition

Introduction

Input flood is a technique used in second language acquisition (SLA) in which learners are repeatedly exposed to a specific linguistic feature embedded naturally in comprehensible input. Unlike traditional grammar teaching that isolates and explains rules, input flood saturates learners with high-frequency occurrences of a target form across meaningful contexts. Over time, this facilitates both unconscious acquisition and eventual accurate production. As learners internalise these structures through multiple exposures, they develop both greater receptivity to form and more intuitive usage.

In recent years, input flood has received growing attention not just as a standalone instructional tool but as a foundational principle to be integrated into a broad array of pedagogical approaches. In this article, I explore the theoretical underpinnings, empirical evidence, practical advantages, and synergistic effects of input flood with related instructional techniques such as narrow reading/listening, processing instruction, structural priming, and its central role in my own EPI (Extensive Processing Instruction) approach.

Theoretical Foundations

Input flood draws its conceptual foundation from Krashen’s (1982) Input Hypothesis, which holds that acquisition occurs when learners are exposed to language that is comprehensible but slightly beyond their current level (i+1). According to Krashen, such exposure must be extensive, contextualised, and focused more on meaning than form. Input flood satisfies these criteria by embedding repeated target structures within rich, meaningful contexts.

Additionally, usage-based theories of language acquisition (Ellis, 2002; Tomasello, 2003) support the role of frequency and distribution in shaping mental grammar. Learners abstract rules from repeated input—a process known as entrenchment—and begin to notice regularities without overt instruction. Ellis (2005) further notes that learners develop implicit grammatical knowledge more effectively through frequency-based exposure than through metalinguistic explanation. These foundational principles are also echoed in my EPI framework (Conti & Smith, 2021), where input flood is employed as a core tool to facilitate structured exposure to syntactic patterns across all modes of input.

Empirical Support for Input Flood

Research into input flood demonstrates its potential to promote robust form acquisition. Trahey and White (1993) found that French-speaking ESL learners showed significant improvements in adverb placement after exposure to input floods, even without any explicit instruction. Their study provided early evidence that frequency and input salience can alter learner output.

Doughty and Varela (1998) examined the use of input flood in combination with task-based interaction and corrective feedback, showing substantial gains in learners’ use of the English past tense. Similarly, Hernández (2008) demonstrated that input flood delivered via reading and listening texts targeting the Spanish subjunctive resulted in significant improvements in learner accuracy and recognition.

Studies by Han, Park, and Combs (2008) and others confirm that when relative clauses or tense forms are embedded repeatedly in context, learners show both immediate and delayed gains in their comprehension and production. These effects have also been replicated across multiple learner age groups and instructional settings, suggesting wide applicability. Within my EPI model, input flood is operationalised through sentence builders, L.A.M. (listening-as-modelling) tasks, narrow reading texts, and retrieval-based repetition—designed to maximise cognitive uptake of key patterns.

Benefits of Input Flood

  1. Implicit Learning Support: Repeated exposure to linguistic features in input enables learners to acquire grammatical forms subconsciously (Hulstijn, 2005). This mirrors how first languages are learned and helps avoid cognitive overload.
  2. Low Cognitive Load: By bypassing the need for conscious form analysis, input flood reduces processing burden and facilitates more natural uptake (Sweller, 1994).
  3. Durability: Acquisition through input flood tends to result in longer-lasting knowledge compared to rule memorisation or output drills, particularly when exposure is meaningful (Ellis, 2005).
  4. Engagement and Motivation: When input flood is implemented using authentic and engaging materials—such as narratives, dialogues, and songs—it increases learner motivation and contextualises grammar in communicative use.
  5. Accessibility Across Levels: Unlike grammar instruction requiring a threshold of metalinguistic knowledge, input flood can be implemented with learners of all proficiency levels.
  6. Facilitates Noticing: Learners become sensitised to recurring forms, which promotes deeper processing and eventual rule abstraction.
  7. Structured Implementation within EPI: In my EPI framework, input flood is meticulously sequenced across layers of receptive and productive practice, ensuring multiple memory traces are established before output is expected.

Conclusion

Input flood is a powerful, flexible, and research-validated method for supporting naturalistic language acquisition in the classroom. It offers a low-stress, high-exposure route to acquisition that aligns with core cognitive and linguistic theories. While effective on its own, input flood’s true potential is realised when it is synergised with input enhancement, processing instruction, narrow input strategies, and structural priming.

In my EPI framework, input flood is the backbone of receptive and early productive stages, designed to create robust memory traces through exposure, modelling, repetition, and retrieval. When implemented thoughtfully and supported by purposeful output tasks, input flood can transform the classroom into a rich ecosystem for language acquisition—promoting fluency, accuracy, and long-term retention. Teachers seeking to align instruction with second language acquisition research would do well to embrace this principle as a cornerstone of their practice.

References

  • Conti, G., & Smith, S. (2021). Breaking the Sound Barrier. Routledge.
  • Bock, K. (1986). Syntactic persistence in language production. Cognitive Psychology, 18(3), 355–387.
  • Doughty, C., & Varela, E. (1998). Communicative focus on form. In C. Doughty & J. Williams (Eds.), Focus on Form in Classroom SLA. Cambridge University Press.
  • Ellis, R. (2002, 2003, 2005). Task-Based Language Learning and Teaching. Oxford University Press.
  • Fernández, C., & Schmitt, N. (2015). Narrow reading and vocabulary acquisition. Reading in a Foreign Language, 27(2), 129–145.
  • Han, Z., Park, E. S., & Combs, C. (2008). Textual enhancement of input: Issues and possibilities. Applied Linguistics, 29(4), 597–618.
  • Hernández, T. (2008). The effect of input-based instruction on the acquisition of the Spanish subjunctive. Language Teaching Research, 12(3), 365–385.
  • Hulstijn, J. (2005). Theoretical and empirical issues in the study of implicit and explicit learning. Studies in SLA, 27.
  • Krashen, S. (1982, 2004). Principles and Practice in Second Language Acquisition. Pergamon.
  • Lightbown, P., & Spada, N. (1990). Focus on form and corrective feedback in communicative language teaching. Studies in SLA, 12(4).
  • McDonough, K., & Mackey, A. (2008). Syntactic priming and ESL question development. Studies in SLA, 30(1), 31–47.
  • Rodrigo, V., Krashen, S., & Gribbons, B. (2007). The effectiveness of narrow reading on language acquisition. International Journal of Foreign Language Teaching, 3(1), 1–5.
  • Sharwood Smith, M. (1993). Input enhancement in instructed SLA. Studies in SLA, 15(2), 165–179.
  • Shin, Y., Saito, K., & Aubrey, S. (2021). Structural priming and L2 oral grammar development. Language Learning, 71(2), 291–333.
  • Sweller, J. (1994). Cognitive load theory, learning difficulty, and instructional design. Learning and Instruction, 4(4), 295–312.
  • Tomasello, M. (2003). Constructing a Language: A Usage-Based Theory of Language Acquisition. Harvard University Press.
  • Trahey, M., & White, L. (1993). Positive evidence and preemption in the second language classroom. Studies in SLA, 15, 181–204.
  • VanPatten, B. (2004). Input Processing in SLA. Erlbaum.
  • White, L. (1998). Getting the learners’ attention: Input enhancement in SLA. In C. Doughty & J. Williams (Eds.).

Unlocking Fluency through Extensive Processing Instruction: A Two-Part Training with me

Are you looking to revolutionise your language teaching practice with a proven, evidence-informed approach? Join us on the 29th and 30th of April for a comprehensive workshop on Extensive Processing Instruction (EPI), delivered by me, the creator of the approach.

Hosted on networkforlearning.org.uk, this exclusive online training opportunity will give you both the theoretical foundation and the practical tools to design and deliver lessons that build strong, lasting linguistic competence. Whether you’re a classroom teacher, a department lead, or a curriculum designer, this course offers invaluable insight into how to create sequences of instruction that lead to genuine, long-term language acquisition. You can enrol here.

Why EPI

EPI is a pedagogical approach that places processing at the centre of language learning. Steeped in sound SLA tehory and research, it promotes deep, repeated exposure to meaningful input and structured output activities designed to progressively build fluency.

Rather than rushing into production, EPI sequences learning through carefully scaffolded stages that support long-term acquisition. It has been widely adopted across classrooms in the UK, Australia, and beyond for one key reason: it works.

With EPI, learners:

  • Build automaticity through repetition with variation
  • Develop grammatical competence in context
  • Gain confidence before being expected to speak or write
  • Experience success in manageable, motivating steps
  • Engage meaningfully with tasks that match their level and cognitive maturity

EPI also represents a meaningful shift away from shallow, performance-driven tasks towards activities that promote depth of processing and long-term retention. The approach encourages students to notice language features, retrieve vocabulary with support, and gradually move from supported rehearsal to spontaneous communication.

Theories and Research Behind EPI

My model is not only intuitive—it’s deeply rooted in some of the most influential theories in second language acquisition. EPI draws strength from a confluence of research traditions in cognitive psychology, applied linguistics, and educational neuroscience. What makes EPI particularly distinctive is the way it operationalises these theories into teachable routines and sequences that address both comprehension and production.

At the core is the skill-acquisition theory (Anderson), which posits that language knowledge must be proceduralised through repeated, meaningful practice. This principle is echoed in the carefully graduated steps of the MARS EARS cycle, which begin with heavy scaffolding and gradually foster autonomy and spontaneity.

EPI also incorporates elements of VanPatten’s input processing theory, which highlights the cognitive limitations learners face when trying to process form and meaning simultaneously. In EPI, both listening and reading are approached as modelling experiences — not comprehension tests — allowing students to absorb form-function relationships through repeated, comprehensible input.

John Field’s process-based approach to listening adds further support, positioning listening as an active, trainable skill. EPI’s Listening as Modelling phase aligns with this view by focusing on decoding, segmentation and phonological pattern recognition rather than mere gist-taking.

The Noticing Hypothesis (Schmidt) is reflected in the inclusion of awareness-raising tasks that follow the modelling phase. These tasks draw learners’ attention to specific grammatical or lexical patterns they have already encountered in context, preparing them for more autonomous production later.

From a cognitive angle, working memory theory (Baddeley) justifies EPI’s emphasis on recycling, repetition, and low cognitive-load activities in the initial stages. Tasks are carefully designed to optimise processing and retention by staying within learners’ cognitive capacity.

The approach also draws on Michael Hoey’s Lexical Priming theory, which underscores the role of repeated, context-rich exposure in vocabulary acquisition. EPI ensures high-frequency items and chunks reappear across lessons in familiar but varied linguistic environments.

Krashen’s comprehensible input hypothesis informs the initial modelling and receptive phases of EPI. However, unlike approaches that stop at input, EPI builds systematically toward output, ensuring that learners are prepared to speak and write with increasing spontaneity.

Significantly, EPI integrates Swain’s concept of pushed output by incorporating activities that challenge learners to stretch their language resources. Structured and semi-spontaneous tasks gradually reduce support while increasing communicative demands.

Finally, Paul Nation’s fluency training principles are central to the final phases of the MARS EARS cycle. Repetition with variation, time constraints, and the use of familiar language all serve to automate recall and build processing speed.

Here is a breakdown of the major research foundations EPI draws upon:

Theory / Research AreaCore IdeaHow EPI Applies It
Skill-Acquisition Theory (Anderson, 1982)Learning moves from declarative to procedural knowledge via practiceMARS EARS stages mirror this progression
Input Processing (VanPatten, 1996)Learners must process form and meaning in inputListening as Modelling / Reading as Modelling
Working Memory (Baddeley, 2000)Repetition and cognitive load affect language retentionRecycling and repetition built into all stages
Noticing Hypothesis (Schmidt, 1990)Awareness of form is crucial to acquisitionAwareness-raising tasks included after initial modelling
Comprehensible Input (Krashen, 1982)Input must be understandable and meaningfulHighly scaffolded, high-frequency structures
Lexical Priming (Hoey, 2005)Language is learned through repeated exposure in contextEPI recycles lexis and grammar in varied but familiar contexts
Process-Based Listening (Field, 2008)Listening involves teachable, trainable sub-skillsEPI explicitly teaches decoding and form recognition via modelling

Workshop Overview: What You’ll Learn

🗓 Dates: Monday 29th & Tuesday 30th April
🕐 Time: 5pm–8pm AEST / 8am–11am UK time
🖥 Location: Online via networkforlearning.org.uk

This one-day course, delivered over two evenings, offers a step-by-step guide to designing and implementing an EPI-based curriculum.

Session 1: Planning and Priming

  • How to design a coherent EPI unit
  • Choosing and sequencing high-frequency content
  • Introducing Modelling, Listening as Modelling and Reading as Modelling
  • Priming techniques that activate prior knowledge and prep for success

Session 2 & 3: Grammar, Production, and Fluency

  • Structured production activities that boost accuracy and confidence
  • Grammar teaching aligned with processing principles
  • Tasks to promote fluency, spontaneity, and automaticity
  • A full walk-through of the MARS EARS cycle:

MARS – Modelling, Awareness-Raising, Receptive Processing, Structured Production
EARS – Expansion, Autonomy, Routinisation, Spontaneity

Throughout, I will link every activity to its pedagogical rationale, ensuring that you not only know what to do but also why.

This course counts towards EPI CPD-provider accreditation and provides an ideal foundation for those wishing to become an EPI Accredited Teacher.

👉 For more on becoming an accredited EPI teacher, visit the course portal at networkforlearning.org.uk.

Conclusion

Whether you’re new to EPI or looking to deepen your understanding of its application, this workshop offers the perfect opportunity to engage with the pedagogy in a practical and research-informed way. You’ll leave with a clear, actionable roadmap for designing units, delivering instruction, and supporting learners through every phase of the MARS EARS cycle deeply steeped in SLA theories and research (see table below)

More than just a methodology, EPI is a mindset—one that places meaningful input, scaffolded output, and cognitive development at the heart of language learning. If you’re ready to move away from traditional textbook routines and start building confident, spontaneous language users, I look forward to welcoming you to this highly interactive, energising two-part training.

How Long Does It Take to Learn a Language? Understanding the Factors That Make Some Languages Harder Than Others

Introduction

Learning a language is one of the most enriching things a person can do — and also one of the most misunderstood. In an age of language apps, TikTok polyglots, and soundbite promises of “fluency in 30 days,” it’s easy to lose sight of the reality: becoming proficient in a second language takes time, consistency, and the right conditions. For learners in school systems like the UK, this journey is even more complex, shaped by external constraints and the inherent properties of the target language. So how long does it really take to learn a language? And why do some languages seem much harder than others?

This article draws on research from the Foreign Service Institute (FSI), CEFR, and classroom-based studies to explore what affects the pace of language acquisition. It also offers a grounded, realistic picture of what learners and teachers can expect — particularly in contexts where input is limited and achievement targets are set high.

What Does “Proficiency” Really Mean?

Proficiency, in the context of language learning, is not a vague sense of “being good at it.” It refers to a learner’s ability to understand and use the target language accurately and fluently across listening, speaking, reading, and writing. Most countries and education systems in Europe (including the UK) define proficiency using the Common European Framework of Reference for Languages (CEFR), which spans six levels:

CEFR LevelDescriptionEstimated Guided Learning Hours
A1Beginner: basic everyday expressions90–100
A2Elementary: simple tasks, routine exchanges180–200
B1Intermediate: deal with familiar topics350–400
B2Upper Intermediate: interact with fluency500–600
C1Advanced: complex ideas, nuanced discourse700–800
C2Mastery: near-native precision and style1,000+

According to Council of Europe (2001) guidelines and subsequent classroom-based research (North, 2014), these estimates are based on well-structured and consistent instruction. In practice, real-world learners — especially adolescents in school systems — often take significantly longer due to fragmented exposure, limited hours, and inconsistent practice. 

In theory, reaching B1 level proficiency requires approximately 350–400 guided learning hours (Council of Europe, 2001). However, in England, current constraints in curriculum time, coupled with recent reforms such as the reduced lexical scope of GCSE word lists (e.g. the 2024 MFL Subject Content), mean that learners have access to far fewer vocabulary items than would typically be required to support full B1-level interaction.

If we consider that most students receive around 2.5 hours per week over five years of Key Stage 3 and 4 (roughly 36 weeks per year), this only amounts to around 450 contact hours — and that’s an optimistic estimate that doesn’t account for absences, disruptions, or limited practice time. Even with full attendance, much of this time is spent revisiting basic content, rehearsing exam techniques, and teaching to the test.

Research by Graham et al. (2017) and Mitchell & Marsden (2019) shows that in the current climate, learners in England rarely exceed A2, and only a small subset achieve even partial B1 functionality by the end of GCSE.

One important lens researchers use to evaluate developing proficiency — especially in speaking and writing — is the CAF framework, which breaks performance down into Complexity, Accuracy, and Fluency. Studies applying CAF (e.g., Housen, Kuiken & Vedder, 2012) have shown that genuine B1-level production requires a level of automaticity and flexibility rarely seen in GCSE learners, who are often trained in short, memorised chunks with limited scope for spontaneous, accurate, or complex output.

How Long Does It Typically Take?

The most widely cited estimates come from the Foreign Service Institute (FSI), which trained US diplomats in foreign languages and ranked languages into categories based on the average number of guided learning hours required to reach Professional Working Proficiency (B2–C1 on the CEFR scale) for English-speaking adults.

FSI CategoryLanguage ExamplesHours to Proficiency
Category I – EasyFrench, Spanish, Italian, Portuguese600–750 hours
Category II – ModerateGerman, Swahili, Indonesian, Malay750–900 hours
Category III – DifficultRussian, Turkish, Polish, Romanian, Greek, Hindi1,100 hours
Category IV – Very DifficultArabic, Mandarin, Cantonese, Japanese, Korean2,200+ hours

These figures are based on full-time, intensive study environments typical of diplomatic or military training — around 25 hours of instruction per week, plus self-study and immersion, over a period of several months to a few years. They reflect a scenario where language learning is the learner’s primary focus, and the teaching methods are structured, high-quality, and consistent.

In contrast, learners in school systems — like secondary students in the UK — operate under very different conditions. Time on task is far more limited, often fragmented, and diluted by competing curricular demands. As such, these FSI estimates should be interpreted as best-case scenarios rather than realistic expectations for the average school learner.

Moreover, the FSI’s classifications are based on distance from English and do not account for variables such as learner motivation, teaching quality, or access to authentic input — all of which have been shown to significantly influence rate of progress (Muñoz, 2014; Lightbown & Spada, 2013).

Taken together, the FSI framework is best viewed as a useful guideline for comparing relative difficulty, not absolute timelines. For most learners in non-immersive settings, the number of hours required to achieve the same levels of proficiency is likely to be significantly higher.

What Makes a Language Intrinsically Harder?

Not all languages are created equal in terms of learnability — especially for English speakers. Several linguistic and cognitive factors play a major role in determining how quickly a learner can internalise a new language system.

Orthographic Distance – How different is the writing system?

Languages like Mandarin, Arabic, and Japanese use writing systems that differ significantly from the Roman alphabet. Learning to decode and produce logographic characters (e.g. 汉字 in Chinese) or abjads (where vowels are omitted, as in Arabic) increases the cognitive load.

Phonological Complexity – How hard is it to pronounce and distinguish sounds?

Languages with a high number of phonemes, unfamiliar consonant clusters, or tonal variation can be harder to acquire. For instance, Mandarin Chinese has four lexical tones, meaning the pitch pattern of a syllable changes the word’s meaning. Meanwhile, Arabic includes sounds like the pharyngeal fricatives /ʕ/ and /ħ/ which don’t exist in English.

Morphological Complexity – How many forms do words take?

Highly inflected languages like Russian or Polish require learners to memorise numerous case endings, gender rules, and verb conjugations. This contrasts with relatively analytic languages like English or Mandarin, where word order plays a bigger role than word form.

Syntactic Distance – How different is the sentence structure from English?

Languages with subject-object-verb (SOV) word order, like Japanese and Korean, or those with flexible word order, pose difficulties for English speakers used to the relatively rigid subject-verb-object (SVO) pattern. Embedded clauses, topic-prominent constructions, and honorific systems further complicate the learner’s task.

Lexical Similarity – How much vocabulary is familiar?

Languages that share cognates with English (due to Latin or Germanic roots), like French or Spanish, offer a significant head start. In contrast, languages from different families — like Korean or Hungarian — offer few lexical connections and require learning thousands of entirely new word forms.

Environmental, Learner, Curriculum and Methodological Factors That Prolong the Process

Beyond the linguistic complexity of a given language, numerous external and internal factors significantly influence how long it takes to become proficient — often extending the process well beyond theoretical estimates.

Learning Environment

Many students in mainstream education are limited by the constraints of the school timetable. Language learning is typically allotted no more than two or three sessions a week, and even that is subject to cancellations, assessments in other subjects, or pastoral disruptions. Additionally, exposure to the language outside the classroom is often minimal, especially in predominantly monolingual environments like the UK, where daily encounters with the target language are rare.

Teaching Contact Time

Teaching contact time is arguably the most significant structural factor limiting progress in mainstream language classrooms. Research by Muñoz (2014) and Graham et al. (2017) shows that consistent, high-frequency exposure to the target language is essential for building automaticity and long-term retention. Yet in the UK, learners typically receive as little as two to three hours per week — far below the threshold required to consolidate grammar, expand vocabulary, and develop fluency. These limitations are compounded by frequent timetable interruptions, non-specialist teaching in earlier years, and limited opportunities for structured retrieval practice. In such low-input settings, progress is inherently slow and fragile unless significantly supplemented through immersion, digital exposure, or extracurricular reinforcement.

Learner Variables

Individual differences such as working memory, motivation, learning strategies, prior knowledge of other languages, and even learner beliefs about language learning (Dörnyei & Ushioda, 2011) play a significant role. Learners who are motivated, self-regulated, and have access to technology-enhanced tools progress more quickly than those with lower levels of engagement and confidence. However, most classroom learners remain largely dependent on the teacher and textbook.

Curriculum Design

In recent years, curriculum reforms have introduced more ambitious intentions (such as the MFL Pedagogy Review’s call for a ‘rich diet’ of language), but these often clash with restricted word lists, excessive focus on grammar manipulation, or high-stakes exam pressures. In practice, much teaching time is still devoted to exam rehearsal, leaving little room for meaningful communicative interaction or vocabulary expansion.

Methodology and Input

Finally, the method of delivery matters immensely. Approaches rooted in traditional PPP (presentation–practice–production) are less effective than those based on input-rich, communicative, or task-based instruction (Ellis, 2003). Learners need repeated exposure to comprehensible input, scaffolded output opportunities, and structured retrieval of key language forms. When methodology relies too heavily on isolated grammar exercises or rote translation, progression slows.

Teacher Expertise and Confidence

The skills and confidence of the teacher have a profound impact on the learner’s experience. Many secondary MFL teachers report gaps in their training — especially when it comes to spontaneous speaking, teaching phonics, and recycling vocabulary effectively (Tinsley & Doležal, 2018). When confidence is low, teachers may default to safer, less dynamic methods such as worksheets or rote grammar tasks, which limit opportunities for interaction and personalisation.

Assessment Systems and Accountability Pressures

The nature of assessment also plays a role. In the UK, GCSE and A-level exams remain heavily focused on discrete grammatical accuracy and tightly controlled writing tasks. This emphasis often narrows the scope of what is taught and assessed. As a result, authentic communication, fluency, and risk-taking — which are critical for long-term language development — are often deprioritised in favour of predictable exam outcomes.

Parental and Societal Attitudes

Finally, the broader social and cultural context cannot be ignored. In a largely monolingual society, languages are sometimes perceived as non-essential. Without strong parental support or visible social value attached to language learning, students may approach the subject with limited motivation. Research has shown that learner belief in the usefulness of the subject is a key predictor of sustained engagement (Dörnyei & Ushioda, 2011).

Summary Tables: Ranking Intrinsic and Extrinsic Factors

Intrinsic Linguistic Factors Affecting Learning Difficulty (Ranked by Impact)

FactorDescriptionRelative ImpactSources
Morphological complexityNumber of inflections, cases, and verb formsHighKarlsson (2008); Odlin (1989)
Orthographic distanceDifference in writing systems and scriptsHighDeFrancis (1984)
Phonological complexityUnfamiliar sounds, tones, and consonant clustersHighWong & Perrachione (2007)
Syntactic distanceWord order differences and sentence structureModerate–HighOdlin (1989)
Lexical similarityDegree of shared vocabulary with EnglishModerateNation (2001)

Extrinsic Learning Factors Affecting Attainment (Ranked by Impact)

FactorDescriptionRelative ImpactSources
Contact time and curriculum constraintsLimited instructional hours and fragmented delivery over the school yearHighGraham et al. (2017); Mitchell & Marsden (2019)
Methodology and input qualityEmphasis on grammar over meaningful interactionHighEllis (2003); Lightbown & Spada (2013)
Assessment pressureTeaching to the test and focus on form over functionHighTinsley & Doležal (2018); Mitchell et al. (2019)
Teacher expertiseConfidence and training in modern pedagogyModerate–HighTinsley & Doležal (2018)
Learner motivation & autonomyIndividual differences in engagement and regulationModerateDörnyei & Ushioda (2011)
Exposure beyond the classroomAccess to authentic input and real-world communication opportunitiesModerateMuñoz (2014)
Societal and parental supportPerceived value of language learning in wider cultural and family environmentModerateDörnyei & Ushioda (2011)

Conclusion

Learning a language is not a sprint — it’s a marathon with detours, plateaus, and the occasional uphill struggle. Despite what catchy slogans and language apps suggest, becoming proficient in a second language takes time, practice, and above all, meaningful exposure. While frameworks like the CEFR and FSI provide useful benchmarks, they must be interpreted through the lens of the learner’s context — including how the language is taught, how often it’s used, and how strongly it’s valued both inside and outside the classroom.

The good news? Every hour spent meaningfully engaged with a language — whether reading, listening, speaking, or thinking — builds momentum. And while reaching fluency might take longer than hoped, especially in classroom settings with limited input, the benefits far outweigh the effort. Multilingual learners outperform their monolingual peers in metalinguistic awareness, memory, cultural knowledge, and even career opportunities. The journey might be long, but it is unquestionably worth it.

References

  • Council of Europe. (2001). Common European Framework of Reference for Languages: Learning, Teaching, Assessment. Cambridge University Press.
  • Dörnyei, Z., & Ushioda, E. (2011). Teaching and Researching Motivation. Routledge.
  • Ellis, R. (2003). Task-Based Language Learning and Teaching. Oxford University Press.
  • Graham, S., Courtney, L., & Tonkyn, A. (2017). Motivational challenges experienced by lower-level learners of French at Key Stage 4. The Language Learning Journal, 45(2), 228–243.
  • Housen, A., Kuiken, F., & Vedder, I. (2012). Dimensions of L2 performance and proficiency: Complexity, accuracy and fluency in SLA. John Benjamins.
  • Lightbown, P. M., & Spada, N. (2013). How Languages are Learned (4th ed.). Oxford University Press.
  • Mitchell, R., Bryne, C., & Marsden, E. (2019). Foreign Language Learning: Research, Policy and Practice. Palgrave.
  • Muñoz, C. (2014). Exploring young learners’ foreign language learning awareness. Language Awareness, 23(1-2), 24–40.
  • Nation, P. (2001). Learning Vocabulary in Another Language. Cambridge University Press.
  • North, B. (2014). The development of a common framework scale of language proficiency. Peter Lang.
  • Odlin, T. (1989). Language Transfer: Cross-Linguistic Influence in Language Learning. Cambridge University Press.
  • Tinsley, T., & Doležal, N. (2018). Language Trends 2018: Language teaching in primary and secondary schools in England. British Council.