Showing posts with label reading. Show all posts
Showing posts with label reading. Show all posts

Sunday, 14 September 2014

International reading comparisons: is England really doing so poorly?

I was surprised to see a piece in the Guardian stating that "England is one of the most unequal countries for children's reading levels, second in the EU only to Romania". This claim was made in an article about a new campaign, Read On, Get On, that was launched this week.

The campaign sounds great. A consortium of organizations and individuals have got together to address the problem of poor reading: the tail in the distribution of reading ability that seems to stubbornly remain, despite efforts to reduce it. Poor readers are particularly likely to come from deprived backgrounds, and their disadvantage will be perpetuated, as they are at high risk of leaving school with few qualifications and dismal employment prospects. I was pleased to see that the campaign has recognized weak language skills in young children as an important predictor of later reading difficulties. The research evidence has been there for years (Kamhi & Catts, 2011), but it has taken ages to percolate into practice, and few teachers have any training in language development.

But! You knew there was a 'but' coming. It concerns the way the campaign has used evidence. They've mostly based what they say on the massive Progress in International Reading Literacy Study (PIRLS), and the impression is they have exaggerated the negative in order to create a sense of urgency.

I took a look at the Read On Get On report. The language is emotive and all about blame: "The UK has a sorry history of educational inequality. For many children, this country provides enormous and rich opportunities. At the top end of our education system we rival the best in the world. But it has long been recognised that we let down too many children who are allowed to fall behind. Many of them are condemned to restricted horizons and limited opportunities." I was particularly interested in the international comparisons, with claims such as "The UK is one of the most unfair countries in the developed world."

So how were such conclusions reached? Read On, Get On commissioned the National Foundation for Educational Research (NFER) to compare levels of reading attainment in the UK with that of other developed countries, with a focus on children approaching the last year of primary schooling.

Given the negative tone of "letting down children", it was interesting to read that "In terms of its overall average performance, NFER’s research found England to be one of the best performing countries." I put that in bold because, somehow, it didn't make it into the Guardian, so is easy to miss. It is in any case dismissed by the NFER report in a sentence: "As a wealthy country with a good education system, that is to be expected."

The evidence of the parlous state of UK education came from consideration of the range of scores from best (95th percentile) to worst (5th percentile) for children in England. Now this is where I think it gets a bit dishonest. Suppose there were a massive improvement in scores for a subset of children, such that the mean and highest scores went up, but with the lowest scoring still doing poorly; presumably, the shrill voices would get even shriller, because the range would extend even further. This seems a tad unfair: yes, it makes sense to stress that the average attainment doesn't capture important things, and that a high average is not a cause for congratulation if it is associated with a long straggly tail of poor achievers. But if we want to focus on poor achievers, let's look at the proportion of children scoring at a low level, and not at some notional 'gap' between best and worst, which is then translated into 'years' to make it sound even more dramatic.

The question is how does England compare with other countries if we just look at the absolute level of the low score corresponding to the 5th percentile. Answer: not brilliant – 16th out of the 24 countries featured in the subset considered by the NFER survey. But, rather surprisingly, we find that the NFER survey excluded New Zealand and Australia, both of whom did worse than England.

So do we notice anything about that? Well, in all three countries, children are learning English, a language widely recognized as creating difficulty for young readers because of the lack of consistent mapping between letters (orthography) and sounds (phonology). In fact, when looking for sources for this blogpost, I happened upon a report from an earlier tranche of PIRLS data, which examined this very topic, by assigning an 'orthographic complexity' score to different languages. The authors found a correlation of .6 between the range of scores (5th to 95th percentile again, this time for 2003 data) and a measure of complexity of the orthography. I applied their orthography rating scale to the 2011 PIRLS data and found that, once again the range of reading scores was significantly related to orthography (r = .72), with the highest ranges for those countries where English was spoken – see Figure below. (NB it would be very interesting to extend this to include additional countries: I was limited to the languages with an orthographic rating from the earlier report).
PIRLS 2011 data: range of reading attainment vs. orthographic complexity
International comparisons have their uses, and in this case they seem to suggest that a complex orthography widens the gap between the best and worst readers. However, they still need to be treated with caution. I haven't had time to delve into PIRLS in any detail, but just looking at how samples of children were selected, it is clear that criteria varied. In particular, there were differences from country to country in terms of whether they excluded children who were non-native speakers of the test language, and whether they included those with special educational needs. Romania, which had the most extreme range of scores between best and worst, excluded nobody. Finand, which tends to do well in these surveys, excluded "students with dyslexia or other severe linguistic disorders, intellectually disabled students, functionally disabled students, and students with limited proficiency in the assessment language." England excluded "students with significant special educational needs". Needless to say, all of these criteria are open to interpretation.

I'm not saying that the tail of the distribution is unimportant. Yes, of course, we need to do our best to ensure that all children are competent readers, as we know that poor literacy is a major handicap to a person's prospects for employment, education and prosperity. But let's stop beating ourselves over the head about this. Research indicates that the reasons for children's literacy problems are complex and will be influenced by the writing system they have to learn (Ziegler & Goswami, 2005) and constitutional factors (Asbury & Plomin, 2013), as well as by the home and school environment: we still have only a poor grasp of how these different factors interact. Until we gain a better understanding, we should of course put in our best efforts to help those children who are struggling. The enthusiasm and good intentions of those behind Read On, Get On are to be welcomed, but their spin on the PIRLS data is unhelpful in implying that only social factors are important.

References
Asbury K, and Plomin R. 2013. G is for genes: The impact of genetics on education and achievement. Chichester: Wiley Blackwell.

Kamhi AG, and Catts HW. 2011. Language and Reading Disabilities (3rd Edition): Allyn & Bacon.

Ziegler JC, & Goswami U (2005). Reading acquisition, developmental dyslexia, and skilled reading across languages: a psycholinguistic grain size theory. Psychological bulletin, 131 (1), 3-29 PMID: 15631549

Saturday, 5 October 2013

Good and bad news on the phonics screen



Teaching children to read is a remarkably fraught topic. Last year the UK Government introduced a screening check to assess children’s ability to use phonics – i.e., to decode letters into sounds. Judging from the reaction in some quarters they might as well have announced they were going to teach 6-year-olds calculus. The test, we were told, would confuse and upset children and not tell teachers anything they did not already know. Some people implied that there was an agenda to teach children to read solely using meaningless materials. This, of course, is not the case. Nonwords are used in assessment precisely because you need to find out if the child has the skills to attack an unfamiliar word by working out the sounds. Phonics has been ignored or rejected for many years by those who assumed that if you taught phonics the child would be doomed to an educational approach that involved boring drills in meaningless materials. This is not the case: for instance, Kevin Wheldall argues that children need to combine teaching of phonics with training in vocabulary and comprehension, and storybook reading with real texts should be a key component of reading instruction.
There is evidence for the effectiveness of phonics training from controlled trials,  and I therefore regard it as a positive move that the government has endorsed the  use of phonics in schools. However, they continue to meet resistance from many teachers, for a whole range of reasons. Some just don’t like phonics. Some don’t like testing children, especially when the outcome is a pass/fail classification. Many fear that the government will use results of a screening test to create league tables of schools, or to identify bad teachers. Others question the whole point of screening: This recent piece from the BBC website quotes Christine Blower, the head of the National Union of Teachers, as saying: "Children develop at different levels, the slow reader at five can easily be the good reader by the age of 11.” To anyone familiar with the literature on predictors of children’s reading, this shows startling levels of complacency and ignorance. We have known for years that you can predict with good accuracy which children are likely to be poor readers at 11 years from their reading ability at 6 (Butler et al, 1985).
When the results from last year's phonics screen came out I blogged about them, because they looked disturbingly dodgy, with a spike in the frequency distribution at the pass mark of 32. On Twitter, @SusanGodsland has pointed me to a report on the 2012 data where this spike was discussed. This noted that the spike in the distribution was not seen in a pilot study where the pass mark had not been known in advance. The spike was played down in this report, and attributed to “teachers accounting for potential misclassification in the check results, and using their teacher judgment to determine if children are indeed working at the expected standard.” It was further argued that the impact of the spike was small, and would lead to only around 4% misclassification.
However, a more detailed research report on the results was rather less mealy-mouthed about the spike and noted “the national distribution of scores suggests that pupils on the borderline may have been marked up to meet the expected standard.” The authors of that report did the best they could with the data and carried out two analyses to try to correct for the spike. In the first, they deleted points in the distribution where the linear pattern of increase in scores was disrupted, and instead interpolated the line. They concluded that this gave 54% rather than 58% of children passing the screen. The second approach, which they described as more statistically robust, was to take all the factors that they had measured that predicted scores on the phonics screen, ignoring cases with scores close to the spike, and then use these to predict the percentage passing the screen in the whole population. When this method was used, only 46% of children were estimated to have passed the screen when the spike was corrected for.
Well, this year’s results have just been published. The good news is that there is an impressive increase in percentage of children passing from 2012 to 2013, up from 58% to 69%. This suggests that the emphasis on phonics is encouraging teachers to teach children about how letters and sounds go together.
But any positive reaction to this news is tinged with a sense of disappointment that once again we have a most peculiar distribution with a spike at the pass mark. 
 
Proportions of children with different scores on phonics screen in 2012 and 2013. Dotted lines show interpolated values.

I applied the same correction as had been used for the 2012 data, i.e. interpolating the curve over the dodgy area. This suggested that the proportion of cases passing the screen was overestimated by about 6% for both 2012 and 2013. (The precise figure will depend on the exact way the interpolation is done). 
Of course I recognise that any pass mark is arbitrary, and children’s performance may fluctuate and not always represent their true ability. The children who scored just below the pass mark may indeed not warrant extra help with reading, and one can see how a teacher may be tempted to nudge a score upward if that is their judgement. Nevertheless, teachers who do this are making it difficult to rely on the screen data and to detect whether there are any improvements year on year. And it undermines their professional status if they cannot be trusted to administer a simple reading test objectively.
It has been announced that the pass mark for the phonics screen won’t be disclosed in advance in 2014, which should reduce the tendency to nudge scores up. However, if the pass mark differs from previous years, then the tests won’t be comparable, so it seems likely that teachers will be able to guess it will remain at 32. Perhaps one solution would be to ask the teacher to make a rating of whether or not the test result agrees with their judgement of the child’s ability. If they have an opportunity to give their professional opinion, they may be less tempted to tweak test results. I await with interest the results from 2014!

Reference
Butler, Susan R., Marsh, Herbert W., Sheppard, Marlene J., & Sheppard, John L (1985). Seven-year longitudinal study of the early prediction of reading achievement Journal of Educational Psychology, 77, 349-361 DOI: 10.1037//0022-0663.77.3.349

Thursday, 26 September 2013

Raising awareness of language learning impairments


A couple of years ago I did a Google search for ‘Specific language impairment’. I was  appalled by what I found. The top hit was a video by a chiropractor who explained he’d read a paper about neurological basis of language difficulties; he proceeded to mangle its contents, concluding that cranial osteopathy would help affected children.

I’ve previously described how I got together with colleagues in 2012 to try and remedy this situation, culminating in a campaign for Raising Awareness of Language Learning Impairments (RALLI). The practicalities have sometimes been challenging but I’m pleased to say that the collection of videos on our RALLI site has now attracted over 90,000 hits, providing an accessible and evidence-based source of information about developmental language impairments. As well as research-based films we have videos with practical information for parents, children and teachers.

So here, for those of you interested in this topic, is an index of what we have so far:

Background to RALLI

Research topics

Information for teachers

Support for parents and children

International
     Spanish translations/subtitled versions
Reference  
Bishop, D. V. M., Clark, B., Conti-Ramsden, G., Norbury, C., & Snowling, M. J. (2012). RALLI: An internet campaign for raising awareness of language learning impairments Child Language Teaching & Therapy, 28 (3), 259-262 DOI: 10.1177/0265659012459467

Saturday, 17 August 2013

Changing children's brains

Portraits of Serafino & Francesco Falzacappa; Pier Leone Ghezzi [1674 - 1755]
Source: J. Paul Getty Museum


Our children are being exposed to an experience that alters their brains in ways we do not fully understand. There is now strong evidence that patterns of brain connectivity are different in individuals who have been exposed compared to those who have not1. Influential figures have expressed concern that people’s memories will be restricted by this experience, which removes the need for them to memorise material, and allows them to look things up instead2. And indeed, there is clear evidence of changes in cognitive processing, as predicted3. Furthermore, instead of learning in a social context, our children are increasingly being encouraged to engage in solitary activities that deprive them from the benefits of interacting with other people. And, rather than embracing traditional influences, they are exposed to alien ideas from other cultures4,5 

Does this sound familiar? Are you thinking computer games, ipads, smart phones? If so, then consider: We are changing children’s brains, altering their memories, and influencing their ideas by exposing them to books.

References
1. Dehaene, Stanislas, Pegado, Felipe, Braga, Lucia W., Ventura, Paulo, Filho, Gilberto Nunes, Jobert, Antoinette, Dehaene-Lambertz, Ghislaine, Kolinsky, Régine, Morais, José, & Cohen, Laurent (2010). How learning to read changes the cortical networks for vision and language Science, 330, 1359-1364 DOI: 10.1126/science.1194140 
2. http://outofthejungle.blogspot.co.uk/2007/11/socrates-objections-to-writing.html
3. Ong, W. J. (1982). Orality and Literacy. London and New York: Routledge.
4. http://yalebooks.wordpress.com/2012/06/11/a-history-of-women-readers-belinda-jack-discusses-the-relationship-between-gender-and-literacy/
5. Nafisi, A. (2003). Reading Lolita in Tehran. New York: Random House.



Tuesday, 3 April 2012

Phonics screening: sense and sensibility

There’s been a lot written about the new phonics test that is being introduced in UK schools in June. Michael Rosen cogently put the arguments against it on his blog this morning. A major concern is that the test involves asking children to read a list of items, and takes no account of whether they understand them. Indeed, the list includes nonwords (i.e. pronounceable letter strings, such as "doop" or "barg") as well as meaningful words. So children will be “barking at print” - a very different skill from reading for meaning.

I can absolutely see where Rosen is coming from, but he’s missing a key point. You can’t read for meaning if you can’t decode the words. It’s possible to learn some words by rote, even if you don’t know how letters and sounds go together, but in order to have a strategy for decoding novel words, you need the phonics skills. Sure, English is an irritatingly irregular language, so phonics doesn’t always give you the right answer, but without phonics, you have no strategy for approaching an unfamiliar word.
Back in 1990, Hoover and Gough wrote an influential paper in 1990 called “The Simple View of Reading”. This is clearly explained in this series of slides by Morag Stuart from the Institute of Education. It boils down to saying that in order to be an effective reader you need two things: the ability to decode words, and the ability to understand the language in a text. Some children can say the words but don’t understand what they’ve read. These are the ones Michael Rosen is worried about. They won’t be detected by a nonword reading test. They are all-too-often missed by teachers who don’t realise they are having problems because when asked to read aloud, they do fine. There’s a fair bit of research on these so-called “poor comprehenders”, and how best to help them (some of which is reviewed here). But there are other children with the opposite pattern: good language understanding but difficulties in decoding: this corresponds to classic dyslexia. There are decades of research showing that one of the most effective ways of identifying these children is to assess their ability to read novel letter sequences that they haven’t encountered before - nonwords. Nonword reading ability has also been shown to predict which children are at risk for later reading failure.  It's useful precisely because it tests children's ability to attack unfamiliar material, rather than testing what they have already learned. It's a bit like a doctor giving someone a stress test on a treadmill. They may never encounter a treadmill in everyday life, but by observing how they cope with it, the doctor can tell whether they are at risk of cardiovascular problems.

Some children don’t need explicit teaching of phonics - they pick it up spontaneously through exposure to print. But others just don’t get it unless it is made explicit. I’m coming at this as someone who sees children who just don’t get past first base in learning to read, and who fall increasingly far behind if their difficulties aren’t identified. A nonword reading test around age 6 to 7 years will help identify those children who could benefit from extra support in the classroom.
So that’s the rationale, and it is well-grounded in a great deal of reading research. But is there a downside? Potentially, there are numerous risks. It would be catastrophic if teachers got the message from this exercise that reading instruction should involve training children to read lists of words, or worse still, nonwords. Unfortunately, testing in schools is increasingly conflated with evaluation of the school, and so teaching-to-the-test is routinely done. The language comprehension side of reading is hugely important, and shouldn't be neglected. Developing children’s oral language skills is an important component of making children literate. It is also important for children to be read to, and to learn that books are a source of pleasure.
Another concern is children being identified at an early age as failing. The cutoff that is used is crucial, and there are concerns that the bar may be set too high.  Children at real risk are those who bomb on nonword reading, not those who are just a bit below average.
The impact on children’s self-perception is also key. There is already evidence that some primary school children are unduly stressed by SATS. There’s nothing more likely to put a child off reading than being given a test that they don’t understand and being told they’ve failed it. When I was at school, we had the 11+ examination that divided children into those who went to grammar school and those who didn’t. I had friends whose parents promised them a bicycle if they passed - even though there was precious little practice that you could do for the 11+, which was designed to test skills that had not been explicitly taught. Schoolfriends who failed were left with a chip on their shoulder for years. I’d hope that this reading screen is introduced in a more sensitive manner, but the onus is on parents, teachers and the media to ensure this happens. This screening test should serve as a simple diagnostic that will allow teachers to identify those children whose weak letter-sound-knowledge means that they could benefit from extra support. It should not be used to evaluate schools, make children feel they are failures, worry their parents, or support a sterile phonics-only approach to reading.

References
Connor, M. J. (2003). Pupil stress and standard assessment tasks (SATs) An update. Emotional and Behavioural Difficulties, 8(2), 101-107. doi: 10.1080/13632750300507010
Hoover, W. A., & Gough, P. B. (1990). The simple view of reading. Reading and Writing, 2, 127-160.
Nation, K., & Angell, P. (2006). Learning to read and learning to comprehend. London Review of Education, 4(1), 77–87. doi: 10.1080/13603110600574538
Rack, J. P., Snowling, M. J., & Olson, R. K. (1992). The nonword reading deficit in developmental dyslexia. Reading Research Quarterly, 27, 29-53.
Snowling, M., & Hulme, C. (2012). Interventions for children's language and literacy difficulties International Journal of Language & Communication Disorders, 47 (1), 27-34 DOI: 10.1111/j.1460-6984.2011.00081.x

 
free counters

Saturday, 15 October 2011

Lies, damned lies, and spin

©www.cartoonstock.com

The Department for Education (DfE) issued a press report this week entitled “England's 15-year-olds' reading is more than a year behind the best”. The conclusions were taken from analysis of data from the PISA 2009 study, an OECD survey of 15-year-olds in the principal industrialised countries.

The DfE report paints a dire picture: “GCSE pupils' reading is more than a year behind the standard of their peers in Shanghai, Korea and Finland….Fifteen-year-olds in England are also at least six months behind those in Hong Kong, Singapore, Canada, New Zealand, Japan and Australia, according to the Department for Education's (DfE) analysis of the OECD's 2009 Programme for International Student Assessment (PISA) study.” The report goes on to talk of England slipping behind other nations in reading.
Schools Minister Nick Gibb is quoted as saying: “The gulf between our 15-year-olds' reading abilities and those from other countries is stark – a gap that starts to open in the very first few years of a child's education.”
I started to smell a rat when I looked at a chart in the report, entitled “Attainment gap between England and the countries performing significantly better than England” (my emphasis). This seemed an odd kind of chart to provide if one wanted to evaluate how England is doing compared to other countries. So I turned to the report provided by the people who did the survey.
Here are some salient points taken verbatim from their summary on reading:
  • Twelve countries had mean scores for reading which were significantly higher than that of England. In 14 countries the difference in mean scores from that in England was not statistically significant. Thirty-eight countries had mean scores that were significantly lower than England.
  • The mean score for reading in England was slightly above the OECD average but this difference was not statistically significant.
  • England’s performance in 2009 does not differ greatly from that in the last PISA survey in 2006.
There is, of course, no problem with aiming high and wanting our children to be among the top achievers in the world. But that’s no excuse for the DfE's mendacious manipulation of information.

Reference
Bradshaw, J., Ager, R., Burge, B. and Wheater, R. (2010). PISA 2009: Achievement of 15-Year-Olds in England. Slough: NFER.