A blog by Centre for Literacy’s Administrative and Marketing assistant Caterina Incisa.

Whether it’s a scary jack-in-the-box or an unexpected phone call from an old friend, surprises come in all shapes and sizes. In the recent results from PIAAC (also known as The Survey of Adult Skills), though much of the data was expected, there were a few surprises. At our recent Fall Institute – Interpreting PIAAC Results: Understanding Competencies of the Future we invited our participants to each write down one thing that surprised them about the PIAAC results. Here are a few:

“The drop in results from Norway.”

“…. that no great differences were found between the proficiency between recent and established immigrants.”

“What most surprised me about PIAAC? The Canadian media’s overall lack of reportage, particularly given the disproportionate investment and oversampling in Canada relative to all other OECD member nations.”

“Canada normally performs above the OECD average in international surveys. So the fact that in PIAAC we performed below the OECD average was a shock for the general public.”

“Variability within countries.”

 “My surprise: The remarkably high literacy scores of NS, PEI, and NB respondents of those whose first language is not the same as the test language.”

One “surprise” which came up repeatedly was that many young respondents had not done better, especially in the PS-TRE (Problem Solving in Technology Rich Environments) component of the survey:

“Youth versus 55 – 64 showing poor literacy.”

“Low performance of those aged 16 – 24 in Canada.”

“Results for youth opposite of what I expected.”

“I was surprised by the low scores of the younger cohort in England.

“I was surprised that the 16 -24 year old results were not better.”

These phrases floated among tables throughout the three days of the Institute, concern on people’s faces as they wondered why many young people weren’t doing better than the older generations, especially in digital literacy. People were puzzled: isn’t this the Millennial Generation? The generation who have more Facebook friends than real ones? Shouldn’t they be “computer literate”? Well, perhaps this is the exact problem. On Day Two of the Institute David Rosen suggested that perhaps the youth were not using digital technologies in the same way anymore. Rosen suggested that perhaps I.T has evolved into ICSET (Information, Communication, Shopping, and Entertainment Technology). There was a murmur of agreement at my table as people traded stories of sullen teenagers hunched over Facebook and online shopping. In one task, the PS-TRE component of the PIAAC survey asked respondents to take a series of emails and categorize them into folders; perhaps if the problem given had involved hashtags or memes the youth might have performed better?


The low youth scores were particularly a problem in England: The Guardian reported that “England is the only country in the developed world where the generation approaching retirement is more literate and numerate than the youngest, according to the first skills survey by the Organisation for Economic Co-operation and Development.” The BBC noted that “(this) younger group will have many more qualifications, but the test results show that these younger people have no greater ability than those approaching retirement who left schools with much lower qualifications in the 1960s and 1970s.”

During the Fall Institute people suggested that perhaps the technology used in the PIAAC survey was outdated for younger generations, that they were accustomed to using more recent platforms and thus the technology they were being tested on was unfamiliar to them. It brought to mind the following image:


Image taken from We Know Memes.com

How can we expect the youth and older adults to take the test in the same medium if they are not familiar with the same technology/platforms? Might this be a reason that some youth respondents did worse than their older counterparts in the PS-TRE component of the PIAAC survey? The answer is not clear, but the question at least has sparked discussion.

The Centre’s Executive Director, Linda Shohet, has been invited to participate in the Illiteracy: Grand National  Cause 2013 National and European Conference, 13 to 15 November 2013, in Lyon, France. This conference will also welcome representatives from other European countries. As we discovered at our 2013 Summer Institute, France has taken an approach to basic skills very different from that of Canada, the US, and the OECD. While PIAAC looks at variations in the skill levels of the adult population as a whole, French national surveys and interventions have focused on identifying and helping adults with the lowest skill levels.

The conference is the culmination of a national mobilization against “illiteracy” in France:

“ Because public opinion needs to be better informed about this widespread phenomenon that is still something of a taboo in our society, because it is necessary to explain that we can continue to learn at any age and that solutions do exist, and because only collective action can allow us to improve and strengthen the measures to prevent and fight illiteracy, the “Fighting Illiteracy Together, Grand National Cause 2013” alliance brought together by the ANLCI, has been working hard throughout 2013: For the first time in France, a large national awareness campaign has been publicized by the press, radio, television and on the internet.” (ANCLI, 2013)

 This mobilization also included regional conferences involving more than 4,000 participants, held between June and September 2013.

 One of the findings of PIAAC was that in every country surveyed, those who already have high skill levels are the ones who benefit most from adult education opportunities; those with low skills benefit from the least. There may be advantages to focusing attention on the people with the lowest skill levels.

 One component of PIAAC that distinguishes it from its predecessors (IALS, IALSS), is PS-TRE (Problem-Solving in a Technology Rich Environment). This section of the PIAAC survey assessed the ability of adults to accomplish certain tasks and solve problems using a laptop computer and some common software applications such as e-mail or Word. Those respondents who indicated on the background questionnaire that they did not use computers or were found to be unable to use them well enough to take the computer-assisted version of PIAAC were excluded from the PS-TRE component and completed the rest of the survey on paper (same question re test or survey). One surprising Canadian finding was that 7% of those aged 16-24 fell into this category.

What is PS-TRE? As we note in our Fall Institute 2013 Research Scan: Problem-Solving in a Technology Rich Environment and Related Topics, the term seems to have originated in the U.S. National Assessment of Educational Progress in the early 2000’s, in which “(n)ationally representative samples of 8th graders were assessed on two computer – delivered, extended problem solving scenarios […] The two main components of PS-TRE assessed were computer skills and scientific inquiry, and performance was judged by both the quality of answers given to open-ended and multiple-choice questions and of the process undertaken to reach those answers.” The OCED version of PS-TRE similarly combines technological aptitude with abstract problem-solving skills. But is it possible to abstract a generic set of “problem-solving” skills from the context of the actual problem-solving that people do? Since those aged 16-24 are generally assumed to be “digital natives”, some participants at the Fall Institute wondered why they had not done better in PS-TRE, especially in the United States. Could it be that younger people in particular tend to use digital technologies for things that don’t match the “problem-solving” scenarios in PIAAC, as David Rosen suggested during a discussion on PS-TRE on Day 2 of the Institute? Does this reflect a possible different kind of mismatch?

After the October 8, 2013 release of PIAAC results we put together a web page of analyses of reactions to the results in Canada and other countries. In response, some experts in the field sent us their own reactions to PIAAC and to the responses from media and policymakers.  As we note in the introduction:

The OECD also did an early analysis of media reactions to PIAAC in different countries. You can read it at http://oecdinsights.org/2013/10/10/how-the-world-reported-the-oecd-skills-survey/. McGill University professor Ralf St. Clair offers his own analysis on his Literacy and Learning Blog, while noted New Literacy Studies theoretician Brian Street has sent us his comments on PIAAC to the e-consultation on the upcoming United Nations Development Program Gender Equality Strategy [pdf document], focusing on the gender differences found in the PIAAC results as well as the need to make lifelong learning opportunities available to all.


Gail Spangenberg, President of the Council for Advancement of Adult Literacy (CAAL) in the United States, has written a couple of blog posts about the PIAAC results. In the most recent, What’s The Story?, she argues that it is important to help low-skilled adults and that an exclusive focus on “fixing” the K-12 education system won’t do that:

Discussions about our low-skilled adults and what we can do to help lift them up tend to revert mistakenly to what we can do to improve K-12.  I stress this point because historically and in most current media coverage, we fail over and over again to grasp the importance of differentiating adult education from K-12 or colleges.  We need to coalesce around the real and very urgent need to upgrade the basic foundational skills of our adults: our current and future workforce, the parents of our children, and to put it altruistically, the keepers of our freedom.”

We are now putting together a page of links to the presentations and discussions that took place at the Institute, as well as background documentation. Have a look! French presentations are also going up, albeit more slowly. Videos will also be embedded in this page.



From left to right: Beautiful City Theatre actors Adam Daniel Koren, Alyson Leah, Maija Sidial Whitney, and Samantha Chaulk in “Measure for Measure”. Photograph by Tam Lan Truong

To kick off Fall Institute 2013 – Interpreting PIAAC Results: Understanding Competencies of the Future, the Beautiful City Theatre troupe performed Measure for Measure, which included the players repeatedly listening intently for messages of meaning at the feet of a “PIAAC” teddy bear.  Sabadooey PIAAC?, they intoned each time, speaking in tongues. And the bear whispered back disembodied words and phrases: “skills mismatch”, “technology-rich environments”, and the like.  Each pronouncement from the PIAAC bear led into a short mime with sound, but finally led the audience to ask:   What is PIAAC telling us? Is it telling us things that challenge our pre-conceived ideas, or are we mainly hearing what we already believe to be true?

Perhaps a more relevant question than “what does PIAAC tell us?” is “What are people seeing and hearing in PIAAC?”  We heard from a British participant that the poor results for people aged 18-24 in England and Northern Ireland are seen by the Conservative-led government as confirmation that the reforms they have already proposed for British education are indeed necessary. American results were greeted with general dismay by the few who noticed them amidst the political drama over the US government shutdown. Heidi Silver-Pacuilla from the U.S. Office of Vocational and Adult Education said there was virtually “no good news” for the U.S. in PIAAC. In Canada, reaction to the PIAAC results has been subdued: in fact, government officials from Nova Scotia and New Brunswick reported that there was virtually “no buzz” at all about PIAAC in their respective provinces. This may be in part because, as Patrick Bussiere, from Employment Resources and Skills Canada, reported early on Day One, Canada’s results were average overall: above average in Problem-Solving in a Technology Rich Environment (PS-TRE), below average in numeracy, and about average (but rather polarized between the highly-skilled and the poorly skilled) in literacy — no shocks here.

Other possible reasons suggested for the restrained PIAAC coverage in Canada included loss of shock headlines since the OECD and the government of Canada no longer propose Level 3 as a benchmark.  Hence no one can claim that PIAAC tells us that nearly half the Canadian population are “unable to function in a modern economy”.  Still others suggested that there is general  “survey fatigue”.

Whatever the reason, Measure for Measure ended with a light-hearted caution — “Don’t jump to conclusions!” The Institute ended with general agreement that we need more analysis and reflection before we can interpret what the results might mean for policy and program, and that no single survey, however rich in data, can be treated as an oracle on literacy and learning, which are so deeply interconnected with social, economic and human issues such as productivity, poverty, exclusion, family life, health and aging.