Where is the Focus of OLPC in Peru and ICT4E in General?
A fascinating discussion has emerged on one laptop per child programs – much has been said in the previous articles and there have been many posts in the past several days on the topic of OLPC in Peru and similar efforts elsewhere. Research has been cited, and personal experiences and opinions shared. Where do we go from here?
I want to specifically address the broader ICT for Education (ICT4E) question of whether or not a single intervention can have an impact on student achievement.
In looking at the discussions and research to date, I suggest an ICT4E intervention can have (and should have!) an impact, but only through focus. Specifically, focus on:
- Specific problems and explicit objectives
- Student achievement
- Monitoring and evaluation
To start, let me clarify what I mean with ICT4E initiatives for the purpose of this article: I am talking about comprehensive ICT interventions at the classroom/student level such as OLPC (but not only one-to-one programs). Such efforts would at a minimum combine elements of providing electronic teaching and learning resources (content), devices to access the content, and related teacher training. Under the paradigm of focus, I also propose that we get more precise when we talk about ICT4E, as clearly one intervention type or implementation model does not necessarily compare to the other.
Focus ICT4E interventions on specific problems and formulate explicit objectives
The OLPC Peru program aims “To improve the quality of public primary education, especially that of children in the remotest places and in extreme poverty, prioritizing multi-grade schools with only one teacher”. The program description does not mention specific subjects or explicit skills in any of its objectives, and there are plenty of programs out there with similarly vague statements.
I think we need to get more evidence-based, specific, and realistic when talking about anticipated impacts and avoid overgeneralization in our program descriptions and general ICT4E rhetoric.
In a focused approach to ICT4E intervention planning and program design we would unpack the broader goal, identify the key problems area(s) on the classroom level, and address them in a systematic manner. Children’s early reading skills, e.g., are low “…most children read badly, with poor fluency and limited comprehension”. Oscar Becerra in his contribution also mentions the challenges of teachers’ reading levels. Further, the effort to add 200 e-Books to the other content provided on the XOs, and the fact that the section on how to use these provides the most descriptive and detailed pedagogical guidance in the OLPC manuals, is an indicator that reading has already been recognized, although not explicitly and systematically, for specific attention.
Continuing using early reading as an example, a focused ICT intervention could draw on research to date on how to improve reading in the early grades even in low-resource environments. From this work, we can learn that a focus on the classroom-level and the very specific, underlying skills of early reading (such as phonemic awareness and letter recognition), with simple-to-implement and direct approaches to teaching (and teacher training), low-cost materials and a focus on continuing assessment to monitor progress can make a significant difference in students’ early reading outcomes already within a relatively short period of time.
A role for an ICT intervention in this scenario could then be how to get to such results in a more effective and efficient manner and/or for specific groups of students whose needs are not yet fully addressed through existing, though working, approaches.
In an effort to respond to the (very valid) sustainability and scale up requirements, to adhere to best practices in institutionalization and sector-wide approaches, and in dealing with the significant challenges related to procurement – of equipment and content – ICT4E initiatives get caught up in investing resources and efforts on too many fronts already at the very beginning, without having had a chance to even get to the classroom and validate their impact where it should matter most: at the student level.
Focus on student achievement
A lot of discussion related to this month’s topic raised the question of whether or not ICT4E initiatives should be assessed in regards to their contribution to improved student learning outcomes and specifically academic achievement.
In environments where resources are extremely limited, the choice for implementing a certain program is de facto a decision against an alternative approach. I question how we justify not expecting comprehensive classroom-level ICT initiatives to focus on learning outcomes when even the most foundational skills critical for future learning, such as early reading, are not being acquired by large proportions of children.
Clearly, other skills, including cognitive skills, critical thinking skills, etc., are important for learning and child development. Thus it is important to move the body of knowledge on ICT for Education further into exploring such new domains. However, should this really be done on the expense of education systems that already struggle with basic service delivery and ensuring universal learning of the most foundational skills? As many of us remember, there has already been much discussion around the issue of added-value of technology in good schools, versus challenge schools (systems).
I propose to go back to the drawing board and try to get more specific about where we expect ICT can make an impact, define impacts more narrowly, and develop a roadmap of the steps including outputs and outcomes to get there.
A revised ICT4E initiative objective could under such an approach read like: “… aims to improve early reading skills, specifically fluency, among students in grades 1 and 2 of public primary schools” within an average timeline of a donor-funded project of some 5 years. The OLPC Peru program sets a good example in regard to focusing on a specific target: “…children in the remotest places and in extreme poverty, prioritizing multi-grade schools with only one teacher.”
While ICT is still not a magic bullet to resolve this issue, such an objective is at least measurable as there are internationally validated tools, and – as we can learn from reading curricula around the world – results should be detectable within a reasonable timeframe of 1-2 years for most languages and scripts. Other objectives may take longer to achieve and may need to be broken down into intermediate results and steps to find the balance between being ambitious and realistic.
Establishing priorities and validating impact on specific aspects of larger objectives and expanding on these over time will go a long way to building credibility for the ICT4E and the larger ICT for Development practice area, and rallying the national support needed for a program.
With such a focused objective in hand, program implementers can then align activities (provision of content, equipment, training, etc. ) and design monitoring and evaluation frameworks that provide relevant information to the project internally and to the larger body of knowledge.
Focus on rigorous Monitoring and Evaluation in ICT4E interventions
Much has already been said about the lack of systematic Monitoring and Evaluation (M&E) of progress and impact in OLPC programs and elsewhere.
Implementing good M&E is difficult. It is hard to design relevant and appropriate frameworks with coherent indicators, realistic data collection methods, and validated tools. Thus, M&E costs money, which can be more or less depending on the complexity of the program and/or the objectives and questions driving program implementation. And it is even harder to focus on M&E when at the same time available resources and people engaged in the program are tied up with the enormous tasks related to the logistics of getting the hardware devices into children’s hands.
In spite of this, however, absence of systematic M&E deprives any program of the opportunity to tell a story much beyond anecdotal evidence once it comes to evaluation.
Furthermore, without systematic monitoring, there is no opportunity to implement corrective measures to program parameters and to continually check back whether activities and resources are moving the initiative towards project objectives before scale up/institutionalization. Monitoring can take many faces and each of them can be strategically employed at different times in a project life-cycle to help improve the program. Monitoring may involve low-effort usability tests with target group users on ease of use of equipment, user acceptance tests for new content and approaches, pilots for new curricula, or comprehensive randomized controlled trial methodologies that allow for a more precise look at the effect of the equipment versus that of the content or the teacher training on the overall project objective before scale-up.
Although no written M&E framework was found on the OLPC Peru website, some monitoring- type activities have taken place. From an outsider’s glance it seems, however, as if these were not yet fully capitalized on to improve the implementation model.
In 2007, in advance of the main roll-out, a pilot was conducted in Arahuay with 48 students for several months. The pilot unearthed a number of hardware design and technical installation issues that were subsequently fixed. However, there seems to be little documentation of any change in the pedagogical model related to the educational resources or teacher training provided as a result of the pilot.
Further, the first IDB study was conducted in 2009 and found that “Concerning use, it is worth making note of what looks like a decreasing utilization of computers in the classroom, which could be a reflection of the need for more technical and pedagogical support for the teachers, as well as of the lack of planning sessions, activities and digital resources appropriate for educational use.” Although in other sources, an iterative improvement process to the roll-out is being stated, the IDB 2012 report does not indicate implementation design adjustments towards more pedagogical support or resources appropriate for educational use between 2009 and 2010.
The OLPC Peru program has been given a potentially huge opportunity through this present IDB study, which, given the rigor of its methodology should be taken very seriously. For this, however, the community would need to agree that the current study is conceptualized as a monitoring effort, providing relevant feedback at a strategic point in time for the program, rather than constituting an “end of program” type evaluation effort. On the side of project implementers, it will require some potentially hard strategic decisions about the need for and focus of future efforts and resources.
In general for the ICT4Ed practice area, I propose we adopt a more focused approach to establish the added-value ICT provides in low-income countries and:
- Avoid generalizing results from studies done on specific models/intervention types (and their particular choices for equipment, content and implementation) to “ICT for Education” as a whole.
- Focus on very explicit problems and measurable objectives for ICT interventions first of all and avoid general, overambitious objective statements.
- Get more systematic and rigorous in monitoring ICT for Education initiatives to find out first, on a small scale, what works (or not) for specific contexts before scale up.
- Increase expectations on comprehensive classroom-level ICT initiatives to contribute to measurable results – specifically in key learning areas – instead of expecting different standards for ICT4Education compared to other educational improvement efforts.
Perhaps the most glaring ISSUE raised in many of the OLPC (and ICT4E in general) reports is the pervasive lack of internet access and library resources at these underserved schools.
Why is everyone harping on about maths and literacy results with or without ICTs in the classroom, when, as reported in Wayan’s news of the destructive Peruvian warehouse fire the other day, the cost of 61,000 OLPC “US$200″ accounted for less than 15% of total cost of destruction cited in the article – a loss which also included 500,000 extremely expensive books and 6,000 solar panels destined for these schools!
I really think we should be rethinking one laptop per child and other ICT4E projects with the following formula:
one library/media centre + a well-paid, suitably motivated, skilled ‘informationist’ per school, with
FREE high speed Internet access at these sites for folks to access increasingly more OERs –
OERs which are adapted for use on an increasingly diverse suite of (Bring Your Own) Devices… informationists are there to guide such use!
remember the following?
"critical component of OLPC will be its ability to connect to wireless networks and in particular to mesh networks. This is expected to significantly lower the costs of connectivity for the adopting schools or school districts, since in a mesh network each machine assists each other in transmitting information through the network. This feature also creates the requirement of massive distribution in particular locations, enough to create a dense connectivity environment in which most homes will end up having a machine that at the same time benefits from and supports the network"
wishful thinking … I sometimes get the feeling that most developing country governments (read politicians in power) are simply in no great rush to get their voting masses to become internet-savvy…
Your first point should count double for the OLPC Peru activity, which is an outlier among ICT4E interventions:1. Avoid generalizing results from studies done on specific models/intervention types (and their particular choices for equipment, content and implementation) to “ICT for Education” as a whole.The approach taken in Peru can be generalized only on the approach promoted by OLPC's leadership – that laptops alone, without an integration into curriculum and the teacher training to facilitate it, will somehow create an implementation miracle:.<img src="http://www.olpcnews.com/images/olpc-plan.jpg" width="550px">.Best represented by this equation, it was marketing brilliance – it sidestepped the hard work of deployment with a politician-friendly solution: just hand out a thing and skip the messy human issues. As we now have seen, this didn't work.However, this is not the approach of most ICT4E interventions, which are working on the human change management as much as the technology problems. I would prefer more a focus on the human side, but at least we're getting past the helicopter drop deployment conversation – finally.
Although it is now widely claimed that OLPC in Peru was a failure, I do not agree. My point of view is that those making that claim are asking the wrong questions and getting less than useful answers as a direct result. What of the benefits other than those shown by standardized tests? What of the connections that children in Peru are making with each other and the rest of the world? What of children of subsistence farmers finding better agricultural information on the Internet? What of the development of the Free Software culture in Peru, which has long championed freedom from commercial software and data format lockin? What of the hundred other questions that I could ask, but do not have room for here?
And who can seriously claim that we will not rewrite the textbooks, rework the curriculum, and retrain the teachers? Only those who refuse to pay attention to the existing programs to do so (such as the Sugar Labs program that I lead for Replacing Textbooks), and who demand that we complete these tasks by next Tuesday at the latest. Such a project cannot be completed before deployment. It will take at least a generation to assimilate what children can learn using computers, when, for example, freed from pencil and paper drudgery in arithmetic; or when able to join together in mapping their country's culture, or agriculture, or health issues, or whatever the students and their teachers discover to be necessary and possible; or when students anywhere in a country or anywhere in the world can form friendships, start bands, learn languages, or prepare to create businesses, NGOs, and civil society organizations.
I agree that interventions that are focused on obtaining results in particular learning areas may have a much better chance of finding positive effects rather than those that are not targeted. However, measurement issues are in some way tilted toward finding that targeted interventions are more effective. It is easier to identify large effects in one dimension rather than many small effects in a range of dimensions. Ideally, we should experiment with targeted and not targeted interventions and see impacts in adult outcomes (the long-term effect that we may agree that is the ultimate goal). However, this will take 20 years… In the meantime, we will need to specify some short term areas of impact to focus measurement in them. Efforts to enlarge the range of skills that are measured could be useful. But at the end of the day, we will need to put some priorities in terms of skills. Carmen makes a strong argument that in environments where children are hardly learning to read or to do basic Math, we may need to devote the limited resources to solve these problems. In other scenarios were children are acquiring these skills, we may focus more on other dimensions such as general cognitive skills or creativity.