{"id":2350,"date":"2012-03-20T09:39:36","date_gmt":"2012-03-20T13:39:36","guid":{"rendered":"https:\/\/edutechdebate.org\/?p=2350"},"modified":"2012-09-27T10:39:00","modified_gmt":"2012-09-27T14:39:00","slug":"where-is-the-focus-of-olpc-in-peru-and-ict4e-in-general","status":"publish","type":"post","link":"https:\/\/edutechdebate.org\/olpc-in-peru\/where-is-the-focus-of-olpc-in-peru-and-ict4e-in-general\/","title":{"rendered":"Where is the Focus of OLPC in Peru and ICT4E in General?"},"content":{"rendered":"
<\/a><\/p>\n A fascinating discussion has emerged on one laptop per child programs – much has been said in the previous articles and there have been many posts in the past several days on the topic of OLPC in Peru and similar efforts elsewhere. Research has been cited, and personal experiences and opinions shared. Where do we go from here?<\/p>\n I want to specifically address the broader ICT for Education (ICT4E) question of whether or not a single intervention can have an impact on student achievement.<\/p>\n In looking at the discussions and research to date, I suggest an ICT4E intervention can have (and should have!) an impact, but only through focus. <\/strong>Specifically, focus on:<\/p>\n To start, let me clarify what I mean with ICT4E initiatives for the purpose of this article: I am talking about comprehensive ICT interventions at the classroom\/student level such as OLPC (but not only one-to-one programs). Such efforts would at a minimum combine elements of providing electronic teaching and learning resources (content), devices to access the content, and related teacher training. Under the paradigm of focus<\/em>, I also propose that we get more precise when we talk about ICT4E, as clearly one intervention type or implementation model does not necessarily compare to the other.<\/p>\n Focus ICT4E interventions on specific problems and formulate explicit objectives<\/strong><\/p>\n The OLPC Peru program aims \u201cTo improve the quality of public primary education, especially that of children in the remotest places and in extreme poverty, prioritizing multi-grade schools with only one teacher\u201d. The program description<\/a> does not mention specific subjects or explicit skills in any of its objectives, and there are plenty of programs out there with similarly vague statements.<\/p>\n I think we need to get more evidence-based, specific, and realistic when talking about anticipated impacts and avoid overgeneralization in our program descriptions and general ICT4E rhetoric.<\/p>\n In a focused approach to ICT4E intervention planning and program design we would unpack the broader goal, identify the key problems area(s) on the classroom level, and address them in a systematic manner. Children\u2019s early reading skills, e.g., are low<\/a> \u201c\u2026most children read badly, with poor fluency and limited comprehension\u201d. Oscar Becerra in his contribution<\/a> also mentions the challenges of teachers\u2019 reading levels. Further, the effort to add 200 e-Books to the other content provided on the XOs, and the fact that the section on how to use these provides the most descriptive and detailed pedagogical guidance in the OLPC manuals<\/a>, is an indicator that reading has already been recognized, although not explicitly and systematically, for specific attention.<\/p>\n Continuing using early reading as an example, a focused ICT intervention could draw on research to date on how to improve reading in the early grades even in low-resource environments. From this work, we can learn that a focus on the classroom-level and the very specific, underlying skills of early reading (such as phonemic awareness and letter recognition), with simple-to-implement and direct approaches to teaching (and teacher training), low-cost materials and a focus on continuing assessment to monitor progress can make a significant difference in students\u2019 early reading outcomes<\/a> already within a relatively short period of time.<\/p>\n A role for an ICT intervention in this scenario could then be how to get to such results in a more effective and efficient manner and\/or for specific groups of students whose needs are not yet fully addressed through existing, though working, approaches.<\/p>\n In an effort to respond to the (very valid) sustainability and scale up requirements, to adhere to best practices in institutionalization and sector-wide approaches, and in dealing with the significant challenges related to procurement \u2013 of equipment and content \u2013 ICT4E initiatives get caught up in investing resources and efforts on too many fronts already at the very beginning, without having had a chance to even get to the classroom and validate their impact where it should matter most: at the student level.<\/p>\n <\/a><\/p>\n Focus on student achievement <\/strong><\/p>\n A lot of discussion related to this month\u2019s topic raised the question of whether or not ICT4E initiatives should be assessed in regards to their contribution to improved student learning outcomes and specifically academic achievement.<\/p>\n In environments where resources are extremely limited, the choice for implementing a certain program is de facto a decision against<\/em> an alternative approach. I question how we justify not<\/strong> expecting comprehensive classroom-level ICT initiatives to focus on learning outcomes when even the most foundational skills critical for future learning, such as early reading, are not being acquired by large proportions of children.<\/a><\/p>\n Clearly, other skills, including cognitive skills, critical thinking skills, etc., are important for learning and child development. Thus it is important to move the body of knowledge on ICT for Education further into exploring such new domains. However, should this really be done on the expense of education systems that already struggle with basic service delivery and ensuring universal learning of the most foundational skills? As many of us remember, there has already been much discussion<\/a> around the issue of added-value of technology in good schools, versus challenge schools (systems).<\/p>\n I propose to go back to the drawing board and try to get more specific about where we expect ICT can make an impact, define impacts more narrowly, and develop a roadmap of the steps including outputs and outcomes to get there.<\/p>\n A revised ICT4E initiative objective could under such an approach read like: \u201c\u2026 aims to improve early reading skills, specifically fluency, among students in grades 1 and 2 of public primary schools\u201d within an average timeline of a donor-funded project of some 5 years. The OLPC Peru program sets a good example<\/a> in regard to focusing on a specific target: \u201c\u2026children in the remotest places and in extreme poverty, prioritizing multi-grade schools with only one teacher.\u201d<\/p>\n While ICT is still not a magic bullet<\/a> to resolve this issue, such an objective is at least measurable as there are internationally validated tools, and – as we can learn from reading curricula around the world – results should be detectable within a reasonable timeframe of 1-2 years for most languages and scripts. Other objectives may take longer to achieve and may need to be broken down into intermediate results and steps to find the balance between being ambitious and realistic.<\/p>\n Establishing priorities and validating impact on specific aspects of larger objectives and expanding on these over time will go a long way to building credibility for the ICT4E and the larger ICT for Development practice area, and rallying the national support needed for a program.<\/p>\n With such a focused objective in hand, program implementers can then align activities (provision of content, equipment, training, etc. ) and design monitoring and evaluation frameworks that provide relevant information to the project internally and to the larger body of knowledge.<\/p>\n <\/a><\/p>\n Focus on rigorous Monitoring and Evaluation in ICT4E interventions<\/strong><\/p>\n Much has already been said about the lack of systematic Monitoring and Evaluation (M&E) of progress and impact in OLPC programs<\/a> and elsewhere.<\/a><\/p>\n Implementing good M&E is difficult. It is hard to design relevant and appropriate frameworks with coherent indicators, realistic data collection methods, and validated tools. Thus, M&E costs money, which can be more or less depending on the complexity of the program and\/or the objectives and questions driving program implementation. And it is even harder to focus on M&E when at the same time available resources and people engaged in the program are tied up with the enormous tasks related to the logistics of getting the hardware devices into children\u2019s hands.<\/p>\n In spite of this, however, absence of systematic M&E deprives any program of the opportunity to tell a story much beyond anecdotal evidence once it comes to evaluation<\/em>.<\/p>\n Furthermore, without systematic monitoring<\/em>, there is no opportunity to implement corrective measures to program parameters and to continually check back whether activities and resources are moving the initiative towards project objectives before scale up\/institutionalization. Monitoring can take many faces and each of them can be strategically employed at different times in a project life-cycle to help improve the program. Monitoring may involve low-effort usability tests with target group users on ease of use of equipment, user acceptance tests for new content and approaches, pilots for new curricula, or comprehensive randomized controlled trial methodologies that allow for a more precise look at the effect of the equipment versus that of the content or the teacher training on the overall project objective before scale-up.<\/p>\n Although no written M&E framework was found on the OLPC Peru website, some monitoring- type activities have taken place. From an outsider\u2019s glance it seems, however, as if these were not yet fully capitalized on to improve the implementation model.<\/p>\n In 2007, in advance of the main roll-out, a pilot<\/a> was conducted in Arahuay with 48 students for several months. The pilot unearthed a number of hardware design and technical installation issues that were subsequently fixed.<\/a> However, there seems to be little documentation of any change in the pedagogical model related to the educational resources or teacher training provided as a result of the pilot.<\/p>\n Further, the first IDB study<\/a> was conducted in 2009 and found that \u201cConcerning use, it is worth making note of what looks like a decreasing utilization of computers in the classroom, which could be a reflection of the need for more technical and pedagogical support for the teachers, as well as of the lack of planning sessions, activities and digital resources appropriate for educational use.\u201d Although in other sources<\/a>, an iterative improvement process to the roll-out is being stated, the IDB 2012 report does not indicate implementation design adjustments towards more pedagogical support or resources appropriate for educational use between 2009 and 2010.<\/p>\n The OLPC Peru program has been given a potentially huge opportunity through this present IDB study, which, given the rigor of its methodology should be taken very seriously. For this, however, the community would need to agree that the current study is conceptualized as a monitoring<\/em> effort, providing relevant feedback at a strategic point in time for the program, rather than constituting an \u201cend of program\u201d type evaluation<\/em> effort. On the side of project implementers, it will require some potentially hard strategic decisions about the need for and focus of future efforts and resources.<\/p>\n In general for the ICT4Ed practice area, I propose we adopt a more focused approach to establish the added-value ICT provides in low-income countries and:<\/p>\n\n
\n