{"id":2340,"date":"2012-03-15T09:32:58","date_gmt":"2012-03-15T13:32:58","guid":{"rendered":"https:\/\/edutechdebate.org\/?p=2340"},"modified":"2012-09-27T10:39:00","modified_gmt":"2012-09-27T14:39:00","slug":"what-do-olpc-peru-results-mean-for-ict-in-education","status":"publish","type":"post","link":"https:\/\/edutechdebate.org\/olpc-in-peru\/what-do-olpc-peru-results-mean-for-ict-in-education\/","title":{"rendered":"What Do OLPC Peru Results Mean for ICT in Education?"},"content":{"rendered":"
<\/a><\/p>\n One Laptop Per Child (OLPC) has been a part of a larger ICT4E discussion, which has included ongoing debate<\/a> over the effectiveness of the XO and its various deployments.\u00a0 Since its inception, OLPC has relied mainly on aspirations, visions, and projections to support investment from various partners across the globe.\u00a0 Pilots programs were conducted at various levels of deployment, programming, and stakeholder engagement.\u00a0 More recently, larger, more longer-standing deployments have reached a point where assessments are now coming to fruition. <\/p>\n Concurrently, as the OLPC offering developed and evolved, so too did a variety of education technology initiatives and device platforms.\u00a0 More specifically, the presence of similar programs<\/a> and other form factors (tablets, mobile phones) increased the channels through which a variety of activities for education, instruction, communications, and business could be conducted.\u00a0 <\/p>\n These processes contribute to shifting expectations, which despite the support or critique of one assessment or another, highlight the importance of setting clear objectives that are cognizant of both the array of available tools as well as the surrounding systems that have an indirect, but no less significant affect on final outcomes.<\/p>\n A recent Technology Salon<\/a> in Washington, D.C. touched on various topics including the current landscape of ICT developments around the world and the findings and characteristics of the 2011 IDB randomized control experiment in Peru XO deployment.<\/p>\n <\/p>\n ICT Landscape<\/strong><\/p>\n When the XO concept first hit the market in 2005, laptops had just passed desktop machines<\/a> in overall retail sales in a personal computing.\u00a0 The personal computing device landscape at the time was characterized by the mobility of laptops and the power of the desktop computer.\u00a0 Netbooks would surface<\/a> several years later, accentuating the value of mobility over power. \u00a0\u00a0<\/p>\n Seven years later, that landscape has changed with the introduction of tablet PCs, not to mention the prevalence of smart phones.\u00a0 With laptops displacing laptops and netbooks (sort of<\/a>) and tablets displacing<\/a> laptops, all while smart phones provide prevalent<\/a> and complementary channels of information, resulting in more diverse market than that which the XO first entered in 2005.\u00a0 <\/p>\n What are the implications of this mixed learning environment for ICT4E?\u00a0 The answer will no doubt vary.\u00a0 In higher disposable income environments, it\u2019s completely feasible for a primary school child to check out an iPad at school, own a smart phone, and return to a home equipped with a tablet, netbook, and\/or a laptop.\u00a0 On the other hand, in the lesser developed areas, this is not necessarily the reality for individuals, NGOs, or state education authorities on the ground.\u00a0 <\/p>\n For the education sector, the result is an increased variety of tools and overlapping ability to engage educational learning goals and objectives.\u00a0 Whether the tablet proves to be an effective tool in ICT4E remains to be seen in projects like the Aakash<\/a> tablet and OLPC\u2019s more recent release of the XO 3.0<\/a>.<\/p>\n Assessment in Context<\/strong><\/p>\n For the XO deployment in Peru, the IDB\u2019s randomized control experiment<\/a> sampled five students per grade per school out of 320 schools (2\/3of which were in a treatment receiving the intervention) at intervals of 3 and 15 months.\u00a0 A 2010 assessment brief<\/a> was released based on the 3 month data, and last Thursday\u2019s assessment brief covered the 15 month data.\u00a0 The findings showed little effect in national assessments in math and language test results, with slight advancement in cognitive abilities based on IDB methodologies.\u00a0 A natural question follows:\u00a0 Was 15 months long enough of a period to observe significant effect on child learning?<\/p>\n Part of the answer may rest in the level of technical and pedagogical support implicated in both briefings.\u00a0 In the 2010 assessment, only 10.5% of teachers reported receiving technical support (with 7% receiving pedagogical support).\u00a0 Thursday\u2019s discussion revealed that only 1\/3 of the schools had received pedagogical support and only 1\/3 had actually used technical training and manuals provided them.\u00a0 Further discussion revealed that training consisted of a total 40 hours of training, while the percentage of teachers having received training dropped from 80% in 2010 to 71% by the 2011 assessment.\u00a0 <\/p>\n Considering the role teachers play in the traditional education systems of instruction and assessment, a lack of technical and pedagogical support could, at the very least, delay integration of the XO as a significant tool in those systems.\u00a0 It can be argued that the use of XOs outside the classroom could produce other benefits to complement classroom learning.\u00a0 However, given that almost half the students were prohibited from taking the XOs home and that half of all teachers didn\u2019t use the XO in the classroom, would a larger assessment time frame produce significant difference in findings?<\/p>\n Outputs vs. Outcomes<\/strong><\/p>\n Core to program logic is that program theory helps identify a problem for which a program is designed, complete with inputs, activities and outputs.\u00a0 In assessing a program\u2019s effectiveness, it\u2019s important to distinguish the difference between outputs and outcomes as well ensure alignment with measurement and evaluation criteria.\u00a0 The two largest XO deployments in absolute numbers and percent of population are Peru and Uruguay.\u00a0 <\/p>\n Thursday\u2019s discussion pointed out Uruguay\u2019s clear objective of social inclusion, which produced a near 100% primary school penetration rate through a national 1-to-1 program.\u00a0 The Uruguay assessment<\/a> focused on access, use, and experience, reflecting a focus on social inclusion as an outcome.\u00a0 In the case of the assessment of Peru, math, language, and cognitive test results showed outputs, but no clear connection to Peru\u2019s 2007 stated objectives which targeted pedagogical training and application.\u00a0 If objectives and outcomes are not clearly aligned with assessment criteria, can \u201ceffectiveness\u201d be appropriately measured?<\/p>\n Objects vs. Processes<\/strong><\/p>\n It is important to be clear about what is being measured before measurement begins.\u00a0 Is it the insertion of an object or introduction of a process?\u00a0 Thursday\u2019s discussion touched on the lack of clarity over this question.\u00a0 Was it the sheer presence of laptops that would dramatically \u201cempower children to learn?\u201d\u00a0 Placing a laptop next to one\u2019s head to demonstrate expectations of the XO, of course, is an exaggeration of the self-learning model<\/a>. \u00a0\u00a0<\/p>\n An analogy that better demonstrates the situation could be a teacher being given a chalkboard and chalk at the beginning of the school year.\u00a0 There is an inherent assumption that the teacher knows how to write on the board and has been trained with curriculum content to write on the board.\u00a0\u00a0 The point being, the intervention is much more complex than the introduction of a singular object, be it a chalkboard or an XO.\u00a0 In some sense, this is recognized by the stated objectives of the Peruvian program, targeting both training and pedagogical application surrounding the XOs.\u00a0 The realization, however, based on the numbers seems distant.<\/p>\n The statement, \u201cComputers will happen\u201d brought up an interesting idea of shifting focus away from the question of \u201cwhether the XO\u2019s presence had an effect,\u201d and towards evaluating the effect of the content and overall experience.\u00a0 Assessing the experience, rather than the hardware, would more specifically target the processes designed to target pedagogical use and curriculum development.\u00a0 The XO does not stand alone, but rather depends on an ecosystem of processes that affect child-learning.\u00a0 More specifically, outcomes related to XO use would more appropriately be measured not by only looking at groups that have received the XO and those who haven\u2019t, but by looking at what surrounding processes have contributed to child-learning.\u00a0 <\/p>\n Walter Bender spoke at a mEducation event where, despite the XO 3.0 being the theme of the event, he emphasized the capabilities of the Sugar platform and the capabilities of it and the applications surrounding it.\u00a0 Perhaps focusing on the evaluation of training, content, and support surrounding XO deployments can provide more insight to measured effects of the processes that sit on top of the object?<\/p>\n One-to-What Computing<\/strong><\/p>\n The name \u201cOne Laptop Per Child\u201d itself explicitly assigns itself to a \u201cone-to-one\u201d computing model as an education solution.\u00a0 The discussion over instructional format is not new and reflects the complex reality that program design.\u00a0 The right \u201csolution\u201d will be considerate of not only training and curriculum, but also the resources and capacity constraints of school programs and the communities surrounding them.\u00a0 Curriculum can be designed for individual or collaborative learning environments.\u00a0 In my three years of experience teaching computer studies in Western Samoa, I used different classroom formats, employing both a one-to-one model and a one-to-many model.\u00a0 Admittedly, applying the one-to-many model was a compromise between limited resources (20 computers) and overwhelming demand (300 students).\u00a0 <\/p>\n However, I observed just as much \u201clearning\u201d happening in both models, in conjunction with adjusted lesson and activity plans.\u00a0 The value of collaboration cannot be discounted, especially when students in a one-to-one model often still talk amongst themselves.\u00a0 The issue then becomes a matter of how much \u201cface-time\u201d the student has with the computer and how much value that provides for their overall learning process.\u00a0 Is one hour of use a day necessarily less valuable than 24 hours a day?\u00a0 The reality is that each student will have different circumstances that affect their ability to use the device.\u00a0 <\/p>\n This variability in circumstance highlights how classroom designs differ from home use design where the primary influencer, at least for the class period, is the teacher.\u00a0 The debate between one-to-one and one-to-many computing formats is important and will no doubt be expanded by studies in the more measurable and controllable classroom environment.\u00a0 But to be clear, I definitely value having a computer at home (as a child, I was fortunate enough to have a Tandy 1000EX<\/a>), but I am unsure how the effect of such exposure can be accurately measured, when the \u201chome\u201d environment can vary greatly.<\/p>\n Shifting a Paradigm<\/strong><\/p>\n In 2007, Uruguay\u2019s Plan Ceibal and Peru\u2019s XO program designs looked very similar, but Plan Ceibal evolved with input from the community that helped shape the activities, enabling Uruguay to achieve its objectives.\u00a0 Perhaps larger geographical, socio-economic, or political factors held Peru back from achieving its goals? Or perhaps, as one attendee asked, Uruguay\u2019s success was a reflection of a social structure that existed prior to OLPC.\u00a0 <\/p>\n The reality is that the technological landscape is rapidly evolving, challenging both the \u201cform\u201d (netbook) and \u201cfashion\u201d (1-to-1 deployment) of OLPC deployments.\u00a0 Moving forward, ICT4E projects will need to focus on alignment between a growing array of tools, clear objectives, and assessment criteria in order to ensure measurable effectiveness and consider cost-effectiveness in the presence of alternative measures.\u00a0 A focus on the curriculum and pedagogical experience may provide a better understanding of process interventions rather than object insertions.\u00a0 <\/p>\n The cases of Uruguay and Peru could serve as first steps in appreciating that the success of the program does not lie solely in a single machine, but rather in engaging the stakeholders and conditions surrounding it.<\/p>\n