Let’s Make Tech All About Learning

I have found myself lately in several conversations about the price of technology. The conversations have focused on laptops and tablets and folks wondering if we could find devices that were less expensive.

And I realized that, in their thinking, all laptops and devices were created equal, in such a way that the only variable is cost (and, if this were true, I would have to agree).

But it made me realize that we were having the wrong conversation completely. The conversation shouldn’t be about price; it should be about value.

Further, I realized that we miss the boat on the value conversation when we spend too much time talking about the technology and the tools, or about providing technology and procurement. We need to spend most of our time talking about what kinds of learning we would like to make happen with the technology. You can only get to the value conversation when you can discuss what you want to do with the devices and compare different devices around how well suited they are to those purposes.

I used to teach with a really wonderful professor of elementary educational technology, named Ralph Granger. He used to say, when you go to the hardware store to buy a new drill bit, you don’t really want a new drill bit. You want a hole. When it comes to educational technology, we need to talk less about our “drill bits” and more about the “holes” we want.

Or as Marc Prensky says, we need more verbs and fewer nouns.

And, as TPACK reminds us, when we align our educational arrows, we are talking about content, pedagogy, and technology (What instructional strategies might we use to teach this learning target, and what role could our devices play?).

I believe that part of that conversation needs to be around student engagement and motivation.

So I was very happy to see that the National Association of School Boards of Education is pointing out that student engagement needs to be a critical criteria for judging the value of our educational investments (including technology). One article on their recent report starts, “Education is a $600 billion a year industry, but that investment means little unless students are physically and mentally present and engaged to benefit from it.”

How are you prepared to help make our educational technology conversations focus more on learning?

 

Does Technology Improve Learning – No! A Keynote

I recently had the honor of keynoting at the Illinois Computing Educators (ICE) conference.

My message was that technology alone will not improve learning; only teachers improve learning. But technology can be wonderful tool for teachers and for students under the guidance of teachers.

Watch the keynote here. And related resources are down below.

 

If we want to leverage technology well for learning, then these are the components we should attention to:

  • Focus on Learning
  • Deliberate, Shared Leadership
  • Community Engagement
  • How You REALLY Protect Stuff
  • Support the Heck Out of Folks

Resources

Technology:

Learning:

Leadership:

Community Engagement:

Supporting Educators (Professional Development):

 

A Child Struggles in School: Where Does the Problem Lie?

In a conversation recently with a caring, conscientious teacher, she commented that she had success working with struggling learners and helping to make them feel smart.

But when they got to the next grade and perhaps had a teacher that wasn't as effective at reaching those children, or perhaps thought there was a pace for learning and students should stick to it, or perhaps simply saw the onus for learning as being on the student, the students really struggled again.

She worried that perhaps she had led those students to have an unrealistic view of themselves by not being more up front with them about being struggling learners. She wondered, despite her success helping those students to learn, to feel successful, and to feel smart, if she shouldn't be more direct with them about being struggling learners, to prepare them for possible pain and disappointment later.

And I caught myself wondering, is the problem that each child isn't where the school is in the curriculum?

Or is the problem that the school isn't where the child is in the curriculum?

 

How Will We Use Our Technology? – 7 Powerful Uses

Whether you have technology for your students, or you are thinking about getting technology for your students, “How will we, or should we, use our technology?” is an important question.

The answers to that question need to come from what we know about learning, more than what we know about technology. Recently, I have written about how we should focus on learning when we try to answer this question; that we should think about how technology has changed how students learn outside of school; and if we are having problems with our technology, that it might be that our vision for learning is lacking.

And I think it is important to articulate how we would like technology to be used in our classrooms partly because personal technology skill is not the same as teaching with technology skill. Because a teacher can use an iPad herself doesn't mean that she knows how to leverage that same iPad for student learning. Articulating how we might expect teachers to use those devices helps provide teachers targets for their own professional learning.

We are currently working with the idea that there are 7 powerful uses of technology:

  1. Tech for Foundational Knowledge: How can we help students learn the basics?
  2. Tech for Using Knowledge: How can we contextualize learning and make learning engaging and meaningful? How can students use their knowledge? What is the role for creating and creativity, and for project-based learning?
  3. Tech for Learning Progress Management: How do we keep track of student learning? Promote a transparent curriculum? Make learning progressions clear? Help students navigate their learning? Maintain evidence of mastery?
  4. Tech for Personalizing Learning: How does technology help us tailor the learning to the student?
  5. Tech for Supporting Independent Learning: How can technology help the student do more on their own and need the teacher less?
  6. Tech for Assessment: How can technology help us capture what students know and can do?
  7. Tech for Home/School Connection: How can technology help us stay better connected to parents?

Again, note the pedagogical focus, not a technology focus. In other words, the technology isn't the end or the desired outcome, rather the technology is in service to desirable educational outcomes.

How are you leveraging technology for each of these 7 uses?

 

Is the Problem Your Students, the Device, or Your Vision for Learning?

There has been a mixed bag of results for technology in schools lately. You certainly hear about districts creating exciting learning opportunities for their students by leveraging technology. But you also read about LA Unified's problems with their iPad initiative, or Miami-Dade schools putting their initiative on hold because of the troubles in LA and in North Carolina.

The blame for the failures in these districts is pointed in lots of directions, but includes students as “hackers” (although there was no hacking, just clever students figuring how how to make locked down devices function as designed), or lack of keyboards (don't get me started on how stupid that issue is – it comes from adults who haven't sat with a tablet long enough to know how easy the virtual keyboard is to use). Diane Ravich points to overly agressive timelines, poor project management, poor contract management, and a failure to evaluate curriculum resources, especially against district curriculum standards.

But I believe there is a much deeper problem at the root of these disasterous educational technology initiatives.

Let me come at this from a different direction… Recently a friend contacted me, saying she was working with a district that was trying to decide what device to invest in. Tablets? Chromebooks? Laptops?

Based on 13 years of working with 1to1 initiatves and all the lessons learned, my reply was to ask, “What's their vision for learning? Frankly, without such a vision, I'm not sure it would matter what they bought; it will be equally unsuccessful…”

How do you know what you want technology for if you haven't decided what learning should look like in your classrooms? A tool bought for no other purpose than to have the tool (or because you believe it is good to have the tool) fulfills its purpose by simply being there. Yet, later, purchasers are surprised that amazing things haven't happened by simply being in the tool's presence…

Or maybe you have what I have come to think of as a “default learning vision.” In the absence of a vision for learning driving the instructional use, the instuctional use becomes the vision for learning. The vision defaults to what you do when what you do isn't informed by a vision.

So, what may be the default vision for learning in these initiatives?

I look at these three well-publicized initiatives and I see a vision of learning that boils down to this: electronic workbooks.

There is no doubt that access to digital content and resources should be one slice of how schools leverage technology for learning. But workbooks (of any variety!) have always been wholely insufficient for quality learning programs. (If they were sufficient, we would have the best educational system in the world by simply dropping a box of textbooks and workbooks at each student's home each year…).

Or as Diane Ravich points out about this problem:

…the content of the tablets must allow for teacher creativity, not teacher scripting… The time will come when tablets replace the bulky, puffed-up textbooks that now burden students’ backpacks. The time will come when tablets contain all the contents of all the textbooks, as well as a wealth of additional resources, in multiple subjects. But they must encourage exploration and inquiry, not fidelity to a packaged program. Customized and individualized must become a reality, not a sales pitch for programmed learning.

Is it any wonder that these technology initiatives are a train wreck, given their vision for learning?

 

iPads in Primary Grades: What Veteran Teachers Think – Stephanie

This is the third installment in a series of interviews with veteran teachers to get their perspective on our iPads in primary grades initiative, Advantage 2014. Is the initiative really having the impact our early adopters would have you believe? Would our more cautious or hesitant teachers agree? Here are the first and second posts in the series.

Stephanie Hathaway teaches kindergarten. Here are her thoughts on the initiative.

Highlights from Stephanie’s interview:

  • She felt there was a lot of pressure to succeed, which she found daunting, since she wasn’t familiar with iPads before the initiative.
  • But the district provided lots of professional development
  • Impact: Assessment (time 0:48)
  • Impact: Like having 18 teachers in the room – interventions & individualization (time 2:18)
  • Impact: Motivation factor and creativity factor (time 4:07)
  • Also supports the learning of handwriting.

iPads in Primary Grades: What Veteran Teachers Think – Jean & Chris

Auburn has had some real success with Advantage 2014, our iPads in primary grades initiative. Although many folks like hearing about the enthusiastic teachers who have done many inventive things with the iPads and their students, others wonder what veteran teachers might think; teachers who may not be so enthusiastic.

In March of 2013, I interviewed a handful of such teachers to see what their perspective was. This is the first in a series highlighting the veteran teachers' perspective of teaching and learning with iPads in kindergarten and first grade.

Both Christine Gagne and Jean Vadeboncoeur have taught first grade “for a long time,” as Chris says. Both were skeptical of having to use the iPads with students, and Jean admits that she is not a “pro screen kind of person.” In this video, Chris and Jean talk about their experience in the first year of using the iPads, and the impact the iPad, apps, and their professional development had on their students.

 

Highlights of their comments:

  • By March, all their students were meeting or exceeding standards.
  • The apps and using the iPads generated a lot of excitement in the students.
  • They saw students try harder and work more diligently to figure out the work on their own.
  • They were surprised at this year's students' progress compared to previous years.
  • They thought the amount of practice and the immediate feedback were secrets of the success.

 

The Connection Between Facilities and Learning

Auburn needs a new high school, and we're working through the process to get a new one built. The issues were especially brought to light by our accredidation, which placed us on warning status in the curriculum and program category because of our facility. Also, until recently, we thought we'd have to go it on our own, without state funding.

This led (naturally) to questions from the public about what the connection might be between facilities and learning. Plenty of folks believe that you can throw a tent up in the ball field and teach kids (effectively) there…

So I did a little digging.

Turns out there's strong research on the connection between the quality and condition of a school building and student academic achievement, student behavior, and teacher stress levels.

Key elements that impact learning include natural lighting, noise reduction, heating, cooling, and air quality, and overall conditions, such as maintenance and cleanliness. (Maybe this is why an academically oriented accreditation process examines the state of the facilities…)

Studies have controlled for family factors (such as family background, free and reduced lunch rates, race/ethnicity, attendance, and suspension rates), and found that building condition not only significantly impacted achievement and behavior, but was a stronger predictor of academic achievement than many family background factors and socioeconomic conditions.

Researchers also found that many of the environmental factors that contribute to student learning can be improved with proper building maintenance, construction, or renovations.

See Barnes, R., Chandler, J., Thomsen, B. A Problem Based Learning Project Analyzing State Assessment Instruments Used for School Facilities. pp 32-35 for a summary of the research.

 

The Series on the New MLTI: Choice, Auburn, and Learning

Maine has long had the first (and, unfortunately, only) 1to1 learning with technology initiative: MLTI.

The MLTI contract was up for renewal this year, and, for the first time, Maine is allowing each district to choose from 5 finalist proposals, producing a lot of conversation about the choices and how to choose.

Below is the series of blog posts I have written about the MLTI renewal, Auburn's choice and choice process, and my interest that MLTI selection focus on learning:

 

What We Want from Technology – MLTI, Customized Learning, and School Vision

There have been many discussions around Maine since the Governor announced schools would have choice over which solution they select for MLTI for the next four years. But most of those conversations have focused on the device, or its capabilities, or why it is “my preferred device,” or why people are worried that the device they aren't that familiar with will not be sufficient for the task at hand…

I wish so much more of those conversations had instead been about school visions for learning, and what we hope to get from technology for learning. What role can technology play in learning? What is your school's or district's vision (ours is here), and what is the role of technology in fulfilling that vision?

And for Auburn, as I would guess for other districts in the Maine Cohort for Customized Learning, we are concerned about technology's role in helping us succeed with implementing Customized Learning (such a critical part of our vision).

Here is what we think the roles for technology are for learning, especially for Customized Learning:

  • Instructional Resources for Building Foundational Knowledge
  • Instructional Resources for Using Knowledge, Creating, Complex Reasoning, and Projects
  • Learning Progress Management
  • Supporting Independent Learning
  • Assessment
  • Home School Connection
  • Student Motivation

How are you currently using technology for each of these? What are teachers doing (maybe in your district, but maybe in another) that shows you exciting ways technology could be used for each of these? What is best technology practice for each of these roles?

But much more importantly, as Maine's districts think about selecting a solution for MLTI, how does each proposed solution measure up against each of these roles for technology?

You don't have to be interested in Customized Learning to be interested in these roles. But I don't beleive a school can make a satisfactory decision about which solution to select if they are only thinking about the device or the operating system…

 

Auburn’s Data Shows (Again) The Positive Impact of iPads

Our School Committee wants to know if there has been an impact of having iPads in the primary grades classrooms, and there has been!

In fact, we recently presented those findings to the School Committee.

All our primary grades students participate in CPAA testing (Children's Progress Academic Assessment). It is a test meant to be used as formative assessment to let teachers know where their students are in their literacy and math learning, giving them information about student mastery of specific concepts, helping inform teachers' instruction.

As we look back over the CPAA data from past years, and compare to the cohorts of students who have had iPads, we found that a larger percentage of students have reached proficiency, and have reached it sooner, than in the years before we had iPads. For kindergarten, this is true for 6 out of 8 concepts. For first grade, it is true for 5 out of 7 concepts.

We know it hasn't just been the iPads. We have done a ton of professional development on literacy best practices, math best practices, and educational technology best practices.

But what this data does tell us, is that when we combine teachers with professional development and 1to1 iPads, then our students learn more, faster.

In other words, Advantage 2014, our literacy, math, and iPad initiative, is having a positive effect on student achievement.

So when we ask for iPads for second grade, we aren't just asking for tech or gadgets. We are asking for a proven educational resource that helps our students learn better.

More Indications of Positive Results from Auburn’s iPads

We’ve had iPads in our Kindergarten classrooms for more than a year now. This fall, we also rolled out iPads to our 1st grade students. All in the name of improving students’ mastery of literacy and math.

We know that we have too many students who aren’t demonstrating proficiency, so for several years, we’ve been making sure that teachers are getting quality training in literacy and math instruction, and we’re hopeful that, combined with the access to educational resources made possible through iPads, that we’ll increase that level of proficiency.

And when we examined gains made by last year’s kindergarten students, that’s what we found. Our kindergarten students had made more gains than in years past, leading our Curriculum Director to proclaim that taxpayers’ money is well spent.

Read more about our gains in the Sun Journal article Educators Say iPads Help Scores, and the MPBN radio story Auburn Educators Tout Benefits of iPads for Kindergartners (sorry iPad users; you need flash to listen to the story, but you can still peruse the article).

Keep the MLTI RFP focused on Learning: Talking Points

I have had some great conversations and email exchanges with many of you since posting my concerns about keeping the new MLTI RFP focused on learning. Some of you have asked if you wanted to reach our to the Commissioner to express similar views, what might you say?

Here are my talking points:

  • Instead of tech specs, the RFP should describe what we would like to do with the devices (what is the change in learning that we would like to see?)
  • Technology is expensive, and we should not invest in it if we are simply going to use it to do what we do without it (what is the change in learning that we would like to see?)
  • Looking at the work in Maine, perhaps that change in learning should be Customized Learning and the Education Evolving recommendations
  • In keeping with the components of Customized Learning, the learning activities described should include both those for low level learning and for high level learning.
  • Low level activities (recall, understanding, simple application) could include the following: access to online resources, information gathering, note taking, communicating, studying, accessing online educational tools, etc.
  • High level activities (non-routine application, analysis, evaluation, creating) could include the following: creating simulations, project-based with multimedia, coding and programming, writing for a purpose and audience, digital storytelling, engineering and design, etc.

 

MLTI: What Change in Learning Would You Like to See?

I think one of things that MLTI, the Maine Learning Technology Initiative, did well, right out of the gate, was to say it isn't a “tech buy,” but rather a learning initiative. I think this one point is a major reason why the first (and still only) statewide learning with laptop initiative did so well and is more than a decade old. Even the first RFP to prospective vendors focused on what we wanted to do with the technology, rather than tech specs.

And the focus on learning was especially evident in our professional development.

Our PD focused on project-based learning, and the writing process, and mathematical problem solving, etc. We focused on how to teach with technology, not so much on how to use it. And when we did focus on how to use it, it was in the context of how to teach with that tool. We didn't do workshops on how to use a spreadsheet; we did workshops on how to analyze data and the participants left also knowing how to do spreadsheets.

But I've grown concerned that MLTI may be moving away from that focus on learning. To listen to conversations about the initiative, they seem to focus much more on the “stuff” (comparing devices, network and filtering solutions, and discussing software fixes and specifications…) than on teaching and learning. I am not saying that I've heard that from Jeff Mao, Maine's Tech Director, or the DOE, as much from out in the general public. But even so, it has me worried a little…

I think one of the tricks of keeping a mature initiative going is to reflect on what made it great in the first place, and make sure that we keep those pieces fresh, even if they may have gotten a little stale and need refreshing. That's not to say that the MLTI team isn't doing their job. Every initiative needs freshening up when things have been routine for a while!

Right now, the MLTI contract is getting ready to run out and the Department of Education is working to craft a new RFP. What better (and perhaps more appropriate!) time to freshen up an initiative than when designing that initiative's RFP.

So I recently had conversations with both Commissioner of Education Bowen and Jeff Mao, asking them to please consider framing the new MLTI RFP around the change in learning they would like to see in our classrooms. This post reflects some of what I shared with them, first in my phone conversations, and then in a follow up email.

So, I'm hoping that MLTI is still committed to being a “learning initiative” and not a “tech buy.” And if it is, I'm hoping that the RFP can be crafted in such a way that this is evident.

And if so, then what is the change in learning that the Commissioner and the MLTI team are hoping will come about by leveraging the technology? Is it Customized Learning? What would Education Evolving, Maine's new education strategic plan, look like in action and how could technology help bring about? Is it the practices highlighted in the DOE's new Center for Best Practices? What are we hoping students would be doing each day, both on and off their devices, that we would recognize is a change in learning?

Or as I say in presentations, if we're just going to use technology to do what we're already doing, why put the money into technology?

I'm hoping that the Commisioner and the MLTI team will consider framing the RFP in such a way as to make obvious that we are looking for a change in learning, and allow the responding vendors to propose the technical solutions that they think can help get us there.

So, if you think that MLTI should be more than a tech buy, please contact the Commissioner of Education (624-6620; commish.doe@maine.gov) and state Tech Director (624-6634; jeff.mao@maine.gov) to encourage them to frame the RFP around desired changes in learning.

 

 

Auburn’s iPad Research Project on the Seedlings Podcast

Seedlings is a great little podcast that, although about educational technology, is really about good teaching and learning.

So I felt honored when the Seedling hosts invited me to return to talk about Auburn’s research on their Advantage 2014 program, best known for giving iPads to Kindergartners. You can download that podcast and access related links here.

This was a follow up to the previous podcast, where we talked both about Advantage 2014, and Projects4ME, the statewide virtual project-based non-traditional program, where students can earn high school credit by designing and doing projects, instead of taking courses.

Responding to Critiques of Auburn’s iPad Research Claims

When we announced our research results last week, Audrey Watters was one of the first to cover it. Shortly thereafter, Justin Reich wrote a very thoughtful review of our research and response to Audrey’s blog post at his EdTechResearcher blog. Others, through comments made in post comments, blogs, emails, and conversations, have asserted that we (Auburn School Department) have made claims that our data don’t warrant.

I’d like to take a moment and respond to various aspects of that idea.

But first, although it may appear that I am taking on Justin’s post, that isn’t quite true (or fair to Justin). Justin’s is the most public comment, so the easiest to point to. But I actually believe that Justin’s is a quite thoughtful (and largely fair) critique from a researcher’s perspective. Although I will directly address a couple things Justin wrote, I hope he will forgive me for seeming to hold up his post as I address larger questions of the appropriateness of our claims from our study.

Our Research Study vs. Published Research
Our results are initial results. There are a lot of people interested in our results (even the initial ones – there are not a lot of randomized control trials being done on iPads in education), so we decided to share what we had so far in the form of a research summary and a press release. But neither of these would be considered “published research” by a researcher (and we don’t either – we’re just sharing what we have so far). Published research is peer reviewed and has to meet standards for the kinds of information included. We actually have more data to collect and analyze (including more analyses on the data we already have) before we’re ready to publish.

For example, Justin was right to point out that we shared no information about scales for the ten items we measured. As such, some of the measures may seem much smaller than when compared proportionally to their scale (because some of the scales are small), and we were not clear that it is inappropriate to try to make comparisons between the various measures as represented on our graph (because the scales are different). In hindsight, knowing we have mostly a lay audience for our current work, perhaps we should have been more explicit around the ten scales and perhaps created a scaled chart…

Mostly, I want my readers to know that even if I’m questioning some folks’ assertions that we’re overstating our conclusions, we are aware that there are real limitations to what we have shared to date.

Multiple Contexts for Interpreting Research Results
I have this debate with my researcher friends frequently. They say the only appropriate way to interpret research is from a researcher’s perspective. But I believe that it can and should also be interpreted as well from a practitioner’s perspective, and that such interpretation is not the same as a researcher’s. There is (and should be) a higher standard of review by researchers and what any results may mean. But practical implementation decisions can be made without such a high bar (and this is what makes my researcher friends mad, because they want everyone to be just like them!). This is just like how lawyers often ask you to stand much further back from the legal line than you need to. Or like a similar debate mathematicians have: if I stand some distance from my wife, then move half way to her, then move half way to her again, and on and on, mathematicians would say (mathematically) I will never reach her (which is true). On the other hand, we all know, I would very quickly get close enough for practical purposes! 😉

Justin is very correct in his analysis of our research from a researcher’s perspective. But I believe that researchers and practitioners can, very appropriately, draw different conclusions from the findings. I also believe that both practitioners and researchers can overstate conclusions from examining the results.

I would wish (respectfully) that Justin might occasionally say in his writing, “from a researcher’s perspective…” If he lives in a researcher world, perhaps he doesn’t even notice this, or thinks it implied or redundant. But his blog is admittedly not for an audience of researchers, but rather for an audience of educators who need help making sense of research.

Reacting to a Lay Blog as a Researcher
I think Justin has a good researcher head on him and is providing a service to educators by analyzing education research and offering his critique. I’m a little concerned that some of his critique was directed at Audrey’s post rather than directly at our research summary. Audrey is not a researcher. She’s an excellent education technology journalist. I think her coverage was pretty on target. But it was based on interviews with the researchers, Damian Bebell (one of the leading researchers on 1to1 learning with technology), Sue Dorris, and me, not a researcher’s review of of our published findings. At one point, Justin suggests that Audrey is responding to a graph in our research summary (as if she were a researcher). I would suggest she is responding to conversations with Damian, Sue, and me (as if she were a journalist). It is a major fallacy to think everyone should be a researcher, or think and analyze like one (just as it is a fallacy that we all should think or act from any one perspective, including as teachers, or parents, etc). And it is important to consider individual’s context in how we respond to them. Different contexts warrant different kinds of responses and reactions.

Was It The iPads or Was It Our Initiative
Folks, including Audrey, asked how we knew what portion of our results were from the iPads and which part from the professional development, etc. Our response is that it is all these things together. The lessons we learned from MLTI, the Maine Learning Technology Initiative, Maine’s statewide learning with laptop initiative, that has been successfully implemented for more than a decade, is that these initiatives are not about a device, but about a systemic learning initiative with many moving parts. We have been using the Lead4Change model to help insure we are taking a systemic approach and attending to the various parts and components.

That said, Justin is correct to point out that, from a research (and statistical) perspective, our study examined the impact that solely the iPad had on our students (one group of students had iPads, the other did not).

But for practitioners, especially those who might want to duplicate our initiative and/or our study, it should be important to note that, operationally, our study studied the impact of the iPad as we implemented them, which is to say, systemically, including professional development and other components (Lead4Change being one way to approach an initiative systemically).

It is not unreasonable to expect that a district who simply handed out iPads would have a hard time duplicating our results. So although, statistically, it is just the iPads, in practice, it is the iPads as we implemented them as a systemic initiative.

Statistical Significance and the Issue of “No Difference” in 9 of the 10 Tests
The concept of “proof” is almost nonexistent in the research world. The only way you could prove something is if you could test every possible person that might be impacted or every situation. Instead, researchers have rules for selecting some subset of the entire population, rules for collecting data, and rules for running statistical analyses on those data. Part of why these rules are in place is because, when you are only really examining a small subset of your population, you want to try to control for the possibility that pure chance got you your results.

That’s where “statistical significance” comes in. This is the point at which researchers say, “We are now confident that these results can be explained by the intervention alone and we are not worried by the impact of chance.” Therefore, researchers have little confidence in results that do not show statistical significance.

Justin is right to say, from a researcher’s perspective, that a researcher should treat the 9 measures that were not statistically significant as if there were no difference in the results.

But that is slightly overstating the case to the rest of the world who are not researchers. For the rest of us, the one thing that is accurate to say about those 9 measures is that these results could be explained by either the intervention or by chance. It is not accurate for someone (and this is not what Justin wrote) to conclude there is no possitive impact from our program or that there is no evidence that the program works. It is accurate to say we are unsure of the role chance played on those results.

This comes back to the idea about how researchers and practitioners can and should view data analyses differently. When noticing that the nine measures trended positive, the researcher should warn, “inconclusive!”

It is not on a practitioner, however, to make all decisions based solely on if data is conclusive or not. If that were true, there would be no innovation (because there is never conclusive evidence a new idea works before someone tries it). A practitioner should look at this from the perspective of making informed decisions, not conclusive proof. “Inconclusive” is very different from “you shouldn’t do it.” For a practitioner, the fact that all measures trended positive is itself information to consider, side by side with if those trends are conclusive or not.

“This research does not show sufficient impact of the initiative,” is as overstated from a statistical perspective, as “We have proof this works,” is from a decision-maker’s perspective.

We don’t pretend to have proof our program works. What is not overstated, and appropriate conclusions from our study, however, and is what Auburn has stated since we shared our findings, is the following: Researchers should conclude we need more research. But the community should conclude at we have shown modest positive evidence of iPads extending our teachers’ impact on students’ literacy development, and should take this as suggesting we are good to continue our program, including into 1st grade.

We also think it is suggestive that other districts should consider implementing their own thoughtfully designed iPads for learning initiatives.

More News on Auburn’s iPad Research Results

The other day, I blogged about our Phase 1 research results on the impact of Advantage 2014, our literacy and Math initiative that includes 1to1 iPads in kindergarten. Now the press and blogosphere is starting to report on it, too.

Auburn’s press release, the research summary, and slides from the School Committee presentation are here.

It’s Your Turn:

Have you found press about this elsewhere? Please share!

Confirmed: iPads Extend a Teacher’s Impact on Kindergarten Literacy

I’m excited! I’m REALLY excited!

Our “Phase I” research results are in…

iPads in Kindergarten

We (Auburn School Department) took a big risk last May when we started down the path to have the first 1to1 kindergarten learning with iPads initiative. We had confidence it would help us improve our literacy and math proficiency rates. One of our literacy specialists had used her own iPad with students to great success (one of the big reasons we moved forward). But there were also segments of the community that thought we were crazy.

Now we have pretty good evidence it works!

We did something not a lot of districts do: a randomized control trial. We randomly selected half our kindergarten classrooms to get iPads in September. The other half would use traditional methods until December, when they received their iPads. We used our regular kindergarten literacy screening tools (CPAA, Rigby, Observation Survey) for the pre-test and post-test. And across the board, the results were emerging positive for the iPad classrooms, with one area having statistical significance.

These results are a strong indication that the iPad and it’s apps extend the impact our teachers have on our students’ literacy development. We definitely need more research (and will be continuing the study through the year, including comparing this year’s results to past years), but these results should be more than enough evidence to address the community’s question, “How do we know this works?”

And I’m especially excited that we went all the way to the Gold Standard for education research: randomized control trials. That’s the level of research that can open doors to funding and to policy support.

Why do we think we got these results?

We asked our kindergarten teachers that question. Anyone walking by one of the classrooms can certainly see that student engagement and motivation is up when using the iPads. But our kindergarten teachers teased it out further. Because they are engaged, students are practicing longer. They are getting immediate feedback, so they are practicing better. Because we correlate our apps to our curriculum, they are practicing the right stuff. Because we select apps that won’t let students do things just any way, we know the students are practicing the right way. Because they are engaged, teachers are more free to work one on one with the students who need extra support at that moment.

We also believe we got the results we got because we have viewed this as an initiative with many moving parts that we are addressing systemically. A reporter asked me, how do you know how much of these results are the iPad, how much the professional development, and how much the apps. I responded that it is all those things together, on purpose. We are using a systemic approach that recognizes our success is dependent on, among other things, the technology, choosing apps wisely, training and supporting teachers in a breadth of literacy strategies (including applying the iPad), partnering with people and organizations that have expertise and resources they can share with us, and finding data where we can so we can focus on continuous improvement.

And we’re moving forward – with our research, with getting better at math and literacy development in kindergarten, with figuring out how to move this to the first grade.

So. We have what we were looking for:

Confirmation that our vision works.

It’s Your Turn:

What do you think the implications of our research are? What do our findings mean to you?