3 lessons from data on how students are actually using educational apps and software at school

Teachers and students aren't using all the pricey software that school budgets buy, researchers say

By: Jill Barshay
Original Post from Hechinger Report

rightBytes Inc. is a for-profit company that sells data analysis to public schools. One of its products monitors which websites students visit and which apps they’re clicking on their tablets. The company’s marketing pitch is that it can tell school administrators what educational software is actually being used, how much they’re spending on it and whether the ed tech they’re buying is boosting student performance — the education sector’s version of “return on investment.”

It’s not perfect. A lot of computer usage isn’t captured, especially at home. Higher test scores could be caused by things other than the online software like great teaching. Despite these drawbacks, the company has an interesting repository of technology usage from roughly 400,000 students, kindergarten through high school, across 26 states. (Yes, even kindergarteners are using apps at school.) The company hired Ryan Baker, director of the Penn Center for Learning Analytics, and another data scientist to mine the data and create a national snapshot of technology use for the 2017-18 school year. A report was released in November 2018.

Baker began by calculating how much each student improved on standardized assessments between the fall of 2017 and the spring of 2018 in both math and reading. (In addition to the annual state test each spring, many schools administer additional assessments throughout the year to track progress.) They had enough test score data to analyze roughly 150 of the 2,500 education apps in the marketplace.

Here are the takeaways:

1. Most software is drastically underused by schools

Although some apps are designed to be used daily or for many minutes each week, most aren’t used very often. Even the most intensely used app in the study, Carnegie Learning’s digitalACE, was used fewer than 32 days and for a total of 804 minutes, on average. That’s less than a half hour a week. The vast majority of the apps were used for fewer than seven days during the school year and less than 200 minutes in total.

Another way of expressing underuse is to look at software licenses that schools buy. (Sometimes each student needs a license but often multiple students can share a license.)  Some 70 percent of the licenses schools purchased weren’t used by anyone, Baker found.  Among the licenses that were used, most were used for fewer than 10 hours during the school year.

“Talk to teachers,” said Baker. “Pay attention to what your teachers are actually using. I think there’s a lot of cases where someone in the district thinks it’s a good idea and so they buy it for everybody. And most of the teachers don’t want anything to do with it.”

Baker says teachers are “smart” not to assign software to students if they themselves haven’t received enough training on how to use it well.

For some apps, however, Baker found that more licenses were used than the school purchased. That’s an indication that teachers are independently selecting their own apps and assigning free versions of them to students but the school hasn’t purchased premium access.

Calculating financial waste is tricky. The price of licenses ranged from 14 cents to $367. Schools often buy many of them.  Some of the most expensive ones were purchased by high schools for credit recovery, which gives students a second chance to pass classes that they failed, and to provide Advanced Placement courses that are not offered at a school.

2. More upside potential for math, less in reading

The researchers found a correlation between rising math scores and more time spent on the software but the correlation was tiny. Among the sites or apps showing the strongest correlations between usage and math scores were ALEKS, Wikipedia, LearnZillion, DreamBox, Seesaw and Starfall. However, certain online programs were conspicuously missing from BrightBytes’s list, such as ASSISTments, a free math program that has performed well in randomized-controlled trials.  It’s designed for homework but BrightBytes’s technology for monitoring use primary captured activity at schools.

In reading, a positive association between online activity and learning improvements was less common. Indeed, when the researchers compared reading test score gains across all the apps students used, there was no overall correlation at all. Students were just as likely to post the same reading test score gain regardless of the amount of time they spent learning online. That echoes more rigorous scientific research that has consistently found better outcomes for some math software but not in reading.

It would be a mistake to conclude from this study that online software is producing any test score gains. The kids who are assigned to use software more might be in classrooms with better teachers and it could be the human teachers who are producing the learning gains, not the apps and websites. It’s also possible that the kinds of students who use educational software the most are more motivated learners and would have had higher test scores even without software. A study that compares the test score gains with those of similar kids who didn’t use the software would be more conclusive. This study didn’t do that.

However, correlations like these send out important signals. “The fact that we’re not seeing a lot of correlations is a sign that the systems aren’t being used effectively or they’re not effective,” said Baker. “There are a lot of systems out there that do well in controlled settings, but they don’t do so well in the real world because of issues like teacher training or teachers choosing not to use the system.”

Not all reading apps were useless. Of the more than 100 apps and sites analyzed, some were associated with higher reading scores. Among the top one were Varsity Tutors, LearnZillion, Wikipedia, Brainingcamp, Google Classroom and TED-Ed. In the case of LearnZillion, higher test score gains were associated with both the number of days students logged in and the total time spent on the app.

Sometimes frequency mattered more than minutes. For example, students who visited Varsity Tutors more frequently had higher reading scores. But it didn’t matter how much time they spent on the app, which connects real humans to students via video for 1-to-1 tutoring sessions. With other apps, frequently visiting a site sometimes was associated with lower test scores. But spending a lot of time on a particular topic seemed to be beneficial.

3. Wikipedia pops to the top

Note that the lists of top apps include several free ones. That too confirms other research which has found that cheap can be effective. But it was odd to see Wikipedia listed among the top three apps for both reading and math achievement.  Perhaps it’s a sign that educational software is so ineffective that even an crowd-sourced encyclopedia can do better!

“The second biggest surprise of the whole investigation was how well Wikipedia did,” said Baker. “The word ‘app’ is a misnomer in this case. Wikipedia has a lot of mathematics definitions.”

Students might be looking up terms they don’t understand during a math lecture. Some Wikipedia entries have examples of how to do calculations, such as adding fractions or figuring out the slope of a line. The explanations are extremely sophisticated, quickly heading into college-level math, so it’s likely that the brightest students are best able to take advantage of it.

It’s another example of how technology use at schools might be helping the best and the brightest to surge ahead. That’s good for motivated students, but it could also increase the achievement gap.