Recommended

Invasive New Education Technology Turns Students Into Lab Rats

A teacher conducts a class at a public U.S. elementary school.
A teacher conducts a class at a public U.S. elementary school. | (Photo: Reuters)

As education-technology lobbyists persuade policy-makers that "personalized learning" is the golden ticket to academic achievement, parents must understand exactly what this concept is and what it means for children. What they discover should trigger a rebellion.

As we've explained (see here, here, and here, for example), personalized learning means using technology to deliver instruction based on the "needs" of the student. A child will be propped in front of a screen and directed to interact with a sophisticated software program that records thousands of behavioral data points. His physiological data can be harvested via the "Internet of Things," which refers to a connected network of devices, such as wristbands, that deliver physical data to a smartphone or computer.

All this psychological and physiological data will be fed into an algorithm that not only analyzes how the child's brain works but predicts what he'll do in the future. None of this is explained to parents when they're told "personalized learning" is the wave of the future.

Get Our Latest News for FREE

Subscribe to get daily/weekly email with the top stories (plus special offers!) from The Christian Post. Be the first to know.

Ed-tech wizards offer a plethora of untested but supposedly transformational plans. The latest comes from Massachusetts-based BrainCo, Inc., which promises a headband that students can wear during lessons to transmit electroencephalography (EEG) data to teachers. This data will provide real-time information on how well the kids are paying attention. Seriously.

An EdSurge.com article on the BrainCo scheme illuminates multiple problems – imprecision of the EEG data, teachers' lack of training to interpret it, and undeveloped or nonexistent privacy policies, among others. But those gaping holes in the plan haven't stopped investors from China (appropriately, given China's affinity for invasive surveillance) and even a Harvard dean from signing on. Wave of the future.

BrainCo is part of a much bigger effort. An alliance between the University of Southern California schools of education and engineering also plans to connect children to devices and analyze them as lab rats. The USC project, called the Center for Human-Applied Reasoning and the Internet of Things, or CHARIOT, is described as a "moonshot."

The federal government advocates peering into children's psyches, having endorsed the idea almost six years ago. The feds celebrate devices such as facial-expression cameras, wireless skin sensors, and eye-trackers to measure how "engaged" students are with their lessons. According to its press release, this is also CHARIOT's focus: "An essential aspect of personalized learning . . . is being able to evaluate such things as the level of student engagement and emotions."

"Using wearables and other sensors to collect physiological and cognitive data that can be analyzed and interpreted to measure mental effort," the press release enthuses, "provides opportunities to use artificial intelligence, or AI" to show teachers and students how best to present and learn information. So now the government school won't measure just knowledge but also "mental effort."

For example, the press release explains, "as a student does a reading assignment, eye-tracking technology might tell a teacher when the student is losing focus . . . ." Maybe the technology can be enhanced to give him an electric jolt to get him back on track?

But the techsters plan to use a positive approach as well. "AI technology will also be used to provide students with additional assistance with content, learning strategies and motivation messages." All that was formerly provided by something called "teachers," but maybe that concept, relying as it does on human contact rather than machines, is too 20th-century.

CHARIOT's directors believe that with "pervasive" sensors, "there are currently unforeseen ways by which data can be gathered, interpreted and deployed . . . ." No kidding. When this intensely personal and possibly erroneous data is collected on children – with minimal restrictions on how it can be used or to whom it can be disclosed – the "unforeseen" possibilities are frightening.

Mathematician and data scientist Cathy O'Neil warns about the dangers posed by feeding this data into algorithms that may influence a child's future. Education technocracy "raises some serious privacy concerns – particularly if you consider that it can involve tracking kids' every click, keystroke and backspace from kindergarten on."

O'Neil asks about the late bloomer who couldn't read until third grade, or the 10th-grader whose "persistence" score droops on the software. "Should colleges," she asks, "have access to that information in making their admissions decisions?" And remember the ill-fated inBloom, a data scheme that, until derailed by privacy advocates, included tags identifying children as "tardy" or "autistic." How long do those tags last? Forever?

As O'Neil warns, "tech moguls are conducting an enormous experiment on the nation's children," without parental understanding or consent. Policy-makers should pull the plug.

Emmett McGroarty and Jane Robbins are senior fellows at American Principles Project.

Was this article helpful?

Help keep The Christian Post free for everyone.

By making a recurring donation or a one-time donation of any amount, you're helping to keep CP's articles free and accessible for everyone.

We’re sorry to hear that.

Hope you’ll give us another try and check out some other articles. Return to homepage.

Most Popular

More Articles