Curmudgucation: Big Brother Knows What's in Your Heart
Well, this is creepy.
Before the pandemic, Ka Tim Chu, teacher and vice principal of Hong Kong's True Light College, looked at his students' faces to gauge how they were responding to classwork. Now, with most of his lessons online, technology is helping Chu to read the room. An AI-powered learning platform monitors his students' emotions as they study at home.
The software is called 4 Little Trees, and the CNN article only scratches the surface of how creepy it is. So let's work our way down through the levels of creepiness.
4 Little Trees is a product of Find Solution AI, a company founded in 2016. This product appears to be the heart and soul of their company. Though their "about us" mission statement is "FSAI consistent vision is to solve the difficulties that the society has been encountered with technology." They might want to look at their placement of "with technology" in that sentence. Anyway, on to 4 Little Trees.
It uses the computer webcam to track the movement of muscles on the student face to "assess emotions." With magical AI, which means it's a good time for everyone to remember that AI is some version of a pattern-seeking algorithm. AI doesn't grok emotions any more than it actually thinks-- in this case it compares the points it spots on the student's face and compares it to a library of samples. And as with all AI libraries of samples, this one has issues--mainly, racial ones. 4 Little Trees has been "trained" with a library of Chinese faces. The company's founder, Viola Lam, is aware "that more ethnically-mixed communities could be a bigger challenge for the software."
But aren't emotions complicated? The sample image shows the software gets to choose from varying amounts of anger, disgust, fear, happiness, sadness, surprise and neutral. The company calls these "primary" emotions. More complex emotions "like irritation, enthusiasm, or anxiety" are tougher to read, though in a great case of techno-whataboutism, one commenter notes that "human beings are not good at reading facial expressions." So there.
Privacy concerns? Particularly in China, where Big Technobrother is already pioneering all sorts of crazy-creepy techno-surveillance? 4 Little Trees only stores the data points for the different muscle points on the face, not the face itself. Feel better yet? There are 83 schools in Hong Kong using it, anywhere from $10 to $49 per student.
The applications touted in this article are simple enough. The software can perform emotional surveillance on your remote students, but the company suggests that it could also be useful in a large class, where it's hard for a teacher to catch the face of every student. Basically, just another handy support tool to help teachers better understand where their students are, right?
Well, no. That's the picture in the CNN profile, and that's pretty creepy. But if we go to the actual website for Find Solution Ai and 4 Little Trees-- okay, I can't actually read any of that one. But it turns out they've got an English language site as well.
Turns out that emotion detection can be helpful for teachers, but it can also be helpful for eliminating the need for them, too, because Find Solution Ai is in the personalized learning biz.
Here, for instance, is their quick four-step plug for Adopting Motivation Model:
When learner deals with hard questions...
Our emotion detection technology notices the frustration
The AI technology will analyze the learner's progress and provide them the most suitable question, and give challenges when it is appropriate.
Compare with the traditional way of learning, learner will be self-motivated and learn more effectively!
There's also some noise here about how their model raises "learning efficiency" to 12%. How did they measure that? Lord only knows, but they do mark "tradition learning" as 0% efficient. There is also a promise of Big Data Analytics which among other things will "forecast students' learning performance from the data collected, enhance further training to the students on their weaker areas." There appears to be a bunch of earn-a-badge gamification as well.
Basically, Find Solution Ai has augmented algorithm-driven education with some Ai that is supposed to read your heart. As always, there are many, many, many questionable links in the chain. Is the education programming itself valid and effective? How deep is the software's ability to adapt to students (when it's time to present the next exercise, are there dozens, hundreds or thousands of possibilities)? Is there any basis for believing the software can really read human emotions? And how will humans behave when they realize that software is trying to read their hearts by the expressions on their faces?
How do you feel about this? Lean in a little closer to your screen, so that Big Brother can better read your heart.
This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:
The views expressed by the blogger are not necessarily those of NEPC.