Coinbase’s Head of State Policy: Crypto can’t wait

Dancey applies best practices, e.g. For example, after a lecture or discussion, she asks students to identify their “muddy spot” — a concept or idea that they feel students are still struggling with. “I ask them to write it down, share it and we’ll address it as a class for the benefit of all,” she said.

But Intel and Classroom Technologies, which sell virtual school software called Class, think there might be a better way. The companies have teamed up to integrate an Intel-developed AI-based technology with Class that runs on Zoom. Intel claims its system can tell if students are bored, distracted, or confused by rating their facial expressions and how they interact with educational content.

“We can give the teacher additional insights so they can communicate better,” said Michael Chasen, co-founder and CEO of Classroom Technologies, who said teachers have struggled to connect with students in virtual classrooms during the pandemic.

His company plans to test Intel’s Student Engagement Analytics technology, which captures images of students’ faces using a computer camera and computer vision technology and combines them with contextual information about what a student is working on to assess a student’s level of understanding. Intel hopes to convert the technology into a product that can distribute it more widely, said Sinem Aslan, a research scientist at Intel who helped develop the technology.

“We’re trying to enable one-on-one tuition at scale,” Aslan said, adding that the system is designed to help teachers identify when students need help and inform how to change class materials based on how students are having interact with the lesson content. “A high level of boredom will lead to it [students to] Hide educational content entirely,” Aslan said.

However, critics argue that it is not possible to accurately determine whether someone is feeling bored, confused, happy, or sad based on facial expressions or other external cues.

Some researchers have found that categorizing their condition with a single label is an inappropriate approach because people express themselves through tens or hundreds of subtle and complex facial expressions, body gestures, or physiological cues. Other research shows that people communicate emotions like anger, fear, and surprise in ways that differ across cultures and situations, and how they express emotions can vary at an individual level.

“Students have different ways of showing what’s inside them,” said Todd Richmond, a longtime educator and director of the Tech and Narrative Lab and a professor at Pardee RAND Graduate School. “For the student to be distracted at that moment may be the appropriate and necessary state for him at that moment in his life,” he said, when it comes to personal issues, for example.

Controversial Emotions AI invades everyday technology

The classroom is just an arena where the controversial “emotional AI” finds its way into everyday tech products and sparks investor interest. It also invades delivery and passenger vehicles and virtual sales and customer service software. Following Protocol’s report last week on the use of this technology in sales pitches, Fight for the Future launched a campaign urging Zoom not to include the technology in its near-ubiquitous video conferencing software.

At this early stage, it’s not clear how Intel’s technology will be integrated into the Class software, said Chasen, who said he expects the company to work with one of the colleges it’s already working with to bring the Intel system to evaluate. Chasen told Protocol that Classroom Technologies doesn’t pay Intel to test the technology. Class is backed by investors including NFL quarterback Tom Brady, AOL co-founder Steve Case and Salesforce Ventures.

Intel has established partnerships to support the spread of other emerging forms of AI that it has developed. For example, the company has partnered with Purdue University and soccer scouting app AiScout in hopes of developing a system that turns data to visualize joints and skeletal movement into analytics to monitor and improve athletic performance.

Educators and advocacy groups have raised alarms about excessive student surveillance and invasions of privacy related to facial recognition, used in schools for identification and security purposes. These concerns have accelerated as AI-based software has been used more than ever during the pandemic, including technologies that monitor student behavior in hopes of preventing cheating during virtual tests and systems that track content that View students on their laptops to do this to identify if they are at risk of self-harm.

Class already tracks how many times students raise their hands during a session and offers a “supervision” feature that teachers can use to monitor what students see on their computers if students agree to share their desktop screen with teachers.

“I think we have to be very sensitive to people’s privacy rights and not be overly intrusive with these systems,” Chasen said.

Cameras as a social justice issue

As virtual teaching has become the norm in recent years, a debate has arisen among educators as to whether or not students need to turn on their cameras during class. Today, cameras are optional in Dancey’s English program, in part because students can communicate with teachers in virtual environments using their microphones or via chat.

But to capture students’ facial expressions, Intel’s technology would need to have those cameras turned on.

“The thing about turning on cameras almost became a social justice issue,” Dancey said. Not only are some students concerned about others being able to see where or how they live, but activating the cameras also consumes power, which can be an issue for students who use a mobile hotspot to connect to class, she said.

“It’s kind of an invasion of privacy, and there are accessibility issues because the camera that’s on uses a huge amount of bandwidth. It could literally cost them money,” Dancey said.

We don’t want this technology to be a surveillance system.

“Students shouldn’t have to control how they look in the classroom,” said Nandita Sampath, a policy analyst at Consumer Reports who focuses on algorithmic bias and accountability issues, who said she wondered if students had the ability to get inaccurate results contesting if they come from Intel system leads to negative consequences. “What cognitive and emotional states do these companies say they can assess or predict, and what is accountability for that?” she said.

Aslan said the goal of Intel’s technology isn’t to monitor or punish students, but to coach teachers and provide additional information so they can better understand when students need help. “We didn’t start this technology as a surveillance system. In fact, we don’t want this technology to be a surveillance system,” Aslan said.

Sampath said Intel’s technology can be used to judge or punish students even when that’s not the intention. “Maybe they don’t intend for that to be the final decision maker, but that doesn’t mean the teacher or administrator can’t use it that way,” she said.

Dancey said teachers were also concerned that surveillance would be used against them. “Often surveillance is used very unfairly against instructors,” she said. “I don’t think it would be paranoid to say, especially when it comes to measuring ‘student engagement’ – TM, in quotes – that if I apply for a promotion or employment, that will be part of my evaluation ? Could they say, ‘So-and-so had a low comprehension quotient?’”

When Intel tested the system in a physical classroom, some teachers who participated in the study suggested it provided useful information. “I’ve witnessed how I was able to capture some of the students’ emotional challenges that I didn’t anticipate [before]said one teacher, according to a document provided by Intel.

But while some teachers found it helpful, Dancey said she wouldn’t want to use the Intel system. “I think most teachers, especially at the university level, would find this technology, like the panopticon, morally reprehensible. Honestly, if my institution offered it to me, I would turn it down, and if we had to use it, I would think twice about continuing to work here,” she said.

AI data processing by psychologists

At this early stage, Intel wants to find the best ways to implement the technology in a way that is most useful for teachers, Aslan said: “How do we make it consistent with what the teacher does on a daily basis? ”

I think most teachers, especially at the university level, would find this technology morally reprehensible.

Intel developed its adaptive learning analysis system by integrating data collected from students in real classroom sessions using laptops with 3D cameras. To label the ground truth data used to train the algorithmic models, the researchers hired psychologists to watch videos of the students and categorize the emotions they detected in their expressions.

“We don’t want to start with assumptions. That’s why we hired the subject matter experts to label the data,” said Nese Alyuz Civitci, machine learning researcher at Intel. The researchers used data only when at least two out of three labelers agreed on how a student’s utterances should be categorized.

“It was really interesting to see those emotions — the states are really subtle, they’re really tiny differences,” Civitci said. “It was really hard for me to see those differences.”

Rather than judging Intel’s AI models based on whether they accurately reflect students’ actual emotions, the researchers “positioned it on how instrumental or how much a teacher can trust the models,” Aslan said.

“I don’t think it’s a technology that’s fully mature yet,” Chasen said of Intel’s system. “We need to see if the results are relevant to student performance and see if we can’t extract useful data from them for teachers. We’re testing that to find out.”

Ultimately, he said, the Intel system will provide an element of data that Classroom Technologies and its customers will combine with other signals to create a holistic assessment of students.

“There’s never a single piece of data,” he said. He also suggested that the information revealed by Intel technology should not be used alone without context to assess a student’s performance, such as B. “when the AI ​​says they’re not paying attention and they all have an A”.

Comments are closed.