Media contact

Rachel Gray
Media and Content
0411 987 771
rachel.gray1@unsw.edu.au

People will have a greater understanding of their data and greater control over how it is used when they are using artificial intelligence technology, if UNSW PhD student Kevin Witzenberger’s plans come to fruition.  

The Scientia Scholarship student from UNSW Arts & Social Sciences is conducting this research as part of his PhD titled The Impact of Artificial Intelligence on Education Policy.

Mr Witzenberger says he is looking at the various forms of AI taken up in education, then questioning what they promise, what they deliver and what implications these technologies might have in an educational setting.  

Automated roll calls via facial recognition, online student verification through keystroke analysis, and student dropout or success predictions are just a few examples, he says.

“I want to know what is happening with some of these technologies and what impacts they have on education,” he says. 

At the start of his research, Mr Witzenberger visited the world’s biggest EdTech trade show in London to look at emerging new education products featuring AI.  

“What really stood out to me was that while these systems are very complex, they’re often based on simple ideas and are not always appropriate for education,” he says.

He gives the example of an AI program he once saw that gave the teacher a read-out of students’ emotions in the classroom based on facial recognition technology. 

“But what value does that really add?” he asks.  “I am critical of a machine that says a student is 67.8% happy.”

“We should not see these numbers as complete truths when they come from a machine. We should instead be asking what is the value in something like an emotional score,” he says.

“Because aren’t teachers better placed to deal with the emotional wellbeing of their students than a camera paired with an algorithm?”

Another area of concern for AI in education, Mr Witzenberger says, is the potential for bias in automated systems. 

“Machine learning is the part of AI that learns to make its own rules and decisions in a better way, or to make a prediction,” he says.

“So, if I want to identify faces in the classroom, then I’d have to feed it a lot of faces that I choose first which will then determine the way the system identifies people.” 

Sadly, this can often be discriminatory, he says. 

Mr Witzenberger says one example of this is when Amazon used an AI algorithm in their recruiting process and later discovered it screened-out female applicants.

“They tried to build an AI that determined employability based on the CVs people sent in. But the definition of employability was built around only the previously successful applicants,” he says.  

And because they had hired more men in the past, it became a very sexist form of screening, he says. 

“Looking at these issues shows us the problems we need to avoid in education,” he says. “Therefore, the development of these technologies also needs to be monitored.”

Mr Witzenberger says we would not want a system that steers female students away from subjects such as STEMM just because there have been more male enrolments in the past.

“I'm not always trying to look at just the negative side of AI, but it's just that there’s so much of it.” 

Mr Witzenberger says he hopes this research will give policy makers and educators greater awareness over the AI technology they choose. 

For the project part of his PhD, Mr Witzenberger intends to develop a prototype that would give students more control over the way their data is analysed, which is otherwise often concealed in a black box.

He says commercial organisations tend to hide how decisions are made behind a black box of intellectual property, which means users don’t really know how their data is being used.

“And the owners of these black-boxes do not always want the user to know what data they’re actually using,” he says. “It’s all supposed to operate a little bit under your consciousness.”

Mr Witzenberger says he wants to build a prototype with students that is very open, consensual and not based on obsessive features and invasiveness. 

“AI as we know it in education is deeply embedded into the context of powerful technology companies such as Google, Microsoft and Amazon,” he says.

For this reason, his research will focus on rethinking “some of the recent trends around the datafication and digitalisation of education by giving the power back to the user.”