Adrian Camm, principal of Westbourne Grammar

Rate this post

Banning AI tools has become the default position of public school systems across the country, with all states and territories except South Australia moving to curb their use due to fraud fears and ethical concerns.

But Cam says that’s just the wrong approach.

“You talk about pausing, banning and regulating things, it just goes underground. We need to understand; We need education on how to navigate this new environment in safe, effective and ethical ways,” he says.

“If schools aren’t at the forefront of having good conversations with young people, who is?”

This view is shared by Vitomir Kovanovic, Senior Lecturer in Learning Analytics at the University of South Australia. He says the public school system risks leaving kids unfit for the future workforce.

“We need to do more than just block technology,” he says. “What’s the point of doing that when the kids are going to go to jobs that are using AI?

“If students in public schools don’t engage in AI where students in private schools do, they will be at a disadvantage in job opportunities.”

A similar approach was adopted by South Australian Education Minister Blair Boyer, who earlier this year began an eight-week trial of a ChatGPT-style AI bot called “EdChat” in a handful of public schools in the state.

“AI has already become a feature of working life and will continue to do so in the future,” Boyer said in announcing the trial. He emphasizes ensuring that children are equipped with the skills to protect themselves online.

“Without teaching our young people how to use AI in a safe way, we may be doing students a disservice by not preparing them for the jobs of the future.”

The release of ChatGPT in November 2022 put the power of AI in the global spotlight.

Beyond equipping students with workforce skills, Kovanovic says AI tools can add value to teachers and students in the classroom today.

He mentioned teachers using AI to generate different essay questions for students based on their interests. For example, one question may use cricket as a theme, while another may use a student’s favorite television show.

“That can be a very powerful thing,” he says.

For students, the ability to access AI tutors who can provide near-instant guidance and feedback 24-7 is also seen as a major drawcard, while chatbots are already being used in classrooms to test students’ critical thinking skills.

But while AI’s power to shape education is being hyped by some, others are taking a more serious look. A 2019 report prepared for the federal Education Department says it ultimately boils down to belief.

“How school systems engage with AI will be largely determined by broad public education that can build trust in the technology,” it says.

“In schools, this trust must be founded on the potential of AI-enabled systems for fair and equitable learning opportunities and the well-being of students and their entire school community.

“AI can potentially offer benefits to teachers and students in the form of personalized learning and pedagogical agents designed to deliver appropriate and targeted content and feedback to students.”

Erica Southgate, associate professor at Newcastle University and co-author of the report, says that while AI has absolute benefits, there are also a mountain of unresolved issues, both known and potential.

For example, in terms of delivering appropriate and indexed content and feedback, she says there have been cases where schools have achieved 95 percent on tests for all students using a particular tool.

“I think this system would do a great job of teaching kids how to do math, or it could be that the curriculum could be too simple,” she says. In other words, the indexed material did not shock the students enough.

“What we don’t want to do is disable teachers.” They need to be really skilled and able to intervene and speak up “when it’s good for learning and for the students”.

Southgate also says that teachers and parents should understand that AI is already present in many applications used in schools, and that while generative AI like chatbots has received a lot of attention, AI in education was not new.

“One of the main issues facing the education sector is that AI already exists and is really bringing to light what digital rights are for students and teachers,” she says.

AI uses extractive technology and works to extract data, whether it’s what people write or their biometric data, and in many cases people don’t realize what data is being extracted.

“How do we use it well but how do we challenge the use of it, is it fair and ethical and who gets access?” Southgate says.

“All we can do is say that the education department and the education institutions are doing the right things, but we don’t really know. There are a lot of complicated questions about who gets access to AI.”

Southgate says a big issue is whether fairness can be automated.

She cites a recent paper by Sandra Wachter, a professor at Oxford University, which “found a serious discrepancy between European notions of discrimination and existing work on algorithmic and automated fairness.”

While AI algorithms are often black boxes, Southgate says they have the potential to discriminate in ways we’ve never imagined.

“The history of AI is full of examples of discrimination and bias, and they’ve really harmed real humans, and they often go undetected because we lack oversight,” she says.

“I think there are a lot of great uses for this technology, but we need to approach it really carefully. We need to have more discussions about accountability and governance and transparency because it’s not clear which departments are buying and how they’re going to affect (things).” can

When state and federal education ministers met earlier this month, they agreed that a national AI task force would consult on a draft framework for schools, which will be considered by the end of the year.

But Westbourne principal Cam is already onboard. He has started an AI academy since its fifth year that he hopes will make its students only passive users of end-user technology.

“We don’t want students to learn how to type prompts in ChatGPT to write an essay,” he says. “We want to say to them: How do we build our own artificial intelligence? How can we ensure that we are thinking ethically when we build artificial intelligence?”

Cam says he wants to inspire philosophical and ethical conversations for tomorrow’s ethicists and futurists because, ultimately, “the opportunities are great” when it comes to AI.

Leave a Comment