EDITORIAL: Alaska’s education commissioner’s AI blunder has lessons for us

It was a scenario worthy of a TV sitcom: In making a case to the state Board of Education for cell phone restrictions in Alaska schools, state education commissioner and former Anchorage School District superintendent Deena Bishop leaned heavily on a AI text generator – en failed to remove the fabricated quotes it added to support its arguments. If she had been a high school student, Bishop would have received an A on the assignment and a stern lecture on her own work. It is shameful that our top education manager does not have sufficient knowledge about the correct use of AI, and we should not send our students into the world unequipped in the same way.

The AI ​​debacle was doubly unfortunate because it distracted from two more valuable discussions we should be having about education and technology – first, the issue Bishop enlisted AI help to address: cell phone restrictions in schools. It’s ironic that the quotes hallucinated by Bishop’s AI helper were fake, because they are enough real world data This indicates that cell phone restrictions in schools are beneficial for student success and social-emotional well-being. Banning cell phone use on school grounds is strongly correlated with higher math scores and is widely supported by teachers who witness the distracting effects of phones on their students. The State Board of Education should not let Bishop’s misstep distract from the serious issue at hand — and from the opportunity to undo some of the distractions that have crept into the classroom.

The other unfortunate aspect of Bishop’s citation fabrication is that it shows a lack of maturity in the way we use artificial intelligence – even at the highest levels of our government. While the temptation has been great, especially in schools, to impose a blanket ban on the use of AI in schoolwork, this is not a technology that will go away. On the contrary, we should expect it to become more deeply entrenched in our times. the daily life of the coming years.

With that in mind, the solution may not lie in imposing some kind of monastic moratorium on the technology, but in carefully integrating it into the curriculum and teaching students how to use it responsibly. In the face of such a groundbreaking development, the impulse to panic is powerful, and – especially in schools – we are wary of doing things differently from the way we have been taught. But just as calculators did not prevent students from doing math, the advent of language and image generation tools, if used wisely, will not prevent students from thinking critically.

It is incumbent upon us, as parents and educators, to think of ways in which AI can be a valuable learning tool rather than a crutch used solely to save time and reduce effort. Consider how students using a chatbot as a partner in a Socratic dialogue about a lesson topic can lead to insights that would otherwise not be achievable given the limitations of a teacher’s time in a given class period.

The road between where we are today and the point where AI will be seamlessly integrated into our society will certainly be bumpy, but it will only get bumpier if we don’t focus on using our technological tools properly. We need to think about the ways in which we use AI to help us, and ensure that we don’t pledge our work to it, but rather use its capabilities to broaden our own horizons, synthesizing data that we might not otherwise have access to thought, and use its output. as a springboard to creatively solve our problems – a valuable human skill.

And whether the person using AI is a student drafting an outline for an essay or an education commissioner wanting to brief the state school board on policy, we’d be wise to double-check what it’s telling us to avoid being embarrassed by our naive confidence that the friendly machine that spits out suggestions would never lead us astray. After all, who among us has never been told by our GPS driving assistant to take a road that didn’t exist?