Will AI change what it means to be human?

|
Facebook share Twitter X share Pinterest share Linkedin share

Studying the role of artificial intelligence in educational technology raises fundamental questions about what it means to be human

Artificial intelligence technologies are quickly changing the way we teach and learn… and may also be changing something much more fundamental: who we are as humans. 

These are important research focuses for Dr. Jon Dron, a professor in the School of Computing and Information Systems in Athabasca University’s Faculty of Science and Technology. He has spent most of his academic career focused on education, technology, and educational technology. 

Early in his career, this included developing algorithms for collective-intelligence applications. For example, an algorithm might be able to tell a user, "If you like this book, then you might also like these other books." This is essentially what generative AI like ChatGPT does today. 

“What I’ve found interesting with this explosion of stuff in generative AI is that it is exactly this collective intelligence stuff I was doing, but at a massively, massively greater scale and sophistication,” he said. 

The trouble is this is very much not human, but it plays a very human role in the kinds of things we do. And I am in equal parts enthralled and terrified by the possibilities of this.

Dr. Jon Dron, associate dean of learning and assessment, Faculty of Science and Technology

Can AI replace teachers?

Dron’s focus has shifted more recently away from developing algorithms—generative AI does this much more quickly and effectively than a human can—and towards trying to understand how this kind of technology can help us teach and learn. 

It certainly does have its advantages. He said you can ask a large language model like ChatGPT to write a university course on a given topic, with specified learning outcomes and metrics through which to measure those outcomes, and it will do a decent job in a fraction of the time it would take for a human to do that work.  

The problem is that with nearly every bit of technology in human history, the bit of technology itself is not just the artifact. Rather it is the process of how humans use those artifacts to meet their needs. 

For example, an alphabet has little value without a human to put the letters in order and create meaning, and likewise a pen and paper have little value without a human using them to create some sort of output. But this has changed with large language models, where the artifact itself is now participating in creating meaning with its responses to different prompts. 

“The trouble is this is very much not human, but it plays a very human role in the kinds of things we do,” Dron said. “And I am in equal parts enthralled and terrified by the possibilities of this.”

Human face with digital enhancements

Rethinking teaching and learning

For Dron, this development emphasizes the importance of understanding how the process of teaching and learning works, and how best to achieve the outcomes we want from education. He argues learning simply to demonstrate proficiency in a defined series of educational outcomes with quantifiable quiz and examination results ultimately misses the point of education. 

“Education is about creating safe societies, productive societies, and creative societies,” he said. “And it’s about helping us collectively figure out how to be human together.” 

If we determine the value of education based on how many students achieved a certain level of results with percentages and graphs, we’re not reaching that goal. And if we start to depend on generative AI for different parts of the process, then we’re getting even further away from that goal. 

“It’s really cookie cutter stuff, which is a real pity,” Dron said. “That’s not how teaching really works, and it’s not the best way to be learning.” 

Children’s stories aren’t about learning that giants can be dangerous, and beans can sometimes grow really, really high. They’re about what it means to be human.

Dr. Jon Dron

AI cannot teach us how to be human

Dron points to the example of children’s stories to illustrate the challenge that comes from a technology like generative AI using itself to create an output, such as writing, that can closely mimic what a human can do. 

“Children’s stories aren’t about learning that giants can be dangerous, and beans can sometimes grow really, really high. They’re about what it means to be human,” he said. 

“When that is being generated by essentially a collective algorithm, what our kids are learning isn’t quite what it means to be human. It’s about what it means to be an AI.” 

As these generative AI models continue to develop, and each learns in part from content that previous versions of AI have produced, then the outputs will become less and less human. And as far as Dron can tell, these models will continue to quickly increase in sophistication, which means we as the end users must be increasingly cautious about how we use these tools and how we think about them. 

We must keep in mind that we’re not using these tools to do something like write software. Rather, we work with the tools to write the software. 

“It’s an important and subtle distinction,” he said. “In doing that, we maintain a bit of a separation between us and the objects that are changing how we think. And I think that’s a healthier way of thinking about these things.” 

Explore AI with a master's degree

Athabasca University’s Master of Science in Information Systems (MScIS) is a graduate program like no other, with many routes and options to meet your unique educational goals.  

Using AI to personalize learning

AU researcher Dr. Oscar Lin uses AI to explore adaptive learning and delivering a more personalized education experience for online learners.

Learn More