This is the first of a podcast series dedicated to global perspectives on educational innovation launched by the Centro de Innovación and I₂ + E Clinic at CUCEA, University of Guadalajara, Mexico, in conjunction with MED, Finland.
It argues that any decisions we make on generative AI in education need to include a holistic understanding of real life in school and out of school, recognition that sometimes education is dragged down by overfull and outdated curricula and teaching methods, understanding of the pressures students and teachers face, and a need to further maximize the role of human interaction, not technological, in education.
Transcript:
This podcast is about the coming together of two phenomena.
Education the tortoise that moves slowly and generative AI the cheetah that has been fast for a long time, and now is getting faster.
Let’s start with the tortoise. Some educational systems are acclaimed as being effective, but only if we are talking about life in the 19th century. Others are highly acclaimed as fit for the 21st Century, but one thing for sure is that it can take a long time to create schools and universities both effective today, and future ready.
The problem now is that we don’t have the luxury of time. The world is changing too fast, and something needs to be done. This is not just about the global environmental crisis or geopolitical powerplay, but, mainly, because of the nature of education, it is about the worlds of young people.
Some educationalists say – “surely we can do better” – and now in 2023 they are looking to see if the cheetah, generative AI, could be a tool to make that happen.
We have increasingly been living with what is called Artificial Narrow Intelligence (ANI) since the term was launched in 1956. Organisations and companies throughout our societies use this for question-answers services, often to the frustration of consumers, people like me.
But, in November 2022 something happened. A highly publicised type of ANI became freely available called ChatGPT. There was a lot of talk, sometimes almost hysterical, in and outside education.
Some of this talk was not actually about artificial narrow intelligence like ChatGPT but about what is called Artificial General Intelligence AGI.
AGI, if and when it comes, would involve applications operating with the capacity of highly intelligent humans. It is not here yet.
ChatGPT is one of many chatbots that are available such as Bard, Bing, Chatsonic, Jasper, Perplexity, and Youchat. Chat means conversation, Bot means software that imitates a human user’s behaviour.
Some of these chatbots have been around for a long time. A basic version called Eliza, was developed as a psychiatrist conversational interface back in 1966 at MIT. But now things are moving fast, and the media have engaged with a dramatic frenzy about AGI predicting mass unemployment, and even the end of life when AI turns against humanity and destroys our world. At one point it was declared ‘AI chatbots will soon be more intelligent than humans’.
Meanwhile back in the world of education, those people thinking and saying ‘surely we can do better’ were exploring Chatbots like ChatGPT.
To get a perspective on what is happening, it is worth thinking about the recent speed of technological development because it is exponential.
In 2018, the first version of ChatGPT was designed according to 117 million parameters. A parameter is a measurement of the complexity and capacity to learn from input data.
The highly published ChatGPT-3 launched in November 2022 involved 175 billion parameters, and only about 5 months later in March 2023 a new version ChatGPT-4, involved an almost unbelievable 100 trillion parameters.
So, what was the talk in education about this form of generative AI as we went into early 2023?
At the start it was heavily reactive, focusing on potential problems and threats such as predicting a so-called tsunami of cheating amongst students. Interestingly it did not include cheating by academics and researchers, but anyway that is another story. The mantras were ‘ban it’ – drive it away, punish people for using it and so on. Then there were others who simply said ‘ignore it, it will pass’.
As the months rolled on the talk became more proactive, probably because people had time to dig deeper and try to understand Chatbots better, and focused on how these chatbots could be tools to enhance both learning and teaching.
Mid-year these more positive and optimistic perspectives extended to thinking beyond teaching and learning, and into student wellbeing and competence-building for future lives.
It included re-conceptualizing our understanding of academic integrity, admitting that there are many tools already available that can give students forms of advantage, and looking to see how AI, present and future, could be harnessed to enhance education by reimagining outdated approaches to teaching, help teachers with their workload, and enable them to have more ‘quality time’ to function even more effectively.
By the time you listen to this podcast both AI development and educational discourse on its use may have evolved considerably. However, such discussion is likely to revolve around one key constant.
This constant is seeing that any decisions we make need to include a holistic understanding of real life in school and out of school, sometimes overfull and outdated curricula and teaching methods, the pressures students and teachers face, and a need to further maximize the role of human interaction, not technological, in education.
This constant is important because learning to live with generative AI is not just about technology, but about humanity. The importance of the teacher, as a person who students believe cares about them, who can be trusted and through this respected, is expanding as new technological developments impact on lives, in and out of school.
And in response to the teacher who asks, am I going to be replaced by AI? The answer is no, on the contrary, your long-standing often undervalued role in society is going to become ever more important. The question now is should we be using AI intelligently to both maximize that human interaction, and change education for the better?
Let’s just think about young people in our societies. Across the world we have a new global culture which is driven by access to technologies, often from a very young age, and a high level of usage.
At the same time, we have major studies reporting rapid increases in levels of anxiety and depression amongst young people.
One study published this year by the USA Centers for Disease Control and Prevention reports that whereas in 2011 adolescent females telling that they experience persistent sadness or hopelessness was already a high 36% in 2011, 2021 this had risen to 57%.
Those who had seriously considered suicide was 19% in 2011, and then up to 30% in 2021. Male adolescent levels are also reported as rising but the increases are smaller, and the rates not as high as with females.
And these latest figures are from 2021 – some two years ago. If the upwards trajectory is continuing the figures now are likely to be higher. Frankly, the statistics from English-speaking countries are appalling. Maybe they are not found in other countries but as one researcher Jonathan Haidt comments ‘the arrival of smartphones rewired social life for an entire generation. What did we think would happen to them?’
Looking back at recent history he points to 2010 as a decisive year. ‘Instagram was founded in 2010. The iPhone 4 was released then too – the first smartphone with a front-facing camera. In 2012 Facebook bought Instagram, and that’s the year the user base exploded’.
And it was around 2010 health statistics on anxiety, harm, depression and worse started rising. If these figures are replicated in other countries then this is means that some 30-50% of adolescents are facing challenges with mental wellbeing.
Put simply, a world of digital connectivity may not be leading to a world in which people experience human connectedness.
When we think now about education and have to make decisions on banning, living with or ignoring AI like ChatGPT, statistics like these this on mental wellbeing need to be part of the picture.
So, what do we see?
We can see that young people are acquiring new and crucial skills from their experience of living with integrated technologies from a very early age. This influences how they process and use information. Significant studies report that use of the Internet can result in acute and sustained changes in cognition, specifically attentional capacities, memory processes, and social cognition.
We can see that these technologically astute young people are increasingly generating their own contexts for and habits of learning, and that in many countries the speed of this is outpacing how we are responding in education (PISA 2018). So, if they don’t learn the way we teach, maybe we should teach the way they learn. People have said – no, we must maintain our standards. But first how relevant are the standards, and second as I said earlier some of these standards may be relics from another century, another reality.
We can see that state-of-the-art curricula fit for this new age include life-centric intended learning outcomes that blend both knowledge and competences.
We can see that generative AI can be used as a powerful tool to effectively operationalize these new curricula in many ways.
One is being to enable high levels of personalised learning through diversified learning activities and processes and through this the opening up new learning opportunities, especially for those who have specific learning needs, neurodiverse learning preferences, or are otherwise marginalized.
And it is now being shown how AI can contribute to building healthy supportive learning environments for those who are negatively affected emotionally, psychologically by some forces in their environment, whether through social media or something else such as fear of the future.
No teacher can do this alone when facing classes or 30-40 students.
Now towards the latter part of 2023, teachers, researchers, and students themselves are giving examples of how generative AI can open up doors on effective learning and future-readiness in 5 ways.
These involve enhancing student agency (students taking responsibility and making choices); engagement (students being involved in constructing teaching and learning processes); efficacy (students having self-belief in having capacity to reach goals); cognition (students having more sophisticated analytical systems thinking); and competences (students applying knowledge and know-how to solve problems and achieve goals).
These people are not arguing that more class contact time should have students using computers.
What they are saying is that the potential of forms of AI for young people is so huge that it can be utilized in or out of school, and that it could act as a significant change agent in revitalizing fit-for-purpose education in the 2020’s.
They also recognize that the potential for teachers is high in reducing slow-time work such as administration, speeding up the development of resources and activities, and helping them break away from what has been called the ‘tyranny of the textbook’, helping them to manage the curriculum a tool, not just a rule.
We need to listen to the educational professional communities now talking, studying, publishing. This could be a window of opportunity to realize innovative educational practices that benefit the mental wellbeing, intellectual and skills capacities of students, and the professional demands of the human teacher who will inevitably be ever more important in the lives of the young people in their care.
These communities need to include the so-called gatekeepers, those responsible for curricula, teacher education, and testing because actually the educational tortoise is capable of sometimes moving fast, if it is not blocked by yesterday’s curricula, yesterday’s testing and yesterday’s teaching training structures.
And it will inevitably mean government entities enabling access to very specific forms of generative AI which are safe to use, accessible, and co-developed with educators.