The new chat program ChatGPT from OpenAI is a chatbot that is able to provide human answers to all kinds of questions using AI. Students can also take advantage of this: "Write me an essay in the first person on the topic of climate change", "Explain the cardiovascular system to me in a few words" or "What is the solution to 2346^3 x 456-213?" are just simple examples of how the chatbot can be used in a school context. ChatGPT can be used in different languages and mimics the writing style of the users, enabling a credible and natural interaction.
In the meantime, students have discovered the chatbot as a practical helper in everyday school life. The following scenarios are currently particularly popular:
- Writingsummaries of books, plays, etc.
- Getting an overview of a chapter of material, preparing and writing presentations
- Programming small programs
But it's not just students who are using the chatbot - teachers are also already using the software, for example to create teaching materials or for teacher training. The practical aspect is that the more questions the chatbot is asked, the more precise its answers become - because, as is usual for an AI, it learns with each input of new information.
Can a chatbot really be recommended as a helper for school questions?
One central question is how strongly such chatbots affect the motivation of students to solve tasks independently. Experts also warn against using the texts generated by ChatGPT in everyday school and university life without further revision. This is because they are not always error-free and correct. It can happen that facts are mixed up or sources are even invented to make a text sound plausible. In addition, the data basis of the current version is limited to data up to the year 2021. Therefore, the bot cannot output more current information. Having texts pre-structured, complex issues explained in simple points, or complex math problems unraveled can be quite useful - but a bot can never replace human thinking power.
Information evaluation as a challenge
ChatGPT is well suited for acquiring widespread knowledge. However, more specific questions often produce incorrect answers. The challenge here is that the chatbot is programmed to make its answers sound as "human-like" as possible. This often makes them seem more credible than the results of a Google search - even if, for example, sources have been invented. In the future, this may make it even more difficult for teachers and students alike to evaluate the information they find and to practice source criticism in the classroom.
Therefore, it will become even more important in the future to discuss with students how to compare content with other sources, check sources for credibility, and identify false reports.
What will certainly also become increasingly relevant: Focusing on discussion and engagement with content rather than checking knowledge that can be easily found on the Internet via chatbots (but also Google and Co.).
Actively incorporate the topics of ChatGPT and information literacy into your lessons.
- Show students a text created by a chatbot and have them search for the original sources of the information mentioned. Discuss and evaluate sources found with each other in class.
- Choose a topic that you have recently discussed in class and have a bot answer follow-up questions about it. Discuss and question the answers given by the bot with the class.
Programs like ChatGPT can change the teaching and the information behavior of students and teachers - similar to the digital encyclopedia Wikipedia a few years ago.
We advise the following
- Do not ignore: ChatGPT is already used by many students. Moreover, it can be assumed that not only ChatGPT itself will continue to develop, but that other similar tools will emerge. This makes it all the more important to already deal with the opportunities and risks of such programs. See it as an opportunity: together with your students, you can now influence how such applications will be used in the future and what value source criticism will have.
- Don't ban them: Banning such programs makes little sense because it takes away the opportunity to actively engage with the issue. While there are programs that work to detect AI-generated texts (similar to plagiarism), even these can be tricked - so you'll probably never be able to say with certainty that your students didn't consult an AI for homework.
- Use together: Explore the possibilities and limitations of such applications with your students. Especially at this stage, it is important to develop rules together - on how to deal with the information found and how to integrate such programs into everyday school life.
ChatGPT can only be used after prior registration and collects usage data. The data is transferred to the USA, where it is stored and processed. The minimum age for use specified by the operator OpenAI is 13 years, the minimum age for registration is even 18 years.
Teachers are not allowed to oblige students under 18 to create an account with the ChatGPT operator OpenAI. Students over 18 may create an account, but the decision to do so must be voluntary. There must be no negative consequences (e.g. exclusion from participation in classes) if students do not want to use OpenAI's service.
Is there a way to use ChatGPT in class in a privacy compliant way?
Teachers can create an account at OpenAI on their own responsibility and use it in class for demonstration purposes. They can also make the account available to their students. However, two aspects must be taken into account: The students must be over 13 years old. And the students must use the account on non-personalized devices. Use via private devices or personalized school devices would theoretically allow the collection of personal data.
These assessments on the subject of data privacy are based, among other things, on this article from unterricht.digital (as of January 23, 2023).