Universiteit Leiden

nl en

AI in education

Generative AI such as Chat GPT has been part of our working and learning environment since 2022. This AI use in the educational setting by students, teachers and staff members comes with a number of new challenges. The creative process of writing a "classic paper" for instance, can be carried out partly with AI. This challenges us to rethink our way of handling the use of AI within the academic community. What could AI in general mean for education of our students? What are the opportunities and limitations of AI for teachers and other colleagues?

IMPORTANT NOTICE: Summative tasks - i.e. assignments that are officially graded and/or assessed - always need to be carried out according to the applicable exam regulations. Furthermore, the extent to which AI can or cannot be used needs to be determined by the teacher and the exam board. 

Task force AI in education

The AI in education task force consists of a diverse group of university colleagues. They have put together a strategy for the implementation of Ai in our edicational system. Gradually, we will continue to further implement and develop this strategy.  

Please know we are aware of the steps that still need to be taken, in order for us to be able to use AI safely within the educational setting. For now, you will find the first guidelines on integrating and practical use of generative AI in your classes and assignments. We'll constantly update this information and keep our focus on the educational side of AI possibilities. 

Potential risks in the use of commercial and freely accessible AI

Since we have no user agreements with suppliers, we cannot be certain of the way our data is processed when using freely accessible, commercial AI platforms. We therefore need to proceed cautiously regarding the use of such platforms. 

We currently advise you to avoid the use of commercial and freely accessible AI platforms. 

Educational AI use: 'high risk' category

The use of AI can potentially create privacy and expertise related risks; it is highly important to be aware of these. After all, we process a lot of data related to individuals, research results and data which can be traced back to a certain context. EU regulations regarding the use of AI specify the use of AI systems within the educational setting to fall within the 'high risk' category because of this. 

As a university, we are obliged to check the use of software applications containing AI elements against the related legislation. We need to ensure all required arrangements are made with suppliers for safe use of these systems. At present, these arrangements are not in place (yet). 

Microsoft Copilot is an example of an AI element within a software application which we cannot use safely at the moment. Microsoft has not clarified specifically how our data will be processed when using CoPilot. 

In addition to licenced software, there are quite a number of online, freely accessible AI systems. The use of these systems still has many privacy related risks because we cannto be sure how data from chats, prompts or output is used for further development of the AI model. 

The online, freely accessible ChatGPT versions are a good example of these unsafe applications. We cannot be sure how our data is processed and possibly used in the output for chats with other users. 

Guidelines and regulations

We're in the process of establishing further regulations and guidelines, using the 5 levels of AI integration defineby by 

Perkins et al.[1] as our starting point.

Which currently means:

  1. there is no 'white list' of reliable AI systems. All and any use of AI causes potential risks. 
  2. when experimenting with or using AI, you must always avoid using data which is related to indiviuals or research ar can be traced back to a specific context. 

The following guides (in English) will further provide you with tips and information. They will be updated regularly in light of further AI developments. 

AI Guide for teachers: 

A guide for determining the ground rules for using and handling AI in your teaching methods. 

Implementing AI in your curriculum:

Examples and tips for implementing AI within the educational setting, for example when it comes to written assignments.  

Scale of AI use at University Leiden:

A summary of AI use according to the 5 levels of AI integration according to Perkings. 

 

[1] Perkins, M., Furze, L., Roe, J., MacVaugh, J. (2024) The Artificial Intelligence Assessment Scale (AIAS):

A Framework for Ethical Integration of Generative AI in Educational Assessment. British University

Vietnam, (Vietnam). Deakin University (Australia). James Cook University (Australia)

This website uses cookies.  More information.