Right now you are surfing a limited version of umu.se. What does this mean?

"False"
Skip to content
printicon
Main menu hidden.

Image: AdobeStock

XAI – Explainable AI

Research group Artificial intelligence is becoming increasingly used in society, business and industry. Explanatory AI is about making the decisions or actions taken by an AI system understandable to the people using the system. The European Union as well as many countries are now moving towards requiring such functionality from AI systems.

Kary Främling
Professor
E-mail
Email

Growing need for explainable AI

An increasing number of functions in our society are now run by AI-based systems, which has caused concern about how reliable these AI-based systems really are. A basic prerequisite for trusting AI systems is to make it possible to gain an insight into the underlying reasoning and be able to get answers to questions such as, "why was this decision made?", "why not?", "what would happen if?" and "why A and not B?"

End users in focus

We believe that current research in explanatory AI mostly ignores the end users and rather focuses on the creators of the AI systems. In our research, we aim to develop methods that can justify and explain their decisions and actions in a similar way that humans explain to each other. This means enabling a dialogue in which the AI system takes into account people's background knowledge, their capacity to handle different amounts of information at the same time, and their reactions during the dialogue.

Depending on how AI systems have been programmed or the data used for their learning, they may contain errors, biases or simply 'opinions' - just like humans. Explanatory AI is therefore necessary to truly understand the reasoning of the AI system and to judge whether we agree with it or not.

Unique research group

The eXplainable Artificial Intelligence (XAI) team at Umeå University was founded in 2017 by Professor Kary Främling when he took up his WASP professor position in Data Science, specialising in data analysis and machine learning. The XAI focus area is central as Främling has been a very active and internationally recognised researcher in artificial intelligence since the 1980s, focusing on topics such as neural network learning, multi-criteria decision support and reinforcement learning.

The first research initiative on the subject

Kary Främling's 1996 PhD thesis entitled "Learning and Explaining Preferences with Neural Networks for Multiple Criteria Decision Making," (in French: Modélisation et apprentissage des préférences par réseaux de neurones pour l'aide à la décision multicritère") can be considered one of the first research initiatives to explicitly address the topic of explanatory artificial intelligence, including aspects such as robustness and reliability of reasoning.

The CIU Method

An important part of the team's research is based on the Contextual Importance and Utility (CIU) methodology, which makes it possible to explain and justify the performance of AI systems in a given situation.

Present and former members of the team

Kary Främling

Kary Främling is the head of the eXplainable Artificial Intelligence (XAI) team at the university of Umea in Sweden. Kary has been an active researcher in Artificial Intelligence since the 1980's, focusing on topics such as neural network learning, multiple criteria decision support and reinforcement learning. His PhD thesis from 1996 entitled « Learning and Explaining Preferences with Neural Networks for Multiple Criteria Decision Making » (in French : Modélisation et apprentissage des préférences par réseaux de neurones pour l'aide à la décision multicritère ») can be considered to be one of the first research initiatives that explicitly address the topic of Explainable Artificial Intelligence, including aspects such as confidence, robustness and reliability of the reasoning. He is currently Professor in Data Science at Umeå University, Sweden.

Research leader

Kary Främling
Professor
E-mail
Email

Overview

Participating departments and units at Umeå University

Department of Computing Science

Research area

Computing science
Latest update: 2025-11-21