![]() The research concludes four findings: F1) A mental model for explanations is an effective way to identify uncertainty addressing explanation content that addresses target user group specific needs. In a qualitative user study (cognitive walkthrough) with ten participants, it was investigated which explanations are needed to support understand-ability, trustworthiness, and actionability. Furthermore, several options of confidence levels were explored. Additionally, a causal chain model was created and used as an assumed representation of the mental model for explanations. By using a design analysis technique (task questions), a concept (UI mockup) was created in a controlled way. The question this research attempts to answer is what are the explainability needs of the model manager. This research aims to find out the explanation needs for a user role called "model manager", a user monitoring multiple AI-based systems for quality assurance in manufacturing. In the context of explainable AI (XAI), little research has been done to show how user role specific explanations look like.
0 Comments
Leave a Reply. |