Human-centered AI: how can we support end-users to interact with AI?
This document discusses how to design human-centered AI systems that support end-users. It explores explaining model outcomes to increase trust and acceptance, and enabling users to interact with explanation processes. Personal characteristics like need for cognition impact how users respond to explanations. Explanations should be personalized and allow different levels of detail. Evaluations show explanations improve understanding but also increase cognitive load, so simplification is important. The goal is to preserve human control and ensure AI meets user needs.