From the course: Navigating the EU AI Act

Unlock the full course today

Join today to access over 24,700 courses taught by industry experts.

Remedies

Remedies

- [Narrator] If a serious incident occurs due to the use of a high risk AI system, providers and employers are responsible for reporting it and addressing the situation. But what happens if, I, as the consumer feel like I've been wronged by a system and it doesn't bubble up as a serious incident? Do I have a way to share my views or requests more information? The answer is yes. In Section 4 of Chapter nine, the AI Act provides remedies for consumers of AI systems. I'll introduce all three in order and provide additional details about each. First, anyone, whether you are a provider, deployer or bystander, has the right to file a complaint with the Market Surveillance Authority. Complaints sent to the MSA must identify the provider or deployer responsible. You are expected to launch complaints that identify issues with conformity assessments or lack of safety, fairness and transparency. If you feel that the system is using a prohibited AI practice, for example, emotion recognition, you…

Contents