AI Ministry

AI Across Industries

Sector-specific AI use cases, regulatory requirements, and implementation pathways

Real-World Context

AI in Action: Industry Examples

Leading European companies are already deploying AI systems that fall under EU AI Act obligations. Understanding the regulatory implications is no longer optional — it is a market requirement.

HIGH RISKAutomotive · Safety-Critical AI

Automated Driving & AI Safety Systems

European automotive OEMs and Tier-1 suppliers are deploying AI systems for automated driving — processing real-world sensor data, training perception models, and making safety-critical decisions at high speed. These systems directly affect human lives on public roads and fall under the strictest EU AI Act obligations.

REGULATORY OBLIGATIONS
  • Mandatory conformity assessment before market placement
  • Registration in EU high-risk AI database
  • Continuous post-market monitoring & incident reporting
  • UNECE WP.29 R155/R156 cybersecurity compliance
  • Full technical documentation and audit trail required
Applicable to: ADAS systems, autonomous lane keeping, collision avoidance, automated parking
LIMITED RISKAutomotive · Conversational AI

AI-Powered In-Vehicle Virtual Assistants

Premium automotive manufacturers are integrating AI-powered virtual assistants that conduct natural conversations, remember previous interactions, and personalise the driving experience using cloud AI infrastructure and mapping services. Systems like these — built on automotive AI agents and cloud platforms — process personal data including location, preferences, and conversation history, triggering GDPR and EU AI Act transparency obligations.

REGULATORY OBLIGATIONS
  • Transparency: users must know they interact with AI
  • GDPR compliance for conversation and location data
  • Data minimisation and purpose limitation principles
  • Right to explanation for AI-driven recommendations
  • Cross-border data transfer rules (EU–US cloud providers)
Applicable to: AI voice assistants with contextual memory, personalised navigation, preference learning
REGULATORY GAP — 2026Automotive · AI Testing & ValidationAI Ministry Insight

AI Testing in Vehicles: The Standard That Doesn't Exist Yet

Leading European OEMs are actively developing AI systems for automated driving — yet a critical gap exists: no unified standard currently defines how AI and machine learning models in vehicles must be tested for functional safety compliance. ISO/PAS 8800:2024 (AI safety for road vehicles) explicitly does not define ASIL-level compliance for ML models. ISO 26262 covers functional safety but predates modern AI architectures. SOTIF (ISO 21448) addresses intended functionality but lacks ML-specific test protocols.

ISO/PAS 8800:2024
Partial
AI safety in vehicles — does NOT define ASIL for ML
ISO 26262:2018
Insufficient
Functional safety — predates ML architectures
ISO 21448 (SOTIF)
Partial
Intended functionality — no ML test protocol
EU AI ACT OBLIGATION — AUGUST 2026

Autonomous and semi-autonomous driving systems are classified as High-Risk AI under EU AI Act Annex III. From August 2026, providers must demonstrate conformity — including testing and validation — yet the testing methodology standard remains undefined. AI Ministry is actively engaging with industry stakeholders and EU bodies to develop practical guidance for this gap.

ISO/PAS 8800SOTIFISO 26262EU AI Act Annex IIIML ValidationAutomotive AI Testing
Free Assessment Tool

Where does your AI system stand?

Use our interactive EU AI Act Risk Checker to get an instant classification and required compliance actions.

AI Ministry Assistant
Online · AI-powered