Skip to content
Legal Industry Insight

Healthcare Sector: Privacy and AI in 2025

Healthcare sector Spain 2025: GDPR Article 9 health data obligations converging with EU AI Act high-risk requirements (Annex III) for diagnosis and clinical AI systems, mandatory DPIA for systems processing 5,000+ patients.

5 min read

The private healthcare sector in Spain faces in 2025 an unprecedented regulatory convergence: on one side, the already established obligations of Regulation (EU) 2016/679 (GDPR) and Organic Law 3/2018 (LOPD-GDD, Spain's national data protection act) on health data protection; on the other, the new obligations of Regulation (EU) 2024/1689 on Artificial Intelligence (AI Act), whose provisions enter into force on a staggered basis through August 2026. The intersection of these two regulatory frameworks requires integrated legal risk management that compliance services and Data Protection Officers in the sector must address in a coordinated manner.

Health Data: Special Protection Under GDPR and LOPD-GDD

Article 9 of the GDPR classifies health-related data as special category data and prohibits its processing except where one of the exceptions provided in Article 9.2 applies. In the healthcare context, the most common legal bases are:

Explicit consent (Article 9.2.a GDPR): valid for processing not directly related to the provision of care, such as sending commercial communications on complementary health services, participation in epidemiological studies for private research purposes, or processing data to improve clinical AI systems. Consent must be specific to each purpose, informed, and freely given. In the healthcare context, the dependency relationship between patient and provider can compromise the freedom of consent, so quality standards are particularly demanding.

Necessity for preventive medicine or diagnosis purposes (Article 9.2.h GDPR): covers processing performed by healthcare professionals or under their responsibility for the management of healthcare systems and services. This legal basis does not cover subsequent processing of the data for purposes unrelated to direct patient care, such as commercial analysis or training third-party AI models.

Public interest in the field of public health (Article 9.2.i GDPR): reserved for public bodies or entities with a public service mandate; it does not cover processing by private entities unless they are expressly acting under a mandate from the health authority.

Spain’s LOPD-GDD, at Article 9, specifies that healthcare centres may process their patients’ data for disease prevention, the provision of healthcare, and the management of healthcare system services, and for compliance with Social Security financing obligations. The Spanish Data Protection Agency (Agencia Española de Protección de Datos, AEPD) has issued sector-specific guidance — including its Guide on Data Processing in the Healthcare Sector (2021) — which constitutes the practical reference for compliance decision-making.

The AI Act and AI Systems in Healthcare

Regulation (EU) 2024/1689, published in the Official Journal of the EU on 12 August 2024 and in force from 1 August 2024, establishes a progressive obligations framework based on the risk level of AI systems. For the healthcare sector, the most relevant aspects are:

Prohibited AI systems (applicable from 2 February 2025, Article 5): There are no AI systems particularly relevant to mainstream healthcare in this category, though subliminal manipulation techniques or exploitation of vulnerabilities could be applicable in mental health or addiction contexts.

High-risk AI systems (Annex III, point 5): The AI Act expressly classifies as high-risk AI systems those intended to be used as medical devices — subject to the medical devices regulation, Regulation (EU) 2017/745 — or as complements to medical devices. Also included are systems used for triage or prioritisation of emergency care, differential diagnosis of serious diseases, or interpretation of medical images (radiology, pathology).

Obligations for operators of high-risk systems include, under Articles 9 to 15 of the AI Act: establishing a risk management system, using representative and bias-free training data, maintaining updated technical documentation, preserving activity logs enabling traceability of system operation, ensuring human oversight of the system, and obtaining conformity assessment before deployment.

GDPR–AI Act interaction: Recital 97 of the AI Act explicitly recognises the relationship between the AI Act and the GDPR. When an AI system processes health data, both the AI Act framework and GDPR obligations must be satisfied simultaneously. The Data Protection Impact Assessment (DPIA) under Article 35 GDPR is mandatory when processing is carried out at large scale or may involve high risks to individuals’ rights. In practice, any AI system processing clinical data for more than 5,000 patients should be subject to a DPIA.

Regulatory Enforcement: Precedents from Across Europe

The AEPD sanctioned a private clinic in 2022 with €150,000 for unauthorised access to clinical records by non-clinical staff, and in 2023 investigated several telemedicine service providers for deficiencies in identity verification and consent management procedures. In Italy, the Garante has imposed fines of up to €4.5 million on clinical analysis laboratories for large-scale processing of health data without an adequate legal basis.

These enforcement actions underline that the healthcare sector is under active regulatory scrutiny on data protection matters, and that the arrival of AI Act enforcement from 2026 will add a new layer of supervisory exposure.

Practical Recommendations for Healthcare Organisations

Healthcare sector entities should, in the immediate horizon:

  1. Conduct an inventory of all AI systems in use or under evaluation and classify them against Annex III of the AI Act.
  2. Audit contracts with AI-assisted diagnostic software providers to verify that the provider acts as a manufacturer under the AI Act and has obtained or commits to obtaining conformity assessment.
  3. Update Records of Processing Activities (Article 30 GDPR) to incorporate AI systems as specific processing activities.
  4. Update privacy policies to include clear information about AI systems that make or support decisions with effects on the patient.
  5. Train the DPO and compliance team on AI Act requirements prior to August 2026.

At BMC, our legal team advises healthcare organisations on GDPR compliance, AI Act readiness, and integrated data protection governance. Learn about our legal services.

Want to learn more?

Let us discuss how to apply these ideas to your business.

Call Contact