Las empresas del sector ciencias de la vida (life sciences) y sanidad privada en España afrontan en 2025-2026 una convergencia regulatoria sin precedentes: RGPD, Ley NIS2 (ciberseguridad), Reglamento DORA (resiliencia digital) y la Ley de IA (AI Act) se aplican simultáneamente. BMC asesora a clínicas, laboratorios, medtech y farmacéuticas en la implantación integrada de estos marcos normativos para evitar sanciones y garantizar la continuidad operativa.
Life sciences companies and private healthcare organisations in Spain face in 2025–2026 an unprecedented regulatory convergence: on one side, the established obligations of Regulation (EU) 2016/679 (GDPR) and Organic Law 3/2018 (LOPD-GDD) on health data protection; on the other, the new requirements of Directive (EU) 2022/2555 (NIS2) on cybersecurity, Regulation (EU) 2022/2554 (DORA) on digital operational resilience for certain financial-adjacent entities, and Regulation (EU) 2024/1689 (AI Act) on artificial intelligence systems. For life sciences organisations in Spain — pharma, medtech, clinical laboratories, and private hospital networks — all four frameworks can apply simultaneously, requiring coordinated compliance governance. See our sector overview at Healthcare & Life Sciences.
Health Data: Special Protection Under GDPR and LOPD-GDD
Article 9 of the GDPR classifies health-related data as special category data and prohibits its processing except where one of the exceptions provided in Article 9.2 applies. In the healthcare context, the most common legal bases are:
Explicit consent (Article 9.2.a GDPR): valid for processing not directly related to the provision of care, such as sending commercial communications on complementary health services, participation in epidemiological studies for private research purposes, or processing data to improve clinical AI systems. Consent must be specific to each purpose, informed, and freely given. In the healthcare context, the dependency relationship between patient and provider can compromise the freedom of consent, so quality standards are particularly demanding.
Necessity for preventive medicine or diagnosis purposes (Article 9.2.h GDPR): covers processing performed by healthcare professionals or under their responsibility for the management of healthcare systems and services. This legal basis does not cover subsequent processing of the data for purposes unrelated to direct patient care, such as commercial analysis or training third-party AI models.
Public interest in the field of public health (Article 9.2.i GDPR): reserved for public bodies or entities with a public service mandate; it does not cover processing by private entities unless they are expressly acting under a mandate from the health authority.
Spain’s LOPD-GDD, at Article 9, specifies that healthcare centres may process their patients’ data for disease prevention, the provision of healthcare, and the management of healthcare system services, and for compliance with Social Security financing obligations. The Spanish Data Protection Agency (Agencia Española de Protección de Datos, AEPD) has issued sector-specific guidance — including its Guide on Data Processing in the Healthcare Sector (2021) — which constitutes the practical reference for compliance decision-making.
The AI Act and AI Systems in Healthcare
Regulation (EU) 2024/1689, published in the Official Journal of the EU on 12 August 2024 and in force from 1 August 2024, establishes a progressive obligations framework based on the risk level of AI systems. For the healthcare sector, the most relevant aspects are:
Prohibited AI systems (applicable from 2 February 2025, Article 5): There are no AI systems particularly relevant to mainstream healthcare in this category, though subliminal manipulation techniques or exploitation of vulnerabilities could be applicable in mental health or addiction contexts.
High-risk AI systems (Annex III, point 5): The AI Act expressly classifies as high-risk AI systems those intended to be used as medical devices — subject to the medical devices regulation, Regulation (EU) 2017/745 — or as complements to medical devices. Also included are systems used for triage or prioritisation of emergency care, differential diagnosis of serious diseases, or interpretation of medical images (radiology, pathology).
Obligations for operators of high-risk systems include, under Articles 9 to 15 of the AI Act: establishing a risk management system, using representative and bias-free training data, maintaining updated technical documentation, preserving activity logs enabling traceability of system operation, ensuring human oversight of the system, and obtaining conformity assessment before deployment.
GDPR–AI Act interaction: Recital 97 of the AI Act explicitly recognises the relationship between the AI Act and the GDPR. When an AI system processes health data, both the AI Act framework and GDPR obligations must be satisfied simultaneously. The Data Protection Impact Assessment (DPIA) under Article 35 GDPR is mandatory when processing is carried out at large scale or may involve high risks to individuals’ rights. In practice, any AI system processing clinical data for more than 5,000 patients should be subject to a DPIA.
Regulatory Enforcement: Precedents from Across Europe
The AEPD sanctioned a private clinic in 2022 with €150,000 for unauthorised access to clinical records by non-clinical staff, and in 2023 investigated several telemedicine service providers for deficiencies in identity verification and consent management procedures. In Italy, the Garante has imposed fines of up to €4.5 million on clinical analysis laboratories for large-scale processing of health data without an adequate legal basis.
These enforcement actions underline that the healthcare sector is under active regulatory scrutiny on data protection matters, and that the arrival of AI Act enforcement from 2026 will add a new layer of supervisory exposure.
NIS2 Directive: Cybersecurity Obligations for Life Sciences in Spain
Directive (EU) 2022/2555 (NIS2), transposed into Spanish law via Real Decreto-ley 4/2024, introduces mandatory cybersecurity requirements for entities classified as “essential” or “important” in critical sectors. Healthcare — including hospitals, clinical laboratories, pharmaceutical manufacturers, and medical device companies — is expressly listed as an essential sector under Annex I of NIS2.
Obligations for life sciences entities under NIS2 include: adopting risk-based cybersecurity governance policies, implementing technical and organisational security measures (covering network security, access controls, incident detection and response, supply chain security), and notifying the national competent authority (INCIBE/CCN-CERT) of significant security incidents within 24 hours of detection and with a full incident report within 72 hours. Senior management bears direct liability for compliance — a material change from the NIS1 framework.
For Spain, the CNPIC and the CCN-CERT act as supervisory authorities for the health sector. Non-compliance can result in administrative sanctions of up to €10 million or 2% of global annual turnover for essential entities.
DORA: Digital Operational Resilience for Life Sciences Adjacent to Financial Services
Regulation (EU) 2022/2554 (DORA) — fully applicable from 17 January 2025 — is primarily directed at financial entities but has material implications for life sciences companies that: (a) provide digital health financing or insurance-linked services, (b) operate as ICT third-party service providers to regulated financial entities (e.g., hospital fintech or MedPay platforms), or (c) are subsidiaries of financial groups subject to DORA consolidation.
Key DORA requirements relevant to this sector include: ICT risk management frameworks, contractual obligations when acting as third-party ICT providers to financial entities (including right-to-audit clauses and exit strategies), and participation in threat-led penetration testing (TLPT) programmes for critical ICT systems. Life sciences companies should review their customer and partner base to assess whether DORA supply-chain obligations flow down to them contractually.
Practical Recommendations for Life Sciences Organisations
Life sciences entities operating in Spain should, in the immediate horizon:
- Conduct an inventory of all AI systems in use or under evaluation and classify them against Annex III of the AI Act.
- Audit contracts with AI-assisted diagnostic software providers to verify that the provider acts as a manufacturer under the AI Act and has obtained or commits to obtaining conformity assessment.
- Update Records of Processing Activities (Article 30 GDPR) to incorporate AI systems as specific processing activities and verify that health data legal bases under Article 9.2 are correctly documented.
- Conduct a NIS2 classification assessment to determine whether the entity qualifies as essential or important, and implement the required cybersecurity governance programme.
- Review DORA applicability for any financial-adjacent digital services or ICT provider relationships with regulated financial entities.
- Update privacy policies to include clear information about AI systems that make or support decisions with effects on the patient.
- Train the DPO and compliance team on GDPR–AI Act interaction and NIS2 requirements prior to August 2026.
At BMC, our legal team advises life sciences organisations in Spain on GDPR, NIS2, DORA, AI Act compliance, and integrated data protection governance. Learn about our data protection services and criminal compliance services.