Privacy by Design: Cheaper to Prevent Than to Remediate
Article 25 GDPR implementation: privacy by design and by default for digital products, software, apps, and internal processes. Direct integration with product and engineering teams.
Why privacy by design reduces costs and regulatory risk
Does this apply to your business?
Do your product and engineering teams consult the DPO or privacy advisor before beginning development of features that process personal data?
Is the default configuration of your products the most privacy-protective option, or do users have to actively search for how to reduce data sharing?
Have you defined data retention periods at every layer of your architecture (database, backups, logs, analytics) with a technical process to apply them automatically?
Does your development process include a privacy assessment before launching new features that might require a DPIA?
0 of 4 questions answered
Our privacy by design integration process
Privacy requirements analysis
In the product definition phase, we identify planned personal data processing activities, applicable legal bases, purposes, and data flows between systems, services, and third parties.
Compliant data architecture design
We define the data architecture that meets the principles of minimisation, purpose limitation, and storage limitation, and design the technical measures for pseudonymisation, encryption, and access control.
Impact assessment (if required) and design reviews
We determine whether the product requires a DPIA under Article 35 GDPR, conduct it where necessary, and participate in design reviews to verify that privacy requirements are maintained throughout development.
Launch and accountability documentation
We accompany the product launch with updated compliance documentation: privacy notices, informational clauses, records of processing activities, and DPIA report where applicable.
The challenge
Article 25 of the GDPR requires that data protection be considered from the moment of designing any product, service, or process that handles personal data. In practice, the vast majority of organisations follow the reverse sequence: they launch the product and then try to retrofit compliance onto an architecture that was not designed for it. The result is costly remediation, complex technical changes, and compliance that is frequently incomplete.
Our solution
We integrate privacy requirements into the product development cycle from the earliest design phases. We work directly with product, UX, and engineering teams to define the data architecture, technical and organisational measures, and information flows that ensure GDPR compliance without sacrificing product functionality.
Privacy by design and by default is a legally binding obligation under Article 25 of the EU General Data Protection Regulation (GDPR, Regulation 2016/679), which requires controllers to implement appropriate technical and organisational measures designed to give effect to data protection principles — such as data minimisation, purpose limitation, and storage limitation — both at the time of designing the processing and at the time of the processing itself. "Privacy by default" additionally requires that, by default, only personal data necessary for each specific purpose is processed. Failure to implement privacy by design and by default is a sanctionable GDPR infringement, independent of whether a data breach has occurred, and the AEPD has issued fines specifically for this violation.
Privacy by design is not a voluntary best practice — it is a legal obligation under Article 25 of the GDPR that creates liability for controllers who fail to implement it. And yet the majority of organisations continue to treat privacy as a post-development remediation exercise rather than a design requirement present from the earliest architectural decisions.
The True Cost of Getting the Sequence Wrong
The cost of the incorrect sequence is systematically underestimated. An architectural change that would have taken hours at the design stage — separating identification data from functional data, applying pseudonymisation from the source, implementing retention policies in the data model — can take weeks or months of engineering work when the system is already in production, with live data, dependent processes, and third-party contracts that constrain every change.
Beyond the direct engineering cost, post-launch privacy remediation is frequently incomplete. An architecture not designed for data minimisation cannot be made minimalist without rebuilding the data model. A system without audit logging cannot retroactively produce the access records that accountability requires. These structural deficiencies are visible to the AEPD in an inspection and are treated as evidence that privacy was not, in fact, built into the design.
Integration Without Bureaucracy
Our integration into product and engineering teams is structured around a lightweight process that generates real protections without bureaucratic overhead. For each new feature or product with a personal data component, we work with the team to answer four questions at the design stage: what data is collected and why, on what legal basis, for how long it is retained, and who has access. This exercise, conducted during design, rarely requires more than an hour. Conducted after launch, it can require weeks of audit and months of remediation.
The sprint review integration — where a privacy advisor reviews product demos when data processing changes are involved — is the mechanism that catches compliance issues when they are still inexpensive to address. A data field added to a user record, a new third-party integration, or a change to the analytics model can each trigger GDPR implications that are visible in a demo but invisible in a code review.
Privacy by Design for AI Systems
For artificial intelligence systems, data protection impact assessments and privacy by design are especially critical because the architecture decisions made at model design time determine whether the system can be GDPR-compliant in a structural sense. A model trained without data minimisation cannot be made minimalist retrospectively without complete retraining. Differential privacy, federated learning, pseudonymised training datasets, and explainable AI (XAI) design are tools that must be chosen at the outset — not added after the model is in production.
Privacy by default in the user experience is a component that product teams frequently underestimate. The product’s default privacy configuration is not just a legal requirement — it is also a signal to users of the organisation’s genuine commitment to their data. Platforms that share data with third parties by default, that activate advertising tracking without consent, or that make privacy controls difficult to find generate greater distrust and greater regulatory exposure than those that adopt the opposite model.
The Cost of Retrofitting Privacy: Why Design Stage is the Right Moment
The cost differential between embedding privacy requirements at design stage versus retrofitting compliance after launch is consistently underestimated by product teams and CTOs. A data architecture change that would take hours during design — separating identification data from functional data, applying pseudonymisation from the origin, implementing retention policies in the data model — can take weeks or months of engineering work when the system is already in production, with real data, dependent processes, and third-party contracts conditioning every change.
The practical implication is that privacy by design is not just a legal compliance requirement — it is also a cost management discipline. The AEPD’s guidance on privacy by design in digital products and AI systems is the reference standard in Spain, and the sanction for non-compliance with Article 25 GDPR can reach EUR 10 million or 2% of global turnover. But the business case for early privacy integration is economic before it is legal.
Our integration into product and development teams is structured around a lightweight process that does not generate disproportionate bureaucracy but does deliver real guarantees. For each new feature or product with a personal data component, we help the team answer four questions at the design stage: what data is collected and why, on what legal basis, how long is it retained, and who has access. When answered at design stage, this exercise rarely takes more than an hour. When answered post-launch, it can require weeks of audit and months of remediation.
Data Protection Impact Assessments (DPIAs) as a Design Prerequisite
The Data Protection Impact Assessment (DPIA, or EIPD in Spanish regulatory terminology) is a mandatory step under Article 35 GDPR before implementing any processing activity that is likely to result in a high risk to the rights and freedoms of individuals. The AEPD has published a list of processing activities that require a DPIA, including: large-scale processing of sensitive data, systematic monitoring of publicly accessible spaces, profiling that produces legal or similarly significant effects, processing of biometric data for identification, and processing involving AI or new technologies.
The DPIA is not a bureaucratic formality — it is a structured risk management exercise that identifies the data protection risks of a processing activity and the measures available to mitigate them. A well-conducted DPIA at design stage can avoid the regulatory crisis of launching a product that creates high privacy risk without appropriate safeguards. Our outsourced DPO service includes DPIA supervision as a standard function, ensuring that impact assessments are conducted before irreversible design decisions are made, are documented in a format that withstands supervisory scrutiny, and are updated when the processing activity changes materially.
Regulatory Framework: GDPR Art. 25 and AEPD Guidance
Article 25 of the GDPR imposes two distinct obligations:
Privacy by design (Art. 25.1): the controller must, both at the time of determining the means of processing and at the time of processing itself, implement appropriate technical and organisational measures — such as pseudonymisation — designed to implement data protection principles effectively and to integrate the necessary safeguards into the processing, in order to meet the requirements of the GDPR and protect the rights of data subjects.
Privacy by default (Art. 25.2): the controller must implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. This obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage, and their accessibility.
The AEPD has published detailed guidance on the implementation of Art. 25, including specific guidance for software and application development (Guía para el desarrollo de aplicaciones con privacidad por diseño y por defecto, 2020). The guidance establishes a risk-based approach: privacy by design measures must be proportionate to the risks that the processing presents to the rights and freedoms of data subjects.
AEPD sanction precedents for Art. 25 violations (as distinct from data breaches under Art. 5/6/9) include fines for:
- Default privacy settings that share user data with third parties without active consent.
- Data retention policies that retain data beyond the stated purpose without a documented legal basis.
- Architecture decisions that process more personal data than necessary for the stated purpose.
- Lack of pseudonymisation in development and testing environments that use production personal data.
Sectors Most Affected
HealthTech and digital health: electronic health records, patient-facing applications, telemedicine platforms, and connected medical devices process special-category data under Art. 9 GDPR in large volumes. Privacy by design obligations are most demanding in this sector: data minimisation, purpose limitation, pseudonymisation of records, and patient access rights must all be embedded in the product architecture from the design phase.
FinTech and financial services: credit scoring apps, open banking platforms, investment advisory tools, and insurance comparison engines process financial and behavioural data at scale. GDPR Art. 22 (right not to be subject to solely automated decisions) creates a specific privacy by design challenge: automated decision systems must be architected to allow human review as a genuine (not perfunctory) intervention.
HR technology: applicant tracking systems, employee monitoring software, performance management platforms, and people analytics tools are consistently cited by the AEPD as areas of high enforcement risk. The combination of GDPR obligations and specific Spanish employment law provisions on algorithmic transparency (RDL 9/2021) creates a dual compliance requirement that must be addressed at product design stage.
EdTech and education platforms: platforms serving minors require heightened privacy protections under GDPR Recital 38 and the LOPDGDD. Data minimisation and age verification by design are mandatory; any processing that exceeds what is strictly necessary for the educational service is non-compliant by definition.
Company Size Segmentation
Startups developing digital products face the most significant privacy by design challenge relative to their stage: the architectural decisions made during the MVP phase typically persist through the product’s entire lifecycle. A data model designed without pseudonymisation cannot easily be retrofitted; a data retention policy not embedded in the product logic from launch requires engineering work to implement retrospectively. Privacy by design integrated from the first sprint is the most cost-effective compliance approach for early-stage companies.
SMEs with established digital products often have inherited technical debt — features built without privacy review, third-party integrations added without DPA assessment, data retention policies not enforced by the system. Our privacy audit for established digital products identifies these debt items and prioritises remediation by regulatory risk and engineering effort.
Corporate groups with multiple digital products require an enterprise privacy architecture programme: consistent data classification, standard privacy controls implemented across products, group-level DPIA methodology, and coordinated training for product teams across business units.
Common Mistakes We Fix
-
Using production personal data in development and test environments. Development and testing environments that contain real personal data are a significant source of data exposure risk and a clear Art. 25 GDPR violation (processing more data than necessary, failure to pseudonymise). Anonymisation or synthetic data generation for non-production environments is the correct approach.
-
Default opt-in for marketing and analytics tracking. Platforms that default to opt-in for marketing communications, behavioural analytics, or third-party tracking cookies violate Art. 25.2 (privacy by default). Consent must be affirmative and specific; default settings must be the most privacy-protective available.
-
Collecting data “just in case.” The data minimisation principle requires that only data that is strictly necessary for the defined purpose is collected. Forms that collect optional data fields that are never used, analytics tracking that records user behaviour beyond what the stated purpose requires, or profile completeness systems that incentivise users to provide more data than necessary — all violate Art. 25 by design.
-
Not applying retention periods technically. A privacy policy that states a retention period is insufficient if the system does not technically enforce deletion when the period expires. Retention must be enforced in the data architecture — through automatic deletion schedules, data lifecycle management, or equivalent technical mechanisms.
-
Treating third-party API integrations as outside the privacy-by-design scope. Third-party services integrated into a product (analytics platforms, advertising networks, customer support tools, payment processors) receive personal data from the product. The privacy by design obligation extends to the data flows to these third parties: the DPA framework must be in place, data minimisation principles applied to the data shared, and the integration designed to share only what is necessary for the defined purpose.
Geographic Coverage
We provide privacy by design advisory across Spain, with a primary focus on technology companies and digital product teams in Madrid, Barcelona, and Valencia. For companies developing products for multiple EU markets, we advise on privacy by design in the context of the GDPR as applied by different national supervisory authorities, noting that interpretive positions on specific design questions (particularly around consent and purpose limitation) vary between the AEPD and other EU data protection authorities.
Worked Example: Privacy by Design Integration for a HealthTech App
A Spanish digital health company (40 employees, EUR 3 million ARR) developing an occupational health monitoring application engaged BMC at the prototype stage. The application collected data on employees’ physical activity, sleep patterns, and stress indicators, shared with employers as aggregated workforce health reports and with individual employees as personal health insights.
BMC’s privacy by design integration:
- Data flow mapping: identified 7 distinct personal data processing activities, including health data (Art. 9 GDPR special category) and behavioural data.
- Legal basis architecture: individual employee consent for personal health insights (granular, revocable consent UX designed into the onboarding flow); legitimate interest analysis for employer aggregate reporting (anonymised data does not trigger GDPR, but the anonymisation standard had to be technically verified).
- DPIA: conducted for the health data processing and the potential re-identification risk of “anonymous” aggregate reports in small teams (teams of fewer than 10 raise re-identification concerns even with aggregation).
- Data minimisation: the original data model collected 23 data fields; privacy by design review reduced this to 14 essential fields, with 9 optional fields behind an explicit in-app opt-in.
- Pseudonymisation architecture: individual health data stored with pseudonymised identifiers; the mapping table held separately with access restricted to a defined security role.
- Third-party integration review: the app used a US-based analytics SDK that transferred personal data to the US. Standard Contractual Clauses (SCCs) put in place; data transfer impact assessment (DTIA) conducted; alternative EU-based SDK identified as fallback.
- Result: launched compliant. First inspection by the AEPD (triggered by a competitor complaint about the industry) found no violations. Legal costs of the privacy by design integration: approximately EUR 18,000. Estimated cost of equivalent remediation after launch: EUR 60,000-90,000 in engineering and legal costs, plus potential AEPD sanction exposure.
How We Work
Our privacy by design practice integrates directly into product and engineering teams. The service is available in three engagement models:
Sprint review integration: a privacy adviser reviews new features and data processing changes during the development sprint, providing real-time design guidance. Typically 4-8 hours per sprint for active development teams. Catches compliance issues at the lowest-cost correction point.
DPIA as a service: for new products or significant new features, we conduct the full Art. 35 GDPR impact assessment, including the risk assessment, the measures consultation, and the DPO supervision. Output is a completed DPIA document in AEPD-compliant format.
Privacy architecture audit: for established digital products, we audit the existing data architecture, data flows, and third-party integrations against privacy by design requirements, producing a prioritised remediation plan with engineering effort estimates.
All three models can be provided in coordination with our outsourced DPO service, where the DPO provides formal oversight and sign-off on DPIAs and major processing decisions.
Real results from privacy by design implementation
When we started developing our occupational health app, we brought BMC in during the design phase. They defined the data architecture, conducted the DPIA, and reviewed every sprint with the team. We launched compliant from day one without a single post-launch architectural change. Far less expensive than waiting.
Experienced team with local insight and international reach
What our privacy by design service includes
Development Cycle Integration
Defining the privacy process for agile teams: privacy review criteria in the definition of done, privacy analysis templates for new features, and workshops for product and engineering teams.
Compliant Data Architecture
Design or review of the product's data architecture to ensure the principles of minimisation, purpose limitation, storage limitation, and pseudonymisation or encryption where applicable.
Privacy by Default in UX
Review of the user experience design to ensure that default settings are the most protective and that the interface does not incorporate dark patterns that undermine consent.
Data Protection Impact Assessment
Determination of the DPIA requirement and, where triggered, completion of the assessment integrated into the design process before development begins.
Accountability Documentation
Records of processing activities update, product privacy notice drafting, and documentation of technical and organisational measures implemented.
Results that speak for themselves
GDPR Healthcare Spain: Compliance Case Study | BMC
AEPD investigation closed with no sanction. Full GDPR compliance achieved across all group centres within 6 months.
Criminal Compliance Spain: Construction Group Case | BMC
Criminal compliance program implemented in 6 months, whistleblower channel operational, AENOR certification obtained, and prosecution risk effectively mitigated.
AML compliance program for a real estate development group
SEPBLAC inspection passed with minor observations only, zero sanctions. Full AML program operational within 90 days.
Reference guides
Post-Brexit: your British company operating in Spain with the right structure
post-Brexit advisory for UK companies operating in Spain: entity structuring, customs and VAT, work permits for British nationals, UK-Spain tax treaty optimisation and data protection compliance.
View guideAML compliance in Spain 2026: what your business must know about anti-money laundering regulation
Spain AML compliance 2026: SEPBLAC obligations, risk-based approach, PBC manual, UBO verification, and suspicious transaction reporting. Expert service from BMC.
View guideComprehensive legal services for businesses
Comprehensive legal advisory for businesses: commercial, employment, contracts, regulatory compliance, and dispute resolution. A dedicated legal team to protect your company.
View guideBuy property in Spain with confidence — and without the horror stories
Buying property in Spain 2026: NIE, conveyancing, ITP tax, mortgage advice, and due diligence for foreign buyers. Step-by-step guide from BMC property lawyers.
View guideThe collective agreement that governs your workforce: understand it and negotiate from strength
Spain collective bargaining guide: union negotiation obligations, ERE/ERTE triggers, works council rights, agreement registration, and how BMC protects employer interests.
View guideYour commercial lease agreement: get the clauses right before you sign
Spain commercial lease guide: LAU legal framework, rent review clauses, break options, guarantee structures, and key negotiation points for tenants and landlords.
View guideAnalysis and perspectives
Frequently asked questions about privacy by design
Start with a free diagnostic
Our team of specialists, with deep knowledge of the Spanish and European market, will guide you from day one.
Privacy by Design
Legal
First step
Start with a free diagnostic
Our team of specialists, with deep knowledge of the Spanish and European market, will guide you from day one.
Request your diagnostic
You may also be interested in
EU AI Act Compliance
Full compliance with the EU Artificial Intelligence Act: risk classification, conformity assessments, transparency obligations, and prohibited practice audits.
Saber másCriminal Compliance
Corporate criminal compliance programmes to exempt or mitigate the criminal liability of legal entities under Article 31 bis of the Spanish Criminal Code.
Saber másCybersecurity Audit
Security posture assessment, compliance audits (ENS, ISO 27001, NIS2), vulnerability assessment, penetration testing management, and third-party risk evaluation.
Saber másData Protection & Privacy
GDPR and LOPDGDD compliance, outsourced DPO, and comprehensive privacy management for businesses.
Saber másKey terms
EU AI Act
The EU Artificial Intelligence Act (Regulation EU 2024/1689) is the world's first comprehensive…
Read definitionData Protection Officer (DPO)
A Data Protection Officer (DPO) is a designated individual responsible for overseeing an…
Read definitionPrivacy by Design
A GDPR principle (Article 25) requiring data protection to be integrated into the design of…
Read definitionStandard Contractual Clauses (SCCs)
Model contracts adopted by the European Commission that provide adequate safeguards for transferring…
Read definitionTalk to the partner in charge
Response within 24 business hours. First meeting free.