Skip to content

Privacy by Design: Cheaper to Prevent Than to Remediate

Article 25 GDPR implementation: privacy by design and by default for digital products, software, apps, and internal processes. Direct integration with product and engineering teams.

Why privacy by design reduces costs and regulatory risk

90+
Products and systems with privacy by design implemented
Art. 25
GDPR mandate for privacy from the design stage
60%
Typical reduction in remediation costs vs post-launch compliance
4.8/5 on Google · 50+ reviews 25+ years experience 5 offices in Spain 500+ clients
Quick assessment

Does this apply to your business?

Do your product and engineering teams consult the DPO or privacy advisor before beginning development of features that process personal data?

Is the default configuration of your products the most privacy-protective option, or do users have to actively search for how to reduce data sharing?

Have you defined data retention periods at every layer of your architecture (database, backups, logs, analytics) with a technical process to apply them automatically?

Does your development process include a privacy assessment before launching new features that might require a DPIA?

0 of 4 questions answered

Our approach

Our privacy by design integration process

01

Privacy requirements analysis

In the product definition phase, we identify planned personal data processing activities, applicable legal bases, purposes, and data flows between systems, services, and third parties.

02

Compliant data architecture design

We define the data architecture that meets the principles of minimisation, purpose limitation, and storage limitation, and design the technical measures for pseudonymisation, encryption, and access control.

03

Impact assessment (if required) and design reviews

We determine whether the product requires a DPIA under Article 35 GDPR, conduct it where necessary, and participate in design reviews to verify that privacy requirements are maintained throughout development.

04

Launch and accountability documentation

We accompany the product launch with updated compliance documentation: privacy notices, informational clauses, records of processing activities, and DPIA report where applicable.

The challenge

Article 25 of the GDPR requires that data protection be considered from the moment of designing any product, service, or process that handles personal data. In practice, the vast majority of organisations follow the reverse sequence: they launch the product and then try to retrofit compliance onto an architecture that was not designed for it. The result is costly remediation, complex technical changes, and compliance that is frequently incomplete.

Our solution

We integrate privacy requirements into the product development cycle from the earliest design phases. We work directly with product, UX, and engineering teams to define the data architecture, technical and organisational measures, and information flows that ensure GDPR compliance without sacrificing product functionality.

Privacy by design and by default is a legally binding obligation under Article 25 of the EU General Data Protection Regulation (GDPR, Regulation 2016/679), which requires controllers to implement appropriate technical and organisational measures designed to give effect to data protection principles — such as data minimisation, purpose limitation, and storage limitation — both at the time of designing the processing and at the time of the processing itself. "Privacy by default" additionally requires that, by default, only personal data necessary for each specific purpose is processed. Failure to implement privacy by design and by default is a sanctionable GDPR infringement, independent of whether a data breach has occurred, and the AEPD has issued fines specifically for this violation.

Privacy by design is not a voluntary best practice — it is a legal obligation under Article 25 of the GDPR that creates liability for controllers who fail to implement it. And yet the majority of organisations continue to treat privacy as a post-development remediation exercise rather than a design requirement present from the earliest architectural decisions.

The True Cost of Getting the Sequence Wrong

The cost of the incorrect sequence is systematically underestimated. An architectural change that would have taken hours at the design stage — separating identification data from functional data, applying pseudonymisation from the source, implementing retention policies in the data model — can take weeks or months of engineering work when the system is already in production, with live data, dependent processes, and third-party contracts that constrain every change.

Beyond the direct engineering cost, post-launch privacy remediation is frequently incomplete. An architecture not designed for data minimisation cannot be made minimalist without rebuilding the data model. A system without audit logging cannot retroactively produce the access records that accountability requires. These structural deficiencies are visible to the AEPD in an inspection and are treated as evidence that privacy was not, in fact, built into the design.

Integration Without Bureaucracy

Our integration into product and engineering teams is structured around a lightweight process that generates real protections without bureaucratic overhead. For each new feature or product with a personal data component, we work with the team to answer four questions at the design stage: what data is collected and why, on what legal basis, for how long it is retained, and who has access. This exercise, conducted during design, rarely requires more than an hour. Conducted after launch, it can require weeks of audit and months of remediation.

The sprint review integration — where a privacy advisor reviews product demos when data processing changes are involved — is the mechanism that catches compliance issues when they are still inexpensive to address. A data field added to a user record, a new third-party integration, or a change to the analytics model can each trigger GDPR implications that are visible in a demo but invisible in a code review.

Privacy by Design for AI Systems

For artificial intelligence systems, data protection impact assessments and privacy by design are especially critical because the architecture decisions made at model design time determine whether the system can be GDPR-compliant in a structural sense. A model trained without data minimisation cannot be made minimalist retrospectively without complete retraining. Differential privacy, federated learning, pseudonymised training datasets, and explainable AI (XAI) design are tools that must be chosen at the outset — not added after the model is in production.

Privacy by default in the user experience is a component that product teams frequently underestimate. The product’s default privacy configuration is not just a legal requirement — it is also a signal to users of the organisation’s genuine commitment to their data. Platforms that share data with third parties by default, that activate advertising tracking without consent, or that make privacy controls difficult to find generate greater distrust and greater regulatory exposure than those that adopt the opposite model.

The Cost of Retrofitting Privacy: Why Design Stage is the Right Moment

The cost differential between embedding privacy requirements at design stage versus retrofitting compliance after launch is consistently underestimated by product teams and CTOs. A data architecture change that would take hours during design — separating identification data from functional data, applying pseudonymisation from the origin, implementing retention policies in the data model — can take weeks or months of engineering work when the system is already in production, with real data, dependent processes, and third-party contracts conditioning every change.

The practical implication is that privacy by design is not just a legal compliance requirement — it is also a cost management discipline. The AEPD’s guidance on privacy by design in digital products and AI systems is the reference standard in Spain, and the sanction for non-compliance with Article 25 GDPR can reach EUR 10 million or 2% of global turnover. But the business case for early privacy integration is economic before it is legal.

Our integration into product and development teams is structured around a lightweight process that does not generate disproportionate bureaucracy but does deliver real guarantees. For each new feature or product with a personal data component, we help the team answer four questions at the design stage: what data is collected and why, on what legal basis, how long is it retained, and who has access. When answered at design stage, this exercise rarely takes more than an hour. When answered post-launch, it can require weeks of audit and months of remediation.

Data Protection Impact Assessments (DPIAs) as a Design Prerequisite

The Data Protection Impact Assessment (DPIA, or EIPD in Spanish regulatory terminology) is a mandatory step under Article 35 GDPR before implementing any processing activity that is likely to result in a high risk to the rights and freedoms of individuals. The AEPD has published a list of processing activities that require a DPIA, including: large-scale processing of sensitive data, systematic monitoring of publicly accessible spaces, profiling that produces legal or similarly significant effects, processing of biometric data for identification, and processing involving AI or new technologies.

The DPIA is not a bureaucratic formality — it is a structured risk management exercise that identifies the data protection risks of a processing activity and the measures available to mitigate them. A well-conducted DPIA at design stage can avoid the regulatory crisis of launching a product that creates high privacy risk without appropriate safeguards. Our outsourced DPO service includes DPIA supervision as a standard function, ensuring that impact assessments are conducted before irreversible design decisions are made, are documented in a format that withstands supervisory scrutiny, and are updated when the processing activity changes materially.

Regulatory Framework: GDPR Art. 25 and AEPD Guidance

Article 25 of the GDPR imposes two distinct obligations:

Privacy by design (Art. 25.1): the controller must, both at the time of determining the means of processing and at the time of processing itself, implement appropriate technical and organisational measures — such as pseudonymisation — designed to implement data protection principles effectively and to integrate the necessary safeguards into the processing, in order to meet the requirements of the GDPR and protect the rights of data subjects.

Privacy by default (Art. 25.2): the controller must implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. This obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage, and their accessibility.

The AEPD has published detailed guidance on the implementation of Art. 25, including specific guidance for software and application development (Guía para el desarrollo de aplicaciones con privacidad por diseño y por defecto, 2020). The guidance establishes a risk-based approach: privacy by design measures must be proportionate to the risks that the processing presents to the rights and freedoms of data subjects.

AEPD sanction precedents for Art. 25 violations (as distinct from data breaches under Art. 5/6/9) include fines for:

  • Default privacy settings that share user data with third parties without active consent.
  • Data retention policies that retain data beyond the stated purpose without a documented legal basis.
  • Architecture decisions that process more personal data than necessary for the stated purpose.
  • Lack of pseudonymisation in development and testing environments that use production personal data.

Sectors Most Affected

HealthTech and digital health: electronic health records, patient-facing applications, telemedicine platforms, and connected medical devices process special-category data under Art. 9 GDPR in large volumes. Privacy by design obligations are most demanding in this sector: data minimisation, purpose limitation, pseudonymisation of records, and patient access rights must all be embedded in the product architecture from the design phase.

FinTech and financial services: credit scoring apps, open banking platforms, investment advisory tools, and insurance comparison engines process financial and behavioural data at scale. GDPR Art. 22 (right not to be subject to solely automated decisions) creates a specific privacy by design challenge: automated decision systems must be architected to allow human review as a genuine (not perfunctory) intervention.

HR technology: applicant tracking systems, employee monitoring software, performance management platforms, and people analytics tools are consistently cited by the AEPD as areas of high enforcement risk. The combination of GDPR obligations and specific Spanish employment law provisions on algorithmic transparency (RDL 9/2021) creates a dual compliance requirement that must be addressed at product design stage.

EdTech and education platforms: platforms serving minors require heightened privacy protections under GDPR Recital 38 and the LOPDGDD. Data minimisation and age verification by design are mandatory; any processing that exceeds what is strictly necessary for the educational service is non-compliant by definition.

Company Size Segmentation

Startups developing digital products face the most significant privacy by design challenge relative to their stage: the architectural decisions made during the MVP phase typically persist through the product’s entire lifecycle. A data model designed without pseudonymisation cannot easily be retrofitted; a data retention policy not embedded in the product logic from launch requires engineering work to implement retrospectively. Privacy by design integrated from the first sprint is the most cost-effective compliance approach for early-stage companies.

SMEs with established digital products often have inherited technical debt — features built without privacy review, third-party integrations added without DPA assessment, data retention policies not enforced by the system. Our privacy audit for established digital products identifies these debt items and prioritises remediation by regulatory risk and engineering effort.

Corporate groups with multiple digital products require an enterprise privacy architecture programme: consistent data classification, standard privacy controls implemented across products, group-level DPIA methodology, and coordinated training for product teams across business units.

Common Mistakes We Fix

  1. Using production personal data in development and test environments. Development and testing environments that contain real personal data are a significant source of data exposure risk and a clear Art. 25 GDPR violation (processing more data than necessary, failure to pseudonymise). Anonymisation or synthetic data generation for non-production environments is the correct approach.

  2. Default opt-in for marketing and analytics tracking. Platforms that default to opt-in for marketing communications, behavioural analytics, or third-party tracking cookies violate Art. 25.2 (privacy by default). Consent must be affirmative and specific; default settings must be the most privacy-protective available.

  3. Collecting data “just in case.” The data minimisation principle requires that only data that is strictly necessary for the defined purpose is collected. Forms that collect optional data fields that are never used, analytics tracking that records user behaviour beyond what the stated purpose requires, or profile completeness systems that incentivise users to provide more data than necessary — all violate Art. 25 by design.

  4. Not applying retention periods technically. A privacy policy that states a retention period is insufficient if the system does not technically enforce deletion when the period expires. Retention must be enforced in the data architecture — through automatic deletion schedules, data lifecycle management, or equivalent technical mechanisms.

  5. Treating third-party API integrations as outside the privacy-by-design scope. Third-party services integrated into a product (analytics platforms, advertising networks, customer support tools, payment processors) receive personal data from the product. The privacy by design obligation extends to the data flows to these third parties: the DPA framework must be in place, data minimisation principles applied to the data shared, and the integration designed to share only what is necessary for the defined purpose.

Geographic Coverage

We provide privacy by design advisory across Spain, with a primary focus on technology companies and digital product teams in Madrid, Barcelona, and Valencia. For companies developing products for multiple EU markets, we advise on privacy by design in the context of the GDPR as applied by different national supervisory authorities, noting that interpretive positions on specific design questions (particularly around consent and purpose limitation) vary between the AEPD and other EU data protection authorities.

Worked Example: Privacy by Design Integration for a HealthTech App

A Spanish digital health company (40 employees, EUR 3 million ARR) developing an occupational health monitoring application engaged BMC at the prototype stage. The application collected data on employees’ physical activity, sleep patterns, and stress indicators, shared with employers as aggregated workforce health reports and with individual employees as personal health insights.

BMC’s privacy by design integration:

  • Data flow mapping: identified 7 distinct personal data processing activities, including health data (Art. 9 GDPR special category) and behavioural data.
  • Legal basis architecture: individual employee consent for personal health insights (granular, revocable consent UX designed into the onboarding flow); legitimate interest analysis for employer aggregate reporting (anonymised data does not trigger GDPR, but the anonymisation standard had to be technically verified).
  • DPIA: conducted for the health data processing and the potential re-identification risk of “anonymous” aggregate reports in small teams (teams of fewer than 10 raise re-identification concerns even with aggregation).
  • Data minimisation: the original data model collected 23 data fields; privacy by design review reduced this to 14 essential fields, with 9 optional fields behind an explicit in-app opt-in.
  • Pseudonymisation architecture: individual health data stored with pseudonymised identifiers; the mapping table held separately with access restricted to a defined security role.
  • Third-party integration review: the app used a US-based analytics SDK that transferred personal data to the US. Standard Contractual Clauses (SCCs) put in place; data transfer impact assessment (DTIA) conducted; alternative EU-based SDK identified as fallback.
  • Result: launched compliant. First inspection by the AEPD (triggered by a competitor complaint about the industry) found no violations. Legal costs of the privacy by design integration: approximately EUR 18,000. Estimated cost of equivalent remediation after launch: EUR 60,000-90,000 in engineering and legal costs, plus potential AEPD sanction exposure.

How We Work

Our privacy by design practice integrates directly into product and engineering teams. The service is available in three engagement models:

Sprint review integration: a privacy adviser reviews new features and data processing changes during the development sprint, providing real-time design guidance. Typically 4-8 hours per sprint for active development teams. Catches compliance issues at the lowest-cost correction point.

DPIA as a service: for new products or significant new features, we conduct the full Art. 35 GDPR impact assessment, including the risk assessment, the measures consultation, and the DPO supervision. Output is a completed DPIA document in AEPD-compliant format.

Privacy architecture audit: for established digital products, we audit the existing data architecture, data flows, and third-party integrations against privacy by design requirements, producing a prioritised remediation plan with engineering effort estimates.

All three models can be provided in coordination with our outsourced DPO service, where the DPO provides formal oversight and sign-off on DPIAs and major processing decisions.

Track record

Real results from privacy by design implementation

When we started developing our occupational health app, we brought BMC in during the design phase. They defined the data architecture, conducted the DPIA, and reviewed every sprint with the team. We launched compliant from day one without a single post-launch architectural change. Far less expensive than waiting.

WorkHealth Technologies S.L.
CTO

Experienced team with local insight and international reach

What our privacy by design service includes

Development Cycle Integration

Defining the privacy process for agile teams: privacy review criteria in the definition of done, privacy analysis templates for new features, and workshops for product and engineering teams.

Compliant Data Architecture

Design or review of the product's data architecture to ensure the principles of minimisation, purpose limitation, storage limitation, and pseudonymisation or encryption where applicable.

Privacy by Default in UX

Review of the user experience design to ensure that default settings are the most protective and that the interface does not incorporate dark patterns that undermine consent.

Data Protection Impact Assessment

Determination of the DPIA requirement and, where triggered, completion of the assessment integrated into the design process before development begins.

Accountability Documentation

Records of processing activities update, product privacy notice drafting, and documentation of technical and organisational measures implemented.

Guides

Reference guides

Post-Brexit: your British company operating in Spain with the right structure

post-Brexit advisory for UK companies operating in Spain: entity structuring, customs and VAT, work permits for British nationals, UK-Spain tax treaty optimisation and data protection compliance.

View guide

AML compliance in Spain 2026: what your business must know about anti-money laundering regulation

Spain AML compliance 2026: SEPBLAC obligations, risk-based approach, PBC manual, UBO verification, and suspicious transaction reporting. Expert service from BMC.

View guide

Comprehensive legal services for businesses

Comprehensive legal advisory for businesses: commercial, employment, contracts, regulatory compliance, and dispute resolution. A dedicated legal team to protect your company.

View guide

Buy property in Spain with confidence — and without the horror stories

Buying property in Spain 2026: NIE, conveyancing, ITP tax, mortgage advice, and due diligence for foreign buyers. Step-by-step guide from BMC property lawyers.

View guide

The collective agreement that governs your workforce: understand it and negotiate from strength

Spain collective bargaining guide: union negotiation obligations, ERE/ERTE triggers, works council rights, agreement registration, and how BMC protects employer interests.

View guide

Your commercial lease agreement: get the clauses right before you sign

Spain commercial lease guide: LAU legal framework, rent review clauses, break options, guarantee structures, and key negotiation points for tenants and landlords.

View guide

Service Lead

Bárbara Botía Sainz de Baranda

Senior Lawyer — Legal Division

Registered no. 11,233, Málaga Bar Association (ICAM) Law Degree, University of Murcia BBA in Business Administration, University of Murcia
FAQ

Frequently asked questions about privacy by design

Article 25 GDPR imposes two complementary obligations. Privacy by design: the controller must implement appropriate technical and organisational measures to ensure that data protection principles are built in from the design stage. Privacy by default: the default settings of the product or service must ensure that only the personal data necessary for each specific purpose is processed. Both obligations apply before processing commences, not merely once it is underway.
Integration into agile methodologies operates at several levels: privacy criteria in the team's definition of done, a privacy review step in the design review process before beginning development of each feature, DPO or privacy advisor participation in product demos when data processing changes, and a standard rapid assessment process for new user stories with a personal data component.
Typical technical measures include: pseudonymisation of personal data in development and test environments, encryption of sensitive data in transit and at rest, role-based access control with minimum privilege, audit logging of personal data access, technically secure deletion procedures when retention periods expire, anonymisation of data for individual-level analytics, and separation of identification data from functional data.
Yes, when integrated correctly from the outset. The cost of integrating privacy at the design stage is systematically lower than remediating non-compliance on a live system. Post-launch architectural changes — separating data, adding encryption, implementing retention policies, redesigning data models — are technically complex, expensive, and frequently imperfect. The investment in a correct privacy process from the start is recovered in the first remediation avoided.
AI systems present specific privacy by design challenges: tendency to overfit on training data in ways that can reveal individual information (memorisation), the need for large data volumes that conflicts with minimisation, and model opacity that complicates explainability. Differential privacy techniques, federated learning, pseudonymisation of training data, and explainable AI (XAI) design are the primary privacy by design tools for AI. We combine these with EU AI Act compliance where the system is high-risk.
Privacy by default means the initial configuration of the product or service must be the most privacy-protective, without the user having to do anything to activate it. In practice: data sharing with third parties disabled by default, highest privacy setting in profile visibility options, analytics in anonymised mode by default where the user has not consented, and security notifications enabled by default. Users may reduce the privacy level if they actively choose to, but the starting point must be maximum protection.
Yes. Although the ideal objective is to integrate privacy from the design stage, we also conduct privacy assessments of live systems to identify existing gaps and prioritise corrective measures. This product privacy audit covers the data architecture, information flows, technical security measures, and retention policies. The output is a remediation plan prioritised by risk and implementation effort.
Yes. UX design is a critical component: how consent is requested, the clarity of user-facing information, the ease of exercising access and erasure rights, and the absence of dark patterns in privacy settings all have direct compliance implications. We work with UX teams to ensure the user interface reinforces rather than undermines the privacy framework.
First step

Start with a free diagnostic

Our team of specialists, with deep knowledge of the Spanish and European market, will guide you from day one.

Privacy by Design

Legal

Talk to the partner in charge

Response within 24 business hours. First meeting free.

Services
Contact
Insights