Skip to content

Privacy by Design: Cheaper to Prevent Than to Remediate

Article 25 GDPR implementation: privacy by design and by default for digital products, software, apps, and internal processes. Direct integration with product and engineering teams.

90+
Products and systems with privacy by design implemented
Art. 25
GDPR mandate for privacy from the design stage
60%
Typical reduction in remediation costs vs post-launch compliance
4.8/5 on Google · 50+ reviews 25+ years experience 5 offices in Spain 500+ clients
Quick assessment

Does this apply to your business?

Do your product and engineering teams consult the DPO or privacy advisor before beginning development of features that process personal data?

Is the default configuration of your products the most privacy-protective option, or do users have to actively search for how to reduce data sharing?

Have you defined data retention periods at every layer of your architecture (database, backups, logs, analytics) with a technical process to apply them automatically?

Does your development process include a privacy assessment before launching new features that might require a DPIA?

0 of 4 questions answered

Our approach

Our privacy by design integration process

01

Privacy requirements analysis

In the product definition phase, we identify planned personal data processing activities, applicable legal bases, purposes, and data flows between systems, services, and third parties.

02

Compliant data architecture design

We define the data architecture that meets the principles of minimisation, purpose limitation, and storage limitation, and design the technical measures for pseudonymisation, encryption, and access control.

03

Impact assessment (if required) and design reviews

We determine whether the product requires a DPIA under Article 35 GDPR, conduct it where necessary, and participate in design reviews to verify that privacy requirements are maintained throughout development.

04

Launch and accountability documentation

We accompany the product launch with updated compliance documentation: privacy notices, informational clauses, records of processing activities, and DPIA report where applicable.

The challenge

Article 25 of the GDPR requires that data protection be considered from the moment of designing any product, service, or process that handles personal data. In practice, the vast majority of organisations follow the reverse sequence: they launch the product and then try to retrofit compliance onto an architecture that was not designed for it. The result is costly remediation, complex technical changes, and compliance that is frequently incomplete.

Our solution

We integrate privacy requirements into the product development cycle from the earliest design phases. We work directly with product, UX, and engineering teams to define the data architecture, technical and organisational measures, and information flows that ensure GDPR compliance without sacrificing product functionality.

Privacy by design and by default is a legally binding obligation under Article 25 of the EU General Data Protection Regulation (GDPR, Regulation 2016/679), which requires controllers to implement appropriate technical and organisational measures designed to give effect to data protection principles — such as data minimisation, purpose limitation, and storage limitation — both at the time of designing the processing and at the time of the processing itself. "Privacy by default" additionally requires that, by default, only personal data necessary for each specific purpose is processed. Failure to implement privacy by design and by default is a sanctionable GDPR infringement, independent of whether a data breach has occurred, and the AEPD has issued fines specifically for this violation.

Privacy by design is not a voluntary best practice — it is a legal obligation under Article 25 of the GDPR that creates liability for controllers who fail to implement it. And yet the majority of organisations continue to treat privacy as a post-development remediation exercise rather than a design requirement present from the earliest architectural decisions.

The True Cost of Getting the Sequence Wrong

The cost of the incorrect sequence is systematically underestimated. An architectural change that would have taken hours at the design stage — separating identification data from functional data, applying pseudonymisation from the source, implementing retention policies in the data model — can take weeks or months of engineering work when the system is already in production, with live data, dependent processes, and third-party contracts that constrain every change.

Beyond the direct engineering cost, post-launch privacy remediation is frequently incomplete. An architecture not designed for data minimisation cannot be made minimalist without rebuilding the data model. A system without audit logging cannot retroactively produce the access records that accountability requires. These structural deficiencies are visible to the AEPD in an inspection and are treated as evidence that privacy was not, in fact, built into the design.

Integration Without Bureaucracy

Our integration into product and engineering teams is structured around a lightweight process that generates real protections without bureaucratic overhead. For each new feature or product with a personal data component, we work with the team to answer four questions at the design stage: what data is collected and why, on what legal basis, for how long it is retained, and who has access. This exercise, conducted during design, rarely requires more than an hour. Conducted after launch, it can require weeks of audit and months of remediation.

The sprint review integration — where a privacy advisor reviews product demos when data processing changes are involved — is the mechanism that catches compliance issues when they are still inexpensive to address. A data field added to a user record, a new third-party integration, or a change to the analytics model can each trigger GDPR implications that are visible in a demo but invisible in a code review.

Privacy by Design for AI Systems

For artificial intelligence systems, data protection impact assessments and privacy by design are especially critical because the architecture decisions made at model design time determine whether the system can be GDPR-compliant in a structural sense. A model trained without data minimisation cannot be made minimalist retrospectively without complete retraining. Differential privacy, federated learning, pseudonymised training datasets, and explainable AI (XAI) design are tools that must be chosen at the outset — not added after the model is in production.

Privacy by default in the user experience is a component that product teams frequently underestimate. The product’s default privacy configuration is not just a legal requirement — it is also a signal to users of the organisation’s genuine commitment to their data. Platforms that share data with third parties by default, that activate advertising tracking without consent, or that make privacy controls difficult to find generate greater distrust and greater regulatory exposure than those that adopt the opposite model.

Track record

Real results from privacy by design implementation

When we started developing our occupational health app, we brought BMC in during the design phase. They defined the data architecture, conducted the DPIA, and reviewed every sprint with the team. We launched compliant from day one without a single post-launch architectural change. Far less expensive than waiting.

WorkHealth Technologies S.L.
CTO

Experienced team with local insight and international reach

What you get

What our privacy by design service includes

Development Cycle Integration

Defining the privacy process for agile teams: privacy review criteria in the definition of done, privacy analysis templates for new features, and workshops for product and engineering teams.

Compliant Data Architecture

Design or review of the product's data architecture to ensure the principles of minimisation, purpose limitation, storage limitation, and pseudonymisation or encryption where applicable.

Privacy by Default in UX

Review of the user experience design to ensure that default settings are the most protective and that the interface does not incorporate dark patterns that undermine consent.

Data Protection Impact Assessment

Determination of the DPIA requirement and, where triggered, completion of the assessment integrated into the design process before development begins.

Accountability Documentation

Records of processing activities update, product privacy notice drafting, and documentation of technical and organisational measures implemented.

FAQ

Frequently asked questions about privacy by design

Article 25 GDPR imposes two complementary obligations. Privacy by design: the controller must implement appropriate technical and organisational measures to ensure that data protection principles are built in from the design stage. Privacy by default: the default settings of the product or service must ensure that only the personal data necessary for each specific purpose is processed. Both obligations apply before processing commences, not merely once it is underway.
Integration into agile methodologies operates at several levels: privacy criteria in the team's definition of done, a privacy review step in the design review process before beginning development of each feature, DPO or privacy advisor participation in product demos when data processing changes, and a standard rapid assessment process for new user stories with a personal data component.
Typical technical measures include: pseudonymisation of personal data in development and test environments, encryption of sensitive data in transit and at rest, role-based access control with minimum privilege, audit logging of personal data access, technically secure deletion procedures when retention periods expire, anonymisation of data for individual-level analytics, and separation of identification data from functional data.
Yes, when integrated correctly from the outset. The cost of integrating privacy at the design stage is systematically lower than remediating non-compliance on a live system. Post-launch architectural changes — separating data, adding encryption, implementing retention policies, redesigning data models — are technically complex, expensive, and frequently imperfect. The investment in a correct privacy process from the start is recovered in the first remediation avoided.
AI systems present specific privacy by design challenges: tendency to overfit on training data in ways that can reveal individual information (memorisation), the need for large data volumes that conflicts with minimisation, and model opacity that complicates explainability. Differential privacy techniques, federated learning, pseudonymisation of training data, and explainable AI (XAI) design are the primary privacy by design tools for AI. We combine these with EU AI Act compliance where the system is high-risk.
Privacy by default means the initial configuration of the product or service must be the most privacy-protective, without the user having to do anything to activate it. In practice: data sharing with third parties disabled by default, highest privacy setting in profile visibility options, analytics in anonymised mode by default where the user has not consented, and security notifications enabled by default. Users may reduce the privacy level if they actively choose to, but the starting point must be maximum protection.
Yes. Although the ideal objective is to integrate privacy from the design stage, we also conduct privacy assessments of live systems to identify existing gaps and prioritise corrective measures. This product privacy audit covers the data architecture, information flows, technical security measures, and retention policies. The output is a remediation plan prioritised by risk and implementation effort.
Yes. UX design is a critical component: how consent is requested, the clarity of user-facing information, the ease of exercising access and erasure rights, and the absence of dark patterns in privacy settings all have direct compliance implications. We work with UX teams to ensure the user interface reinforces rather than undermines the privacy framework.
First step

Start with a free diagnostic

Our team of specialists, with deep knowledge of the Spanish and European market, will guide you from day one.

Privacy by Design

Legal

First step

Start with a free diagnostic

Our team of specialists, with deep knowledge of the Spanish and European market, will guide you from day one.

25+
years experience
5
offices in Spain
500+
clients served

Request your diagnostic

We respond within 4 business hours

Or call us directly: +34 910 917 811

Call Contact