The DORA Regulation (EU 2022/2554) strengthens digital operational resilience for the EU financial sector, requiring demonstrable controls to prevent, manage, and recover from ICT disruptions. In practice, it converges with data security: reducing exposure, maintaining control when third-party ICT providers are involved, and sustaining consistent records across the ecosystem—not only in production.
What DORA Changes for Technology and Data Teams
DORA raises expectations in three dimensions that directly impact how datasets are managed outside production:
- ICT risk management and control: defined, operational, and reviewable policies, processes, and technical controls.
- Verifiable assurance: traceability and reproducible outcomes to demonstrate control execution.
- Third-party ICT risk: controls must remain in place when data is consumed by vendors and external services.
Execution typically spans three workstreams:
- Security: defines policies, segregation, least privilege, monitoring, and logging.
- Data platform / DevSecOps: implements delivery pipelines, transformations, and validations.
- Third-party management: governs vendor inventory, access, retention/expiry, and supporting records.
Scope in Spain: Banking, Payments, and Insurance
As an EU regulation, DORA applies directly in Spain. The regulation explicitly includes the insurance domain (insurers and reinsurers, among other entities), alongside a broad range of financial institutions.
For ICT incidents, national authorities publish operational guidance for in-scope entities. For example, the Banco de España has issued process documentation for reporting major incidents under DORA within its remit.
Data Control in Non-Production Environments: An Operational, Verifiable Requirement
DORA’s supporting technical standards for ICT risk management specify controls that apply to non-production environments. In particular, they state that non-production environments should store only anonymized, pseudonymized, or randomized production data, protecting the integrity and confidentiality of that data.
They also acknowledge that using untransformed production data may be permitted as a tightly scoped exception in specific situations and for limited periods, under defined governance.
Operationally, control is not measured only by environment hardening. It is measured by the dataset lifecycle: what enters, how it is transformed, who uses it, how long it exists, and what records are retained.
How to Implement Secure Data in Any Environment Without Slowing Delivery
The most scalable approach is to treat it as an internal service: deliver fit-for-purpose datasets with embedded safeguards—automated and auditable.
Data Classification and Treatment Rules
Start with an actionable, automatable data classification:
- Personal data and special categories (depending on business context).
- Financial and operational data (e.g., accounts, policies, claims, payments, pricing, fraud).
- Identifiers and relational keys (needed for consistency).
- Operational secrets: credentials, tokens, API keys, certificates (must not propagate).
For each class, define a default transformation rule:
- Data masking to preserve formats and operational usability.
- Pseudonymization when you need consistency and controlled traceability.
- Anonymization or randomization when risk and the use case require it.
The technical goal is to preserve utility while minimizing re-identification: field consistency, format preservation, and—where required—relationship preservation.
Governed Dataset Delivery by Environment and Purpose
Standardize a delivery workflow that produces ready-to-use datasets per destination:
- Select source and scope (tables/fields, time windows, subsets).
- Apply transformations with versioned rules.
- Run automated validations: referential integrity, constraints, distributions, and leakage checks.
- Deliver to the target environment or destination (including vendors where applicable).
- Enforce retention and rotation: automatic expiry and access revocation.
This reduces uncontrolled duplication and makes control records a by-product of the process.
Exceptions with Approval, Expiry, and Traceability
When an exception is required, treat it as a controlled procedure:
- Request with rationale, scope, duration, and environment.
- Approval by data owner and Security.
- Reinforced controls: segregation, least privilege, exhaustive logging.
- Automatic expiry and verifiable removal or deletion records.
This supports the requirement for tightly scoped, governed exceptions as reflected in the technical standards.
Third-Party ICT: Maintaining Control When Data Leaves Your Perimeter
DORA strengthens third-party ICT risk management and the need to maintain a register of ICT agreements and dependencies. For data teams, the critical point is ensuring controls remain effective when datasets are consumed in external platforms (cloud, managed services, consultancies, or tooling).
Operationally, third-party control rests on three elements:
- Destination inventory and classification: which vendors access data, for what purpose, and under what context (environment, project, service).
- Outbound conditions: what data categories may be shared, which transformations are mandatory, and what exceptions are permitted.
- Vendor-linked traceability: the ability to reconstruct what dataset was delivered to which third party, under what retention/expiry, to revoke access and demonstrate control.
This complements contractual registers with technical proof of real usage.
Audit-Ready Evidence: What an Organization Should Be Able to Produce
Design evidence by default. A minimum package per execution (delivery run) should include:
- Dataset identifier and version (source, scope, timestamp).
- Applied rules (policy set and transformation version).
- Validation outputs (referential integrity, constraints, leakage checks).
- Destination and retention/expiry (environment, permissions, TTL).
- Approvals and justification where applicable.
- Verifiable expiry, withdrawal, and deletion records.
This reduces friction: internal audit and compliance reviews rely on consistent, traceable, reproducible artifacts.
How Gigantics Fits Into a Digital Operational Resilience Program
Gigantics centralizes governed dataset delivery to internal and third-party environments, applying transformation policies and traceability to reduce sensitive data exposure without sacrificing operational utility.
- Consistent, versioned policies for data masking, pseudonymization, anonymization, or randomization by data domain.
- Integrity for use: preservation of valid formats, cross-field consistency, and relationships (including referential integrity where applicable).
- Execution records: what dataset was delivered, to which environment or vendor, which rules were applied, and its retention/expiry—supporting audit and third-party control when data leaves the perimeter.

