Skip to main content

Article 32 GDPR and data masking

· 4 min read
Juan Rodriguez


In previous articles, we have covered the importance of using a data masking tool within a company. We have gone through the following points:

  • How to gain speed and reliability when it comes to generating data for testing.
  • How to avoid data leaks and, in turn, reputational crises.
  • How to make data more secure and different techniques to do so.
  • Several concepts on how to implement a cybersecurity strategy.

But until now, we had not talked about another key concept: legislation.

Since May 2018 in all member states of the European Union, the now famous, General Data Protection Regulation or as it is generally known, GDPR for its abbreviation in English, was adopted.

This regulation had a major impact on many companies, since it regulated the use of personal data that companies collect from consumers and how they can make use of it. In fact, it came at a time when this use was in question and many cases of abuse were already being committed by many companies.

This paradigm shift has always been attributed to the use of personal data for commercial or marketing purposes because that is where it was most affecting all types of people, but it has also affected other types of data use. And that is where data masking and Article 32 of the GDPR come in.

Article 32 talks about the security of data processing and treatment. One of the paradigm shifts generated by this Regulation is that data are not the property of companies, but rather, they are the property of users. Therefore, users give this data to companies and as such, companies are obliged to safeguard such data to guarantee the rights of users. Therefore, we are going to unravel part of this article in relation to data masking:

Article 1. [...] to ensure a level of security appropriate to the risk, which shall include, inter alia, as appropriate:

  • a) anonymization and encryption of personal data.

  • b) the ability to ensure the continuing confidentiality, integrity, availability, and resilience of processing systems and services.

  • c) the ability to restore availability and access to personal data in a timely manner in the event of a physical or technical incident.

  • d) a process for regularly testing, evaluating, and assessing the effectiveness of technical and organizational measures to ensure the security of the processing.

Article 2. In assessing the appropriate level of security, an account shall be taken, in particular, of the risks presented by the processing, especially those resulting from the accidental or unlawful destruction, loss, alteration, unauthorized disclosure of or access to personal data transmitted, stored, or otherwise processed.

Article 3. Adherence to an approved code of conduct referred to in Article 40 or to an approved certification mechanism referred to in Article 42 may be used as evidence of compliance with the requirements set forth in paragraph 1 of this Article.

Article 4. The controller and the processor shall take steps to ensure that any natural person acting under the authority of the controller or the processor who has access to personal data does not process them except on instructions from the controller unless required to do so by Union or Member State law.

From these articles, what would we be covering by adopting software like Gigantics?

  1. We would be anonymizing and encrypting data for internal use.

  2. There would be no unauthorized persons having access to sensitive data.

  3. A control system can be guaranteed for the treatment of the data that is anonymized. It is recorded in different reports.

  4. We can guarantee efficiency as it is an automatic process controlled by software.

  5. We would mitigate to the maximum or eliminate the risks of elimination, alteration, diffusion, or access to sensitive data by unauthorized persons.

Talk to us
Share post: