For some businesses, the GDPR might look a bit like a new EU-driven digital Armageddon. The Regulation looks so tough that many companies are very concerned whether they will be able to meet the stringent new requirements, and many technology companies are making changes to their products in an attempt to appease Brussel and comply with the GDPR. While the new Regulation is clearly revolutionary, don’t panic! Let’s look at the technological options (and especially the smart ones) available to companies to help them meet the GDPR.

The Regulation itself is relatively vague with regard to technologies. Even the authors had trouble describing how technological solutions should be implemented. Which is good news and bad news. Bad news in that, at least in the beginning, we have to count on structured and unstructured data as a target for our solution. One common problem, for example, is that client contracts or other data is frequently left on shared drives. The good news is that this lack of technological standards creates greater space for smart and practical solutions. It also provides an impetus for managers to justify investing significant time and resources in resolving issues with data and role classification – a common problem for organisations.

The challenge is that sensitive data located all over the infrastructure requires tools to find, convert, operate and audit/control it. This article looks at how this can be handled.


To identify the data you need smart solutions that will scan your infrastructure – but be aware since such solutions usually create indexes that will be full of sensitive personal data. So the hope is that process-based protection will be acceptable to the regulatory authorities. Solutions will also have to be smart enough to look for integrations/linkages between data sources, which will need to be reflected in any reports they produce. This is the tricky part for organisations using numerous reports, custom spreadsheet machines crunching data on local machines etc. Solutions that monitor operations and utilise deep packet inspection might then be a good way forward to approach this problem.


Once you have identified all the data that you have to convert, you have to choose your strategy. How will you convert frequently used dynamic/operational data, and how will you handle that generic data you store for ‘just in case’ scenarios (called archives)? Legacy data can be a big issue, especially in the case of backups. Again, the GDPR is not able to clearly answer the ‘backup question’, and in some cases it will not even be technically possible to convert backups. It is also useful to know that you cannot keep older versions of databases purely for ‘just in case’ scenarios.


You have to look for smart solutions that can combine data integration and encryption/decryption of database fields fast enough not to interfere with your operations. Unfortunately, this will lead to process and role changes. You will probably not be able to keep your good old root account, and any solution will create massive indexes of sensitive data. This is a problem because there is something known as the ‘right to be forgotten’. You will not only have to control access to personal data and provide ‘proper data protection’, you also have to provide a service that deletes sensitive personal data from your records on request; and you have to able to provide appropriate audit information that demonstrates that you have done so! Hence, yes, there is a little chicken and egg problem with those requirements, but safely stored indexes and audit logs should spare you any trouble.

Vendors of solutions will provide some tools for their own systems, but not many will help you with integration. The hope is that software vendors will soon come up with solutions, and also the open source community is beginning to rise to the challenge (such as, for example,


Integration is also tricky on the level of control and audit, as companies have to be able to control employees’ handling of data and provide proper audit trails for future checks. The problem is that audit trails tend to contain a lot of sensitive personal data, and so again solutions have to either encrypt them or split and store them with limited access. Deep packet inspection tools, integration platforms and process monitoring can help with monitoring daily activities. Smart solutions are also a must for unstructured data handling. When analysing data to identify sensitive information, it is better to apply big data analytics’ principles and work with probability rather than attempt 100% precision.

In a nutshell, the GDPR will not just be about process changes and consulting. Technical implementations will require creativity. It will also include tons of smart solutions to sort out unstructured data and common operation practices that are not compliant with the Regulation. It presents a huge opportunity for organisations to make the right architectural changes to their data and infrastructures. Implementation of integration platforms may help business development and transformation to achieve more secure and flexible IT.

Please follow and like us: