Regulation often drives evolution in organisations, but never before has regulation been so close to shaping every organisation’s approach to data. The General Data Protection Regulation, EU’s answer to the increasingly fast evolution of data collection and treatment in our modern world, intends to rule the collection and usage of personal data through a global approach. All public and private organisations must prepare for the scheduled start date of 25 May 2018.

A data protection regulation? What exactly does it protect?

The GDPR intends to protect privacy by applying rules to the collection, safekeeping, treatment and usage of personal data, and in doing so substantially altering the terms of previous European or national regulations on the topic. Personal data is defined as information permitting direct or indirect identification of individual persons. This encompasses the obvious names and identification numbers, but also covers location data and specific characteristics of persons such as physical traits, health information or socio-economic data. Anything that links to a person is considered personal data, whatever the context and whatever the reason the information was collected in the first place.

This means that information on the private or professional context of a person is still personal data, and as a consequence not only is client or prospective client data concerned, but also data on human resources, providers and partners, be it on a commercial or non-profit basis.

Treatment of personal data

As is now commonplace, the regulation encourages organisations to enforce privacy management through a risk-based approach, meaning that the level of control and protection must be commensurate with the sensitivity of data.

The key objective of the data controller is to ensure that data subject rights are protected on an on-going basis, as well as through specific actions and participation in environment-altering projects for assessment and safeguard setup purposes. Even though GDPR aims at protecting privacy, the regulation remains mindful of the importance of data analysis in today’s connected world, allowing some leeway through pseudonymisation, encryption, management of client consent and privacy-by-design concepts. The data controller will be responsible for safekeeping personal data as well as demonstrating compliance upon request from a data subject or regulator.

Where relevant, i.e. in public institutions or certain organisations handling sensitive personal data, a data protection officer must be appointed. The data protection officer has an overarching responsibility for privacy protection, and should be considered for all ends and purposes as the GDPR compliance officer. The data protection officer role encompasses the on-going training of and advice to concerned functions across the organisation, as well as a front-line role towards the relevant regulatory bodies.

Roadmap to compliance

Achieving regulatory compliance usually starts with a gap analysis. This will help identify the key items to address and pave the way for remediation implementation. In the case of GDPR, the data-centric approach to privacy protection means that any future data source or repository underlying existing or new infrastructure will in turn be subject to the same regulatory requirements.

Building a future-proof infrastructure today seems like a vain promise, and it probably is, due to the increasing pace of regulation review, the speed of technological evolution and the relative complexity for organisations to marshal their forces into projects. dFakto offers an easily updated, easily connected solution to the existing infrastructure with the sole purpose of cataloguing and analysing all data in the light of the regulation, shifting the challenges from the entire organisation to just maintaining a single, dedicated application.

Platform-based analytics

Maintenance of compliance standards in an environment where both the infrastructure and compliance rules may rapidly evolve constitutes probably the greatest challenge of GDPR for any private or public organisation.

Maintaining compliance in a traditional project steering environment implies privacy-centric functions in each and every project, substantially altering the momentum of project roll-out through additional governance, compliance gap analyses and potential re-engineering. Even though these constraints are common to project management across most sectors, they do not constitute a sound base to grow a future-proof privacy-minded business and technological environment.

Experience tells us all that both the infrastructure and regulation will evolve over time, probably even faster than we would expect. With that in mind, it seems reasonable to assume that the most relevant action today is to implement a platform, the role of which is limited to connecting all data sources across the organisation to detect personal data, store an audit trail and issue actionable reporting to data controllers.

Today’s technology and process expertise can help us move beyond the limitations of the past, as today, we can build a living catalogue of data able to:

connect easily to all existing and future data sources,
apply rules categorising data depending on their compliance risk level,
store encrypted or pseudonymised data,
report to the data controllers their list of required actions based on compliance rules,
– and structure the follow-up to ensure that actions are executed in due time.

In this way, we can move from a state of on-going monitoring, regular reporting and one-off analyses to a dedicated continuous process, limiting the operational and project impacts on the whole organisation while enforcing a privacy-centric approach to data management.

Conclusion: the case for platform-based personal data management

Connecting any tool to a new source will always represent some IT work to integrate data. But that work is limited in scope and cost, and above all does not hinder the progress of new implementation projects and the maintenance of the existing infrastructure.

Likewise, updating the detection and the compliance rules defining actions to be taken always represents analysis work, sometimes even requiring dedicated regulation specialists. But that work is also limited in scope and time, and can be performed in a yearly review process meaning that it does not need to be started for each new project in the organisation.

Shifting the burden of compliance implementation and maintenance from a large project team to a small taskforce of dedicated individuals not only makes sense from an organisational point of view, but also from a customer-centric point of view, the cost of regulatory compliance usually ending up being borne by the client.

All these elements build the case for a platform-based answer to the issue of regulatory requirements on data management – a solution rooted in data management to answer the challenges of privacy protection.

You want to learn more about GDPR?

dFakto is organizing a workshop with the goal to review and discuss the main issues that need to be tackled when it comes to GDPR. For more info please contact JOANA SCHMITZ jsc@dfakto.com or +32(0)2.290.63.90.

More about the author: LinkedIn – Dorian de Klerk


As you’ll surely know the overall objective of the BCBS239 regulation is to strengthen banks risk data aggregation capabilities and risk reporting practices, as it relates to credit risk, operational risk, market risk and liquidity risk. Now this regulation is mandatory for the ‘systemically important banks’ – known as G-SIBs – such as BNP Paribas, ING and Belfius (in Belgium). A full listing of these banks is available here. It is however also strongly recommended that national regulators enforce the same regulation principles to banks known as D-SIBs, 3 years after their identification as such.

The new regulations insist that G-SIBs should be able to produce their aggregated risk numbers ‘within a short time, like in a crisis mode’. The overall aim is that a better understanding of their risks will help improve decision-making processes at banks. Remember 2008? Well the financial authorities still remember it well too … BCBS239 is a step to ensure that history doesn’t repeat itself.

What is the BCBS239 regulation?

A regulation in reaction to the 2008 crisis, the BCBS239 regulation outlines 14 principles under 4 main chapter headings including I. Overarching governance and IT infrastructure; II. Risk data aggregation capabilities; III. Risk reporting practices and IV. Supervisory review, tools and cooperation. Even though it is a principle based-regulation with few clear and defined metrics that can be used to monitor compliance, it is designed to improve the reporting and supervision of risk within large banks.

Do you have time to wait for someone to give you one single version of the truth?

In the long term, the solution seems simple: streamlined data aggregation as an input for all BCBS239 requirements. However this is easier said than done as many banks have legacy IT systems that are not adapted to the improved financial regulation reporting demands. Moreover the data necessary to feed the reporting needs is not available from one single data source but is historically spread throughout the organisation.

Making faster and more accurate reporting a reality is one reason large banks have appointed a Chief Data Officer. This CDO will need to make sure that the risk numbers are regularly available and that they are accurate and verified. Legacy IT systems, with data spread over a multitude of sources, and additional pressure on operational and commercial use of data is just adding to the difficulties facing the CDO.

We need those numbers… now

Let’s imagine that you have all the data available, and it has been aggregated and verified. In itself that already seems like a great achievement. But now imagine you need to be able to produce the necessary dashboards and insights on the spot? The dream seems a bit further away now, doesn’t it? The reality is that many G-SIBs cannot currently deliver accurate numbers within a limited time frame, and it would be even harder if they were in a difficult ‘crisis’ period. This inevitably will take time to develop and even longer to set up and be put in place. Can the G-SIBs wait for concrete results in IT to deliver the aggregating risk on operations?

If you think the answer is yes … then that’s fine you don’t need to read any further. But if you think the answer is no … then dFakto has a cost-effective solution for you that could be worked out and delivered within just a short time frame.

dFakto are experts in managing data and aggregating results. They have a long track record of sourcing and storing confidential data from within IT systems, verifying it, enhancing it as necessary and then presenting it in a way that is easy to understand by everyone. Their data factory and data vaulting techniques allow for real-time dashboards meaning that if you have a risk aggregation question tomorrow you can also answer tomorrow.

If you are looking for a real answer to this problem, and in a short time frame then don’t just put more consultants on the job, hire people who can deliver an effective solution that includes business intelligence and technical know-how.

Ask dFakto to explain how their expertise and experience could be put to good use and help you solve a pressing regulatory issue within just a few months.