Wouldn’t you like to have all of your processes working synchronously off of a conformed set of Master Data?

With a bit of ambition, you may even envision parallel processes utilizing this master data to harmonize the customer and business experience into something more efficient. If you’ve tried, I bet you’ve found that it’s like herding kittens. The moment something changes, synchronisation becomes an issue because so many departments have very particular perspectives on Master Data.

If you could easily achieve a Master Data ‘golden record’, the customer experience would be an evolving seamless experience. In terms of record capture and utilization, there would be the real possibility of running parallel business processes. It would reduce customer-perceived response timings for fulfillment or for single sign-on!

But what if one of your processes needs a modification due to internal or external influences?

Maybe you will need to capture additional information due to company structural transformation, or limit information capture due to regulation. Perhaps you have acquired a new business asset which is producing or consuming additional data that requires that you rethink your data model.

Your business process driven data capture and MDM (Master Data Management) is now in a state of change. This puts the overall data system at risk of coming out of not reflecting reality.

At the very least it must be scrutinized for impacts of change.

Build change into the heart of Master Data

MDM is clearly achievable if you can harness change, but change is a big challenge. You first have to hit upon an approach that incorporates change into the heart of data management, and that keeps up with your processes changes. If you don’t, then there is an implicit resistance/latency to between reality and your concept of Master Data.

If you are using old “Star-Schema” or “Inmon” approaches, the changes you need will take some time. Probably a lot longer than you expect if you’ve had the experience of most enterprises.

These approaches are broadly accepted as ‘tried and tested’ largely because they have been in the industry for several decades. However, we all know that longevity is not the same as suitability. The level of change we are experiencing today was unthinkable when these older methodologies were conceived. Plus, we’ve even moved on from their updates (read this for more).

If your goal is to allow your business processes to change without affecting your MDM integrity, then you need a methodology which locks in data. It remains trustworthy, while flexing to accommodate changes and updates.

If you can achieve a system of record that achieves both this stability and flexibility to feed your MDM, then you are on the path to a sustainable system of record upon which to build your current and future MDM and synchronised processes.

Virtualized perspectives of data history

Data Vault is the methodology that is increasingly being adopted by perceptive companies who realise that change is inevitable and who embrace it. Data Vault is a discipline which is a hybrid of the old “Star-Schema” and “Inmon” techniques. In fact, it is more closely related to modern “Graph” structures which capture both structure and relationship to reduce the impact of change seen in old approaches. By separating out concerns of structure and relationship, change becomes a fluid part of the model.

Consequently, changes DO NOT impact on current data, introducing finely tuned control.

Or find out more from our friendly team of business and technical experts at info@dfakto.com or +32(0)

So you actually bought into that new technology stack that was really going to improve your analytics and help you make better decisions? The truth is while it may well help, the conflict between flexible enterprise change and accurate historical reporting will be always be a discipline rather than a technology when supporting a successful modern business. Sadly no technology is the holy grail. After all, none of them has yet managed to fully put you at ease, has it?

Just as you cannot expect a project to run itself, even though you have the latest project management software, it’s the same with data warehouse management because it is the methodology that counts. Yet just like with project management methodologies, Data Vault is also agile, compartmentalizing change to encourage flexibility.

Pitfalls and inadequacies of steady state

In the best of cases it probably takes months to organise changes within your data warehouse. And the crazy thing is – even if you are doing even simple changes – it may be incomprehensible why ‘just adding a few fields’ (or updating one, heaven forbid!) can take such a long time.

The main problem is that it is the methodology of storing, not the technology that is the issue. The fact is that old Star Schemas, 3NF (Third Normal Form) systems and Snowflakes, just aren’t cut out for change as they were (and still are) designed for analysis of consistent data rather than data capture. So while Star Schemas and Snowflakes are especially good at some analytical tasks, and 3NF is great for enforcing cascading point-in-time structure, none of these methods cope well with changes. And none are made to accommodate (and reconcile) the data structure from 5 years ago with the data you have today and requirements for tomorrow … they are simply too prescriptive and only designed for a single, current way of doing things. As a consequence, if it changes, you essentially have to (carefully) throw out the old and start again, hence it takes all that (expensive) time.

The best answer is to accept change, and embrace it.

Enter Data Vaulting. (Applause). It is able to capture data from anywhere, and extract virtualised views as moving snapshots when required.

‘But our data is always changing.’

It always seems to be about ‘new’ and ‘the next big thing’, but glancing backward at the ‘history’ or considering ‘change’ is often an afterthought, and best left to ‘others’ to reconcile. How are we to understand how well the business is doing over time without some consistency amidst the technology transformation? Of course it is interesting and important to adapt so that you can be interoperable to the best standards, but how can we understand (or allow!) real growth and change while still accurately tracking history?

The older methods like Kimball and Inmon (Star Schema, 3NF and Snowflake anyone?) were created a long time ago, in times when change wasn’t so rapid and we didn’t have such volumes of data. Back then, you were designing a data model for specific single solutions and could afford the time to throw out the old data model and start again. Their continued appeal is that lots of people use them – but they are trying to use them for the wrong purpose! They are effectively trying to use buckets to grab rivers of information, attempting to read snapshots of a data stream as if it was a fixed data model rather than a continuum of change. They try to put the snapshots together into some sort of continuous historical view of business information – and it all begins to look a bit odd! Data Vault changes the game from old-school disciplines and bends to the flow of the river making the technology irrelevant.

Distinct purpose: capture raw history, flexibly

Emerging from these older disciplines, Data Vault is a hybrid evolution and has been specifically designed for change management and organizing diverse sources from the ground up. So just as older methods attempted to shoehorn the data streams into the old methodologies and make data analytically usable, Data Vault is all about embracing change across the entire enterprise. The old methodologies are still very useful – they are good at transforming data into insights – but they are not good as raw systems of record. The fact is Data Vault’s sole purpose is to structure heavy workloads of changing disparate data sources, and then handle them in an agile way, and promote onward data quality and processing. Data Vault’s purpose is primarily as its name suggests – to take raw ‘Data’ (rather than information) and put it in a ‘Vault’ (captured and safe) that stores everything in its purest original state – an immovable, yet flexibly structured, system of record.

By borrowing modern social data concepts like ‘relationships’ and ‘entities’ and combining them with the older methodologies, the structure of your business concepts is separated out from the sources and forms the skeleton on which the data linked. It is possible, and recommended, for a knowledgeable business user to design the intuitive core data concepts behind the Data Vault on a whiteboard before involving the data professionals to fill in the details. Data Vault is a ‘business first’ methodology, focusing data around the business ideas, rather than conforming the business ideas to the data model.

Business may change superficially, but the concepts that underpin it, aspects such as customer, product, etc., do not. It is these that form the backbone of the Data Vault.

You want to learn more about Data Vault?

Find out more from our friendly team of business and technical experts at: info@dfakto.com or +32(0)

dFakto asked Guido Dauchy, former Head of Transformation at BNP Paribas Fortis, to share his experiences of working with dFakto on the transformation plans at BNP Paribas Fortis. BNP Paribas Fortis is a long-term client of dFakto and we have been fortunate to work with them even before the transformation plans of 2012 were announced. In short, dFakto has been instrumental in keeping senior executives abreast of what was really going on with the programme, and this on an almost daily basis. For as Guido explains, every company needs as much help as they can get.

Starting in June 2012, the “Simple & Efficient” initiative launched with two objectives:

Simplify the Group, particularly after several successful integrations and several years of growth
Improve efficiency, in light of an accelerated regulatory pressure and an uncertain economic environment
The reality is most transformation projects are not successful; indeed the probability of delivering as expected is low (~30%). When you look at the numbers in detail, about 70% of failures are due to poor execution rather than poor strategy.

In Guido’s opinion, two aspects contribute to this: the first is a lack of commitment to deliver by the management, and secondly the fact that plans are outlined at a level of detail that is too high and too strategic, and therefore not actionable. He recommends ensuring that governance meetings are fed with “valuable, fresh, frequent & qualitative insights” from the field, and that actuals and forecasts are monitored on a frequent basis. This ensures that there is an automatic measurement of strategic gaps ‘to date’ and ‘at completion’, with as a consequence an ability to follow the evolution of these gaps and converge them.

In 2012, BNP Paribas launched the Simple & Efficient transformation project: the aim was to generate some 2 billion € of savings by 2016. Now, five years later, not only has the programme more than reached its initial objective, but has exceeded the target by creating savings of 3.3 billion €, an increase of 65% over the initial objective. What happened? How did this happen?

with Guido Dauchy, former Head of Transformation at BNP Paribas Fortis, 20th April 2017

Four key success factors

Four key success factors were identified: an evolving governance, a centralised budget, a simple transversal methodology and a dedicated resources team.

It is important to have governance that can evolve. At BNP Paribas, the governance changed according to the phase and stage of the programme. Initially, the executive committee oversaw the planning of the work, while the heads of different business units worked on the plan. The former gave support to the latter who then had the responsibility of monitoring what actually happened in the field. This ensured that the top management has the right information in the right format as soon as possible, and as directly as possible.

The fact is, however, it’s not always easy in big organisations to have the right information at the right time. This is why there is the need for an incentive. This is where the centralised budget came to play an essential role because by funding specific projects, you trigger the feedback you need for those projects. We speak here of a certain kind of mutual agreement between the steering committee and the heads of business units.

Note though it’s important to remember that top management should not be made to play the role of the police, and that their role is making sure that whatever projects have been promised are also executed. Furthermore, it was important that reporting be as simple as possible: with only the details that are relevant, and including whether or not the business unit was sure to reach the objectives of the overall programme.

A third condition to ensure success in any transformation is using the right methodology. The reality is that in big organisations, you don’t have the time to convince people to be part of something. You therefore have to create a dynamic to make sure that the organisation starts to work within the thinking of the programme from the very beginning. In the case of BNP Paribas, this methodology was based on 3 phases. Initially, of course, the overall goal had to be defined. Once this was agreed to, the 1st phase was a top-down overview plan that included all the potential savings for each business, function and territory. The goal of this phase was to decide what kind of initiatives could be taken to reach the overall objectives, in other words, pinpoint the savings that could be delivered and at what cost. Ideally this phase should last only about 6 to 8 weeks. The recommendation is not to take too much time. Make it easy for those that are doing the job and because transformation is not the normal job of the people, give them the support they need.  The 2nd phase consisted of implementing the ideas from phase 1. More people are involved in this phase, indeed it’s where you bring people in who are capable of turning the initiatives into projects.

At BNP Paribas, it required 12 weeks to translate everything into project programmes. The reality is that you have to know what you’re doing, how you’re doing and when you’re doing it, otherwise the initiatives simply won’t move forward. Also you shouldn’t allow initiatives outside the scope of the plan. Finally the last phase is the implementation phase. This phase is longer compared to phase 1 & 2.  It’s in phase 3 that you’ll make the changes in the governance.

The last key success factor we identified in the BNP Paribas case is the necessity of having a dedicated team. The beating heart of the programme, they are the people who bring everything together: managing the programme, giving information to the governance, and presenting possible solutions. They were also the people that made sure that decisions were acted upon on the ground.

These 4 factors at BNP Paribas helped the company exceed the initial objectives of 2 billion € of savings, to reach 3.3 billion €. Now projects that exceed their initial objectives represent only a small percentage compared to the ones who only reach their objectives (and even smaller compared to the ones who fail). So how did the programme overreach its objectives?

3 ways to exceed objectives

The first one is the extension capability. It refers to the dynamics created once the phases 1 & 2 of the methodology have started. During these phases, you see lots of question being raised, and many people starting to communicate with each other, indeed all begin to be more and more convinced of the real purpose of the programme. The result is they begin to think in another way. Their mindset changes to be in line with the dynamics of the programme. At that moment it’s important to start a new wave of phase 1 & 2 to generate (more) new ideas and solutions. As a result, people will come up with better and more valuable ideas.

Now if a project fails to deliver for whatever reason, the loss needs to be compensated by a new project or extra effort in another project in order to respect the initial commitment. When we talk about compensation we mean launching another project or improving another project. This compensation capability implies that all parties are fully transparent on the progress of their projects, identifying at the earliest possible moment any potential problems so that it allows them to quickly respond to it and limit the risk of “non-achievement” of the objectives. With transparency and non-judgemental communications it’s possible to admit that a project will deliver less than expected. And experience showed us that people always find compensation elsewhere because they don’t want to show failure before the executive committee!

A transformation doesn’t happen in a vacuum. It occurs in a world that is changing and disruptive. This is why a dedicated team is essential. In this way you can be certain of continuous transformation within the company, as if you don’t adapt, you’ll disappear.

At dFakto, the way we see transformation involving different things. First, we encourage pro-active management. We believe that if you can identify issues before they have an impact, it’s easier to resolve them (hence the real value of supplying valuable, fresh, frequent & qualitative insights). When people realise they won’t succeed on their part of the programme, they will report it, and by doing so avoid negative figures in the results. Furthermore, if you can identify what’s wrong as early as possible there is a strong incentive to finding ways to solve the problems, and deliver.

In short, as Guido says,

“We have a more organic approach to transformation. We see it as a living organism. It’s something reactive: you can do something, or you can do something different. We used to plan, execute, and then give feedback. Today, we don’t have this ‘only’ linear approach to transformation. We see it, thanks to dFakto data, as something much more cyclical, and that includes a higher reporting frequency, and consequently a high-planning frequency.”


Data Vaults are agile, but what does that mean in simple terms?

In simple terms, it means you don’t have to “eat the whole whale” at one time. Indeed small bites are more effective for achieving data visibility. You can even task the meal to a collection of people all at once, and the pace can increase without errors. It’s because the Data Vault is highly structured that you can split it up into pieces and (without any difficulty) join them back together again. It means you can do it all at once, in small increments, or something in between and at your own pace, according to your own resource constraints.

Just like Lego (snap it together, build out)

Your business decides the core structure of business concepts, and the data snaps onto these concepts, rather than being forced into more traditional styles of data modelling. You can start right now with as little or as much data as you have. Add to it as you figure out what your future requirements are instead of designing a full data model first. And because Data Vault is atomically structured you can use the tiniest piece in a dashboard from the first time you have that smallest bit of agile data available. If your data grows, it won’t affect what you have already measured.

You might want to do a trial on a small scale to evaluate the benefits before expanding it to the entire enterprise? No problem, Data Vault won’t waste your time meaning whatever you do now will not change allowing you to grow and expand your enterprise data warehouse once and one time only. You may choose to scale quickly by setting parallel teams each building up separate agile data stores. With Data Vault, these can all be easily synchronised through the separation of the core structure from data and can be merged together afterwards. If teams agree on a simplistic core structure of business concepts and relationships, they can each develop on top of the shared construct. It isn’t a model, it’s an agreed way of connecting business entities – something you can achieve on a whiteboard!

  • Start anywhere, evolve elsewhere, bring it all together anytime: Integrate business domains driven by real business priorities. Relations between different business domains can be established at any later point in time (doesn’t need to be thought of upfront).
  • Start small, grow any size: It doesn’t matter whether you’re building a small data warehouse for some Master Data or a full enterprise data warehouse. The advantage of Data Vault projects is that they bring results very early after you start the project, and because they are based on business concepts more than data, they are highly flexible as they grow.
  • Automate from the start for fast iterations: The key to Data Vault is deconstructed activities which are rapidly repeatable. Data Vaults and automation go hand in hand, indeed the simplicity and consistency of the approach encourages automation. Standardise your business concepts (on a whiteboard), generate the structure, automate the integration routines and then start growing your agile data warehouse.

Deconstructed data models improve automation

Once the key components of the business and relationships are understood (keys or IDs, like the customer reference number on an invoice), you simply hang data off of it, as you find it. It means that a disjointed approach to data gathering, if that’s all that is possible, doesn’t impact on the final Data Vault method, because that’s how it’s meant to work! Fast, slow, big small – it doesn’t matter. Work on different areas independently and then bring it all together, or start by attaching all of your existing data marts as sources and keep going from there, but with agility. It’s a very intuitive and flexible process for the business who can then follow the data modelling without any requirement other than they understand how the business processes work and can operate a whiteboard.

Breaking down and segregating the work in this way makes it a very repetitive process which scales well with automation. In fact, automation is highly recommended as the core structure of the Data Vault is simply meant to extract and load data ‘Satellites’ onto the lattice-work of ‘Hubs’ and ‘Links’. In fact by segregating the methodology into a network of data tables, the system almost requires automation to fulfil it’s ultimate goal of near-real-time business data for analytics. You’ll be pleased to hear that dFakto has already developed the key automation features you need to build your first Data Vault in just a few weeks!

You want to learn more about Data Vault?

Find out more from our friendly team of business and technical experts at: info@dfakto.com or +32(0)

dFakto explains how you can both grow profits while lowering costs by responding more rapidly to your customers’ needs

Many companies are under constant pressure, and this is felt right throughout the company. While boards and CEOs continue to insist on increased revenues and lower costs, the necessity to delight the customer, multiply customer growth and have them at the centre of your preoccupations is becoming increasingly critical to the well being of the business.

The problem is that while companies are trying hard to become better, the results often fall short. dFakto believes that to be successful in building value and providing compelling customer experiences at a lower cost, companies need to use the power of digital and data to transform themselves into a more agile and more effective organisation devoted to the customer growth.

Boosting customer growth – It’s not about big data, it’s about the right data

Sadly, a company’s internal systems are too often disconnected from the reality of the customer. This is the way it is as most of these processes grew from an internal requirement and a traditional focus on siloed products and services. This inside-out focus is at odds with winning over the customers of today who are increasingly demanding and tech-savvy.

Even for those companies who have taken the decision to build a ‘live’ picture of their customer have run into difficulties. There is a natural desire to link legacy ERP system with a new CRM approach, yet the results have been patchy to say the least. This kind of decision often means years of development and decisions on technology that are quickly superseded and behind the times. In short, there is quite a distance between the internal systems and processes (with legacy systems and rigid IT infrastructure) and the demands of the marketing department and ‘customer experience’ management.

Customer growth thanks to fast, evolutive and tailor made data management platform

A rapid and efficient solution to this problem is a dynamic data warehouse from dFakto. It sources the required data no matter from what type of system and puts it in a unique “data factory”. There the data is enriched as necessary, and quality controlled, and converted into advanced reports that offer genuine customer insights.

The result is near real time data on your customers and a far greater marketing agility letting you make decisions and respond to changing customer expectations rapidly. The system offers insights based on real data, real customer-centric data and is highly secure.

The payback to you is operational excellence and a far more nimble organization : dFakto can develop a proposal in days and deliver first data insights within a matter of months. We’ve done it before for the civil engineering, aerospace, finance and the automotive sectors, and we can do it for you.

Ask us for a confidential conversation and see how we can help you generate growth beyond your expectations : Thibaut Ladry – tla@dfakto.com, m : +32 473 514 730.

Is your company’s data warehouse working the way you want it to? Are you happy with your dashboards? Do all the sources of data you produce feed into it? It’s possible that if you’re using the ‘old methods’, the answer is ‘probably not’. Sadly this is going to change anytime soon if your business changes with any pace. Indeed, you likely have a series of overlapping change requests and a reluctance to introduce more until you can have assurance of some progress, or even some semblance of currency.
Rising above the change requests, a more pertinent question is: ‘Does your data warehouse reflect the current reality of your business?’ Certainly not if you are still waiting on those pending changes. In fact, I’ll bet you have probably resigned yourself to all of this lag and labelled it ‘business as usual’. You might be, if you are proactive, be looking out for technology solutions, but it need not be like that.

Old dogs, new tricks?

Using just the ‘tried and tested’ approaches, anytime you change one thing in your data warehouse there are usually knock on effects such as dependent systems. As a result your data warehouse must usually alter (and test!) iteratively until its data model can cope. This happens a lot, and many have resigned themselves to accepting this as the price of change in ‘modern’ data warehouse technology. We know it’s not true.
New regulations, compliance requirements, connected external systems, business mergers, process changes or simply expanding requirements introduce a need for modifying how you capture, store and analyse your data. In fact the most significant challenge of any data warehouse is to capture change while maintaining continuity with historical data so that reporting is consistent and accurate. Sounds impossible? Not really.
Making changes using traditional methods will nearly always require a review of data models, impact assessments and even complete rewrites when new requirements are implemented – it’s all very slow, tedious and sometimes painful. And when you are done, the model changes must be tested for assurance that nothing has been broken, before moving from staging to live analytics. Weeks turn into months with all of this, and it involves a huge amount of time and effort.

A better approach: Data Vault

It’s complicated keeping up with today’s pace, which is why some clever people came up with the Data Vault discipline. It’s a method of separating core data structure from the data content. The result is you can quickly and accurately adapt to change without suffering lag times and the inaccuracies of traditional data warehouse methodologies. You build the structure ONCE and then hang your data off of it in ‘satellites’. Need to change? No problem, just add additional structure without breaking what was already there. If you need to include more data, you simply hang more off of the core structure (‘hubs’ and ‘links’ ).
Data Vault is a methodology, not a technology. It is a way of thinking about data, rather than a shiny new trend. It involves separating the structure of your data from the changes. It simplifies and stabilises your data model, so that it becomes a fixed entity into which your raw data is locked and never changes. It is a vault of truth of your business – warts and all.
Read more: Here

You want to learn more about Data Vault?

Find out more from our friendly team of business and technical experts at: info@dfakto.com or +32(0)

Regulation often drives evolution in organisations, but never before has regulation been so close to shaping every organisation’s approach to data. The General Data Protection Regulation, EU’s answer to the increasingly fast evolution of data collection and treatment in our modern world, intends to rule the collection and usage of personal data through a global approach. All public and private organisations must prepare for the scheduled start date of 25 May 2018.

A data protection regulation? What exactly does it protect?

The GDPR intends to protect privacy by applying rules to the collection, safekeeping, treatment and usage of personal data, and in doing so substantially altering the terms of previous European or national regulations on the topic. Personal data is defined as information permitting direct or indirect identification of individual persons. This encompasses the obvious names and identification numbers, but also covers location data and specific characteristics of persons such as physical traits, health information or socio-economic data. Anything that links to a person is considered personal data, whatever the context and whatever the reason the information was collected in the first place.

This means that information on the private or professional context of a person is still personal data, and as a consequence not only is client or prospective client data concerned, but also data on human resources, providers and partners, be it on a commercial or non-profit basis.

Treatment of personal data

As is now commonplace, the regulation encourages organisations to enforce privacy management through a risk-based approach, meaning that the level of control and protection must be commensurate with the sensitivity of data.

The key objective of the data controller is to ensure that data subject rights are protected on an on-going basis, as well as through specific actions and participation in environment-altering projects for assessment and safeguard setup purposes. Even though GDPR aims at protecting privacy, the regulation remains mindful of the importance of data analysis in today’s connected world, allowing some leeway through pseudonymisation, encryption, management of client consent and privacy-by-design concepts. The data controller will be responsible for safekeeping personal data as well as demonstrating compliance upon request from a data subject or regulator.

Where relevant, i.e. in public institutions or certain organisations handling sensitive personal data, a data protection officer must be appointed. The data protection officer has an overarching responsibility for privacy protection, and should be considered for all ends and purposes as the GDPR compliance officer. The data protection officer role encompasses the on-going training of and advice to concerned functions across the organisation, as well as a front-line role towards the relevant regulatory bodies.

Roadmap to compliance

Achieving regulatory compliance usually starts with a gap analysis. This will help identify the key items to address and pave the way for remediation implementation. In the case of GDPR, the data-centric approach to privacy protection means that any future data source or repository underlying existing or new infrastructure will in turn be subject to the same regulatory requirements.

Building a future-proof infrastructure today seems like a vain promise, and it probably is, due to the increasing pace of regulation review, the speed of technological evolution and the relative complexity for organisations to marshal their forces into projects. dFakto offers an easily updated, easily connected solution to the existing infrastructure with the sole purpose of cataloguing and analysing all data in the light of the regulation, shifting the challenges from the entire organisation to just maintaining a single, dedicated application.

Platform-based analytics

Maintenance of compliance standards in an environment where both the infrastructure and compliance rules may rapidly evolve constitutes probably the greatest challenge of GDPR for any private or public organisation.

Maintaining compliance in a traditional project steering environment implies privacy-centric functions in each and every project, substantially altering the momentum of project roll-out through additional governance, compliance gap analyses and potential re-engineering. Even though these constraints are common to project management across most sectors, they do not constitute a sound base to grow a future-proof privacy-minded business and technological environment.

Experience tells us all that both the infrastructure and regulation will evolve over time, probably even faster than we would expect. With that in mind, it seems reasonable to assume that the most relevant action today is to implement a platform, the role of which is limited to connecting all data sources across the organisation to detect personal data, store an audit trail and issue actionable reporting to data controllers.

Today’s technology and process expertise can help us move beyond the limitations of the past, as today, we can build a living catalogue of data able to:

connect easily to all existing and future data sources,
apply rules categorising data depending on their compliance risk level,
store encrypted or pseudonymised data,
report to the data controllers their list of required actions based on compliance rules,
– and structure the follow-up to ensure that actions are executed in due time.

In this way, we can move from a state of on-going monitoring, regular reporting and one-off analyses to a dedicated continuous process, limiting the operational and project impacts on the whole organisation while enforcing a privacy-centric approach to data management.

Conclusion: the case for platform-based personal data management

Connecting any tool to a new source will always represent some IT work to integrate data. But that work is limited in scope and cost, and above all does not hinder the progress of new implementation projects and the maintenance of the existing infrastructure.

Likewise, updating the detection and the compliance rules defining actions to be taken always represents analysis work, sometimes even requiring dedicated regulation specialists. But that work is also limited in scope and time, and can be performed in a yearly review process meaning that it does not need to be started for each new project in the organisation.

Shifting the burden of compliance implementation and maintenance from a large project team to a small taskforce of dedicated individuals not only makes sense from an organisational point of view, but also from a customer-centric point of view, the cost of regulatory compliance usually ending up being borne by the client.

All these elements build the case for a platform-based answer to the issue of regulatory requirements on data management – a solution rooted in data management to answer the challenges of privacy protection.

You want to learn more about GDPR?

dFakto is organizing a workshop with the goal to review and discuss the main issues that need to be tackled when it comes to GDPR. For more info please contact JOANA SCHMITZ jsc@dfakto.com or +32(0)

More about the author: LinkedIn – Dorian de Klerk

Transformation has become a necessity in business rather than a nice to have, indeed those businesses not thinking about it in one form or another, are really just sticking their heads in the sand.

Research by the consulting company McKinsey confirms that only 26% of executives say their companies’ transformations have been very or completely successful at both improving performance and equipping the organization to sustain improvements over time. According to the research, no single action explained the difference in success rate; however one thing was clear: the more change actions an organization undertook, the higher the likelihood that they would succeed in their transformation. The research also suggested that practices include communicating effectively, leading actively, empowering employees, and creating an environment of continuous improvement were contributing factors.

16 years of transformation experience

Over the last 16 years, dFakto has gained a lot of experience about what makes a successful transformation and is happy to share some insights. A good plan is obviously a good place to start of course, and our business consultants, who’ve seen a few transformation initiatives in their lifetime, are well placed to give an opinion on the quality of the plan.

It’s usually pretty easy to spot the companies and organisations that are the most likely to succeed, as they all tend to have a clearer view on their future. Realistic answers to a couple of pertinent questions quickly tend to sort out the men from the boys, as it were.

What is the objective of your transformation? How long have you given for it to happen? – Have you got a team dedicated to the project or are they combining other functions? – Are they experienced? – Who is the sponsor of this transformation? – What level are they at? Board of Directors? Management? Head of department? – What funds have been devoted? Is this Capex or Opex? – How will progress be monitored? – How will management be informed of the progress?

It is on these last 2 questions that dFakto is particularly able to advise and help. Too often the focus is just on the ‘actual’ expenditure, thereby only looking backwards, when an eye on both current and future investments is required. It’s a tough nut to crack, and is especially so as the ‘target’ is a moving one … companies tend to start out by thinking they need to be at place A but quickly have to be planning ahead as to how they can arrive at places B and C. There is no doubt that the issue is even tougher when the company in question has no idea as to their progression and how they are doing.

Numbers are not enough

Experience has taught us at dFakto that the numbers are not quite enough, and that a combination of qualitative and quantitative data is required to know where you are, how you are progressing and how much still has to be completed. Of course, financial expenditure and planned investment information is collected and verified, but so too is how people feel about how they are progressing.

The raw data to be collected can be made available in many different forms including project management software, text files, information on websites, SAP databases etc. Once sourced, it is checked, and presented in such a way that the CEO and/or the board can quickly and easily get answers to their questions. This dashboard can be updated in just a few clicks as the numbers are automatically and directly sourced from the client’s system. Consolidation is made quickly meaning that it is easy to play the ‘what if’ game, as in what would be the impact of adding this or that plan for example.

The naked truth

dFakto gives management the reassurance that the data they are looking at is reliable and has been cross-checked, and that the data being presented is the same as that being used all around the company… in other words, “one single version of the truth” with no trickery, no reworking or manipulation of the figures.

The automation of this data collection across the organisation means that the PMO and team can spend more time thinking about the issues and much less time simply organising the collect and verification of data. It means more time can be invested anticipating the next moves in the plan, and on the change management required.

A bonus is that there is no agenda behind the figures as they are what they are, and accepted as such: facts and figures. The happy by-product of this is any discussions for change are non-confrontational making it easier to identify, discuss and rectify the bottlenecks in the transformation.

Change and transformation is painful by definition, but it’s a lot easier than having to make big decisions in a hurry. All the more reason why it’s nice to know where you’re at, where you are going and when you should get there.

Obviously, this doesn’t meant that your plan should be as static as figures can be. Linearity can be the reason for its failure. When implementing a transformation plan, a special attention should be paid to the “non-linearity” of it.

Any transformation plan must allow some flexibility to be able to be quickly modified in order to reduce the damages if one does note that a bad idea is being implemented. On the other hand if one does notice that the plan overdelivers, the plan should be flexible enough to overcome initial expectations. This “non-linear” approach has many consequences: ideas can be tested out and resources can be allocated much faster if needed.

In conclusion

With continuous improvement now becoming a core competency of managers in business today, it is obviously important they stay on top of what is happening in their organization. With dFakto’s help they can understand what’s happening and make the necessary transformations thereby ensuring their business is sustainable and fighting fit for the future.

You want to learn more about Transformation?

dFakto is organizing a workshop with the goal to review and discuss the main issues that need to be tackled when it comes to Transformation. For more info please contact JOANA SCHMITZ jsc@dfakto.com or +32(0)

First of all, we would like to state that we are very proud that our “T360 solution” has functionalities that are completely in line with the expectations and trends for 2017! A point that recently got validated twice, first in Forbes and afterward by Alexandra Levit, currently a writer for the NY Times mentioning dFakto’s achievements in VR and 3D-printing on QuickBase as the number 1 trend in project Management Developments for 2017.

3D Printing and Virtual Reality for project management?

Many projects today take place in the cloud and via distributed teams, which can make it difficult for team members to feel fully immersed in their work experiences. Virtual reality and 3D printing technology reinvigorate the project lifecycle so that tasks and collaboration efforts resonate more strongly.

As stated by our CEO Thibaut de Vylder during the interview for the article in forbes: “We created a UX experience in project management that allows people to update their project status in 30 seconds or less each month just by focusing on what’s most important.”

The virtual reality experience provides full immersion with an app that allows Project Managers to see progress charts in a personal theatre style. Focus your eyes on a particular chart, and it grows. You can dive into elaborations the same way. The 3D printing component, similarly, facilitates the sense of touch. It creates something you can put on your desk, unlike a digital report that you can’t see once you close it.

What we believe

We definitively believe that these technologies reduce the information gap by providing the information and insights the way that people expect it. Not only do future data-driven platforms need to be open, modular and techno-agnostic but they also need to deliver value to the stakeholders.

We strongly believe that the following principles are the key to success driven results.

  • Open: to any useful technology that provides value.
  • Modular: the considered technology must integrate smoothly and can be replaced on demand.
  • Techno-agnostic: no religion. Just cost effectiveness!

Take a look here if you want to know more about T360.

As you’ll surely know the overall objective of the BCBS239 regulation is to strengthen banks risk data aggregation capabilities and risk reporting practices, as it relates to credit risk, operational risk, market risk and liquidity risk. Now this regulation is mandatory for the ‘systemically important banks’ – known as G-SIBs – such as BNP Paribas, ING and Belfius (in Belgium). A full listing of these banks is available here. It is however also strongly recommended that national regulators enforce the same regulation principles to banks known as D-SIBs, 3 years after their identification as such.

The new regulations insist that G-SIBs should be able to produce their aggregated risk numbers ‘within a short time, like in a crisis mode’. The overall aim is that a better understanding of their risks will help improve decision-making processes at banks. Remember 2008? Well the financial authorities still remember it well too … BCBS239 is a step to ensure that history doesn’t repeat itself.

What is the BCBS239 regulation?

A regulation in reaction to the 2008 crisis, the BCBS239 regulation outlines 14 principles under 4 main chapter headings including I. Overarching governance and IT infrastructure; II. Risk data aggregation capabilities; III. Risk reporting practices and IV. Supervisory review, tools and cooperation. Even though it is a principle based-regulation with few clear and defined metrics that can be used to monitor compliance, it is designed to improve the reporting and supervision of risk within large banks.

Do you have time to wait for someone to give you one single version of the truth?

In the long term, the solution seems simple: streamlined data aggregation as an input for all BCBS239 requirements. However this is easier said than done as many banks have legacy IT systems that are not adapted to the improved financial regulation reporting demands. Moreover the data necessary to feed the reporting needs is not available from one single data source but is historically spread throughout the organisation.

Making faster and more accurate reporting a reality is one reason large banks have appointed a Chief Data Officer. This CDO will need to make sure that the risk numbers are regularly available and that they are accurate and verified. Legacy IT systems, with data spread over a multitude of sources, and additional pressure on operational and commercial use of data is just adding to the difficulties facing the CDO.

We need those numbers… now

Let’s imagine that you have all the data available, and it has been aggregated and verified. In itself that already seems like a great achievement. But now imagine you need to be able to produce the necessary dashboards and insights on the spot? The dream seems a bit further away now, doesn’t it? The reality is that many G-SIBs cannot currently deliver accurate numbers within a limited time frame, and it would be even harder if they were in a difficult ‘crisis’ period. This inevitably will take time to develop and even longer to set up and be put in place. Can the G-SIBs wait for concrete results in IT to deliver the aggregating risk on operations?

If you think the answer is yes … then that’s fine you don’t need to read any further. But if you think the answer is no … then dFakto has a cost-effective solution for you that could be worked out and delivered within just a short time frame.

dFakto are experts in managing data and aggregating results. They have a long track record of sourcing and storing confidential data from within IT systems, verifying it, enhancing it as necessary and then presenting it in a way that is easy to understand by everyone. Their data factory and data vaulting techniques allow for real-time dashboards meaning that if you have a risk aggregation question tomorrow you can also answer tomorrow.

If you are looking for a real answer to this problem, and in a short time frame then don’t just put more consultants on the job, hire people who can deliver an effective solution that includes business intelligence and technical know-how.

Ask dFakto to explain how their expertise and experience could be put to good use and help you solve a pressing regulatory issue within just a few months.