So you actually bought into that new technology stack that was really going to improve your analytics and help you make better decisions? The truth is while it may well help, the conflict between flexible enterprise change and accurate historical reporting will be always be a discipline rather than a technology when supporting a successful modern business. Sadly no technology is the holy grail. After all, none of them has yet managed to fully put you at ease, has it?

Just as you cannot expect a project to run itself, even though you have the latest project management software, it’s the same with data warehouse management because it is the methodology that counts. Yet just like with project management methodologies, Data Vault is also agile, compartmentalizing change to encourage flexibility.

Pitfalls and inadequacies of steady state

In the best of cases it probably takes months to organise changes within your data warehouse. And the crazy thing is – even if you are doing even simple changes – it may be incomprehensible why ‘just adding a few fields’ (or updating one, heaven forbid!) can take such a long time.

The main problem is that it is the methodology of storing, not the technology that is the issue. The fact is that old Star Schemas, 3NF (Third Normal Form) systems and Snowflakes, just aren’t cut out for change as they were (and still are) designed for analysis of consistent data rather than data capture. So while Star Schemas and Snowflakes are especially good at some analytical tasks, and 3NF is great for enforcing cascading point-in-time structure, none of these methods cope well with changes. And none are made to accommodate (and reconcile) the data structure from 5 years ago with the data you have today and requirements for tomorrow … they are simply too prescriptive and only designed for a single, current way of doing things. As a consequence, if it changes, you essentially have to (carefully) throw out the old and start again, hence it takes all that (expensive) time.

The best answer is to accept change, and embrace it.

Enter Data Vaulting. (Applause). It is able to capture data from anywhere, and extract virtualised views as moving snapshots when required.

‘But our data is always changing.’

It always seems to be about ‘new’ and ‘the next big thing’, but glancing backward at the ‘history’ or considering ‘change’ is often an afterthought, and best left to ‘others’ to reconcile. How are we to understand how well the business is doing over time without some consistency amidst the technology transformation? Of course it is interesting and important to adapt so that you can be interoperable to the best standards, but how can we understand (or allow!) real growth and change while still accurately tracking history?

The older methods like Kimball and Inmon (Star Schema, 3NF and Snowflake anyone?) were created a long time ago, in times when change wasn’t so rapid and we didn’t have such volumes of data. Back then, you were designing a data model for specific single solutions and could afford the time to throw out the old data model and start again. Their continued appeal is that lots of people use them – but they are trying to use them for the wrong purpose! They are effectively trying to use buckets to grab rivers of information, attempting to read snapshots of a data stream as if it was a fixed data model rather than a continuum of change. They try to put the snapshots together into some sort of continuous historical view of business information – and it all begins to look a bit odd! Data Vault changes the game from old-school disciplines and bends to the flow of the river making the technology irrelevant.

Distinct purpose: capture raw history, flexibly

Emerging from these older disciplines, Data Vault is a hybrid evolution and has been specifically designed for change management and organizing diverse sources from the ground up. So just as older methods attempted to shoehorn the data streams into the old methodologies and make data analytically usable, Data Vault is all about embracing change across the entire enterprise. The old methodologies are still very useful – they are good at transforming data into insights – but they are not good as raw systems of record. The fact is Data Vault’s sole purpose is to structure heavy workloads of changing disparate data sources, and then handle them in an agile way, and promote onward data quality and processing. Data Vault’s purpose is primarily as its name suggests – to take raw ‘Data’ (rather than information) and put it in a ‘Vault’ (captured and safe) that stores everything in its purest original state – an immovable, yet flexibly structured, system of record.

By borrowing modern social data concepts like ‘relationships’ and ‘entities’ and combining them with the older methodologies, the structure of your business concepts is separated out from the sources and forms the skeleton on which the data linked. It is possible, and recommended, for a knowledgeable business user to design the intuitive core data concepts behind the Data Vault on a whiteboard before involving the data professionals to fill in the details. Data Vault is a ‘business first’ methodology, focusing data around the business ideas, rather than conforming the business ideas to the data model.

Business may change superficially, but the concepts that underpin it, aspects such as customer, product, etc., do not. It is these that form the backbone of the Data Vault.

You want to learn more about Data Vault?

Find out more from our friendly team of business and technical experts at: info@dfakto.com or +32(0)2.290.63.90.


dFakto asked Guido Dauchy, former Head of Transformation at BNP Paribas Fortis, to share his experiences of working with dFakto on the transformation plans at BNP Paribas Fortis. BNP Paribas Fortis is a long-term client of dFakto and we have been fortunate to work with them even before the transformation plans of 2012 were announced. In short, dFakto has been instrumental in keeping senior executives abreast of what was really going on with the programme, and this on an almost daily basis. For as Guido explains, every company needs as much help as they can get.

Starting in June 2012, the “Simple & Efficient” initiative launched with two objectives:

Simplify the Group, particularly after several successful integrations and several years of growth
Improve efficiency, in light of an accelerated regulatory pressure and an uncertain economic environment
The reality is most transformation projects are not successful; indeed the probability of delivering as expected is low (~30%). When you look at the numbers in detail, about 70% of failures are due to poor execution rather than poor strategy.

In Guido’s opinion, two aspects contribute to this: the first is a lack of commitment to deliver by the management, and secondly the fact that plans are outlined at a level of detail that is too high and too strategic, and therefore not actionable. He recommends ensuring that governance meetings are fed with “valuable, fresh, frequent & qualitative insights” from the field, and that actuals and forecasts are monitored on a frequent basis. This ensures that there is an automatic measurement of strategic gaps ‘to date’ and ‘at completion’, with as a consequence an ability to follow the evolution of these gaps and converge them.

In 2012, BNP Paribas launched the Simple & Efficient transformation project: the aim was to generate some 2 billion € of savings by 2016. Now, five years later, not only has the programme more than reached its initial objective, but has exceeded the target by creating savings of 3.3 billion €, an increase of 65% over the initial objective. What happened? How did this happen?

with Guido Dauchy, former Head of Transformation at BNP Paribas Fortis, 20th April 2017

Four key success factors

Four key success factors were identified: an evolving governance, a centralised budget, a simple transversal methodology and a dedicated resources team.

It is important to have governance that can evolve. At BNP Paribas, the governance changed according to the phase and stage of the programme. Initially, the executive committee oversaw the planning of the work, while the heads of different business units worked on the plan. The former gave support to the latter who then had the responsibility of monitoring what actually happened in the field. This ensured that the top management has the right information in the right format as soon as possible, and as directly as possible.

The fact is, however, it’s not always easy in big organisations to have the right information at the right time. This is why there is the need for an incentive. This is where the centralised budget came to play an essential role because by funding specific projects, you trigger the feedback you need for those projects. We speak here of a certain kind of mutual agreement between the steering committee and the heads of business units.

Note though it’s important to remember that top management should not be made to play the role of the police, and that their role is making sure that whatever projects have been promised are also executed. Furthermore, it was important that reporting be as simple as possible: with only the details that are relevant, and including whether or not the business unit was sure to reach the objectives of the overall programme.

A third condition to ensure success in any transformation is using the right methodology. The reality is that in big organisations, you don’t have the time to convince people to be part of something. You therefore have to create a dynamic to make sure that the organisation starts to work within the thinking of the programme from the very beginning. In the case of BNP Paribas, this methodology was based on 3 phases. Initially, of course, the overall goal had to be defined. Once this was agreed to, the 1st phase was a top-down overview plan that included all the potential savings for each business, function and territory. The goal of this phase was to decide what kind of initiatives could be taken to reach the overall objectives, in other words, pinpoint the savings that could be delivered and at what cost. Ideally this phase should last only about 6 to 8 weeks. The recommendation is not to take too much time. Make it easy for those that are doing the job and because transformation is not the normal job of the people, give them the support they need.  The 2nd phase consisted of implementing the ideas from phase 1. More people are involved in this phase, indeed it’s where you bring people in who are capable of turning the initiatives into projects.

At BNP Paribas, it required 12 weeks to translate everything into project programmes. The reality is that you have to know what you’re doing, how you’re doing and when you’re doing it, otherwise the initiatives simply won’t move forward. Also you shouldn’t allow initiatives outside the scope of the plan. Finally the last phase is the implementation phase. This phase is longer compared to phase 1 & 2.  It’s in phase 3 that you’ll make the changes in the governance.

The last key success factor we identified in the BNP Paribas case is the necessity of having a dedicated team. The beating heart of the programme, they are the people who bring everything together: managing the programme, giving information to the governance, and presenting possible solutions. They were also the people that made sure that decisions were acted upon on the ground.

These 4 factors at BNP Paribas helped the company exceed the initial objectives of 2 billion € of savings, to reach 3.3 billion €. Now projects that exceed their initial objectives represent only a small percentage compared to the ones who only reach their objectives (and even smaller compared to the ones who fail). So how did the programme overreach its objectives?

3 ways to exceed objectives

The first one is the extension capability. It refers to the dynamics created once the phases 1 & 2 of the methodology have started. During these phases, you see lots of question being raised, and many people starting to communicate with each other, indeed all begin to be more and more convinced of the real purpose of the programme. The result is they begin to think in another way. Their mindset changes to be in line with the dynamics of the programme. At that moment it’s important to start a new wave of phase 1 & 2 to generate (more) new ideas and solutions. As a result, people will come up with better and more valuable ideas.

Now if a project fails to deliver for whatever reason, the loss needs to be compensated by a new project or extra effort in another project in order to respect the initial commitment. When we talk about compensation we mean launching another project or improving another project. This compensation capability implies that all parties are fully transparent on the progress of their projects, identifying at the earliest possible moment any potential problems so that it allows them to quickly respond to it and limit the risk of “non-achievement” of the objectives. With transparency and non-judgemental communications it’s possible to admit that a project will deliver less than expected. And experience showed us that people always find compensation elsewhere because they don’t want to show failure before the executive committee!

A transformation doesn’t happen in a vacuum. It occurs in a world that is changing and disruptive. This is why a dedicated team is essential. In this way you can be certain of continuous transformation within the company, as if you don’t adapt, you’ll disappear.

At dFakto, the way we see transformation involving different things. First, we encourage pro-active management. We believe that if you can identify issues before they have an impact, it’s easier to resolve them (hence the real value of supplying valuable, fresh, frequent & qualitative insights). When people realise they won’t succeed on their part of the programme, they will report it, and by doing so avoid negative figures in the results. Furthermore, if you can identify what’s wrong as early as possible there is a strong incentive to finding ways to solve the problems, and deliver.

In short, as Guido says,

“We have a more organic approach to transformation. We see it as a living organism. It’s something reactive: you can do something, or you can do something different. We used to plan, execute, and then give feedback. Today, we don’t have this ‘only’ linear approach to transformation. We see it, thanks to dFakto data, as something much more cyclical, and that includes a higher reporting frequency, and consequently a high-planning frequency.”

 


Data Vaults are agile, but what does that mean in simple terms?

In simple terms, it means you don’t have to « eat the whole whale » at one time. Indeed small bites are more effective for achieving data visibility. You can even task the meal to a collection of people all at once, and the pace can increase without errors. It’s because the Data Vault is highly structured that you can split it up into pieces and (without any difficulty) join them back together again. It means you can do it all at once, in small increments, or something in between and at your own pace, according to your own resource constraints.

Just like Lego (snap it together, build out)

Your business decides the core structure of business concepts, and the data snaps onto these concepts, rather than being forced into more traditional styles of data modelling. You can start right now with as little or as much data as you have. Add to it as you figure out what your future requirements are instead of designing a full data model first. And because Data Vault is atomically structured you can use the tiniest piece in a dashboard from the first time you have that smallest bit of agile data available. If your data grows, it won’t affect what you have already measured.

You might want to do a trial on a small scale to evaluate the benefits before expanding it to the entire enterprise? No problem, Data Vault won’t waste your time meaning whatever you do now will not change allowing you to grow and expand your enterprise data warehouse once and one time only. You may choose to scale quickly by setting parallel teams each building up separate agile data stores. With Data Vault, these can all be easily synchronised through the separation of the core structure from data and can be merged together afterwards. If teams agree on a simplistic core structure of business concepts and relationships, they can each develop on top of the shared construct. It isn’t a model, it’s an agreed way of connecting business entities – something you can achieve on a whiteboard!

  • Start anywhere, evolve elsewhere, bring it all together anytime: Integrate business domains driven by real business priorities. Relations between different business domains can be established at any later point in time (doesn’t need to be thought of upfront).
  • Start small, grow any size: It doesn’t matter whether you’re building a small data warehouse for some Master Data or a full enterprise data warehouse. The advantage of Data Vault projects is that they bring results very early after you start the project, and because they are based on business concepts more than data, they are highly flexible as they grow.
  • Automate from the start for fast iterations: The key to Data Vault is deconstructed activities which are rapidly repeatable. Data Vaults and automation go hand in hand, indeed the simplicity and consistency of the approach encourages automation. Standardise your business concepts (on a whiteboard), generate the structure, automate the integration routines and then start growing your agile data warehouse.

Deconstructed data models improve automation

Once the key components of the business and relationships are understood (keys or IDs, like the customer reference number on an invoice), you simply hang data off of it, as you find it. It means that a disjointed approach to data gathering, if that’s all that is possible, doesn’t impact on the final Data Vault method, because that’s how it’s meant to work! Fast, slow, big small – it doesn’t matter. Work on different areas independently and then bring it all together, or start by attaching all of your existing data marts as sources and keep going from there, but with agility. It’s a very intuitive and flexible process for the business who can then follow the data modelling without any requirement other than they understand how the business processes work and can operate a whiteboard.

Breaking down and segregating the work in this way makes it a very repetitive process which scales well with automation. In fact, automation is highly recommended as the core structure of the Data Vault is simply meant to extract and load data ‘Satellites’ onto the lattice-work of ‘Hubs’ and ‘Links’. In fact by segregating the methodology into a network of data tables, the system almost requires automation to fulfil it’s ultimate goal of near-real-time business data for analytics. You’ll be pleased to hear that dFakto has already developed the key automation features you need to build your first Data Vault in just a few weeks!

You want to learn more about Data Vault?

Find out more from our friendly team of business and technical experts at: info@dfakto.com or +32(0)2.290.63.90.


dFakto explains how you can both grow profits while lowering costs by responding more rapidly to your customers’ needs

Many companies are under constant pressure, and this is felt right throughout the company. While boards and CEOs continue to insist on increased revenues and lower costs, the necessity to delight the customer, multiply customer growth and have them at the centre of your preoccupations is becoming increasingly critical to the well being of the business.

The problem is that while companies are trying hard to become better, the results often fall short. dFakto believes that to be successful in building value and providing compelling customer experiences at a lower cost, companies need to use the power of digital and data to transform themselves into a more agile and more effective organisation devoted to the customer growth.


Boosting customer growth – It’s not about big data, it’s about the right data

Sadly, a company’s internal systems are too often disconnected from the reality of the customer. This is the way it is as most of these processes grew from an internal requirement and a traditional focus on siloed products and services. This inside-out focus is at odds with winning over the customers of today who are increasingly demanding and tech-savvy.

Even for those companies who have taken the decision to build a ‘live’ picture of their customer have run into difficulties. There is a natural desire to link legacy ERP system with a new CRM approach, yet the results have been patchy to say the least. This kind of decision often means years of development and decisions on technology that are quickly superseded and behind the times. In short, there is quite a distance between the internal systems and processes (with legacy systems and rigid IT infrastructure) and the demands of the marketing department and ‘customer experience’ management.

Customer growth thanks to fast, evolutive and tailor made data management platform

A rapid and efficient solution to this problem is a dynamic data warehouse from dFakto. It sources the required data no matter from what type of system and puts it in a unique “data factory”. There the data is enriched as necessary, and quality controlled, and converted into advanced reports that offer genuine customer insights.

The result is near real time data on your customers and a far greater marketing agility letting you make decisions and respond to changing customer expectations rapidly. The system offers insights based on real data, real customer-centric data and is highly secure.

The payback to you is operational excellence and a far more nimble organization : dFakto can develop a proposal in days and deliver first data insights within a matter of months. We’ve done it before for the civil engineering, aerospace, finance and the automotive sectors, and we can do it for you.

Ask us for a confidential conversation and see how we can help you generate growth beyond your expectations : Thibaut Ladry – tla@dfakto.com, m : +32 473 514 730.


Is your company’s data warehouse working the way you want it to? Are you happy with your dashboards? Do all the sources of data you produce feed into it? It’s possible that if you’re using the ‘old methods’, the answer is ‘probably not’. Sadly this is going to change anytime soon if your business changes with any pace. Indeed, you likely have a series of overlapping change requests and a reluctance to introduce more until you can have assurance of some progress, or even some semblance of currency.
Rising above the change requests, a more pertinent question is: ‘Does your data warehouse reflect the current reality of your business?’ Certainly not if you are still waiting on those pending changes. In fact, I’ll bet you have probably resigned yourself to all of this lag and labelled it ‘business as usual’. You might be, if you are proactive, be looking out for technology solutions, but it need not be like that.

Old dogs, new tricks?

Using just the ‘tried and tested’ approaches, anytime you change one thing in your data warehouse there are usually knock on effects such as dependent systems. As a result your data warehouse must usually alter (and test!) iteratively until its data model can cope. This happens a lot, and many have resigned themselves to accepting this as the price of change in ‘modern’ data warehouse technology. We know it’s not true.
New regulations, compliance requirements, connected external systems, business mergers, process changes or simply expanding requirements introduce a need for modifying how you capture, store and analyse your data. In fact the most significant challenge of any data warehouse is to capture change while maintaining continuity with historical data so that reporting is consistent and accurate. Sounds impossible? Not really.
Making changes using traditional methods will nearly always require a review of data models, impact assessments and even complete rewrites when new requirements are implemented – it’s all very slow, tedious and sometimes painful. And when you are done, the model changes must be tested for assurance that nothing has been broken, before moving from staging to live analytics. Weeks turn into months with all of this, and it involves a huge amount of time and effort.

A better approach: Data Vault

It’s complicated keeping up with today’s pace, which is why some clever people came up with the Data Vault discipline. It’s a method of separating core data structure from the data content. The result is you can quickly and accurately adapt to change without suffering lag times and the inaccuracies of traditional data warehouse methodologies. You build the structure ONCE and then hang your data off of it in ‘satellites’. Need to change? No problem, just add additional structure without breaking what was already there. If you need to include more data, you simply hang more off of the core structure (‘hubs’ and ‘links’ ).
Data Vault is a methodology, not a technology. It is a way of thinking about data, rather than a shiny new trend. It involves separating the structure of your data from the changes. It simplifies and stabilises your data model, so that it becomes a fixed entity into which your raw data is locked and never changes. It is a vault of truth of your business – warts and all.
Read more: Here

You want to learn more about Data Vault?

Find out more from our friendly team of business and technical experts at: info@dfakto.com or +32(0)2.290.63.90.