Thibaut de Vylder, CEO at dFakto presents to CIO Applications Europe how dFakto’s GDPR solution was a real opportunity for companies. Discover here the full article !

The General Data Protection Regulation has raised the data protection bar for companies with data on citizens of the European Union (EU). On the contrary, in dFakto, we believe that GDPR is a great opportunity to improve data management and processing above all within the data subject, client and partner relationships.

In the process of attaining GDPR compliance, many businesses face unexpected GDPR-related workload to be achieved promptly after the initial GDPR Assessment.

To address this challenges, dFakto offers a data-driven management ecosystem and complete end-to-end products and services to minimize the customers’ GDPR compliance issues while focusing on their competitive data advantages.

If you want to know more about dFakto’s GDPR solutions, click here ! 


dFakto is proud to be part of the TOP 10 GDPR solutions providers established by CIO applications!

The all in one GDPR solution does not exist! That’s why dFakto has developed an agile and innovative solution based on its recognized expertise in steering governance!

To be effective, GDPR compliance needs to be assessed, driven at different levels of the company, and measured continuously.

dFakto’s GDPR solution integrates every essential modules for a successful evaluation and ensure your compliance over time!

With strategic partners, dFakto solution integrates the mandatory modules to manage the GDPR and other regulations that affect your business.

By combining recognized expertise with a flexible and scalable solution, dFakto has integrated the TOP 10 GDPR solutions!

Check out the intervention of the CEO of dFakto Thibaut de Vylder for CIO applications.


What is the link between data management and artificial intelligence ?

Thibaut de Vylder, CEO of dFakto answered this question during the DI Summit 2018! Discover his intervention on video!

For more than 18 years, dFakto has been supporting companies in the management of their data: from piloting their projects to analyzing the results and through the digitalisation of their operations.

In different business lines, we enable our customers to gain in productivity while consolidating quality data essential to the operation of their activities.
This quality data is the fuel that makes AI run !

By creating high value-added indicators, analyzing the past to anticipate the future, we are convinced that the only way for artificial intelligence to be truly intelligent when it supports management and the performance of the company.


The general data protection regulation will come into effect on May 25th, which underlines the importance of establishing a strong data culture within each company.

For Thibaut De Vylder, CEO of dFakto, the findings are not appealing: “Many organizations are still doing gap analysis, and too few have started the actual shift to implementation. It is only the implementation planning and rollout that makes it possible to understand the scope of the work to be performed.”

In this type of regulation activity there are three obvious phases: “Assessment”, “Implementation” and “Operationalization”. Currently, all the activity we’re seeing is organisations focusing primarily on the assessment, especially focussed on legal analysis of the contracts of their customers, focussing on the right to use the data, etc.  At the same time, security specialists analyze the infrastructure, the network, the way the data is stored, along with many generalist consultants help to produce of the gap analyses between the current and target regulatory compliance situation. The purpose of all these typical analyses being that each company makes only an inventory of the management of personal data within their activities.”

Beyond the theory

Aligned to these three types of assessment actors, there are tools to support their outcomes, but these are constrained to the theory rather than the practice of actionable GDPR. “Hence the interest to move past this to the action stage,” continues Thibaut De Vylder. “There are implementation guidance solutions, such as GDPR360, that are designed for lean simplicity and are particularly designed for tracking the tasks to be performed, and associated risks. In addition, controls are carried out on the basis of the data deemed sensitive and which will make it possible to detect the noncompliant data and suggest the precise actions to make them conform to acceptable tolerances.”

The goal is to facilitate & drive a true data culture: the lean nature encourages each employee to contribute and become a “data citizen” in the company and seamlessly take on his or her responsibilities in relation to the data. Further regulations such as e-Privacy will only reinforce the requirement for these responsibilities.

A strong signal

Nevertheless, it is clear that “there will be a transitional period, the communication at the base was not necessarily very clear,” says Thibaut De Vylder. “GDPR is a good practice and sends the signal that we will not be able to do anything without asking people for their consent. The digital image of people is increasingly used for decisions that concern them. Hence the importance of restoring the management of this image to the people in whose hands belongs.”

Source: Thibaut De Vylder interviewed by Olivier Clinckart translated from http://www.infosentreprendre.be/conseils-it/bien-aborder-le-virage-du-gdpr


Nixxis’s advanced Contact Suite is a visionary and flexible solution that has been designed anticipating all future needs of contact centers in terms of customer interactions (phone, email, chat, sms, social networks). 80% of Nixxis’s customers report up to 20% increase in productivity and profitability. By promoting dFakto’s GDPR 360 application suite, Nixxis is probably one of the first that brings GDPR compliancy as a competitive advantage !

GDPR concern worldwide business which handle private data on EU citizen.

While you are a Data “Controller” or “Processor” you now have obligations and rights towards Data subjects and Authorities.

As a Data Processor, Nixxis needs to be aligned to its customers obligations as Data Controller and therefore has decided to implement dFakto GDPR 360 to manage the day to day operations maintaining GDPR compliance.

A Data Processor may only operate on behalve and according to the Data Controller’s instructions. The Data Processor may suggest processes improvements and data quality checks regarding GDPR compliancy but may not implement them without the approval of its Data Controller.

 

When facing GDPR, most organizations are focusing on the assessment rather than on the GDPR implementation & operations.

gdpr image2 540x304 540x304 - dFakto is proud to announce a new partnership with Nixxis regarding GDPR compliance.

Those organisations are supported by Law firms for their legal framework, Security firms for their security frameworks and/or GDPR consultants to produce DPIA’s (Data Protection Impact Analysis).

Where are you in your process to be GDPR compliant? Is your assessment done? Have you already set up an operational solution to manage your daily actions once GDPR is in application?

Regulations come into effect in less than 3 months or 80 days from now! After that, the full force of the law comes into effect.

Before May 25, 2018 performing an impact assessment was adequate, but after this date, every bit of Personally Identifiable Information (PII) across the business must be accounted for and managed, to avoid very real fines.

This is a game changer. It is no longer adequate to simply have ‘accounting-like’ procedures in place – the actual PII data must be managed and the business is accountable for it directly.

GDPR is fundamentally a highly evolutive data management challenge. The GDPR solution requires an agile data approach.

 

Turn the GDPR obligation into an opportunity and a competitive advantage with dFakto’s GDPR 360 !

With more than 17 years of experience, dFakto is a company that specialises in sourcing and collating data so that senior managers can more easily identify actionable insights to drive their business forward.

Discover dFakto’s GDPR 360 operational solution to effectively manage daily activities and stay secure regarding risks and costs exposure.

dFakto provides with the best agile GDPR 360 solution and services available on the market. It will tackle this impactful legal obligation in an innovative, cost effective, incremental and sustainable way.

Our innovative GDPR 360 solution addresses the GDPR post-assessment implementation & operational challenges.

GDPR is coming fast now. Are you ready ?

dFakto ecosystem provides a complete GDPR value chain from assessment, implementation to daily operations to maintain compliance.

Visit our website, download our white paper or contact us to get a demo and a GDPR 360 pricing.

www.dfakto.com

www.nixxis.com


FACING THE NEW EUROPEAN DATA PROTECTION REGULATIONS, MOST ORGANIZATIONS FOCUS ON AUDIT, NEGLECTING IMPLEMENTATION AND ONGOING OPERATIONS. WITH ITS ‘GDPR360’ SOLUTION, DFAKTO OFFERS A MANAGEMENT CAPABILITY FOR THIS NEW REGULATION.

Many organizations are still in the assessment or gap analysis stage. These are the plans to build the house, a beautiful architectural sketch but no more, illustrates Thibaut De Vylder who is CEO of dFakto. “Build the foundations, mount the walls, choose colors and the moving date? No specific plan, no budgets, people assigned. Everything remains operationally vague with the completion of the assessment. With GDPR360, dFakto offers a solution to support the implementation and structuring of the necessary GDPR operations. ”

The proposition was born from a statement: In the analysis phase, organizations are most often supported by law firms for the legal framework and by cybersecurity specialists, or even GDPR consultants for the GDPR. produce assessments and DPIA (Data Protection Impact Analysis). “In short, so far we produce paper, a lot of papers”, continues Thibaut De Vylder. “And often, I find that these analyzes are not very concrete and lack granularity … To do well, by treatment, one must at least identify risks, the actions to reduce them, and allocate responsible persons and deadlines. I come back to the image of the sketch. It is not with a sketch that we will build the walls of the house in which we will live every day. You need a plan of execution and good tools to follow up on it! ”

For dFakto, the presentation of the GDPR they have seen in most seminars does not provide an answer. There are principles such as deadlines for notification or the right to be forgotten, and also about controls and sanctions, which are in fact, these new realities. Unfortunately, it is the idea of risk ​​constraint that dominates, and who, therefore freezes initiatives. However, getting in line with the GDPR can be an especially great opportunity to regain control of organisational data collection and its exploitation, to improve not only the quality of treatment, but especially its value. In this sense, the European regulation is a governance project and opportunity for proper Data Management.

INSERT GDPR IN A CONTINUOUS CYCLE ENHANCEMENT CYCLE

GDPR is fundamentally a very scalable data management challenge. The GDPR360 solution requires an agile data approach to implement technical or organizational measures, with human responsibilities to define, apply, and verify that they are met. This is a central point of data governance because data governance is not a computer problem – far from it! And so you need an easy-to-use solution. “In GDPR360, we use various intuitive highly visual indicators, such as weather pictograms,” adds Thibaut De Vylder. The goal is to get users to adopt and register the GDPR within a continuous cycle of improvement. ”

Anomaly reports, analysis, dashboards are evaluated and re-evaluated; new tasks emerge as new risks are identified, assessed and communicated in an updated version of the DPIA(s). The evolution of the progress of the tasks and the evolution of the risks is permanent and continuous. The solution is agile and easy extensibility to other data sources and their respective compliance issues; the dynamic architecture ensures that the new features will not jeopardize what is already in production.

Following a quality assessment, the GDPR360 solution can be deployed in less than one month on the basis of a fixed price for 4 data sources (even more optional), followed by a monthly payment showing the hosting costs, license (SaaS) and support; a budget that can be shared for small businesses. In addition to companies managing personal data of their customers, suppliers … (Data Controller), dFakto has already familiarised many providers in marketing, human resources, IT services, actors in the world of associations and trustees (Data Processor). And, of course, the DPOs who can operate the solution in as-a-service mode on behalf of their own customers; they can therefore focus their activity on piloting and deploy several customers in parallel, at scale.

The GDPR360 solution is fully aligned with and complementary to the methodological recommendations of the CNIL (National Commission Informatique et Libertés) in France, the Commission for the Protection of Privacy in Belgium, the National Data Protection Commission of Luxembourg and, of course, the European Regulation.

The GDPR360 application provides a point-by-point response to GDPR requirements. For example, the requirements for mapping sensitive data, maintaining a processing register and setting up internal procedures to guarantee data protection are one of the founding principles of data governance; the priority management system to comply with present and future obligations is “covered” by the data management strategy, which consists precisely of prioritizing and focusing data management efforts; the risk management system, the privacy impact assessment and the documentation needed to prove compliance are also included by default in a data governance approach.

A REAL “CULTURE OF DATA” IN THE COMPANY

To hear Thibaut De Vylder, it is therefore a continuous project, the date of May 25, 2018 not being the date of arrival, but the starting point of a constant process of improvement. Also, the primary mistake would be to resort to a “stirring the nest once” method, which has only short-term impacts. In summary: you should not make each service involved in data processing think about a collection of all their systems once and compile everything to enact risk-reducing rules once, as quickly as possible … this is a limited approach. If at first glance, it may seem sufficient, it nevertheless presents many weak points: a “siloed” version of things, an “audit” to be repeated periodically (in reality, with every new treatment as it is imagined, if we want to stay in compliance), no analysis of the impact of these changes on the operational data flow and therefore a real difficulty to carry out impact studies on privacy … At the end, the possibility -without doubt you will be “in the nails” of the European regulation, but at the cost of significant effort in the case of a manual treatment without the least return on investment expected from this tedious building site approach. Worse: the “repeated audit” approach risks “sterilizing” the company’s data strategy. And it misses the principle of “GDPR by design” …

DFAKTO, SPECIALIST IN GOVERNANCE AND PILOTAGE

Data governance is not just another way to get ready on May 25, 2018. Its benefits extend well beyond the scope of the new EU Data Protection Regulation. Joining this approach will enable organizations to adopt an evolutionary approach to the exploitation of personal data. In case of new regulatory developments, it is important to have a sound basis “in the knowledge of personal data, tools and methods” to comply smoothly. No need in the future to start identifying personal data, their location, their path in the systems. It’s already done, and kept up to date by design by data governance.

Good governance also helps to instil a true “data culture” into the company and especially to make it leave the sphere of influence of IT. “A data governance deployment is considered successful if everyone’s behavior becomes natural and virtuous with respect to data,” says Thibaut De Vylder. This is of course not just in relation to personal data. Corollary: there is more to data management than  having a DPO on one side playing ‘Mr Compliance’ and on the other side, departments that use data in their own isolated way… but we need empowered users, ‘data citizens ‘at the service of their organization by using, on the one hand, the data in a more efficient way (simplification of the treatments, automation manual work, reduction of the costs …) and, on the other, by allowing them to grasp more easily new opportunities for using data in a digital world (impact on sales, revenues, co-creation of new products and services …).

ROTATING AN OBLIGATION IN OPPORTUNITY

Data governance is a subject in which dFakto has excelled for 17 years. In addition to GDPR360, which responds to GDPR’s post-audit implementation and day-to-day operational challenges, the same application principles that exist in GDPR360 can also handle other types of compliance such as ISO27001, ePrivacy, and more. With the same principle, dFakto has also developed Transformation360, a globally recognized management solution for managing and analyzing complex program and project portfolios, and Client360, which provides a complete ‘client-centric’ view of truth about the customers. “We are a specialist in piloting cross-functional initiatives in large institutions like BNP Paribas Groupe, with more than 15 major programs run in parallel, some of which have more than 3,400 projects and 1,200 programs monitored at high frequency”, insists Thibaut De Vylder.

GDPR360 B2B VERSION FOR DATA CONTROLLER OR DATA PROCESSOR

With its GDPR360 capability, dFakto offers a real solution to companies both as data controller (Customer) and data processor (Supplier, we think for example non-compliant software publishers), both of which have inherited new rights and obligations, at both Supervising Authorities and Data Subjects. Thus all the providers who manage private data on behalf of a customer can assume their responsibility in accordance with the legislation and also offer this assurance to their customers. In order to be compliant themselves, many customers are currently verifying the compliance of their subcontractors, and this solution makes it possible to manage several data sources for both parties, even distributed or relocated.

ESTABLISH A NEW VALUE CHAIN ​​OF DATA MANAGEMENT.

Starting from this new legislation, different types of actors have been profiled to meet the first needs of assessment and dFakto is producing a wider ecosystem, an end-to-end solution, bringing together several complementary quality players. dFakto offers its solution to its customers both directly and indirectly via resellers, either in its name or in white label. For small businesses, a mode of mutual-use of the implementation is also possible. Finally, the DPOs are not left behind since the proposed technology allows them to consider themselves as “Augmented DPO”, allowing them to manage their customers more effectively by providing them with additional added value in terms of reducing the cost of managing their customers’ conformity.

Source: Thibaut De Vylder interviewed by Alain de Fooz translated from https://www.solutions-magazine.com/dfakto-gdpr360-anticipe-gdpr/

 

 


The financial crash of 2008 inspired the first cryptocurrency: Bitcoin, developed by the Satochi Nakamoto group. In simple terms, Bitcoin works as an open source, public distributed ledger, meaning the data, in this instance a digital currency, is decentralised. Bitcoin was developed to offer an alternative currency to the centralised, government controlled currencies. But how does Bitcoin work in a safe, transparent and immutable way? By running transactions through a system called Blockchain.

Why Block chain?

Trust, and security are central to much of civilisation, especially in finance but also in property ownership, supply chain, contracts & identity records, healthcare, government-citizen interaction and even democratic election processes. Since the Sumerians first inscribed a clay tablet with cuneiform (and probably before), trusted records of reference have been our best resort for proving what is and what is not, in a quantifiable way to resolve any potential disputes of opinion. Back with the Sumerians, keeping those clay records safe was probably the job of a king and his army or some similar organisation; similarly banks emerged to store and keep a record of what monetary assets, like gold, that a person legitimately owned and this evolved into the issuing of paper money (effectively IOUs) instead of a person carrying around the actual gold value.

With Blockchain, all of that changes.

With a complex network of electronic records holding many, many duplicate records for comparison and validation of a single fact – like whether or not you have the ability to pay for something – the shared knowledge of whether something is true or not becomes the trust factor. If someone attempts to change or hack the network, they must effectively change every single duplicate value record in order to make a change – the power of blockchain is in consensus of the many. The more the merrier.

Encryption on top of Blockchain provides additional security to prevent any hacking, leading to a completely efficient and trusted method of recording our most precious aspects of life.

What is Blockchain?

A blockchain is a ledger of digital information in which the said information is aggregated into data “blocks”. These blocks are made up of binary code that can represent anything from a value, to an image or even a simple sentence. The blocks are then “chained” together through encryption so that every block connects to the next. As such, a blockchain is built as an increasingly long and ordered string of digital information that has been verified and validated by the previous blocks in the chain. It’s a bit like a highly encrypted and verified shared Google document, one in which each entry in the sheet depends on a logical relationship to all its predecessors. In short, Blockchain technology offers a way to securely and efficiently create a tamper-proof log of sensitive activity, anything from international money transfers to shareholder records.

How does it work?

As a distributed database, blockchain storage is not limited to a single server. Rather, the totality of the information in the blockchain is simultaneously stored and updated by nodes, which are defined as any computer connected to the blockchain that processes the information exchanged. A double encryption system ensures the owner and recipient of any bit of transferred information that is recognised through use of the adequate private keys.

The digital information submitted to the blockchain is processed through a cryptographic hash function. This means that all the digital information inputted is transformed into a single fixed length output that is unique for every input. This means that even if two different inputs only have one byte of different data, another separate output will be created. The different nodes that interact with the blockchain by way of an algorithm challenge then verify these fixed length outputs independently. Whichever computer cracks the code first and verifies that the encrypted information can be validated then updates the other nodes working on the problem. These then confirm the validation on their end. Once a majority of nodes have verified the information, the block is consolidated, added to the blockchain and immediately updated across the board. The result is different batches of data can be consolidated through the cryptographic hash function into a single fixed length output that can be traced back to previous blocks in the chain. This process is called a Merkle Tree.

Secure and transparent

With every new block being validated independently by different nodes through algorithms and verified against earlier blocks, security is tight and the risk of fraud minimal. If a malicious user wanted to change even a single byte of information in a block, the validation system would spot the change as the fixed length output would come out differently, making it almost impossible to falsify information.

Furthermore, as the information is immediately uploaded to all nodes once validated, a blockchain is a very transparent system. As the chain grows longer with each block, the information held in each remains available to the blockchain users with the right access key. A blockchain will keep on growing as a result, with previous blocks being an intricate part of the chain that cannot be amended. This makes it easy to implement procedures such as audits or digital paper trails in an automated way.

One word of warning: while blockchain technology has proven to be impervious so far, this does not automatically mean full-data protection. To avoid data being manipulated, blockchain does need an application layer where data security questions are dealt with; you are still dependent therefore on how well this layer is designed and for what purposes you would want to use blockchain.

No intermediaries and the concept of smart contracts

As a blockchain is a decentralised system, there are no intermediaries required to relay information from point A to point B. In the case of Bitcoin for example, this means that money transfers no longer need the banking system to finalise a transfer as in a normal banking network and a state-mandated currency. In fact, the only delay between instruction and release is the time needed for the network of nodes to solve the algorithm and validate the block, which will be about 10 minutes. Said simply, there is no need to trust that intermediaries will do their job, as the process followed by a blockchain is software-based. Nodes work together toward transferring, documenting and safeguarding digital information as perfect strangers, with no strings attached.

Through this intermediary-free, peer-to-peer interaction backed by software, it is possible to implement computer protocols known as ‘smart contracts’. These smart contracts are governed by the same cryptographic rules to ensure clauses of a contract are implemented and enforced in a timely and automated manner.

Blockchain applications

The Blockchain system, though first popularised by Bitcoin, has many other real-world applications. Information can be shared much more securely over the Internet than in the traditional way (Excel files, EDI, etc.). This is due to blockchain cryptography, which requires anyone who wants to access the data to have the correct token.

Already large transport companies, like Maersk, have started using blockchain solutions to simplify their shipping procedures. But there are many other fields that could benefit from adopting Blockchain based data-management solutions.

  • Governments could solve many problems relating to the silo approach different department still adhere to. For example, automatically linking and validating citizen data using blockchain technology could simplify different civic services simplified through a unified, decentralised database. Using a Blockchain, it would be possible to better control government expenditures in real-time and combat corruption as the myriad of intermediaries involved in budgeting would be rendered obsolete.
  • Healthcare could benefit by regulating access to sensitive data using keys to let different industry bodies, from pharmaceutical companies to hospitals and insurance companies, manage patient information. Say a patient’s blood results showed a deficiency, uploading that data could automatically alert an insurance or medical service provider and start up a procedure to get the adequate treatment to the patient.
  • Blockchains and smart contracts could help ensure compliance protocols are adhered to and the legality of any form of information or goods transfer is done in accordance with the governing legislation, again in a centralised and impartial way.

As the world’s computing power continues to increase, and thus the calculating capabilities of nodes evolve, more and more data will be stored in a decentralised public ledger of sorts, increasingly ensuring immutability of data, transparency and impartiality along the way.

Sources:


As the rules of business are being rewritten on quarterly basis and available data grow exponentially, organisations are feeling the need to become data-driven.

They want short term results using data today and produce new value propositions for tomorrow. They also want to decide using facts and analysis instead of guts feelings.

They are looking for game-changing approaches to make this happen, such as:

a data-driven framework that encapsulates existing and future technologies and complies with upcoming challenges (GDPR, AI…)
new data-driven management skills required to implement and run & maintain these solutions
to do it in a cost-effective, incremental and sustainable way

Thibaut De Vylder has shared, during the Data Summit which took place on the 17th of November 2017 in Brussels, how we have used these approaches to drive innovative data-driven solutions in organisations independent of their size, complexity or industry


Joana Schmitz, Patrick Esselinckx and Arnaud Briol of dFakto were recently interviewed by Arnaud Martin, an independent journalist for the newspaper Le Soir, Références, and Trends-Tendances.

In this interview, they share their secrets for success as well as the difficulties SMEs have in finding their place next to large companies, especially when it comes to finding new collaborators.

It’s not all bad news however, as SMEs offer plenty of advantage to attract new candidates (like Arnaud Briol for example).

Read the entire interview published on Saturday October 7, in the newspaper supplement “Le Soir”.

Also consider visiting our Jobs & Careers page, if, like Arnaud, you want choose to grow with a company rather than in a company: https://www.dfakto.com/jobs/


You have reviewed the latest reports and the numbers are close, but something has changed? You are tempted to review the data modeling to try and correct the perceived error ? But you know that it is going to take some time to trace, and you have no guarantee of finding an answer. Perhaps someone has changed something in the model, somewhere and it has rippled out? Maybe it’s just a small change? Who knows?

This is not an uncommon scenario. The fragility of old data modeling approaches being treated as data stores, has often resulted in pesky errors being introduced into the systems.

This is usually not the fault of the people managing the data but an inherent problem with the way that the old methods were designed.

They were not intended for the rapid change and transformation that we experience in today’s business environments. If you change something as innocuous as a field name or add anything like a new field to a dimension, you will have to cope with some anomalies possibly appearing. Changes that one department request thinks are small, will for another department equal the kind of impacts in their figures that they really don’t need to be worrying about this quarter.

It’s a moving feast and sometimes, you can’t please anyone!

Times are changing; data modeling, too.

In yesteryear, very clever computer scientists designed methodologies for data capture that were leading technology of their day. Those days were typically slower-paced and less evolutive. Expectations were that it would take some time to change, time that wasn’t as commoditised as it is now. If a system took a few months to get the data modelling right, before it was useful, well then they had better take that extra time.

This is unthinkable these days, yet the same models are being used!

These old methods are still useful but underneath we now need something more. They are useful as ‘Data Marts’ in our new world, and are repurposed for what they are good at : to capture a snapshot view of data. However, the data itself needs to be held in structures that are more suited to and built for change!

Yet still, you will see many attempts to persist this old way of thinking. Star Schemas and 3NF architectures are at breaking point in business environments. Indeed, they are pushed to analyse history and are constrained by regulations like GDPR. They have their place, but it is not for high performing data warehouses. These older data architectures are more suited to stability and consistency rather than evolutive history and change capture!

Decompose structure to master change in data modeling

HUB

  • Data Vault separates the structure of data from the myriad of data sources that are attached to it.
  • The model is fixed and never changes. So is the data that you attach to it.
  • Once added, you cannot remove it. Initially this concept sounds restrictive, however the intention of the Data Vault is to ‘capture’ data and hold it in a fixed state and the trade-offs are profound.
  • It pulls data from multiple sources around a single reconcilable set of identifiers called a ‘Hub’ (e.g., a business entity, like a customer or product).
  • You can attach as many as you like, because the ‘Hub’ is a central point of management.
  • This becomes ideal if you are looking to understand discrepancies in your data, while keeping a system of record. Master Data is also a possibility, where you can compare and contrast each source into a derived ‘golden record’, further on.

LINKS

  • ‘Links’ form the second part of the core structure of a Data Vault, and these are where the flexibility and agility come into play.
  • You can have different teams working on different ‘Hubs’ that are unaware of each other if need be.
  • They may be working on data cleansing or master data, or whatever.
  • You can keep them separate by design or by schedule and still hold it all together in the last case by building the links separately.
  • The Links are effectively ‘many-to-many’ tables. So the relationships scenarios are what you chose to make them. There are no constraints as long as the business entities are well thought out.

BUSINESS VAULT

If you want to clean and enrich data, derive data, build business rules and data quality or even build out your ‘golden record’ Master Data.

That happens in another stage called the ‘Business Vault’, but the Data Vault itself is truly a single source of unchanging truth, warts and all.

There are benefits to this approach :

  • You know that what is in your data warehouse is truly a historical record;
  • It is an audit-able trail of consistency in your business;
  • You can derive an unlimited number of Data Marts from it, that will be absolutely consistent over time;
  • If you build sympathetic business rules, they will also be consistent with each other;
  • The reports and analyses that you conduct on this data will remain consistent over time, even if you add more data, as nothing is EVER deleted from a Data Vault, unless it is specifically designed to do this by regulatory constraint.

In conclusion, the Data Vault is built from the ground up to manage growth, while maintaining consistency.

The magic happens because of ‘separation of concerns’.

Or find out more from our friendly team of business and technical experts at info@dfakto.com or +32(0)2.290.63.90.